WO2023214569A1 - Système d'assistance au mouvement et procédé appliqué dans un système d'assistance au mouvement - Google Patents

Système d'assistance au mouvement et procédé appliqué dans un système d'assistance au mouvement Download PDF

Info

Publication number
WO2023214569A1
WO2023214569A1 PCT/JP2023/017081 JP2023017081W WO2023214569A1 WO 2023214569 A1 WO2023214569 A1 WO 2023214569A1 JP 2023017081 W JP2023017081 W JP 2023017081W WO 2023214569 A1 WO2023214569 A1 WO 2023214569A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
assistance system
site
user
therapist
Prior art date
Application number
PCT/JP2023/017081
Other languages
English (en)
Japanese (ja)
Inventor
昌宏 粕谷
達也 關
Original Assignee
株式会社メルティンMmi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社メルティンMmi filed Critical 株式会社メルティンMmi
Publication of WO2023214569A1 publication Critical patent/WO2023214569A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]

Definitions

  • the present invention relates to a motion assist system and a method performed in the motion assist system.
  • the inventor of the present invention discovered the usefulness of using a movement assist device such as a training device for assisting body movements as a lifestyle infrastructure, and as a result of extensive research, invented the present invention.
  • the present invention provides an operation assistance system capable of communicating with external devices.
  • the present invention provides, for example, the following items.
  • (Item 1) A motion assistance system, a movement assisting device configured to detect a biosignal of a wearer and assist the wearer's movements based on the biosignal; voice recognition means configured to recognize the voice input to the movement assisting device; a communication means configured to perform communication between the motion assisting device and an external device; and a control means configured to control the movement assistance device and/or the communication means based on the voice.
  • the external device includes a server device of an EC site.
  • the control means is configured to perform an operation on the EC site via the communication means based on the movement of the wearer detected by the movement assisting device.
  • the motion assistance system described. The control means includes: determining characteristics of a product to be searched for on the EC site based on the wearer's movements; Any of the above items, configured to: transmit an instruction for searching the EC site for products having the determined characteristics to a server device of the EC site via the communication means.
  • the motion assistance system according to item 1.
  • the operation assistance system according to any one of the above items, wherein the operation includes at least one of an operation indicating the size of the product and an operation indicating the weight of the product.
  • the movement assistance system according to any one of the above items, further comprising a changing means for changing the manner in which products on the EC site are presented to the wearer based on the state of the wearer's wearing site.
  • the changing means determines whether or not the state of the wearing part is such that the product can be handled, and when it is determined that the state of the wearing part is not such that the product can be handled, the change means changes the product.
  • the motion assistance system according to any one of the above items, wherein the motion assistance system changes the manner in which the is presented.
  • the external device includes a training guidance device.
  • the motion assisting device is configured such that the operation and/or state of the drive section of the motion assisting device is reflected on the training guidance device, and/or the motion and/or state of the drive section of the training guidance device is reflected in the training guidance device.
  • the movement assistance system according to any one of the above items, which is configured to be reflected on a movement assistance device.
  • the motion assistance system according to any one of the above items further comprising audio output means configured to output audio.
  • a method performed in a motion assist system comprising a motion assist device, a voice recognition means, and a control means, the method comprising: The voice recognition means recognizes voice; The control means performs an operation on the EC site based on the recognized voice; and the motion assist device worn by the wearer detects the wearer's motion; The method includes: the control means performing an operation on the EC site based on the detected operation.
  • a server device for electronic commerce on an EC site the server device being configured to communicate with the operation assistance system according to any one of the above items, the server device comprising: means for receiving an instruction to search for products on the EC site from the operation assistance system; means for receiving the state of the attachment site of the motion assisting device of the motion assisting system; means for searching for products according to said instructions; and means for changing the manner in which the product is presented based on the state.
  • the changing means includes determining whether or not the state of the mounting part is in a state in which the product can be handled, and when it is determined that the state of the mounting part is not in a state in which the product can be handled, The server device according to any one of the above items, which changes the manner in which products are presented.
  • the server device according to any one of the above items, which changes the manner in which products are presented.
  • a method for electronic commerce on an EC site comprising: receiving an instruction to search for products on the EC site from the operation assistance system described in any one of the above items; receiving the state of the attachment site of the motion assisting device of the motion assisting system; searching for products according to said instructions; and changing the manner in which the product is presented based on the state.
  • a program for electronic commerce on an EC site the program being executed on a server device including a processor unit, the program comprising: receiving an instruction to search for products on the EC site from the operation assistance system described in any one of the above items; receiving the state of the attachment site of the motion assisting device of the motion assisting system; searching for products according to said instructions;
  • a program that causes the processor unit to perform processing including: changing a manner in which the product is presented based on the state.
  • a method for supporting electronic commerce comprising: Providing a user with the motion assistance system described in any one of the above items; The operation assistance system communicates with a server device of an EC site according to the voice input by the user; The operation assistance system transmits a purchase instruction for purchasing a product from the EC site to a server device of the EC site; A method comprising: receiving a commission from a provider of a server device of the EC site in response to the server device of the EC site performing a purchase process.
  • a server device for supporting remote training the server device being configured to communicate with at least one movement assistance system according to any one of the above items and at least one training guidance device,
  • the at least one operation assistance system includes a first operation assistance system
  • the server device includes: means for receiving the content of the training guidance that the therapist gave to the user of the first motion assistance system via the training guidance device and the characteristics of the therapist; means for receiving the effectiveness of the training instruction and the characteristics of the user;
  • a server device comprising: storage means for storing characteristics of the therapist, contents of training guidance by the therapist, characteristics of the person to whom the training guidance is given, and effects of the training guidance in association with each other.
  • the at least one operation assistance system further includes a second operation assistance system
  • the server device includes: means for receiving characteristics of a second user of the second motion assistance system; further comprising: means for referring to information stored in the storage means and determining at least one therapist who should provide training guidance to the second user based on the characteristics of the second user; The server device according to any one of the above items.
  • (Item 20) means for receiving conditions for a therapist to be employed; The server device according to any one of the above items, further comprising: means for referring to information stored in the storage means and determining at least one therapist to employ based on the conditions.
  • a method for supporting remote training comprising: Receiving the content of the training guidance provided by the therapist to the user of the movement assistance system according to any one of the above items and the characteristics of the therapist via the training guidance device; receiving the effectiveness of the training instruction and the characteristics of the user; A method comprising: storing characteristics of the therapist, contents of training guidance by the therapist, characteristics of the person receiving the training guidance, and effects of the training guidance in association with each other.
  • a program for supporting remote training the program being executed on a server device including a processor unit, the program comprising: Receiving the content of the training guidance provided by the therapist to the user of the movement assistance system according to any one of the above items and the characteristics of the therapist via the training guidance device; receiving the effectiveness of the training instruction and the characteristics of the user; A program that causes the processor unit to perform a process that includes storing characteristics of the therapist, contents of training guidance by the therapist, characteristics of the partner of the training guidance, and effects of the training guidance in association with each other.
  • (Item 23) A system for electronic commerce on an EC site, wherein the server device is configured to communicate with at least one operation assistance system according to any one of the above items,
  • the motion assistance system is determining a product to be searched for on the EC site based on the voice input to the movement assisting device; transmitting an instruction to search for the determined product on the EC site to the server device via the communication means;
  • the server device includes: receiving from the operation assistance system an instruction to search for the product on the EC site; searching for products according to said instructions;
  • the motion assistance system is determining characteristics of the product based on the movement of the wearer detected by the movement assisting device, and the instructions include sending the product having the determined characteristics to the EC site.
  • the server device includes: receiving the state of the attachment site of the movement assisting device; further configured to: change the manner in which the product is presented based on the state; The system according to any one of the above items, wherein the motion assistance system is configured to present the product to the wearer in the modified manner.
  • the wearer can use the movement assisting device as part of his/her daily life infrastructure, and can receive various types of support using the movement assisting device used in his/her daily life.
  • Diagram showing an example of the flow of a new service using motion assistance devices Diagram showing an example of the flow of a new service using motion assistance devices
  • Diagram showing an example of the flow of a new service using motion assistance devices A diagram showing an example of the configuration of a motion assistance system 100 of the present invention
  • a diagram showing an example of the peripheral configuration of the motion assistance system 100 of the present invention A diagram showing an example of a flow in a process performed in the operation assistance system 100 and the server device 200 of the EC site
  • a diagram showing an example of a flow in a process performed in the motion assistance system 100 and the training guidance device 400 A diagram showing an example of a flow in a process performed in the motion assistance system 100 and the training guidance device 400.
  • the inventor of the present invention has developed a new service using a motion assist device.
  • the service is a service that allows users to receive various types of support after an injury by using motion assist devices to assist with physical movements as part of their daily life infrastructure. It is.
  • a user can use a movement assist device for its intended purpose to assist movement in daily life (e.g., hand and finger assistance), and can also communicate with a therapist remotely via the movement assist device. . This allows the user to consult a remote therapist or receive remote training guidance from a remote therapist via the movement assist device.
  • a user can use the movement assisting device as intended for movement assistance in daily life (for example, assisting with fingers), and can also connect to the Internet via the movement assisting device.
  • the movement assisting device can be connected to, for example, a home TV (or tablet terminal, etc.), so that the remote therapist's situation can be displayed on the TV (or tablet terminal, etc.), or it can be connected to the Internet.
  • a web page can be displayed on a TV (or tablet terminal, etc.).
  • the inventor of the present invention believes that when a handicapped user selects and purchases products on an e-commerce site by moving his or her handicapped fingers, the user feels joy and satisfaction, and at the same time, buys daily necessities. We believe that the joy, satisfaction, and easy purchasing can lead to an improvement in the user's motivation for training and QOL.
  • training refers to medical training involving medical practice (e.g., rehabilitation (more specifically, medical rehabilitation), training using medical equipment), and non-medical training that does not involve medical treatment (for example, training using non-medical equipment such as health equipment). Therefore, the person who instructs the "training”, ie, the "therapist”, may be a medical professional with medical qualifications, or may be a person without medical qualifications.
  • FIGS. 1A and 1B show an example of the flow of a new service using a motion assist device.
  • the user U who wears the motion assist device 10 can receive various types of support via the motion assist device 10.
  • the user U of this service may be a user who has symptoms of paralysis in the body, typically a user who has suffered from a stroke and has symptoms of paralysis in the hands and fingers.
  • the user U can receive movement assistance from the movement assistance device 10.
  • the motion assistance device 10 can be provided to the user U from the service provider P.
  • the motion assisting device 10 can, for example, detect a biosignal from the user U, determine the motion intended by the user U based on the biosignal, and assist the user U's motion.
  • biological signal refers to a signal emitted by a living body.
  • Biological signals include, for example, myoelectric signals indicating the activity of the body's muscles, electrocardiographic signals indicating the activity of the body's heart, electroencephalograms indicating the activity of the body's brain, nerve signals transmitted in nerve cells, etc. but not limited to.
  • a biosignal is a biosignal whose specific value is determined by specific information (e.g., specific movements of the living body (e.g., flexing the fingers, extending the fingers, abducting the fingers, adducting the fingers). movement, etc.)
  • the biological signal is typically a myoelectric signal.
  • the movement assist device 10 may be, for example, a movement assist device as described in International Publication No. 2021/157390 (incorporated herein by reference).
  • the motion assisting device 10 can typically assist the user U's finger motions.
  • user U can receive remote training guidance from a remote therapist.
  • FIG. 1A shows an example of the flow in this situation.
  • the movement assisting device 10 is equipped with a microphone and a speaker, and the user U can receive remote training guidance from the therapist while talking to the therapist at the hospital H via the movement assisting device 10, for example.
  • the movement assisting device 10 can detect the user's U movement.
  • step S1 the detected motion data of the user U is transmitted to the hospital H.
  • a training guidance device robot hand owned by a therapist receives user U's motion data.
  • the training guidance device can move the finger portions of the training guidance device according to the user U's motion data as if the user U were moving his or her fingers. Thereby, the therapist can check user U's finger movements even though he or she is remote. Furthermore, the training guidance device can express the smoothness of the user U's finger movements according to the user U's movement data, so that the therapist can understand the range of motion of the user U's fingers even though he or she is remote. It can be confirmed.
  • the therapist can operate the training guidance device.
  • the training guidance device can detect the movements and applied forces by the therapist.
  • step S2 the detected movement data by the therapist is transmitted to the movement assisting device 10 of the user U.
  • the movement assisting device 10 can move the finger on which the movement assisting device 10 is attached, as if the therapist were moving the user U's finger, according to the movement data provided by the therapist.
  • the therapist can know the state of spasticity, rigidity, contracture, etc. of the user U by feeling the reaction force and movement of the training guidance device against the applied force and movement.
  • the therapist can further move the training guidance device according to the state of spasticity, rigidity, contracture, etc. of the user U. In this way, user U can receive training guidance from a therapist even though he or she is remote. At this time, it is preferable that the conversation between the user U and the therapist is maintained.
  • the call may be a video call with an image displayed on a display device with a camera (for example, a TV with a web camera, a tablet terminal, etc.) to which the motion assistance device 10 is connected. It is preferable that the communication between the training guidance device and the movement assisting device 10 be substantially real-time communication even if there is some loss.
  • a camera for example, a TV with a web camera, a tablet terminal, etc.
  • the user U can also consult with a therapist about training through a video call.
  • the therapist is in the hospital, but the therapist does not necessarily need to be in the hospital.
  • the therapist can provide training guidance to user U from any environment (for example, home) as long as the environment is network connectable. This could lead to changes in the way therapists work and increase employment opportunities for therapists.
  • steps S3 to S8 can be performed as additional steps.
  • step S3 the content of the training guidance provided by the therapist is transmitted to the service provider P.
  • the content of the training guidance includes, for example, the therapist's actions detected by the training guidance device (e.g., the magnitude of the force applied by the therapist to the training guidance device, the duration of the force applied by the therapist to the training guidance device, the therapist (including the direction of the force applied to the training guidance device, etc.).
  • the content of the training guidance may include, for example, the content instructed to the user U by the therapist. Communication between the training guidance device and the service provider P does not need to be substantially real-time, and is preferably lossless communication even if there is some delay.
  • the characteristics of the therapist can also be transmitted.
  • the characteristics of the therapist may include, for example, the therapist's work experience (eg, years of work, number of patients treated, therapist's specialty, therapist's weak area), and therapist's attributes (eg, age, gender).
  • Step S3 may be performed simultaneously with step S2, or may be performed after step S2.
  • the detected movement data by the therapist may be transmitted to the user's U's movement assisting device 10 via the service provider P in step S2.
  • step S4 the effectiveness of the training guidance received from the therapist is transmitted to the service provider P.
  • the effects of training guidance may include, for example, changes over time in the range of motion of the fingers and changes over time in the force exerted by the fingers.
  • the characteristics of user U can also be transmitted.
  • the characteristics of the user U include, for example, the state of the user U (for example, the range of motion of a joint, the state of paralysis (Brunnström stage, or the degree of spasticity, rigidity, or contracture)), the attributes of the user U (for example, age, gender), user U's training history (for example, the period or number of times user U received training guidance, and the period during which user U used motion assistance system 100).
  • the history of guidance by the therapist and the prognosis or characteristics of the user who received the guidance are stored as big data at the service provider P.
  • the history of guidance by the therapist and the prognosis or characteristics of the user who received the guidance are stored in association with each other. This big data can be used for various purposes.
  • this big data can be used when matching users and therapists.
  • step S5 the user U2, who is a new user of the motion assist device 10, sends a request to the service provider P to receive training guidance.
  • the characteristics of the user U2 may also be sent to the service provider P.
  • the characteristics of the user U2 include, for example, the state of the user U2 (for example, the range of motion of a joint, the state of paralysis (Brunnström stage, or the degree of spasticity, rigidity, or contracture)), the attributes of the user U2 (for example, age, gender).
  • the service provider P will refer to big data and search for a therapist who is compatible with the user U2.
  • a therapist who is compatible with user U2 may be, for example, a therapist who has experience in improving user U's symptoms, or a therapist who is good at providing training guidance for user U's symptoms.
  • step S6 the searched therapist is presented to the user U2. If the user U2 approves, he or she can receive training guidance from the therapist.
  • this big data can be used by hospitals when recruiting therapists.
  • the hospital H2 that wants to hire a new therapist transmits to the service provider P the conditions for the therapist to be hired.
  • the conditions for the therapist to be hired include, for example, the work experience of the therapist to be hired (e.g. years of work, number of patients treated), the therapist's strengths and weaknesses), the attributes of the therapist to be hired (e.g. , age, gender).
  • the service provider P will refer to big data and search for therapists that meet the conditions. For example, a therapist who satisfies all of the conditions or a therapist who satisfies some of the conditions is searched.
  • step S8 the searched therapist is presented to the hospital H2. Hospital H2 can contact the proposed therapist and proceed with the hiring process.
  • service provider P may request hospital H to pay a fee.
  • the hospital H2 can search for and hire a desired therapist.
  • FIG. 1B shows an example of the flow in this situation.
  • step S11 the service provider P provides the motion assist device 10 to the user U.
  • the user U can be assisted in the movement of the fingers by the movement assisting device 10.
  • the movement assisting device 10 is equipped with a microphone, and the user U can shop on the EC site by voice operation, for example.
  • the user U can perform voice input to the motion assist device 10 by speaking the name of the desired product along with a password for activating the voice operation (for example, “MELTz”, which is the nickname of the motion assist device 10). .
  • the user U can vocally input an instruction to purchase athletic shoes by saying, "MELTz, I want athletic shoes.”
  • User U can input an instruction to purchase rice by voice, for example, by saying, "MELTz, buy rice.”
  • step S12 the input instruction is transmitted from the movement assisting device 10 to the server S of the EC site.
  • the server S executes the purchase process according to the instruction.
  • the multiple specified products can be presented to the user U as purchase candidates.
  • the purchase candidates may be presented, for example, by voice from a microphone included in the motion assisting device 10, or by being displayed on a display device (for example, a TV) to which the motion assisting device 10 is connected.
  • User U can select a desired product from the presented purchase candidates. For example, the user U may select a desired product by voice, or may select a desired product by operating a cursor displayed on a display device.
  • the user U can operate the cursor by moving the finger on which the movement assisting device 10 is attached. For example, flexing your fingers moves the cursor down, extending your fingers moves the cursor up, abducting your fingers moves the cursor to the left, and adducting your fingers moves the cursor to the right. It is possible to link user U's finger movements and cursor movements. This is preferable because it encourages user U to train.
  • the user U can perform a product search in conjunction with the information obtained from the motion assist device 10.
  • the movement assist device will measure the size of your fingers, and the measured size and the content of your utterance will be measured. is transmitted to the server S, and the server S searches for a container corresponding to the size.
  • a similar operation may be performed while holding an existing container.
  • the characteristics of the container are identified by image recognition, and the characteristics of the container are identified by the image recognition. Since the size is determined depending on the condition, target items can be narrowed down with high accuracy.
  • the motion assist device 10 may be equipped with an IMU (inertial measurement unit), and the IMU may detect hand movements or gestures. Even in the case of a large object that cannot be grasped with fingers, by drawing its outline with your hand and saying, "MELTz, I want a container of this size," the IMU will recognize the approximate size and then Matching items are searched. In addition to this, it is also possible to search by the weight of the object you are holding.
  • IMU intial measurement unit
  • the user U can search for and select a desired product using his or her disabled hands and fingers, and the user U can purchase the product. As mentioned above, this may lead to user U feeling joy and satisfaction, which in turn may lead to user U's motivation for training.
  • the purchased product may be a product that cannot be handled by user U's hands (for example, a product that cannot be grasped with paralyzed hands, If the product is a product that cannot be operated with paralyzed fingers, a product that cannot be grasped or operated even with the use of the motion assist device 10, etc., the user U will lose motivation. Therefore, in this service, a mechanism is introduced to provide a solution to prevent User U from searching for, selecting, or purchasing such products, or to allow User U to handle such products. be able to.
  • the manner in which products are presented to the user U is devised so that the user U can select a product that takes into account the user's U's physical condition or the specifications of the movement assisting device 10. I can do it. For example, when user U searches for a product that is too large to be grasped with the range of motion of user U's paralyzed hand, the product cannot be grasped with the paralyzed hand, so the product will not be ranked high in the search. It is possible to lower the search ranking and present it to user U.
  • a notice stating that it cannot be grasped with the paralyzed hand may be added to the display of the product. and can be presented to the user U.
  • user U may receive a recommendation for an auxiliary tool to help grasp the product with the paralyzed hand. can be presented to.
  • the product when user U searches for a product that is too heavy to hold even with the assistance of movement assisting device 10, the product may be lowered in the search ranking and presented to user U so that it does not come to the top of the search results.
  • a note stating that the product cannot be maintained even if the motion assist device 10 is used may be attached to the product display and presented to the user U, or a recommendation for upgrading the motion assist device 10 may be presented to the user U. You can do it like this.
  • the physical condition of the user U can be detected, for example, by the movement assisting device 10 that the user U is wearing.
  • the physical condition of the user U may be, for example, the physical condition of the user U (e.g., range of motion of the fingers, level of paralysis of the fingers, etc.) revealed through training guidance by a therapist, i.e., as described above with reference to FIG. 1A. can be determined from big data.
  • the movement assisting device 10 may provide the user U with an image (for example, hardness) of the searched product.
  • the movement assist device 10 acts as a resistance to the fingers to simulate the hardness, so that the user U feels as if he were touching the product. You can experience it like that.
  • the purchase process is performed.
  • step S13 the purchased product is provided to user U.
  • the user U can shop at the EC site using the motion assist device 10 (and TV) that he or she usually uses.
  • This is particularly advantageous for elderly users who are not accustomed to devices such as smartphones and personal computers, for example. This is because users can use e-commerce sites without using unfamiliar equipment or small screens. This could lead to lowering the hurdles for elderly people to use e-commerce sites.
  • the above aspect is also advantageous for EC site providers in that by lowering the hurdles for elderly people to use EC sites, EC site providers can attract elderly customers. Furthermore, companies that offer products on e-commerce sites also have an advantage in that they can capture elderly customers and run targeted advertisements aimed at the elderly. .
  • step S14 a fee is paid to the service provider P from the EC site provider in response to the purchase of the product via the movement assisting device 10.
  • the service provider P collects the usage fee for the movement assisting device 10 from the user U, and also collects the usage fee from the provider of the EC site, so that the user can receive the service at a low price. be able to.
  • the service provider P may collect advertising fees from companies that provide products on the EC site.
  • the user U can make a phone call, for example.
  • User U can make a voice call or a video call with any other person in addition to the above-mentioned therapist.
  • a telephone directory in the motion assisting device 10
  • the user U can, for example, do research on a search engine.
  • the user U can perform voice input to the motion assist device 10 by speaking the thing he or she wants to search for along with a password for activating the voice operation (for example, “MELTz”, which is the nickname of the motion assist device 10).
  • voice input to the motion assist device 10 by speaking the thing he or she wants to search for along with a password for activating the voice operation (for example, “MELTz”, which is the nickname of the motion assist device 10).
  • User U can input an instruction to check today's weather by voice, for example, by saying, "MELTz, what's the weather like today?"
  • the user U can vocally input an instruction to find out the route from his home to Kayabacho by saying, "MELTz, what is the route to Kayabacho?”
  • the movement assisting device 10 can search the instruction content by transmitting the instruction to the search engine server.
  • the searched content may be presented by voice from a microphone included in the movement assisting device 10, or by being
  • FIG. 2 shows an example of the configuration of the motion assistance system 100 of the present invention.
  • the movement assistance system 100 includes a movement assistance device 10, a voice recognition means 110, a communication means 120, and a control means 130.
  • the movement assisting device 10 is configured to detect biological signals of the wearer and assist the wearer's movements based on the biological signals.
  • the movement assisting device 10 can have any configuration as long as it includes a detection section for detecting biosignals of the wearer and a drive section for driving the wearer's attachment site based on the biosignals. .
  • the movement assisting device 10 can receive a control signal from the control unit 130 and be controlled according to the control signal.
  • the detection unit may be any biosignal sensor, typically a myoelectric sensor.
  • the biosignal detected by the detection unit may be passed to the control means 130.
  • the driving unit may be any driving means, for example, it may be configured to drive the attachment site (for example, a finger) by a wire drive, or it may be configured to drive the attachment site by a pneumatic actuator. Alternatively, the mounting portion may be driven by the torque of a motor.
  • the drive unit can drive the attachment site according to the control signal.
  • the control signal may be generated by the control unit 130 based on the biological signal detected by the detection unit.
  • the movement assisting device 10 can drive the attachment part as intended by the wearer who is trying to move the attachment part.
  • the control signal may be generated by the control unit 130 based on a signal from an external device described below.
  • the movement assisting device 10 can reflect the operation and/or state of the drive unit of the external device, which is a training guidance device, on the movement assisting device 10, for example. For example, when a therapist moves a finger on a training guidance device at hand, the motion is reflected on the motion assist device 10, causing the wearer to move the corresponding finger. For example, when a therapist maintains a finger portion of a training guidance device at hand in a specific state, that state is reflected on the movement assisting device 10, and the wearer's corresponding finger is maintained in that state.
  • the movement assisting device 10 can include a movement/state detection section that detects the movement of the wearer and/or the state of the wearing site.
  • the motion/state detection unit may detect, as the state of the attachment site, a state of paralysis of the attachment site (for example, spasticity, rigidity, or contracture), a range of motion of the attachment site, or both of these. good.
  • the motion/state detection section includes, for example, an angle sensor capable of detecting the angle of a joint of the attachment site, a force sensor capable of detecting the force applied to the attachment site, etc., but is not limited thereto.
  • the operation/state detection section may be, for example, an IMU (inertial measurement unit), a torque sensor, a pressure sensor, or the like.
  • the IMU can detect the wearer's gestures.
  • the weight of an object held by the wearer can be detected using a torque sensor or a pressure sensor.
  • the motion assistance device 10 can include audio input means (for example, a microphone).
  • the voice input to the movement assisting device 10 is passed to the voice recognition means 110.
  • the voice recognition means 110 is configured to recognize the voice input to the movement assisting device 10.
  • the speech recognition means 110 can recognize speech using techniques known in the field of speech recognition.
  • the voice recognition means 110 can recognize the voice input following the wake word as an instruction. For example, in the example described above with reference to FIGS. 1A and 1B, "MELTz" is used as the wake word.
  • the voice recognized by the voice recognition means 110 may be processed as voice data, converted into text, or converted into a corresponding command. It may also be possible for the data to be processed. Data is passed from the voice recognition means 110 to the control means 130.
  • the communication means 120 is configured to perform communication between the movement assisting device 10 and an external device.
  • the communication means 120 can perform communication between the motion assist device 10 and an external device via a network.
  • the type of network does not matter.
  • the network may be, for example, the Internet or a LAN.
  • the communication means 120 can receive a signal from the motion assisting device 10 via the control means 130 and transmit it to an external device.
  • the communication means 120 can receive a signal from an external device and transmit it to the motion assistance device 10 via the control means 130.
  • the communication means 120 can transmit any signal to an external device and receive any signal from the external device.
  • the communication means 120 can, for example, transmit signals for controlling the external device to the external device.
  • the communication means 120 can receive, for example, a signal for controlling the movement assisting device 10 from an external device.
  • the communication means 120 can, for example, send/receive signals for voice or video calls to/from an external device.
  • the external device may be any device external to the motion assistance system 100.
  • the external device may typically be a server device that provides a website, preferably a server device that provides an EC site.
  • the external device may be a server device that provides a search engine.
  • the external device may be a training guidance device.
  • the training guidance device is a device used by a person who instructs training (eg, a therapist (occupational therapist, etc.), a doctor, etc.).
  • the training guidance device can include a structure that imitates a mounting site on which the movement assisting device 10 is mounted.
  • the training guidance device can include a hand-shaped device (ie, a robot hand).
  • the training guidance device can include a lower limb type device.
  • the external device may be a server device that manages the operation assistance system.
  • the server device can manage the operation assistance system and its users in association with each other.
  • the server device can store the motion assistance system and the characteristics of its users in association with each other in a connected database section.
  • the user characteristics include, for example, the state of the user U (e.g., joint range of motion, state of paralysis (Brunnström stage, or degree of spasticity, rigidity, or contracture)), user attributes (e.g., age, gender), the user's training history (for example, the period or number of times the user received training guidance, and the period during which the user used the motion assistance system 100).
  • the state of the attachment site detected by the motion assistance device of the motion assistance system (e.g. range of motion of the attachment site, state of paralysis of the attachment site, etc.) is sent to the server device, and the status of the attachment site is transmitted to the user.
  • the data may be stored in association with each other in the database section.
  • the user U's condition e.g., Brunnström stage, etc.
  • the server device is transmitted to the server device, and the state of user U can be stored in the database section in association with the user.
  • the stored information may be used later by a therapist, a server device of an EC site, or a motion assistance system.
  • This server device can also manage a movement assistance system in association with a therapist who provides training guidance to the user of the movement assistance system.
  • the server device can store, in a connected database section, the motion assistance system, its user information, and the characteristics of the therapist who provided training guidance to the user in association with each other.
  • the characteristics of the therapist include, for example, at least one of the therapist's work experience (eg, years of work, number of patients treated, areas of strength, areas of weakness), and attributes of the therapist (eg, age, gender).
  • the contents of the training guidance may be stored in association with the therapist and the user.
  • the content of the training guidance may be transmitted from the training guidance device to the server device during or after the training guidance, and the content of the training guidance may be stored in the database unit in association with the user and the therapist.
  • the effects of training guidance may be stored in association with the therapist and the user.
  • the effect of the training instruction may be transmitted from the motion assistance system 100 to the server device, and the effect of the training instruction may be stored in the database unit in association with the user and the therapist.
  • the stored information may be used later by a therapist, a server device of an EC site, or a motion assistance system.
  • the control means 130 is configured to control the movement assisting device 10 and/or the communication means 120.
  • the control means 130 can control the motion assistance device 10 and/or the communication means 120 based on the voice recognized by the voice recognition means 110.
  • control means 130 can control the communication means 120 to transmit the voice data of the voice recognized by the voice recognition means 110 to the external device.
  • the control means 130 can control the communication means 120 to transmit a control signal generated based on the speech recognized by the speech recognition means 110 to an external device.
  • control means 130 can control the movement assisting device 10 with a control signal generated based on the voice recognized by the voice recognition means 110.
  • control means 130 can control the movement assistance device 10 to change the operation mode or assistance strength of the movement assistance device 10 in accordance with the voice.
  • control means 130 can determine a product to be searched for on the EC site based on the voice recognized by the voice recognition means 110, and generate a command to search for the product on the EC site.
  • the control means 130 can control the communication means 120 and transmit a signal to an external device based on the movement of the wearer detected by the movement assisting device 10 or other means. This allows operations on the external device to be performed and/or information to be transmitted to the external device.
  • control means 130 can perform operations on the EC site via the communication means 120 based on the movement of the wearer detected by the movement assisting device. For example, the control means 130 transmits a control signal to an external device (server device of the EC site) via the communication means 120 so that the wearer can operate a cursor by moving the finger on which the movement assisting device 10 is attached. can be sent to. For example, the control means 130 moves the cursor downward when the wearer flexes the finger, moves the cursor upward when the wearer extends the finger, and moves the cursor to the left when the wearer abducts the finger.
  • the control means 130 can transmit information represented by the wearer's actions to an external device (an EC site server device) via the communication means 120.
  • an external device an EC site server device
  • the wearer can convey his/her intention by, for example, moving the finger or the entire hand on which the movement assisting device 10 is attached.
  • the control means 130 estimates the size based on the detected movement
  • the control means 130 estimates the size based on the detected movement.
  • a signal can be sent to an external device indicating the .
  • a camera included in a display device detects the movement, and the control means 130 can estimate the size based on the detected movement and transmit a signal indicating the size to an external device.
  • the control means 130 can estimate the size based on the detected movement and transmit a signal indicating the size to an external device.
  • the movement assisting device 10 detects the movement and the magnitude of the force when holding the item, and the control means 130 Weight can be estimated based on the detected motion and force magnitude and a signal indicative of the weight can be sent to an external device.
  • the control means 130 determines the characteristics of the product to be searched for on the EC site based on the wearer's actions, and issues a command to search the EC site for products having the determined characteristics via the communication means. It can be sent to the server device of the EC site. For example, when the wearer opens his/her finger to indicate the intended size, the control means 130 determines the characteristics (i.e., size) of the product to be searched based on that movement, and determines the characteristics (i.e., size) of the item to be searched for and A command for searching for products can be sent to the server device of the EC site.
  • the control means 130 determines the characteristics (i.e. the weight) of the item to be searched based on the operation; A command to search for products with that weight can be sent to the server device of the EC site.
  • This aspect is also preferable because it encourages training of the wearer.
  • the wearer can feel pleasure and satisfaction by, for example, moving their disabled fingers and fingers to search for a desired product, which can lead to motivation for training.
  • control means 130 controls the movement assisting device 10 so that the movement assisting device 10 provides resistance at the attachment site in order to represent the characteristics (for example, hardness, ease of movement, etc.) of the product searched for on the EC site. can be controlled.
  • the wearer can feel the characteristics of the searched product (e.g., hardness, ease of movement, etc.) when moving the finger with the motion assist device 10 attached, for example, and can feel the characteristics of the product before purchasing. You can get the image.
  • the motion assistance system 100 can be connected to any display device (for example, a television, a tablet terminal, etc.). Thereby, the signal received from the external device can be displayed on the display device.
  • any display device for example, a television, a tablet terminal, etc.
  • the display device may include a speaker. This makes it possible to output audio.
  • audio can be output based on an audio signal received from an external device (eg, an audio signal from a voice call or a video call).
  • the display device may include a camera. This allows video to be input. For example, images can be input for video calls.
  • the camera can also be used to detect the wearer's movements. Furthermore, in order to further improve the accuracy of motion detection, a depth camera may be used as the camera, or a camera and LiDAR may be used together.
  • the display device may be a component of the movement assistance system 100 or may be an external component of the movement assistance system 100.
  • the motion assistance system 100 can further include audio output means (not shown).
  • the audio output means is any means capable of outputting audio.
  • the audio output means may be, for example, a speaker included in the display device, or may be a speaker separate from the display device.
  • the audio output means can output audio based on an audio signal received from an external device (for example, an audio signal from a voice call or a video call). This makes it possible to make a phone call via the motion assistance system 100.
  • an external device for example, an audio signal from a voice call or a video call. This makes it possible to make a phone call via the motion assistance system 100.
  • the motion assisting system 100 may further include a changing means for changing the manner in which information to be presented to the wearer is presented based on the state of the attachment site of the motion assisting device 10. More specifically, the changing means can change the manner in which products on the EC site are presented to the wearer based on the state of the wearer.
  • the condition of the attachment site may be, for example, the degree of paralysis of the attachment site (for example, Brunnström stage, or the degree of spasticity, rigidity, or contracture), or the range of motion of the attachment site. However, it may be both.
  • the state of the attachment site may be, for example, a state of involuntary movement of the attachment site.
  • the state of the attachment site may also be represented, for example, by a biological signal (for example, signal intensity) such as myoelectricity associated with the attachment site.
  • the changing means may change the presentation mode to lower the search ranking of products whose size cannot be handled (for example, cannot be grasped) within the range of motion of the wearing part or in a paralyzed state, and then present the products to the wearer. I can do it.
  • the changing means can change the presentation mode so that the product is presented to the wearer with a note stating that the product cannot be grasped due to the range of motion of the wearing part or in a paralyzed state.
  • the changing means can change the presentation mode so as to recommend an auxiliary tool for grasping a product that is too large to be grasped due to the range of motion of the attachment site or in a paralyzed state.
  • the changing means lowers the search ranking of products that cannot be handled (e.g., cannot be operated or finely controlled) in a state of involuntary movement of the wearing part (e.g., tremor state), so that the product is not available to the wearer.
  • the presentation mode can be changed so that the information is presented.
  • the changing means can change the presentation mode so that the product is presented to the wearer with a note stating that the product cannot be handled in a state of involuntary movement of the wearing part.
  • the changing means can change the presentation mode so as to recommend an auxiliary tool for handling a product that cannot be handled with the involuntary movement of the attachment site.
  • the manner in which products on the EC site are presented to the wearer changes depending on the state of the wearer's location.
  • the changing means determines whether the state of the wearing part is in a state where the product on the EC site can be handled, and if it is determined that the state of the wearing part is not in a state where the product can be handled, the product is displayed.
  • the manner in which it is carried out can be changed.
  • the changing means may, for example, obtain the state of the attachment site by detecting it with the movement assisting device 10, or may obtain it from the database section of the server device that stores the history of past training guidance. Good too.
  • a product on an EC site is tagged in advance with the state of the attachment part that can handle the product or the state of the attachment part that cannot handle the product, and the changing means is tagged with the tag.
  • the state of the attachment site is such that it is possible to handle products on the EC site.
  • the state of the wearing part that can handle the product or the state of the wearing part that cannot handle the product can be determined, for example, from reviews and feedback of the product by other EC site users.
  • the changing means can be implemented by the control means 130, for example. After receiving the information on the searched product from the server device of the EC site, the control means 130 changes the presentation mode based on the state of the attachment site, and displays the information via the display device or the audio output means of the motion assistance system 100. A changed presentation mode can be presented to the wearer.
  • the changing means can be implemented, for example, by the server device of the EC site. After the product is searched, the changing means of the server device changes the presentation mode based on the state of the wearer when presenting the searched product to the wearer, and changes the presentation mode based on the state of the wearer, and changes the display device or audio output of the motion assistance system 100. The changed presentation mode can be presented to the wearer via the means.
  • Each component of the motion assistance system 100 may be implemented as a component separate from the motion assistance device 10, or may be mounted on the motion assistance device 10.
  • the voice recognition means 110, the communication means 120, and the control means 130 can be implemented by a processor, and this processor is installed in the motion assistance device 10 as the same processor as the processor of the motion assistance device 10 or as a different processor. Alternatively, it can be implemented as an external processor of the motion assist device 10.
  • FIG. 3 shows an example of the peripheral configuration of the motion assistance system 100 of the present invention.
  • At least one motion assistance system 100 is connected to at least one server device 200 and at least one training guidance device 400 via a network 500.
  • Network 500 may be any type of network.
  • Network 500 may be, for example, the Internet or a LAN.
  • Network 500 may be a wired network or a wireless network.
  • FIG. 3 Although two movement assistance systems 100, one server apparatus 200, and two training guidance apparatuses 400 are shown in FIG. 3, the number of movement assistance systems 100, server apparatus 200, and training guidance apparatuses 400 is Not limited. An arbitrary number of motion assistance systems 100, an arbitrary number of server devices 200, and an arbitrary number of training guidance devices 400 can be connected via the network 500.
  • One motion assistance system 100 is, for example, a system used by one wearer.
  • the wearer may be, for example, a stroke patient who has suffered from a stroke, and the motion assist device 10 of the motion assist system 100 may be for training stroke patients.
  • One training guidance device 400 is, for example, a device used by a person who instructs one person in training (for example, a therapist (occupational therapist, etc.), a doctor, etc.).
  • the training guidance device 400 can have a structure that imitates the mounting site on which the movement assisting device 10 is mounted.
  • the training guidance device 400 can include a hand-shaped device (ie, a robot hand).
  • the training guidance device 400 can include a lower limb type device.
  • the server device 200 can include an interface, a processor section, and a memory section.
  • the interface exchanges information with the outside of the server device 200.
  • the processor unit of the server device 200 can receive information from outside the server device 200 via the interface, and can transmit information to the outside of the server device 200.
  • the interface can exchange information in any format.
  • the processor unit executes processing of the server device 200 and controls the operation of the server device 200 as a whole.
  • the processor section reads a program stored in the memory section and executes the program. This allows the server device 200 to function as a system that executes desired steps.
  • the processor section may be implemented by a single processor or by multiple processors.
  • the memory unit stores programs required to execute the processing of the server device 200, data required to execute the programs, and the like.
  • the memory unit may store a program for processing for electronic commerce, which will be described later, a program for processing to support remote training, a program for supporting electronic commerce, and the like.
  • the program may be preinstalled in the memory unit.
  • the program may be installed in the memory unit by being downloaded via a network.
  • the program may be stored in a computer-readable storage medium and installed in the memory unit by reading the computer-readable storage medium.
  • the memory section may be implemented by any storage means.
  • the server device 200 may be, for example, a server device of an EC site.
  • the server device 200 is connected to a database section 300.
  • the database section 300 can store information about products sold on the EC site. In the following, an example will be explained in which the server device 200 is a server device of an EC site.
  • the server device 200 may have a configuration for electronic commerce.
  • the server device can include means for receiving an instruction to search for products on the EC site from the operation assistance system 100.
  • the movement assistance system 100 determines the product to be searched for on the EC site based on the voice input to the movement assistance device 10, and generates an instruction to search for that product on the EC site. I can do it.
  • the motion assistance system 100 can also determine the characteristics (e.g., size, weight) of the product to be searched for on the EC site based on the user's movement, and search the EC site for products with the characteristics. can generate instructions to do so.
  • the operation assistance system 100 transmits the generated command to the server device 200, and the server device 200 can receive this command.
  • This receiving means may be implemented, for example, by an interface that controls communication with the outside of the server device 200.
  • the server device 200 can include means for receiving the state of the attachment site of the motion assist device 10 of the motion assist system 100.
  • the condition of the attachment site may be, for example, the degree of paralysis of the attachment site (for example, Brunnström stage, or the degree of spasticity, rigidity, or contracture), or the range of motion of the attachment site. However, it may be both.
  • the state of the attachment site may be acquired by detection by the movement assisting device 10, or may be acquired from the database section of the server device 200 that stores the history of past training guidance.
  • This receiving means may be implemented, for example, by an interface that controls communication with the outside of the server device 200.
  • the server device 200 can include means for searching for products according to the received command.
  • the server device 200 can search for the product specified by the command from the database unit 300 that stores product information.
  • the server device 200 can include means for changing the manner in which the searched product is presented to the user according to the received state of the attachment site. This is the change means mentioned above.
  • the server device 200 can support electronic commerce by, for example, the following processing. Specifically, the processor of the server device 200: receiving an instruction to search for products on an EC site from the operation assistance system 100; Receiving the state of the attachment site of the motion assisting device 10 of the motion assisting system 100; searching for products according to instructions; Based on the condition, the manner in which the product is presented can be changed.
  • the processor of the server device 200 when changing the manner in which the product is presented based on the state, determining whether the state of the attachment site is such that the product can be handled; It is further possible to change the manner in which the product is presented when it is determined that the state of the attachment site is not in a state where the product can be handled.
  • the server device 200 may be, for example, a server device of a service provider P that provides the operation assistance system 100.
  • a server device of a service provider P that provides the operation assistance system 100 In the following, an example will be explained in which the server device 200 is a server device of a service provider P that provides the operation assistance system 100.
  • the server device 200 can be used to support remote training.
  • the server device 200 includes a means for receiving the contents of the training guidance given by the therapist to the user of the movement assistance system 100 via the training guidance device 400 and the characteristics of the therapist, and the effect of the training guidance and the characteristics of the user. and means for receiving.
  • the content of the training guidance can be received from the training guidance device 400 via the network 500, for example.
  • the effectiveness of the training guidance and the user characteristics can be received from the motion assistance system 100 via the network 500, for example.
  • These receiving means may be implemented, for example, by an interface that controls communication with the outside of the server device 200.
  • the content of the training guidance includes, for example, the therapist's actions detected by the training guidance device (e.g., the magnitude of the force applied by the therapist to the training guidance device, the duration of the force applied by the therapist to the training guidance device, the therapist (including the direction of the force applied to the training guidance device, etc.).
  • the content of the training guidance may include, for example, content instructed by a therapist to the user.
  • the characteristics of the therapist may include, for example, the therapist's work experience (eg, years of work, number of patients treated, areas of strength, areas of weakness), and attributes of the therapist (eg, age, gender).
  • the effects of training guidance may include, for example, changes over time in the range of motion of the fingers and changes over time in the force exerted by the fingers.
  • User characteristics include, for example, the user's condition (e.g., range of motion of a joint, state of paralysis (Brunnström stage, or degree of spasticity, rigidity, or contracture)), user attributes (e.g., age, gender, etc.) ), the user's training history (for example, the period or number of times the user U received training guidance, the period during which the user U used the motion assistance system 100).
  • the server device 200 is connected to a database section 300, and the received information can be stored in the database section 300.
  • the database unit 300 may store characteristics of the therapist, contents of training guidance by the therapist, characteristics of the person receiving the training guidance, and effects of the training guidance in association with each other.
  • the data stored in the database unit 300 can be used as big data. For example, by referring to the database unit 300, it is possible to know whether the training guidance given by the therapist was effective, what kind of people it was effective for, and what kinds of people it was not effective for. You will be able to do this.
  • the server device 200 can be used to match a new user of the movement assistance system 100 with a therapist.
  • the server device 200 may include means for receiving the second user's characteristics from the motion assist system 100 (second motion assist system) used by the new user (second user).
  • the second user characteristics may be received via the network 500 from the second motion assistance system 100, for example.
  • This receiving means like the above-mentioned receiving means, can be implemented by an interface that controls communication with the outside of the server device 200.
  • the characteristics of the second user may include, for example, the state of the second user (e.g. range of motion of a joint, state of paralysis (Brunnström stage, or degree of spasticity, rigidity, or contracture)); May include user attributes (eg, age, gender).
  • the state of the second user e.g. range of motion of a joint, state of paralysis (Brunnström stage, or degree of spasticity, rigidity, or contracture)
  • the server device 200 can include means for determining at least one therapist who should provide training guidance to the second user.
  • the determining means can refer to the database unit 300 and determine at least one therapist based on the characteristics of the second user.
  • the at least one therapist may be a therapist who is compatible with the second user, and may be a therapist who has experience in improving the second user's symptoms or who is good at providing training guidance for the second user's symptoms. It can be a therapist, etc.
  • the determining means can be implemented, for example, by the processor section of the server device 200. Since the database unit 300 is associated with the teaching experience of each therapist and its effectiveness, the server device 200 can make such a determination.
  • the server device 200 can be used to match therapists with employers who want to employ them.
  • the server device 200 can be provided with means for receiving the conditions of the therapist to be employed from a terminal device used by an employer who wishes to employ the therapist.
  • the conditions of the therapist to be hired can be received via the network 500 from a terminal device (personal computer, smartphone, tablet, etc.), for example.
  • This receiving means like the above-mentioned receiving means, can be implemented by an interface that controls communication with the outside of the server device 200.
  • the conditions for the therapist to be hired include, for example, the work experience of the therapist to be hired (e.g. years of work, number of patients treated), the therapist's strengths and weaknesses), the attributes of the therapist to be hired (e.g. , age, gender).
  • the server device 200 can include means for determining at least one therapist to employ.
  • the determining means can refer to the database unit 300 and determine at least one therapist based on the conditions of the therapist to be employed.
  • the at least one therapist may be a therapist who meets all of the conditions for a therapist to be employed, or a therapist who meets some of the conditions for a therapist to be employed.
  • the determining means can be implemented, for example, by the processor section of the server device 200. Since the teaching experience of each therapist and its characteristics are associated with the database section 300, the server device 200 can make such a determination.
  • the server device 200 can send the information of the determined at least one therapist back to the terminal device and present it to the employer.
  • Employers can proceed with the hiring process based on the information provided.
  • the server device 200 may include means for receiving a fee from an employer in response to the employment of the determined at least one therapist.
  • the receiving means can receive the fee using any method.
  • the means for receiving the fee can be, for example, cash, card payment, crypto currency, account transfer, code payment, etc.
  • the server device 200 can support remote training by, for example, the following processing.
  • the processor of the server device 200 Receiving the content of the training guidance that the therapist gave to the user of the movement assistance system 100 via the training guidance device 400 and the characteristics of the therapist; receiving training guidance effectiveness and user characteristics; The characteristics of the therapist, the content of the training instruction by the therapist, the characteristics of the person receiving the training instruction, and the effectiveness of the training instruction can be associated and stored in the database section.
  • the processor of the server device 200 receiving characteristics of a second user of a second motion assistance system; The method may further include determining at least one therapist who should provide training guidance to the second user based on the characteristics of the second user with reference to the information stored in the database unit. In addition to this, or in place of this, the processor of the server device 200 receives conditions of the therapist to be employed; The method may further include determining at least one therapist to employ based on the conditions by referring to the information stored in the database section.
  • the server device 200 can be used to support electronic commerce.
  • the server device 200 can manage a plurality of motion assistance systems 100 and their respective users.
  • the server device 200 can provide the user with an environment in which the EC site can be used via the operation assistance system 100.
  • the server device 200 can receive a commission from the provider of the EC site in response to the user purchasing a product on the EC site via the operation assistance system 100.
  • the server device 200 can receive advertising fees from a company that provides advertisements on the EC site in response to the user purchasing a product on the EC site via the operation assistance system 100.
  • the server device 200 can receive advertising fees using any method.
  • the server device 200 can receive advertising fees in the form of cash, card payment, crypto currency, account transfer, code payment, etc., for example.
  • FIG. 4 shows an example of a flow in a process performed in the operation assistance system 100 and the server device 200 of the EC site. In this example, a process in which a user of the operation assistance system 100 purchases a product on an EC site will be described.
  • the user performs a voice input to the movement assisting device 10 that is being worn to start the process of purchasing a product on the EC site.
  • the audio input to initiate the process is, for example, a combination of a specific wake word, the product desired to purchase, and a word suggesting purchase (eg, "I want", "Buy”, etc.).
  • the user utters, "MELTz, I want athletic shoes.”
  • the user makes a gesture with his hand and says, "MELTz, I want a container this big.”
  • the voice recognition means 110 of the motion assistance system 100 recognizes the voice input to the motion assistance device 10.
  • the voice recognition means 110 determines the action to be taken based on the voice. For example, in the example described above, the voice recognition means 110 determines that the action to be taken is to purchase a product based on the voice saying "MELTz, I want athletic shoes.” shoes. For example, in the other example described above, the voice recognition means 110 determines that purchasing a product is an action to be taken based on the voice and gesture of "MELTz, I want a container of this size.” It is determined that the desired product is a "container" of the size recognized by the gesture.
  • step S402 the control means 130 of the movement assistance system 100 generates a command for the action to be taken, and transmits it to the server device 200 via the communication means 120.
  • a command to purchase "athletic shoes” is sent to the server device 200 of the EC site.
  • a command to purchase a "container" of the size recognized by the gesture is transmitted to the server device 200 of the EC site.
  • the server device 200 of the EC site searches for the product to be purchased.
  • the server device 200 can search for a product to purchase from product information stored in the database unit 300.
  • the server device 200 searches for "athletic shoes.”
  • the server device 200 searches for a “container” of the size recognized by the gesture.
  • step S404 the server device 200 of the EC site transmits information about the product found in step S403 to the operation assistance system 100. For example, if a plurality of products are found in step S403, the server device 200 transmits information on the plurality of products found to the movement assistance system 100.
  • the motion assistance system 100 can present the received product information to the user.
  • the movement assistance system 100 may present product information to the user by voice, or may display the product information on a display device connected to the movement assistance system 100 to provide the user with product information. can be presented to. The user can refer to information about the presented product and decide whether to purchase it or not.
  • the movement assistance system 100 can change the manner in which the product is presented based on the state of the user's wearing site.
  • the motion assistance system 100 receives the state of the user's attachment site.
  • the motion assistance system 100 may utilize the state of the attachment site detected by the motion assistance device 10, or may receive the state of the attachment site from a database section that stores past training history. Good too.
  • the motion assistance system 100 determines whether the state of the attachment part is such that the product found in step S403 can be handled, and if it is determined that the state of the attachment part is not such that the product can be handled, the movement assistance system 100 removes the product.
  • the manner in which the information is presented can be changed.
  • the motion assistance system 100 lowers the search ranking and presents the product to the wearer. You can change the presentation mode. For example, if the product found in step S403 is of a size that cannot be grasped within the range of motion of the attachment site or in a paralyzed state, the movement assistance system 100 may display a message indicating that the product cannot be grasped within the range of motion of the attachment site or in a paralyzed state. The presentation mode can be changed so that the product is presented to the wearer with a cautionary note added to the product.
  • the motion assistance system 100 may The presentation mode can be changed to recommend an auxiliary tool for grasping the item. If the product found in step S403 is a product that cannot be handled (for example, cannot be operated or finely controlled) in a state of involuntary movement (for example, tremor state) of the attachment part, the motion assistance system 100 detects the product found in step S403.
  • the presentation method can be changed so that the product is presented to the wearer with a lower search ranking, or a warning can be placed on the product stating that it cannot be handled while the wearer is in a state of involuntary movement of the wearer.
  • the presentation mode can be changed so that the product is presented to the wearer in addition to the product, or an auxiliary device can be recommended for handling the product that cannot be handled with the involuntary movement of the wearer. , the presentation mode can be changed. In this way, the manner in which the product found in step S403 is presented to the wearer is changed depending on the state of the wearer. As a result, it is possible to avoid a situation where the user loses motivation, such as when the user struggles to move his or her disabled body to purchase a product that cannot be handled by the user in his or her physical condition.
  • the user can input selection of a product and/or input purchase of a product via the movement assisting device 10 that the user is wearing.
  • specific actions required by the motion assistance system 100 for example, repeating bending and stretching of the fingers a predetermined number of times within a predetermined time, repeating abduction and adduction of the fingers a predetermined number of times within a predetermined time, holding the fingers in a predetermined posture, (for example, maintaining a V-sign posture for a predetermined period of time), an input to purchase a product can be made.
  • product information is displayed on a display device connected to the motion assistance system 100, inputting selection of the product and/or purchasing the product by operating a cursor displayed on the display device. can be input.
  • step S405 the movement assisting device 10 detects a movement by the user.
  • the motion/state detection section of the motion assist device 10 can detect motions by the user.
  • the control means 130 of the movement assistance system 100 can generate instructions for operation at the EC site based on the detected movement. For example, the control means 130 may generate a command each time a user's motion is detected, or may generate a command after a certain motion is detected.
  • step S406 the control means 130 of the motion assistance system 100 transmits the generated command to the server device 200 via the communication means 120.
  • the server device 200 selects a product according to the command and performs purchase processing for the selected product.
  • the server device 200 can perform a product purchase process by making a payment using the user's payment information that has been acquired in advance.
  • the server device 200 may request the user to input payment information, and when the payment information is input, the server device 200 can perform product purchase processing.
  • step S408 a notification to the effect that the product purchase process has been completed is sent to the motion assistance system 100.
  • the purchased product is delivered to the user.
  • the user can use the movement assistance system 100 to shop at the EC site.
  • Users can use the EC site via the movement assist device 10 that they usually wear without having to use a device such as a smartphone, so for example, users who are not accustomed to using devices such as smartphones may have difficulty using the EC site. can be lowered.
  • the user can be encouraged to train by inputting selection of a product and/or input of purchase of the product by the user's actions. This aspect is particularly preferred for users with finger paralysis.
  • the user's motion may be detected in step S401.
  • the user's actions at this time include at least one of an action indicating the size of the product and an action indicating the weight of the product. This causes the user to specify the size and/or weight of the desired product through actions. This can also facilitate training for the user, and is particularly preferred for users with paralyzed fingers.
  • FIG. 5A shows an example of a flow in a process performed in the motion assistance system 100 and the training guidance device 400.
  • the hand and finger movements of the user of the motion assistance system 100 are reflected on the robot hand of the training guidance device 400, and based on this, the therapist explains the process of instructing the user in training.
  • the user performs a voice input to the movement assisting device 10 that is being worn to start the process of receiving training guidance from a therapist.
  • the audio input to start the process may be, for example, a combination of a specific wake word and a word that suggests training instruction (e.g., "Start training instruction", "Connect me to the therapist AAA", etc.) It is.
  • the user utters, "MELTz, connect me to the therapist AAA.”
  • step S501 the voice recognition means 110 of the motion assistance system 100 recognizes the voice input to the motion assistance device 10.
  • the voice recognition means 110 determines the action to be taken based on the voice. For example, in the example described above, the voice recognition means 110 determines the action to be taken to start training guidance based on the voice "MELTz, connect me to the therapist AAA.”
  • step S502 the control means 130 of the movement assistance system 100 generates a command for the action to be taken, and transmits it to the training guidance device 400 via the communication means 120.
  • a command to start training guidance is sent to the training guidance device 400 of Mr. AAA, the therapist.
  • step S503 the therapist AAA's training guidance device 400 starts training guidance in response to the command to start training guidance.
  • the training guidance device 400 starts a call (voice call or video call) with the motion assistance system 100. Thereby, the therapist can start training guidance for the user.
  • the therapist instructs the user to move their hands and fingers.
  • the user moves his/her fingers according to the instructions.
  • the movement assisting device 10 detects the movement and/or state of the user's fingers.
  • the motion/state detection section of the movement assisting device 10 can detect the motion and/or state of the user's fingers.
  • the control means 130 of the motion assistance system 100 may generate commands indicating the detected motion and/or condition. For example, the control means 130 may generate a command each time an action and/or state by the user is detected, or may generate a command after a certain action and/or state is detected.
  • step S505 the control means 130 of the motion assistance system 100 transmits the generated command to the training guidance device 400 via the communication means 120.
  • step S506 the training guidance device 400 reflects the user's motion and/or state on the robot hand according to the received command. This allows the therapist to check whether the user is able to move his/her fingers as instructed. Furthermore, the therapist can check the condition of the user U's fingers (for example, the degree of contracture of the fingers).
  • the therapist can further continue training guidance or change the level of training guidance after confirming the movement and/or condition of the user's fingers.
  • the user can receive appropriate training guidance from the comfort of his or her home, for example, without directly meeting a therapist. This is particularly useful, for example, in situations where face-to-face contact is limited.
  • FIG. 5B shows an example of a flow in a process performed in the motion assistance system 100 and the training guidance device 400.
  • the motion of the fingers of the robot hand of the training guidance device 400 is reflected on the motion assist device 10 of the motion assist system 100, so that the therapist can instruct the user in training by moving the user's fingers. Describe the process.
  • the user performs a voice input to the movement assisting device 10 that is being worn to start the process of receiving training guidance from a therapist.
  • the audio input to initiate the process is, for example, a combination of a specific wake word and a word suggesting training instruction (eg, "start training instruction", "connect to therapist", etc.).
  • start training instruction e.g. "start training instruction”, "connect to therapist", etc.
  • the user utters, "MELTz, start training guidance.”
  • step S511 the voice recognition means 110 of the motion assistance system 100 recognizes the voice input to the motion assistance device 10.
  • the voice recognition means 110 determines the action to be taken based on the voice. For example, in the above-mentioned example, the voice recognition means 110 determines that the action to be taken is to start training guidance based on the voice saying "MELTz, start training guidance.”
  • step S512 the control means 130 of the movement assistance system 100 generates a command for the action to be taken, and transmits it to the training guidance device 400 via the communication means 120. For example, in the example described above, a command to start training guidance is sent to training guidance device 400.
  • step S513 the therapist AAA's training guidance device 400 starts training guidance in response to the command to start training guidance.
  • the training guidance device 400 starts a call (voice call or video call) with the motion assistance system 100. Thereby, the therapist can start training guidance for the user.
  • the therapist instructs the user to move their hands and fingers in accordance with the therapist's movement support.
  • the user moves his/her fingers according to the instructions.
  • the therapist moves the fingers of the robot hand at hand as if they were the user's fingers.
  • step S514 the training guidance device 400 detects the motion and/or state of the fingers of the robot hand.
  • Training instruction device 400 can generate commands indicating detected motions and/or conditions.
  • the training guidance device 400 may generate a command each time a user's action and/or state is detected, or may generate a command after a certain action and/or state is detected.
  • step S515 the training guidance device 400 transmits the generated command to the movement assisting device 10.
  • the control means 130 of the motion assisting device 10 can receive commands via the communication means 120.
  • step S516 the control means 130 of the motion assist system 100 reflects the motion and/or state of the robot hand on the motion assist device 10 in accordance with the received command. Thereby, the user can move his/her fingers while receiving movement support as if the therapist were assisting the user in moving his/her fingers.
  • the state of the user's fingers at this time may be reflected on the robot hand by, for example, a process similar to steps S504 to S506.
  • the user can receive appropriate training guidance without having to meet a therapist directly, for example, from the comfort of his or her home. This is particularly useful, for example, in situations where face-to-face contact is limited.
  • Information obtained in the training guidance process described above with reference to FIGS. 5A and 5B (for example, the movement, force, time, etc. with which the therapist moved the training guidance device 400, or the user's movement of the motion assistance device 10 of the motion assistance system 100). (movement, force, time, etc.) can be sent to the server device 200 and stored as big data.
  • the present invention is useful as providing an operation assistance system that can communicate with external devices.
  • movement assistance device 100 movement assistance system 110 voice recognition means 120 communication means 130 control means

Landscapes

  • Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

Un des objectifs de la présente invention est de fournir un système d'assistance au mouvement capable de communiquer avec un dispositif externe. Un système d'assistance au mouvement selon la présente invention comprend : un dispositif d'assistance au mouvement qui est configuré pour détecter un signal biologique d'un porteur et pour assister le mouvement du porteur sur la base du signal biologique ; un moyen de reconnaissance vocale qui est configuré pour reconnaître la parole qui a été entrée dans le dispositif d'assistance au mouvement ; un moyen de communication qui est configuré pour effectuer une communication entre le dispositif d'assistance au mouvement et un dispositif externe ; et un moyen de commande qui est configuré pour commander le dispositif d'assistance au mouvement et/ou le moyen de communication sur la base de la parole entrée.
PCT/JP2023/017081 2022-05-02 2023-05-01 Système d'assistance au mouvement et procédé appliqué dans un système d'assistance au mouvement WO2023214569A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022076068 2022-05-02
JP2022-076068 2022-05-02

Publications (1)

Publication Number Publication Date
WO2023214569A1 true WO2023214569A1 (fr) 2023-11-09

Family

ID=88646522

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/017081 WO2023214569A1 (fr) 2022-05-02 2023-05-01 Système d'assistance au mouvement et procédé appliqué dans un système d'assistance au mouvement

Country Status (1)

Country Link
WO (1) WO2023214569A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10151223A (ja) * 1996-11-25 1998-06-09 Mitsubishi Electric Corp ウェルネスシステム
US20150106241A1 (en) * 2013-07-02 2015-04-16 John A. Lucido 3-d immersion technology in a virtual store
JP2018038784A (ja) * 2016-09-02 2018-03-15 パナソニックIpマネジメント株式会社 起立動作支援装置、起立動作支援方法およびプログラム
JP2020512039A (ja) * 2016-12-08 2020-04-23 セイスミック ホールディングス インコーポレイテッド 補助外装スーツと共に使用するパッチシステム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10151223A (ja) * 1996-11-25 1998-06-09 Mitsubishi Electric Corp ウェルネスシステム
US20150106241A1 (en) * 2013-07-02 2015-04-16 John A. Lucido 3-d immersion technology in a virtual store
JP2018038784A (ja) * 2016-09-02 2018-03-15 パナソニックIpマネジメント株式会社 起立動作支援装置、起立動作支援方法およびプログラム
JP2020512039A (ja) * 2016-12-08 2020-04-23 セイスミック ホールディングス インコーポレイテッド 補助外装スーツと共に使用するパッチシステム

Similar Documents

Publication Publication Date Title
US20220288462A1 (en) System and method for generating treatment plans to enhance patient recovery based on specific occupations
US10175654B2 (en) Smartwatch device and method
Aggogeri et al. Robotics for rehabilitation of hand movement in stroke survivors
US20240131394A1 (en) System and method for implementing a treatment machine description language
US20180330810A1 (en) Physical therapy monitoring algorithms
US9256711B2 (en) Systems, computer medium and computer-implemented methods for providing health information to employees via augmented reality display
EP3067808B1 (fr) Procédé et appareil permettant de fournir des informations de patient collaboratives
Pramuka et al. Telerehabilitation technologies: accessibility and usability
JP6397817B2 (ja) サービス提供管理システム
Díaz et al. Development of a robotic device for post-stroke home tele-rehabilitation
US20190172571A1 (en) Enhanced assistive mobility devices
Pulikottil et al. A voice control system for assistive robotic arms: preliminary usability tests on patients
Kabir et al. The Impact of Spinal Cord Injury on Participation in Human-Centered Research
KR101519808B1 (ko) 3차원 공간 센서를 활용한 재활 치료 시스템
US20240170166A1 (en) Systems and methods for automated pricing, conduction, and transcription of telemedicine encounters
Chellal et al. Robot-assisted rehabilitation architecture supported by a distributed data acquisition system
Munteanu et al. Multimodal technologies for seniors: challenges and opportunities
Hreha et al. We all can call: Enhancing accessible cell phone usage for clients with spinal cord injury
WO2023214569A1 (fr) Système d'assistance au mouvement et procédé appliqué dans un système d'assistance au mouvement
JP2021022401A (ja) リハビリ計画作成支援装置、リハビリ計画作成支援システム、リハビリ計画作成支援方法、リハビリ計画作成支援コンピュータプログラム
JP2021108046A (ja) 学習支援システム
Vogiatzaki et al. Maintaining mental wellbeing of elderly at home
Wolff et al. Dynamic assessment of the upper extremity: a review of available and emerging technologies
Winters Telerehabilitation interface strategies for enhancing access to health services for persons with diverse abilities and preferences
JP7354955B2 (ja) 提示システム、提示方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23799498

Country of ref document: EP

Kind code of ref document: A1