CN113681541B - Exoskeleton control system and method based on Internet of things - Google Patents

Exoskeleton control system and method based on Internet of things Download PDF

Info

Publication number
CN113681541B
CN113681541B CN202110927884.5A CN202110927884A CN113681541B CN 113681541 B CN113681541 B CN 113681541B CN 202110927884 A CN202110927884 A CN 202110927884A CN 113681541 B CN113681541 B CN 113681541B
Authority
CN
China
Prior art keywords
exoskeleton
camera
controller
user
gait
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110927884.5A
Other languages
Chinese (zh)
Other versions
CN113681541A (en
Inventor
王天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Chengtian Technology Development Co Ltd
Original Assignee
Hangzhou Chengtian Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Chengtian Technology Development Co Ltd filed Critical Hangzhou Chengtian Technology Development Co Ltd
Priority to CN202110927884.5A priority Critical patent/CN113681541B/en
Publication of CN113681541A publication Critical patent/CN113681541A/en
Application granted granted Critical
Publication of CN113681541B publication Critical patent/CN113681541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0006Exoskeletons, i.e. resembling a human figure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The invention relates to the technical field of exoskeleton cooperative control, in particular to an exoskeleton control system and method based on the Internet of things. The present invention utilizes existing cameras and controllers, and sophisticated image recognition techniques, to control the exoskeleton used in the activity space in terms of movement intent and/or gait recognition.

Description

Exoskeleton control system and method based on Internet of things
Technical Field
The invention relates to the technical field of exoskeleton cooperative control, in particular to an exoskeleton control system and method based on the Internet of things.
Background
In the existing gait synchronous control of the exoskeleton, a sole pressure sensor, an inclination angle sensor and the like are generally used for sensing the movement intention or gait of a person so as to realize man-machine cooperative movement of the exoskeleton and the person. However, the physical sensor has time lag, and is liable to cause erroneous judgment when the ground is uneven or the user goes up and down stairs. The bioelectrical sensory information can directly reflect the wearer's intent to move, and is fast to respond, but more expensive and less technically sophisticated than physical sensors.
The technology of internet of things is widely applied at present, and the intercommunication among intelligent devices including monitoring cameras in families, rehabilitation centers or other public places is common. The present invention is directed to the cooperative control of exoskeleton using existing equipment, in conjunction with sophisticated image recognition techniques, for use in specific spaces to overcome the deficiencies of the prior art.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides an exoskeleton control system and method based on the Internet of things, so as to solve the problems of misjudgment, hysteresis and the like in the existing exoskeleton cooperative control and reduce the cost of an exoskeleton.
An exoskeleton control system based on the Internet of things comprises a controller, at least one camera, at least one exoskeleton and a plurality of adsorption type detection units;
the adsorption type detection unit comprises a light sensing point, an acceleration sensor, a gyroscope, a magnet and a shell; the light sensing point is fixed at the top of the shell, and an acceleration sensor and a gyroscope are arranged in the shell; the magnet is arranged at the bottom of the shell; the adsorption type detection unit adsorbs each joint part of the exoskeleton; the camera detects the photosites in real time, and the acceleration sensor and the gyroscope are in wireless connection with the controller; the outermost layer of the exoskeleton is made of a magnetic material;
a controller in communication with the camera and the exoskeleton, respectively;
the camera is arranged in the motion space of the exoskeleton user and is used for acquiring the image information of the photosites in the motion space of the exoskeleton user;
the controller receives image information of the photosensitive points, and combines detection data of the acceleration sensor and the gyroscope in detection of the adsorption detection unit to obtain gait characteristics of rehabilitation personnel, wherein the gait characteristics comprise stride, frequency, a basic gait curve and resistance coefficients of different stages in a walking cycle in an active rehabilitation mode;
and comparing the gait characteristics with the historical gait characteristic data of the rehabilitation personnel prestored in the controller, and comparing the rehabilitation recovery model or the recovery curve by the controller to set the running parameters of the exoskeleton robot during the next rehabilitation.
Further, no separate locomotor intent and/or gait recognition device is provided on the exoskeleton.
Further, an exoskeleton ID which can be identified by the controller through images is arranged on the exoskeleton.
Further, the camera comprises a driving device for driving the camera to rotate in all directions.
Further, the controller is connected with the camera and the exoskeleton in a close-range communication mode.
Furthermore, the close-range communication mode is Bluetooth, WIFI or ZigBee.
An exoskeleton control method based on the Internet of things comprises the following steps:
step 1: the camera arranged in the exoskeleton activity space acquires images in the exoskeleton user activity space in real time and transmits the images to the controller;
step 2: the controller identifies the image acquired by the camera and judges whether the exoskeleton equipment exists in the image; if yes, executing step 3, and if no, returning to step 1;
and step 3: further judging whether the exoskeleton is worn on a user according to the image, if so, identifying an exoskeleton ID arranged on the exoskeleton, and enabling the controller to establish close-range communication connection with the exoskeleton corresponding to the exoskeleton ID;
and 4, step 4: the controller identifies the movement intention and/or gait of the user according to the image acquired by the camera, and controls the exoskeleton to realize the cooperative action with the user according to the identification result.
Further, the method also comprises the following steps: and 5: the controller controls the camera driving device to act in real time according to the position of the exoskeleton user in the image so as to adjust the angle of the camera and follow the exoskeleton user, so that the exoskeleton user is always in the central position of the image.
Further, the method also comprises the following steps: step 6: the controller controls the camera driving device to act in real time according to the position of the exoskeleton user in the image so as to adjust the angle of the camera, shoot the front or the side of the exoskeleton user to the maximum extent and better identify the movement intention or the gait.
Further, in step 4, the controller identifies the terrain where the user is currently located, besides the movement intention and/or gait of the user, according to the image acquired by the camera, and controls the exoskeleton to realize the cooperative action with the user according to the identification structure and terrain identification result of the movement intention and/or gait.
Compared with the prior art, the invention has the beneficial effects that:
the camera and the controller arranged in the activity space of the exoskeleton user are original equipment of the Internet of things in the activity space, such as the camera and the controller playing a monitoring role. The exoskeleton intelligent control system and the control method thereof have the advantages that the existing cameras and controllers outside and mature image recognition technology are utilized to control the exoskeleton used in the activity space in the aspects of movement intention and/or gait recognition, based on the mode, independent movement intention and/or gait recognition devices are not arranged on the exoskeleton, the cost can be reduced, and the cameras and the controllers are original equipment of the internet of things and do not need to be additionally arranged.
In addition, as the mature image recognition technology is adopted for recognizing the movement intention and/or the gait, compared with a physical sensor, the recognition accuracy is better, and the speed is quicker in time, because the movement intention of the user can be judged through the related actions of other limbs of the user before the limbs using the exoskeleton act.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
figure 1 is a schematic diagram of an exoskeleton control system of the present invention.
Fig. 2 is a schematic diagram of the communication connections between the controller and the exoskeleton of the present invention.
Fig. 3 is a flow chart of the exoskeleton control method of the present invention.
FIG. 4 is a schematic structural diagram of an absorption-type detecting unit according to the present invention.
FIG. 5 is a schematic view of the installation of the absorption detection unit of the present invention.
Detailed Description
Embodiments of the present application will be described in detail with reference to the drawings and examples, so that how to implement technical means to solve technical problems and achieve technical effects of the present application can be fully understood and implemented.
Example one
As shown in fig. 1 and 2, the exoskeleton control system based on the internet of things according to the embodiment includes a controller 2, at least one camera 1, at least one exoskeleton 3; a controller in communication with the camera and the exoskeleton, respectively; the camera is arranged in the exoskeleton user activity space, such as on a wall or a stand column, and is used for acquiring images in the exoskeleton user activity space in real time; the exoskeleton user activity space refers to a space where an exoskeleton user is located, or a preset space where the exoskeleton user can move, such as a place in a home, a hospital, a rehabilitation center, and the like, and the activity space may be closed or open. The controller is used for acquiring and identifying the images acquired by the camera so as to identify the movement intention and/or gait of the exoskeleton user and control the action of the exoskeleton according to the identification result. Specifically, the exercise intention of the user can be recognized by determining the posture of the body of the user in real time by means of image recognition, for example, when the user is in a sitting posture and the upper body is inclined forward by image recognition, it is determined that the user has an intention to start rising and moving forward. The specific movement intention can also be represented by preset limb actions, when the user makes the specific limb actions, the controller can identify the corresponding movement intention, for example, the user waves the palm forwards to represent the intention to walk forwards, the user twists the head twice leftwards or rightwards to represent the intention to turn leftwards or rightwards, the controller identifies the movement intention to identify the user to walk forwards when the user waves the palm forwards through the image collected by the camera, and when the user continuously twists the head twice leftwards or rightwards, the movement intention identifies the user to turn leftwards or rightwards. For gait recognition, image recognition can be performed through the controller to acquire human body motion postures in real time, such as joint motion angle, stride and other information, so as to recognize gait in real time, of course, multiple gaits can be prestored in the controller, each gait can preset stride and stride frequency, and a specific gait is represented through preset limb actions, for example, a user stretches out one finger, the user is represented to walk with a preset first gait, stretches out two fingers, the user is represented to walk with a preset second gait, and the controller acquires gait of the user through image recognition of the preset limb actions. Of course, when the movement intention and/or gait recognition is performed, if the controller detects a preset limb action, the exoskeleton is controlled according to the movement intention and/or gait corresponding to the limb action, otherwise, the body posture and/or the movement posture of the user is detected in real time through image recognition to recognize the movement intention and/or gait and control the exoskeleton.
That is, external cameras and controllers are utilized for movement intent and/or gait recognition, replacing the movement intent and/or gait recognition devices mounted on conventional exoskeleton devices. Of course, the controller can also simultaneously identify the terrain where the user is currently located, identify the structure and the terrain identification result according to the movement intention and/or the gait, and control the action of the exoskeleton according to the identification result. The controller can provide the movement intention and/or gait of the user as input signals to the main control system of the exoskeleton, and the main control system of the exoskeleton controls the exoskeleton to act accordingly, or can directly control the power device of the exoskeleton according to the recognition result, and at the moment, the controller also plays a role of the main control system of the exoskeleton. The control of the actions of the exoskeleton mainly aims at controlling a power device of the exoskeleton and at least comprises the control of the power-assisted size, the action speed and the swing angle of the exoskeleton.
As shown in fig. 4 and 5, the adsorption type detection unit 6 includes a photosite 1, an acceleration sensor 3, a gyroscope 4, a magnet 5, and a housing 2;
the light sensing point 1 is fixed at the top of the shell 2, and the acceleration sensor 3 and the gyroscope 4 are arranged in the shell 2; the magnet 5 is arranged at the bottom of the shell 2; the adsorption type detection unit 6 adsorbs each joint part of the exoskeleton 7; the camera detects the photosites in real time, and the acceleration sensor and the gyroscope are in wireless connection with the controller;
the present invention is directed to exoskeletons for use in specific activity spaces (including but not limited to a user's home, hospital, rehabilitation center, etc.) where it is generally believed that the user will not wear the exoskeleton device out of the space. When the exoskeleton is in the activity space of the user, image recognition can be carried out through a camera and a controller in the activity space so as to acquire the movement intention and/or the gait of the user. Thus, there is no need for separate locomotor intent and/or gait recognition devices on the exoskeleton, which reduces the manufacturing costs of the exoskeleton.
The exoskeleton is provided with an exoskeleton ID, and the controller can acquire the exoskeleton ID of the exoskeleton in the working state in the image and establish communication connection with the exoskeleton corresponding to the exoskeleton ID. The exoskeleton ID may be a serial number provided outside the exoskeleton or may be a specific pattern, etc., as long as it can be used to determine the identity of the exoskeleton.
Further, if there are multiple exoskeletons within the image captured by the camera, the controller can identify whether each exoskeleton is worn on the user, the user's intent to move and/or gait, and the exoskeleton ID, respectively.
The camera comprises a driving device for driving the camera to rotate in all directions. Since the user is moving, the camera driving device is controlled to operate so that the camera can follow the user to capture images in order to maintain the optimum angle for capturing images. When a plurality of exoskeleton devices are in the working state at the same time, the angle of the camera can be controlled, so that all the exoskeleton devices in the working state can be shot by the camera at the same time.
The controller is connected with the camera and the exoskeleton in a close-range communication close-range mode. The close range communication mode can adopt Bluetooth, WIFI or ZigBee, and can ensure that the controller, the camera and the exoskeleton cannot be too far away, limit the moving range of an exoskeleton user and avoid danger when the exoskeleton user leaves a preset moving space.
The exoskeleton device can also be provided with a reminding device, and when the exoskeleton user moves out of the close range communication distance, namely, the exoskeleton user is out of the communication range with the controller, the reminding device on the exoskeleton reminds the user or an attendant in a sound light and/or vibration mode.
Example two
As shown in fig. 3, the exoskeleton control method based on the internet of things according to the embodiment can use the exoskeleton control system according to the embodiment, and the control method includes:
step 1: at least one camera arranged in the exoskeleton motion space acquires images in the exoskeleton user motion space in real time and transmits the images to the controller;
step 2: the controller identifies the image acquired by the camera and judges whether the exoskeleton equipment exists in the image; if yes, executing step 3, and if no, returning to step 1;
and step 3: further judging whether the exoskeleton is worn on the user according to the image, if so, identifying the exoskeleton ID arranged on the exoskeleton, and enabling the controller to establish close-range communication connection with the exoskeleton corresponding to the exoskeleton ID;
and 4, step 4: the controller identifies the movement intention and/or gait of the user according to the image collected by the camera, and controls the exoskeleton to realize the cooperative action with the user according to the identification result.
Further, the method also comprises the following steps: and 5: the controller controls the camera driving device to act in real time according to the position of the exoskeleton user in the image so as to adjust the angle of the camera and follow the exoskeleton user.
The step of following the exoskeleton user refers to that the exoskeleton user is always in the central position of an image or the front or the side of the exoskeleton user is shot to the maximum extent. The exoskeleton user is always in the central position of the image, the surrounding situation of the user can be shot to the maximum extent, the controller can sense the surrounding information of the user better conveniently, the front side or the side of the exoskeleton user can be shot to the maximum extent, and the movement intention or the gait can be judged more accurately from the front side or the side, so the movement intention or the gait can be better identified by adopting the following mode.
Of course, if a plurality of exoskeleton devices in working states exist at the same time, all the exoskeleton devices in the working states can be selected to be shot by the camera at the same time to control the camera.
Further, in step 4, the controller identifies the terrain where the user is currently located, besides the movement intention and/or gait of the user, according to the image acquired by the camera, and controls the exoskeleton to realize the cooperative action with the user according to the identification structure and terrain identification result of the movement intention and/or gait.
During control, the terrain recognition result is combined, on one hand, misjudgment can be better eliminated, and on the other hand, the movement intention and/or gait of the user can be pre-judged according to the current terrain result, so that the user can be better assisted to walk. Identification of terrain, including but not limited to, identification of flatness of the ground, obstacles, stairs. When a user is in a particular terrain, the user may be walking in a particular manner, for example when the ground is found to be uneven (e.g. having a hole) or having an obstacle, it is reasonable to expect that the user may choose to stride further or to bypass the uneven ground or obstacle, and the intent of the user's movement and/or gait can be predicted by this expectation.
Based on the control mode, an independent movement intention and/or gait recognition device is not arranged on the exoskeleton, the cost can be reduced, and the camera and the controller are original Internet of things equipment and do not need to be additionally arranged. In addition, due to the fact that the mature image recognition technology is adopted for movement intention and/or gait recognition, compared with a physical sensor, the recognition accuracy is better, and the speed is quicker in time.
As some terms are used throughout the description and claims to refer to particular components. As one skilled in the art will appreciate, manufacturers may refer to a component by different names. This specification and claims do not intend to distinguish between components that differ in name but not function. In the following description and in the claims, the terms "include" and "comprise" are used in an open-ended fashion, and thus should be interpreted to mean "include, but not limited to. "substantially" means within an acceptable error range, and a person skilled in the art can solve the technical problem within a certain error range to substantially achieve the technical effect. Furthermore, the term "coupled" is intended to encompass any direct or indirect electrical coupling. Thus, if one device is coupled to another device, that connection may be through a direct electrical coupling, or through an indirect electrical coupling via other devices and couplings. The following description is of the preferred embodiment for carrying out the present application, but is made for the purpose of illustrating the general principles of the application and is not to be taken in a limiting sense. The protection scope of the present application shall be subject to the definitions of the appended claims.
It is noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in articles of commerce or systems including such elements.
The foregoing description shows and describes several preferred embodiments of the invention, but as aforementioned, it is to be understood that the invention is not limited to the forms disclosed herein, but is not to be construed as excluding other embodiments and is capable of use in various other combinations, modifications, and environments and is capable of changes within the scope of the inventive concept as expressed herein, commensurate with the above teachings, or the skill or knowledge of the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (7)

1. An exoskeleton control system based on the Internet of things is characterized by comprising a controller, at least one camera, at least one exoskeleton and a plurality of adsorption type detection units; the adsorption type detection unit comprises a light sensing point, an acceleration sensor, a gyroscope, a magnet and a shell; the light sensing point is fixed at the top of the shell, and an acceleration sensor and a gyroscope are arranged in the shell; the magnet is arranged at the bottom of the shell; the absorption type detection units are respectively absorbed on the head, the shoulder, the chest, the abdomen, the hand joints, the hands, the hip joints, the knee joints and the feet of the exoskeleton; the camera detects the photosites in real time, and the acceleration sensor and the gyroscope are in wireless connection with the controller; the outermost layer of the exoskeleton is made of magnetic materials; a controller in communication with the camera and the exoskeleton, respectively; the camera is arranged in the motion space of the exoskeleton user and is used for acquiring the image information of the photosites in the motion space of the exoskeleton user; the controller receives image information of the photosensitive points, and combines detection data of the acceleration sensor and the gyroscope in detection of the adsorption detection unit to obtain gait characteristics of rehabilitation personnel, wherein the gait characteristics comprise stride, frequency, a basic gait curve and resistance coefficients of different stages in a walking cycle in an active rehabilitation mode; the gait characteristics are compared with the historical gait characteristic data of the rehabilitation personnel prestored in the controller, the controller compares a rehabilitation recovery model or a recovery curve, and the running parameters of the exoskeleton robot during the next rehabilitation are set; the camera comprises a driving device, a driving device and a control device, wherein the driving device is used for driving the camera to rotate in all directions; when a plurality of exoskeleton devices are in working states at the same time, controlling the angle of the camera to enable all the exoskeletons in the working states to be shot by the camera at the same time; the controller can also simultaneously identify the terrain where the user is located, identify the structure and the terrain identification result according to the movement intention and/or the gait, and control the action of the exoskeleton according to the identification result.
2. An internet of things based exoskeleton control system as claimed in claim 1 wherein the system can determine patient or health care personnel ID from face recognition.
3. The internet of things-based exoskeleton control system of claim 1 wherein the controller is coupled to the camera and the exoskeleton in a close range communication.
4. The internet of things-based exoskeleton control system of claim 3 wherein the close range communication means is Bluetooth, WIFI or ZigBee.
5. An internet of things-based exoskeleton control method as claimed in claim 1, wherein the control method of the internet of things-based exoskeleton control system comprises the steps of:
step 1: the camera arranged in the exoskeleton activity space acquires images in the exoskeleton user activity space in real time and transmits the images to the controller;
step 2: the controller identifies the image acquired by the camera and judges whether a patient exists in the image; if yes, executing step 3, and if no, returning to step 1;
and step 3: further judging whether the exoskeleton is worn on the user according to the image, if so, identifying the exoskeleton ID arranged on the exoskeleton, and enabling the controller to establish close-range communication connection with the exoskeleton corresponding to the exoskeleton ID;
and 4, step 4: and issuing the running parameters of the exoskeleton robot before rehabilitation, wherein the running parameters comprise stride, frequency, a basic gait curve and resistance coefficients of different stages in a walking cycle in an active rehabilitation mode.
6. The method for exoskeleton control based on the internet of things of claim 5, further comprising: and 5: the controller controls the camera driving device to act in real time according to the position of the exoskeleton user in the image so as to adjust the angle of the camera and follow the patient.
7. The method for exoskeleton control based on the internet of things as claimed in claim 6 wherein said following the patient means keeping the exoskeleton user in the center position of the image or shooting the front or side of the exoskeleton user to the maximum extent so as to better identify the movement intention or gait.
CN202110927884.5A 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things Active CN113681541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110927884.5A CN113681541B (en) 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110927884.5A CN113681541B (en) 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things

Publications (2)

Publication Number Publication Date
CN113681541A CN113681541A (en) 2021-11-23
CN113681541B true CN113681541B (en) 2022-11-25

Family

ID=78579690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110927884.5A Active CN113681541B (en) 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things

Country Status (1)

Country Link
CN (1) CN113681541B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663775B (en) * 2022-05-26 2022-08-12 河北工业大学 Method for identifying stairs in exoskeleton robot service environment
CN115070732A (en) * 2022-06-30 2022-09-20 中国农业科学院都市农业研究所 Agricultural exoskeleton robot detector device and method based on machine vision

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930135A (en) * 2020-08-12 2020-11-13 深圳航天科技创新研究院 Active power-assisted control method and device based on terrain judgment and exoskeleton robot

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102172954B1 (en) * 2013-11-08 2020-11-02 삼성전자주식회사 A walk-assistive robot and a method for controlling the walk-assistive robot
WO2018050191A1 (en) * 2016-09-14 2018-03-22 Aalborg Universitet A human intention detection system for motion assistance
CN112223253B (en) * 2019-07-15 2022-08-02 上海中研久弋科技有限公司 Exoskeleton system, exoskeleton identification control method, electronic device and storage medium
CN110524525B (en) * 2019-10-05 2022-04-26 河北工业大学 Lower limb exoskeleton control method
CN112669964A (en) * 2019-10-16 2021-04-16 深圳市迈步机器人科技有限公司 Power exoskeleton and rehabilitation evaluation method based on same
CN111631923A (en) * 2020-06-02 2020-09-08 中国科学技术大学先进技术研究院 Neural network control system of exoskeleton robot based on intention recognition
CN113063411A (en) * 2020-06-29 2021-07-02 河北工业大学 Exoskeleton evaluation system and method of use thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930135A (en) * 2020-08-12 2020-11-13 深圳航天科技创新研究院 Active power-assisted control method and device based on terrain judgment and exoskeleton robot

Also Published As

Publication number Publication date
CN113681541A (en) 2021-11-23

Similar Documents

Publication Publication Date Title
CN113681541B (en) Exoskeleton control system and method based on Internet of things
US11389686B2 (en) Robotically assisted ankle rehabilitation systems, apparatuses, and methods thereof
KR102163284B1 (en) Wearable robot and control method for the same
CN108836757A (en) A kind of assisted walk exoskeleton robot system with self-regulation
CN110405736B (en) Walking aid control method and system, exoskeleton robot and terminal
JP5186723B2 (en) Communication robot system and communication robot gaze control method
US8532841B2 (en) Rehabilitation device
KR101203669B1 (en) walking assisting device having user recognition function
JP2011516915A (en) Motion content-based learning apparatus and method
CN106112985B (en) Exoskeleton hybrid control system and method for lower limb walking aid machine
CN109350459A (en) A kind of auxiliary walking devices and its walk help control method
CN105662789B (en) A kind of exoskeleton system based on sound limb motion monitoring, control deformed limb motion
CN107305138A (en) Basketball action identification method and system based on wrist attitude detection
CN112405504B (en) Exoskeleton robot
Mun et al. Development and evaluation of a novel overground robotic walker for pelvic motion support
JP2024505468A (en) Hand movement detection device, control method, rehabilitation device and autonomous control system
CN106214163A (en) The artificial psychology of a kind of lower limb malformation postoperative straightening rehabilitation teaches device
KR100555990B1 (en) Walking Training Support Robot with the distributed control of Pneumatic actuator
KR200458671Y1 (en) Apparatus for assisting muscular strength
CN106325306A (en) Camera assembly device of robot and photographing and tracking method of camera assembly device
CN209220856U (en) A kind of assisted walk exoskeleton robot system with self-regulation
CN111728827B (en) Power lower limb exoskeleton control method, device and system
KR101697958B1 (en) Walking System
JP2021126218A (en) Walking assist device, walking assist method, and walking assist program
CN110292508A (en) A kind of exoskeleton rehabilitation robot control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: An Exoskeleton Control System and Method Based on the Internet of Things

Granted publication date: 20221125

Pledgee: Hangzhou High-tech Financing Guarantee Co.,Ltd.

Pledgor: HANGZHOU CHENGTIAN TECHNOLOGY DEVELOPMENT Co.,Ltd.

Registration number: Y2024980003981