CN113681541B - Exoskeleton control system and method based on Internet of things - Google Patents

Exoskeleton control system and method based on Internet of things Download PDF

Info

Publication number
CN113681541B
CN113681541B CN202110927884.5A CN202110927884A CN113681541B CN 113681541 B CN113681541 B CN 113681541B CN 202110927884 A CN202110927884 A CN 202110927884A CN 113681541 B CN113681541 B CN 113681541B
Authority
CN
China
Prior art keywords
exoskeleton
camera
controller
user
gait
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110927884.5A
Other languages
Chinese (zh)
Other versions
CN113681541A (en
Inventor
王天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Chengtian Technology Development Co Ltd
Original Assignee
Hangzhou Chengtian Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Chengtian Technology Development Co Ltd filed Critical Hangzhou Chengtian Technology Development Co Ltd
Priority to CN202110927884.5A priority Critical patent/CN113681541B/en
Publication of CN113681541A publication Critical patent/CN113681541A/en
Application granted granted Critical
Publication of CN113681541B publication Critical patent/CN113681541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Program-controlled manipulators
    • B25J9/0006Exoskeletons, i.e. resembling a human figure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Program-controlled manipulators
    • B25J9/16Program controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Program-controlled manipulators
    • B25J9/16Program controls
    • B25J9/1679Program controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Program-controlled manipulators
    • B25J9/16Program controls
    • B25J9/1694Program controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Rehabilitation Tools (AREA)

Abstract

本发明涉及外骨骼协同控制技术领域,尤其涉及一种基于物联网的外骨骼控制系统及方法,包括控制器、至少一个摄像机、至少一个外骨骼,控制器分别与所述摄像机和所述外骨骼通信连接,摄像机安装在外骨骼使用者活动空间内,并用于采集外骨骼使用者活动空间内的图像,控制器用于获取并识别摄像机所采集的图像,以识别外骨骼使用者的运动意图和/或步态,用于针对性的优化外骨骼控制参数,实现个人定制的康复方案及不同阶段的康复策略调整。本发明利用现有的摄像机和控制器,以及成熟的图像识别技术,对在该活动空间内使用的外骨骼进行运动意图和/或步态识别方面的控制。

Figure 202110927884

The present invention relates to the technical field of exoskeleton cooperative control, in particular to an exoskeleton control system and method based on the Internet of Things, including a controller, at least one camera, and at least one exoskeleton, and the controller is connected with the camera and the exoskeleton respectively Communication connection, the camera is installed in the activity space of the exoskeleton user, and is used to collect images in the activity space of the exoskeleton user, and the controller is used to obtain and identify the images collected by the camera, so as to identify the movement intention of the exoskeleton user and/or The gait is used to optimize the control parameters of the exoskeleton in a targeted manner, to realize the customized rehabilitation plan and the adjustment of rehabilitation strategies at different stages. The present invention utilizes existing cameras and controllers, as well as mature image recognition technology, to control the motion intention and/or gait recognition of the exoskeleton used in the activity space.

Figure 202110927884

Description

一种基于物联网的外骨骼控制系统及方法A kind of exoskeleton control system and method based on internet of things

技术领域technical field

本发明涉及外骨骼协同控制技术领域,尤其涉及一种基于物联网的外骨骼控制系统及方法。The present invention relates to the technical field of exoskeleton cooperative control, in particular to an exoskeleton control system and method based on the Internet of Things.

背景技术Background technique

现有外骨骼的步态同步控制中,通常利用足底压力传感器、倾角传感器等感知人的运动意图或步态,以实现外骨骼和人的人机协同运动。而物理传感器在时间上存在滞后性,在地面不平或上下楼梯时容易产生误判。生物电传感信息能够直接反映穿戴者的运动意图,相对于物理传感器,其响应快速,但价格更加昂贵,且技术不够成熟。In the gait synchronization control of the existing exoskeleton, the plantar pressure sensor, inclination sensor, etc. are usually used to sense the human's movement intention or gait, so as to realize the human-machine cooperative movement of the exoskeleton and the human. However, physical sensors have a lag in time, which is prone to misjudgment when the ground is uneven or when going up and down stairs. Bioelectric sensing information can directly reflect the wearer's motion intention. Compared with physical sensors, its response is faster, but the price is more expensive and the technology is not mature enough.

而现在物联网技术已经得到了广泛应用,在家庭、康复中心或其它公共场所,智能设备之间,包括监控摄像机,进行互通已经很常见。对于在特定空间内使用的外骨骼,本发明旨在利用已有的设备,以及结合成熟的图像识别技术,对外骨骼进行协同控制,以克服现有技术中存在的缺陷。Now the Internet of Things technology has been widely used, and it is very common for smart devices, including surveillance cameras, to communicate with each other in homes, rehabilitation centers or other public places. For the exoskeleton used in a specific space, the present invention aims to use the existing equipment and combine mature image recognition technology to carry out cooperative control of the exoskeleton, so as to overcome the defects in the prior art.

发明内容Contents of the invention

本发明的目的是克服现有技术的不足,提供一种基于物联网的外骨骼控制系统及方法,以克服现有的外骨骼协同控制中存在误判、滞后性等问题,并能够降低外骨骼的成本。The purpose of the present invention is to overcome the deficiencies of the prior art, and provide an exoskeleton control system and method based on the Internet of Things, so as to overcome the problems of misjudgment and hysteresis in the existing exoskeleton collaborative control, and to reduce the exoskeleton the cost of.

一种基于物联网的外骨骼控制系统,包括控制器、至少一个摄像机、至少一个外骨骼,多个吸附式检测单元;An exoskeleton control system based on the Internet of Things, including a controller, at least one camera, at least one exoskeleton, and multiple adsorption detection units;

所述吸附式检测单元包括感光点、加速度传感器、陀螺仪、磁体、壳体;所述感光点固定在壳体顶部,所述壳体内部安置有加速度传感器、陀螺仪;所述磁铁设置在壳体底部;所述吸附式检测单元吸附在外骨骼各关节部位;所述摄像头实时检测感光点,所述加速度传感器、陀螺仪与控制器无线连接;外骨骼最外层采用磁性材料;The adsorption detection unit includes a photosensitive point, an acceleration sensor, a gyroscope, a magnet, and a casing; the photosensitive point is fixed on the top of the casing, and an acceleration sensor and a gyroscope are arranged inside the casing; the magnet is arranged on the casing The bottom of the body; the adsorption detection unit is adsorbed on each joint of the exoskeleton; the camera detects photosensitive points in real time, and the acceleration sensor and gyroscope are wirelessly connected to the controller; the outermost layer of the exoskeleton is made of magnetic materials;

控制器分别与所述摄像机和所述外骨骼通信连接;The controller is respectively connected in communication with the camera and the exoskeleton;

摄像机安装在外骨骼使用者活动空间内,并用于采集外骨骼使用者活动空间内的感光点图像信息;The camera is installed in the activity space of the exoskeleton user, and is used to collect image information of photosensitive points in the activity space of the exoskeleton user;

控制器接受感光点图像信息,并结合吸附式检测单元检测中加速度传感器、陀螺仪的检测数据,可获取康复人员的步态特征,步态特征包括步幅、频率、基础步态曲线、主动康复模式下一个步行周期内不同阶段阻力系数;The controller receives the photosensitive point image information, and combines the detection data of the acceleration sensor and gyroscope in the detection of the adsorption detection unit to obtain the gait characteristics of the rehabilitation personnel. The gait characteristics include stride length, frequency, basic gait curve, active rehabilitation The resistance coefficient of different stages in a walking cycle under the mode;

将步态特征与控制器中预存的康复人员历史步态特征数据对比,控制器对比康复恢复模型或恢复曲线,设定下次康复时外骨骼机器人运行参数。The gait characteristics are compared with the historical gait characteristic data of the rehabilitation personnel stored in the controller, and the controller compares the rehabilitation recovery model or recovery curve to set the operating parameters of the exoskeleton robot for the next rehabilitation.

进一步的,所述外骨骼上不设置单独的运动意图和/或步态识别装置。Further, no separate movement intention and/or gait recognition device is set on the exoskeleton.

进一步的,所述外骨骼上设置有可被控制器通过图像进行识别的外骨骼ID。Further, the exoskeleton is provided with an exoskeleton ID that can be identified by the controller through an image.

进一步的,所述摄像机包括驱动装置,用于驱动摄像机进行全方位转动。Further, the camera includes a driving device for driving the camera to rotate in all directions.

进一步的,控制器与所述摄像机和所述外骨骼之间采用近距离通信方式进行连接。Further, the controller is connected with the camera and the exoskeleton by means of short distance communication.

进一步的,近距离通信方式为蓝牙、WIFI或ZigBee。Further, the short-distance communication method is Bluetooth, WIFI or ZigBee.

一种基于物联网的外骨骼控制方法,控制方法包括如下步骤:A kind of exoskeleton control method based on Internet of things, control method comprises the steps:

步骤1:设置在外骨骼活动空间内的摄像机实时采集外骨骼使用者活动空间内的图像,并将图像传输给控制器;Step 1: The camera set in the activity space of the exoskeleton collects the images in the activity space of the exoskeleton user in real time, and transmits the images to the controller;

步骤2:控制器对摄像机所采集的图像进行识别,判断图像内是否存在外骨骼设备;如果存在,则执行步骤3,如果不存在则返回步骤1;Step 2: The controller recognizes the image collected by the camera, and judges whether there is an exoskeleton device in the image; if it exists, execute step 3, and if not, return to step 1;

步骤3:根据上述图像进一步判断该外骨骼是否穿戴在使用者身上,如果是,则识别设置在外骨骼上的外骨骼ID,并使控制器与该外骨骼ID对应的外骨骼建立近距离通信连接;Step 3: According to the above image, further judge whether the exoskeleton is worn on the user's body, if so, identify the exoskeleton ID set on the exoskeleton, and make the controller establish a short-distance communication connection with the exoskeleton corresponding to the exoskeleton ID ;

步骤4:控制器根据摄像机所采集的图像,对使用者的运动意图和/或步态进行识别,并根据识别结果对外骨骼进行控制,使其与使用者实现协同动作。Step 4: The controller recognizes the user's motion intention and/or gait according to the image collected by the camera, and controls the exoskeleton according to the recognition result, so that it can cooperate with the user.

进一步的,还包括:步骤5:控制器根据外骨骼使用者在图像内的位置,实时控制摄像机驱动装置进行动作,以调整摄像机的角度,对外骨骼使用者进行跟随,使外骨骼使用者始终处于图像的中心位置。Further, it also includes: Step 5: The controller controls the camera driving device to move in real time according to the position of the exoskeleton user in the image, so as to adjust the angle of the camera and follow the exoskeleton user so that the exoskeleton user is always in the The center position of the image.

进一步的,还包括:步骤6:控制器根据外骨骼使用者在图像内的位置,实时控制摄像机驱动装置进行动作,以调整摄像机的角度,以最大限度拍摄到外骨骼使用者的正面或侧面,从而更好地对运动意图或步态进行识别。Further, it also includes: Step 6: The controller controls the camera driving device to perform actions in real time according to the position of the exoskeleton user in the image, so as to adjust the angle of the camera so as to capture the front or side of the exoskeleton user to the maximum extent, This allows for better recognition of motion intentions or gait.

进一步的,步骤4中,控制器根据摄像机所采集的图像,除对使用者的运动意图和/或步态进行识别外,还对使用者当前所处地形进行识别,并根据运动意图和/或步态进行识别结构以及地形识别结果,对外骨骼进行控制,使其与使用者实现协同动作。Further, in step 4, the controller not only recognizes the user's motion intention and/or gait according to the images collected by the camera, but also recognizes the user's current terrain, and according to the motion intention and/or The gait is used to recognize the structure and terrain recognition results, and the exoskeleton is controlled to make it cooperate with the user.

与现有技术相比,本发明的有益效果在于:Compared with prior art, the beneficial effect of the present invention is:

本发明利用的外骨骼使用者活动空间内设置的摄像机以及控制器,均为该活动空间内原有的物联网设备,例如起到监控作用的摄像机和控制器。即,本发明利用外界现有的摄像机和控制器,以及成熟的图像识别技术,对在该活动空间内使用的外骨骼进行运动意图和/或步态识别方面的控制,基于这种方式,外骨骼上可以不设置单独的运动意图和/或步态识别装置,能够降低成本,并且摄像机和控制器均为原有的物联网设备,不需要额外增设。The cameras and controllers installed in the activity space of the exoskeleton user used in the present invention are all original Internet of Things devices in the activity space, such as cameras and controllers that play a monitoring role. That is, the present invention utilizes existing external cameras and controllers, as well as mature image recognition technology, to control the motion intention and/or gait recognition of the exoskeleton used in the activity space. Based on this method, the exoskeleton A separate movement intention and/or gait recognition device does not need to be set on the skeleton, which can reduce costs, and the camera and the controller are all original IoT devices, and no additional installation is required.

另外由于采用成熟的图像识别技术进行运动意图和/或步态识别,相对于物理传感器,其识别准确性更好,并且在时间上更加迅速,因为其在使用外骨骼的肢体动作前,可以通过使用者其它肢体的相关动作,来判断使用者的运动意图。In addition, due to the use of mature image recognition technology for motion intention and/or gait recognition, compared with physical sensors, its recognition accuracy is better, and it is faster in time, because it can pass through The relevant movements of the user's other limbs are used to judge the user's movement intention.

附图说明Description of drawings

此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:The drawings described here are used to provide a further understanding of the application and constitute a part of the application. The schematic embodiments and descriptions of the application are used to explain the application and do not constitute an improper limitation to the application. In the attached picture:

图1为本发明的外骨骼控制系统示意图。Fig. 1 is a schematic diagram of the exoskeleton control system of the present invention.

图2为本发明的控制器与外骨骼之间的通信连接示意图。Fig. 2 is a schematic diagram of the communication connection between the controller and the exoskeleton of the present invention.

图3为本发明的外骨骼控制方法流程图。Fig. 3 is a flow chart of the exoskeleton control method of the present invention.

图4为本发明的吸附式检测单元结构示意图。Fig. 4 is a schematic structural diagram of the adsorption detection unit of the present invention.

图5为本发明吸附式检测单元安装示意图。Fig. 5 is a schematic diagram of the installation of the adsorption detection unit of the present invention.

具体实施方式Detailed ways

以下将配合附图及实施例来详细说明本申请的实施方式,借此对本申请如何应用技术手段来解决技术问题并达成技术功效的实现过程能充分理解并据以实施。The implementation of the present application will be described in detail below with reference to the accompanying drawings and examples, so as to fully understand and implement the implementation process of how the present application uses technical means to solve technical problems and achieve technical effects.

实施例一Embodiment one

如图1、2所示,本实施例所述的基于物联网的外骨骼控制系统,包括控制器2、至少一个摄像机1、至少一个外骨骼3;控制器分别与所述摄像机和所述外骨骼通信连接;摄像机安装在外骨骼使用者活动空间内,例如安装在墙壁上,或者立柱上等,并用于实时采集外骨骼使用者活动空间内的图像;外骨骼使用者活动空间,指的是外骨骼的使用者所处的空间,或者是预设的外骨骼使用者可以在其中进行活动的空间,例如家里、医院、康复中心等场所,活动空间可以是封闭的,也可以是开放的。控制器用于获取并识别摄像机所采集的图像,以识别外骨骼使用者的运动意图和/或步态,并根据识别结果控制外骨骼的动作。利用图像来识别使用者的运动意图和/或步态,具体来讲,可以通过图像识别的方式来实时判断使用者的身体姿态,来识别使用者的运动意图,例如,当通过图像识别出使用者处于坐姿但上身前倾时,则判断使用者有要开始起身向前运动的意图。也可以通过预设的肢体动作,来表示特定的运动意图,当使用者做出特定的肢体动作时,控制器可以识别出对应的运动意图,例如,使用者向前挥舞手掌,表示要向前行走的意图,使用者向左或向右扭头两次,则表示要向左或向右转向,控制器通过摄像机采集的图像识别出使用者向前挥舞手掌时,则将运动意图识别使用者要向前行走,当检测到使用者向左或向右连续扭头两次时,则将运动意图识别使用者要向左或向右转向。对于步态的识别,可以通过控制器进行图像识别,以实时获取人体运动姿态,例如关节运动角度、步幅等信息,来实时识别步态,当然也可以在控制器中预存多种步态,每种步态可以预设步幅和步频,并通过预设的肢体动作,来表示特定的步态,例如,使用者伸出一个手指,代表使用者将以预设的第一种步态进行行走,伸出两个手指,代表使用者将以预设的第二种步态进行行走,控制器通过图像识别预设的肢体动作,来获取使用者的步态。当然,在进行运动意图和/或步态识别时,如果控制器检测到预设的肢体动作,则按照该肢体动作对应的运动意图和/或步态来对外骨骼进行控制,否则,则通过图像识别实时检测使用者的身体姿态和/或运动姿态,来识别运动意图和/或步态,并对外骨骼进行控制。As shown in Figures 1 and 2, the exoskeleton control system based on the Internet of Things described in this embodiment includes a controller 2, at least one camera 1, and at least one exoskeleton 3; the controller is connected to the camera and the exoskeleton respectively Skeleton communication connection; the camera is installed in the exoskeleton user’s activity space, such as on the wall or on the column, and is used to collect images in the exoskeleton user’s activity space in real time; the exoskeleton user’s activity space refers to the exoskeleton user’s activity space The space where the user of the exoskeleton resides, or the preset space in which the user of the exoskeleton can perform activities, such as home, hospital, rehabilitation center, etc. The activity space can be closed or open. The controller is used to obtain and recognize the images collected by the camera, so as to recognize the motion intention and/or gait of the user of the exoskeleton, and control the action of the exoskeleton according to the recognition result. Use images to identify the user's movement intention and/or gait. Specifically, the user's body posture can be judged in real time through image recognition to identify the user's movement intention. When the user is in a sitting position but leans forward, it is determined that the user intends to get up and move forward. Preset body movements can also be used to express specific motion intentions. When the user makes specific body movements, the controller can recognize the corresponding motion intentions. For example, the user waves his palm forward to indicate that he wants to For the intention of walking, if the user turns his head left or right twice, it means he wants to turn left or right. When the controller recognizes that the user is waving his palm forward through the image collected by the camera, it will identify the motion intention of the user. Walking forward, when it is detected that the user turns his head left or right twice in a row, the motion intention will be recognized that the user wants to turn left or right. For gait recognition, image recognition can be performed through the controller to obtain real-time human motion posture, such as joint movement angle, stride and other information, to recognize gait in real time. Of course, multiple gaits can also be pre-stored in the controller. Each gait can preset stride length and stride frequency, and express a specific gait through preset body movements. For example, the user stretches out a finger, which means that the user will use the preset first gait Walking, stretching out two fingers means that the user will walk with the preset second gait, and the controller acquires the user's gait by recognizing the preset body movements through the image. Of course, when performing movement intention and/or gait recognition, if the controller detects a preset body movement, the exoskeleton will be controlled according to the movement intention and/or gait corresponding to the body movement; otherwise, the image will be used to Recognition detects the user's body posture and/or motion posture in real time to recognize motion intention and/or gait and control the exoskeleton.

也就是说,利用外置的摄像机和控制器来进行运动意图和/或步态识别,代替了传统外骨骼设备上安装的运动意图和/或步态识别装置。当然,控制器还可以同时对使用者当前所处地形进行识别,并根据运动意图和/或步态进行识别结构以及地形识别结果,并根据识别结果控制外骨骼的动作。根据识别结果控制外骨骼动作,可以是控制器将使用者的运动意图和/或步态作为输入信号提供给外骨骼的主控系统,外骨骼的主控系统据此来控制外骨骼进行动作,也可以直接由控制器根据识别结果来直接控制外骨骼的动力装置,此时控制器也起到了外骨骼主控系统的功能。控制外骨骼的动作,主要是针对外骨骼的动力装置进行控制,至少包括对外骨骼助力大小、动作速度快慢、摆动角度的控制。That is to say, the motion intention and/or gait recognition is performed by using an external camera and controller, instead of the motion intention and/or gait recognition device installed on the traditional exoskeleton equipment. Of course, the controller can also recognize the current terrain of the user at the same time, and recognize the structure and terrain recognition results according to the motion intention and/or gait, and control the action of the exoskeleton according to the recognition results. To control the action of the exoskeleton according to the recognition result, the controller may provide the user's motion intention and/or gait as an input signal to the main control system of the exoskeleton, and the main control system of the exoskeleton controls the movement of the exoskeleton accordingly. The controller can also directly control the power device of the exoskeleton according to the recognition result, and at this time, the controller also functions as the main control system of the exoskeleton. Controlling the movement of the exoskeleton is mainly to control the power device of the exoskeleton, at least including the control of the size of the exoskeleton, the speed of the movement, and the swing angle.

如图4、5,所述吸附式检测单元6包括感光点1、加速度传感器3、陀螺仪4、磁体5、壳体2;As shown in Figures 4 and 5, the adsorption detection unit 6 includes a photosensitive point 1, an acceleration sensor 3, a gyroscope 4, a magnet 5, and a housing 2;

所述感光点1固定在壳体2顶部,所述壳体2内部安置有加速度传感器3、陀螺仪4;所述磁铁5设置在壳体2底部;所述吸附式检测单元6吸附在外骨骼7各关节部位;所述摄像头实时检测感光点,所述加速度传感器、陀螺仪与控制器无线连接;The photosensitive point 1 is fixed on the top of the casing 2, and the interior of the casing 2 is provided with an acceleration sensor 3 and a gyroscope 4; the magnet 5 is arranged at the bottom of the casing 2; the adsorption detection unit 6 is adsorbed on the exoskeleton 7 Each joint position; the camera detects photosensitive points in real time, and the acceleration sensor and gyroscope are wirelessly connected to the controller;

本发明针对的是在特定活动空间(包括但不限于使用者的家里、医院、康复中心等)内使用的外骨骼,通常认为,使用者不会穿戴该外骨骼设备离开上述空间。当外骨骼处于使用者活动空间内时,可以通过活动空间内的摄像机和控制器进行图像识别,以获取使用者的运动意图和/或步态。因此,外骨骼上无需设置单独的运动意图和/或步态识别装置,这样就降低了外骨骼的制造成本。The present invention is aimed at an exoskeleton used in a specific activity space (including but not limited to the user's home, hospital, rehabilitation center, etc.), and it is generally believed that the user will not leave the above-mentioned space wearing the exoskeleton device. When the exoskeleton is in the user's activity space, image recognition can be performed through the camera and the controller in the activity space to obtain the user's movement intention and/or gait. Therefore, there is no need to set a separate movement intention and/or gait recognition device on the exoskeleton, which reduces the manufacturing cost of the exoskeleton.

外骨骼上设置有外骨骼ID,控制器可以获取图像中处于工作状态下的外骨骼的外骨骼ID,并与该外骨骼ID对应的外骨骼建立通信连接。外骨骼ID可以是设置在外骨骼外部的序号,也可以是特定的图案等,只要是可以用来确定该外骨骼的身份的方式即可。The exoskeleton is provided with an exoskeleton ID, and the controller can obtain the exoskeleton ID of the exoskeleton in the working state in the image, and establish a communication connection with the exoskeleton corresponding to the exoskeleton ID. The exoskeleton ID can be a serial number set outside the exoskeleton, or a specific pattern, etc., as long as it can be used to determine the identity of the exoskeleton.

进一步的,如果摄像机拍摄到的图像内存在多个外骨骼,那么控制器能够分别对每个外骨骼是否穿戴在使用者身上、使用者的运动意图和/或步态、以及外骨骼ID进行识别。Further, if there are multiple exoskeletons in the image captured by the camera, the controller can identify whether each exoskeleton is worn on the user, the user's movement intention and/or gait, and the ID of the exoskeleton .

摄像机包括驱动装置,用于驱动摄像机进行全方位转动。由于使用者在移动,因此,为了保持拍摄的最佳角度,可以控制摄像机驱动装置动作,来使摄像机对使用者进行跟随拍摄。当存在多个外骨骼设备同时处于工作状态时,可以控制摄像机的角度,使所有处于工作状态的外骨骼能够同时被摄像机拍摄到。The camera includes a driving device for driving the camera to rotate in all directions. Since the user is moving, in order to keep the best angle for shooting, the camera driving device can be controlled to make the camera follow and shoot the user. When there are multiple exoskeleton devices working at the same time, the angle of the camera can be controlled so that all the exoskeletons in the working state can be photographed by the camera at the same time.

控制器与所述摄像机和所述外骨骼通信之间采用近距离通信近距离方式进行连接。近距离通信方式可以采用蓝牙、WIFI或ZigBee,采用近距离通信方式,能够保证控制器与摄像机与外骨骼不会距离太远,也限定了外骨骼使用者的活动范围,避免其离开预设的活动空间遇到危险。The controller communicates with the camera and the exoskeleton by means of short-distance communication. The short-range communication method can use Bluetooth, WIFI or ZigBee. The short-range communication method can ensure that the distance between the controller, the camera and the exoskeleton will not be too far, and also limit the range of activities of the exoskeleton user to prevent them from leaving the preset The event space is in danger.

外骨骼设备上还可以设置有提醒装置,当外骨骼使用者移动到近距离通信距离之外时,即脱离与控制器的通信范围时,外骨骼上的提醒装置利用声光和/或震动的方式对使用者或者陪护人员进行提醒。A reminder device can also be provided on the exoskeleton equipment. When the user of the exoskeleton moves out of the short-distance communication distance, that is, when the communication range with the controller is out of the range, the reminder device on the exoskeleton uses sound, light and/or vibration way to remind the user or accompanying personnel.

实施例二Embodiment two

如图3所示,本实施例所述的基于物联网的外骨骼控制方法,可以利用实施例一种的外骨骼控制系统,控制方法包括:As shown in Figure 3, the exoskeleton control method based on the Internet of Things described in this embodiment can use the exoskeleton control system of the first embodiment, and the control method includes:

步骤1:设置在外骨骼活动空间内的至少一个摄像机实时采集外骨骼使用者活动空间内的图像,并将图像传输给控制器;Step 1: at least one camera set in the exoskeleton activity space collects images in the exoskeleton user activity space in real time, and transmits the images to the controller;

步骤2:控制器对摄像机所采集的图像进行识别,判断图像内是否存在外骨骼设备;如果存在,则执行步骤3,如果不存在则返回步骤1;Step 2: The controller recognizes the image collected by the camera, and judges whether there is an exoskeleton device in the image; if it exists, execute step 3, and if not, return to step 1;

步骤3:根据上述图像进一步判断该外骨骼是否穿戴在使用者身上,如果是,则识别设置在外骨骼上的外骨骼ID,并使控制器与该外骨骼ID对应的外骨骼建立近距离通信连接;Step 3: According to the above image, further judge whether the exoskeleton is worn on the user's body, if so, identify the exoskeleton ID set on the exoskeleton, and make the controller establish a short-distance communication connection with the exoskeleton corresponding to the exoskeleton ID ;

步骤4:控制器根据摄像机所采集的图像,对使用者的运动意图和/或步态进行识别,并根据识别结果对外骨骼进行控制,使其与使用者实现协同动作。Step 4: The controller recognizes the user's motion intention and/or gait according to the image collected by the camera, and controls the exoskeleton according to the recognition result, so that it can cooperate with the user.

进一步的,还包括:步骤5:控制器根据外骨骼使用者在图像内的位置,实时控制摄像机驱动装置进行动作,以调整摄像机的角度,对外骨骼使用者进行跟随。Further, it also includes: Step 5: The controller controls the camera driving device to move in real time according to the position of the exoskeleton user in the image, so as to adjust the angle of the camera and follow the exoskeleton user.

所述对外骨骼使用者进行跟随,是指,使外骨骼使用者始终处于图像的中心位置,或者以最大限度拍摄到外骨骼使用者的正面或侧面。使外骨骼使用者始终处于图像的中心位置,能够最大限度的拍摄到使用者周围的情况,这样能够方便控制器更好的感知使用者周围的信息,而以最大限度拍摄到外骨骼使用者的正面或侧面,是由于从正面或侧面能够更准确的对运动意图或步态进行判断,因此,采用这种跟随方式能够更好地对运动意图或步态进行识别。The following of the exoskeleton user refers to making the exoskeleton user always in the center of the image, or photographing the front or side of the exoskeleton user to the maximum extent. The exoskeleton user is always in the center of the image, and the situation around the user can be photographed to the maximum extent, so that the controller can better perceive the information around the user, and the exoskeleton user can be photographed to the maximum extent. The front or the side is because the movement intention or gait can be judged more accurately from the front or the side, therefore, the movement intention or gait can be better recognized by adopting this following method.

当然,如果同时存在多个处于工作状态的外骨骼设备,也可以选择使所有处于工作状态的外骨骼能够同时被摄像机拍摄到,来对摄像机进行控制。Of course, if there are multiple exoskeleton devices in working state at the same time, you can also choose to enable all the exoskeleton devices in working state to be captured by the camera at the same time to control the camera.

进一步的,步骤4中,控制器根据摄像机所采集的图像,除对使用者的运动意图和/或步态进行识别外,还对使用者当前所处地形进行识别,并根据运动意图和/或步态进行识别结构以及地形识别结果,对外骨骼进行控制,使其与使用者实现协同动作。Further, in step 4, the controller not only recognizes the user's motion intention and/or gait according to the images collected by the camera, but also recognizes the user's current terrain, and according to the motion intention and/or The gait is used to recognize the structure and terrain recognition results, and the exoskeleton is controlled to make it cooperate with the user.

在控制时,结合地形识别结果,一方面能够更好的消除误判,另一方面还能够根据当前地形结果对使用者的运动意图和/或步态进行预判,以更好的辅助使用者行走。对地形进行识别,包括但不限于对地面的平整度、障碍物、楼梯的识别。当处于特殊地形时,使用者可能会采用特定的行走方式,例如当发现地面不平整(例如有坑)或有障碍物,可以合理预期使用者可能会选择加大步幅迈过或选择绕过不平整地面或障碍物,通过这种预期可以对使用者的运动意图和/或步态进行预判。When controlling, combined with the terrain recognition results, on the one hand, it can better eliminate misjudgments, and on the other hand, it can also predict the user's movement intention and/or gait according to the current terrain results, so as to better assist the user walk. Recognition of terrain, including but not limited to the recognition of the flatness of the ground, obstacles, and stairs. When in a special terrain, the user may adopt a specific walking style. For example, when the ground is found to be uneven (such as a pit) or there are obstacles, it is reasonable to expect that the user may choose to increase the stride or choose to go around Uneven ground or obstacles, through which the user's movement intention and/or gait can be predicted.

基于这种控制方式,外骨骼上可以不设置单独的运动意图和/或步态识别装置,能够降低成本,并且摄像机和控制器均为原有的物联网设备,不需要额外增设。另外由于采用成熟的图像识别技术进行运动意图和/或步态识别,相对于物理传感器,其识别准确性更好,并且在时间上更加迅速。Based on this control method, there is no need to set up a separate movement intention and/or gait recognition device on the exoskeleton, which can reduce costs, and the camera and controller are all original IoT devices, and no additional installation is required. In addition, due to the use of mature image recognition technology for motion intention and/or gait recognition, compared with physical sensors, its recognition accuracy is better, and it is faster in time.

如在说明书及权利要求当中使用了某些词汇来指称特定组件。本领域技术人员应可理解,硬件制造商可能会用不同名词来称呼同一个组件。本说明书及权利要求并不以名称的差异来作为区分组件的方式,而是以组件在功能上的差异来作为区分的准则。如在通篇说明书及权利要求当中所提及的“包含”为一开放式用语,故应解释成“包含但不限定于”。“大致”是指在可接收的误差范围内,本领域技术人员能够在一定误差范围内解决所述技术问题,基本达到所述技术效果。此外,“耦接”一词在此包含任何直接及间接的电性耦接手段。因此,若文中描述一第一装置耦接于一第二装置,则代表所述第一装置可直接电性耦接于所述第二装置,或通过其他装置或耦接手段间接地电性耦接至所述第二装置。说明书后续描述为实施本申请的较佳实施方式,然所述描述乃以说明本申请的一般原则为目的,并非用以限定本申请的范围。本申请的保护范围当视所附权利要求所界定者为准。Certain terms are used, for example, in the description and claims to refer to particular components. Those skilled in the art should understand that hardware manufacturers may use different terms to refer to the same component. The specification and claims do not use the difference in name as a way to distinguish components, but use the difference in function of components as a criterion for distinguishing. As mentioned throughout the specification and claims, "comprising" is an open term, so it should be interpreted as "including but not limited to". "Approximately" means that within an acceptable error range, those skilled in the art can solve the technical problem within a certain error range and basically achieve the technical effect. In addition, the term "coupled" herein includes any direct and indirect electrical coupling means. Therefore, if it is described that a first device is coupled to a second device, it means that the first device may be directly electrically coupled to the second device, or indirectly electrically coupled through other devices or coupling means. connected to the second device. The subsequent description of the specification is a preferred implementation mode for implementing the application, but the description is for the purpose of illustrating the general principle of the application, and is not intended to limit the scope of the application. The scope of protection of the present application should be defined by the appended claims.

需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的商品或者系统不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种商品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的商品或者系统中还存在另外的相同要素。It should be noted that the term "comprises", "comprises" or any other variation thereof is intended to cover a non-exclusive inclusion such that a good or system comprising a set of elements includes not only those elements but also items not expressly listed other elements, or also include elements inherent in the commodity or system. Without further limitations, an element defined by the phrase "comprising a ..." does not exclude the presence of additional identical elements in the article or system comprising said element.

上述说明示出并描述了本发明的若干优选实施例,但如前所述,应当理解本发明并非局限于本文所披露的形式,不应看作是对其他实施例的排除,而可用于各种其他组合、修改和环境,并能够在本文所述发明构想范围内,通过上述教导或相关领域的技术或知识进行改动。而本领域人员所进行的改动和变化不脱离本发明的精神和范围,则都应在本发明所附权利要求的保护范围内。The above description shows and describes several preferred embodiments of the present invention, but as mentioned above, it should be understood that the present invention is not limited to the forms disclosed herein, and should not be regarded as excluding other embodiments, but can be used in various Various other combinations, modifications, and environments can be made within the scope of the inventive concept described herein, by the above teachings or by skill or knowledge in the relevant field. However, changes and changes made by those skilled in the art do not depart from the spirit and scope of the present invention, and should all be within the protection scope of the appended claims of the present invention.

Claims (7)

1. An exoskeleton control system based on the Internet of things is characterized by comprising a controller, at least one camera, at least one exoskeleton and a plurality of adsorption type detection units; the adsorption type detection unit comprises a light sensing point, an acceleration sensor, a gyroscope, a magnet and a shell; the light sensing point is fixed at the top of the shell, and an acceleration sensor and a gyroscope are arranged in the shell; the magnet is arranged at the bottom of the shell; the absorption type detection units are respectively absorbed on the head, the shoulder, the chest, the abdomen, the hand joints, the hands, the hip joints, the knee joints and the feet of the exoskeleton; the camera detects the photosites in real time, and the acceleration sensor and the gyroscope are in wireless connection with the controller; the outermost layer of the exoskeleton is made of magnetic materials; a controller in communication with the camera and the exoskeleton, respectively; the camera is arranged in the motion space of the exoskeleton user and is used for acquiring the image information of the photosites in the motion space of the exoskeleton user; the controller receives image information of the photosensitive points, and combines detection data of the acceleration sensor and the gyroscope in detection of the adsorption detection unit to obtain gait characteristics of rehabilitation personnel, wherein the gait characteristics comprise stride, frequency, a basic gait curve and resistance coefficients of different stages in a walking cycle in an active rehabilitation mode; the gait characteristics are compared with the historical gait characteristic data of the rehabilitation personnel prestored in the controller, the controller compares a rehabilitation recovery model or a recovery curve, and the running parameters of the exoskeleton robot during the next rehabilitation are set; the camera comprises a driving device, a driving device and a control device, wherein the driving device is used for driving the camera to rotate in all directions; when a plurality of exoskeleton devices are in working states at the same time, controlling the angle of the camera to enable all the exoskeletons in the working states to be shot by the camera at the same time; the controller can also simultaneously identify the terrain where the user is located, identify the structure and the terrain identification result according to the movement intention and/or the gait, and control the action of the exoskeleton according to the identification result.
2. An internet of things based exoskeleton control system as claimed in claim 1 wherein the system can determine patient or health care personnel ID from face recognition.
3. The internet of things-based exoskeleton control system of claim 1 wherein the controller is coupled to the camera and the exoskeleton in a close range communication.
4. The internet of things-based exoskeleton control system of claim 3 wherein the close range communication means is Bluetooth, WIFI or ZigBee.
5. An internet of things-based exoskeleton control method as claimed in claim 1, wherein the control method of the internet of things-based exoskeleton control system comprises the steps of:
step 1: the camera arranged in the exoskeleton activity space acquires images in the exoskeleton user activity space in real time and transmits the images to the controller;
step 2: the controller identifies the image acquired by the camera and judges whether a patient exists in the image; if yes, executing step 3, and if no, returning to step 1;
and step 3: further judging whether the exoskeleton is worn on the user according to the image, if so, identifying the exoskeleton ID arranged on the exoskeleton, and enabling the controller to establish close-range communication connection with the exoskeleton corresponding to the exoskeleton ID;
and 4, step 4: and issuing the running parameters of the exoskeleton robot before rehabilitation, wherein the running parameters comprise stride, frequency, a basic gait curve and resistance coefficients of different stages in a walking cycle in an active rehabilitation mode.
6. The method for exoskeleton control based on the internet of things of claim 5, further comprising: and 5: the controller controls the camera driving device to act in real time according to the position of the exoskeleton user in the image so as to adjust the angle of the camera and follow the patient.
7. The method for exoskeleton control based on the internet of things as claimed in claim 6 wherein said following the patient means keeping the exoskeleton user in the center position of the image or shooting the front or side of the exoskeleton user to the maximum extent so as to better identify the movement intention or gait.
CN202110927884.5A 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things Active CN113681541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110927884.5A CN113681541B (en) 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110927884.5A CN113681541B (en) 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things

Publications (2)

Publication Number Publication Date
CN113681541A CN113681541A (en) 2021-11-23
CN113681541B true CN113681541B (en) 2022-11-25

Family

ID=78579690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110927884.5A Active CN113681541B (en) 2021-08-12 2021-08-12 Exoskeleton control system and method based on Internet of things

Country Status (1)

Country Link
CN (1) CN113681541B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663775B (en) * 2022-05-26 2022-08-12 河北工业大学 Method for identifying stairs in exoskeleton robot service environment
CN115070732B (en) * 2022-06-30 2025-01-17 中国农业科学院都市农业研究所 Machine vision agricultural exoskeleton robot detector device and method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930135A (en) * 2020-08-12 2020-11-13 深圳航天科技创新研究院 Active power assist control method, device and exoskeleton robot based on terrain judgment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102172954B1 (en) * 2013-11-08 2020-11-02 삼성전자주식회사 A walk-assistive robot and a method for controlling the walk-assistive robot
CN110072678A (en) * 2016-09-14 2019-07-30 奥尔堡大学 The mankind for moving auxiliary are intended to detection system
CN112223253B (en) * 2019-07-15 2022-08-02 上海中研久弋科技有限公司 Exoskeleton system, exoskeleton identification control method, electronic device and storage medium
CN110524525B (en) * 2019-10-05 2022-04-26 河北工业大学 A kind of lower limb exoskeleton control method
CN112669964A (en) * 2019-10-16 2021-04-16 深圳市迈步机器人科技有限公司 Power exoskeleton and rehabilitation evaluation method based on same
CN111631923A (en) * 2020-06-02 2020-09-08 中国科学技术大学先进技术研究院 Neural Network Control System of Exoskeleton Robot Based on Intention Recognition
CN113063411A (en) * 2020-06-29 2021-07-02 河北工业大学 Exoskeleton evaluation system and method of use

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930135A (en) * 2020-08-12 2020-11-13 深圳航天科技创新研究院 Active power assist control method, device and exoskeleton robot based on terrain judgment

Also Published As

Publication number Publication date
CN113681541A (en) 2021-11-23

Similar Documents

Publication Publication Date Title
US12112541B2 (en) Bed system
US11389686B2 (en) Robotically assisted ankle rehabilitation systems, apparatuses, and methods thereof
Paulo et al. ISR-AIWALKER: Robotic walker for intuitive and safe mobility assistance and gait analysis
US11150081B2 (en) Thermal sensor position detecting device
CN113681541B (en) Exoskeleton control system and method based on Internet of things
CN111278398A (en) Semi-supervised intent recognition system and method
WO2020140271A1 (en) Method and apparatus for controlling mobile robot, mobile robot, and storage medium
JP2011516915A (en) Motion content-based learning apparatus and method
JP5186723B2 (en) Communication robot system and communication robot gaze control method
WO2018084170A1 (en) Autonomous robot that identifies persons
TWM644361U (en) Haptic guiding system
CN108042142A (en) A kind of wearable human body attitude detection and myodynamia measuring system
CN105014676A (en) Robot motion control method
KR101398880B1 (en) Wearable robot with humanoid function and control method of the same
KR101847918B1 (en) Rehabilitation method and system for using motion sensing band
KR20200042265A (en) Robot control system and robot control method using the same
CN111401334B (en) Non-contact mapping type action recognition equipment and method by using sensor
JP7678473B2 (en) robot
JP4839939B2 (en) Autonomous mobile device
CN113243906A (en) Motion monitoring and analysis system and method
KR20220095235A (en) automatic placement of masks
JP4878462B2 (en) Communication robot
AU2018236851B2 (en) System and method for controlling a prosthetic device
KR20160012393A (en) Apparatus and method for generating control signal based on finger gesture
WO2019075926A1 (en) Tactile sensation self-adaptive massage robot and control method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: An Exoskeleton Control System and Method Based on the Internet of Things

Granted publication date: 20221125

Pledgee: Hangzhou High-tech Financing Guarantee Co.,Ltd.

Pledgor: HANGZHOU CHENGTIAN TECHNOLOGY DEVELOPMENT Co.,Ltd.

Registration number: Y2024980003981

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Granted publication date: 20221125

Pledgee: Hangzhou High-tech Financing Guarantee Co.,Ltd.

Pledgor: HANGZHOU CHENGTIAN TECHNOLOGY DEVELOPMENT Co.,Ltd.

Registration number: Y2024980003981

EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20211123

Assignee: Hangzhou Jintou Finance Leasing Co.,Ltd.

Assignor: HANGZHOU CHENGTIAN TECHNOLOGY DEVELOPMENT Co.,Ltd.

Contract record no.: X2025330000460

Denomination of invention: An Exoskeleton Control System and Method Based on the Internet of Things

Granted publication date: 20221125

License type: Common License

Record date: 20251127

PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: An Exoskeleton Control System and Method Based on the Internet of Things

Granted publication date: 20221125

Pledgee: Hangzhou Jintou Finance Leasing Co.,Ltd.

Pledgor: HANGZHOU CHENGTIAN TECHNOLOGY DEVELOPMENT Co.,Ltd.

Registration number: Y2025330001629