WO2022073467A1 - 一种陪护按摩双臂多任务并行处理机器人装置 - Google Patents
一种陪护按摩双臂多任务并行处理机器人装置 Download PDFInfo
- Publication number
- WO2022073467A1 WO2022073467A1 PCT/CN2021/122528 CN2021122528W WO2022073467A1 WO 2022073467 A1 WO2022073467 A1 WO 2022073467A1 CN 2021122528 W CN2021122528 W CN 2021122528W WO 2022073467 A1 WO2022073467 A1 WO 2022073467A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- massage
- module
- robot
- claw
- arm
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 17
- 210000000078 claw Anatomy 0.000 claims abstract description 87
- 230000009471 action Effects 0.000 claims abstract description 14
- 238000002567 electromyography Methods 0.000 claims abstract description 12
- 238000000034 method Methods 0.000 claims description 79
- 210000001624 hip Anatomy 0.000 claims description 23
- 238000004898 kneading Methods 0.000 claims description 20
- 230000000007 visual effect Effects 0.000 claims description 18
- 238000004891 communication Methods 0.000 claims description 17
- 210000002683 foot Anatomy 0.000 claims description 16
- 239000000463 material Substances 0.000 claims description 16
- 238000005096 rolling process Methods 0.000 claims description 15
- 210000001364 upper extremity Anatomy 0.000 claims description 15
- 230000005484 gravity Effects 0.000 claims description 14
- 210000003141 lower extremity Anatomy 0.000 claims description 14
- 210000003205 muscle Anatomy 0.000 claims description 13
- 230000002572 peristaltic effect Effects 0.000 claims description 13
- 230000003993 interaction Effects 0.000 claims description 12
- 210000005036 nerve Anatomy 0.000 claims description 10
- 210000004247 hand Anatomy 0.000 claims description 7
- 238000013461 design Methods 0.000 claims description 6
- 238000010801 machine learning Methods 0.000 claims description 6
- 230000001953 sensory effect Effects 0.000 claims description 6
- 238000012549 training Methods 0.000 claims description 6
- 230000002902 bimodal effect Effects 0.000 claims description 5
- 230000010391 action planning Effects 0.000 claims description 4
- 238000013528 artificial neural network Methods 0.000 claims description 4
- 239000004973 liquid crystal related substance Substances 0.000 claims description 4
- 239000003086 colorant Substances 0.000 claims description 3
- 230000007613 environmental effect Effects 0.000 claims description 3
- 210000001503 joint Anatomy 0.000 claims description 3
- 206010049565 Muscle fatigue Diseases 0.000 claims description 2
- 238000013135 deep learning Methods 0.000 claims description 2
- 230000002452 interceptive effect Effects 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 claims description 2
- 230000004118 muscle contraction Effects 0.000 claims description 2
- 230000007830 nerve conduction Effects 0.000 claims description 2
- 230000008439 repair process Effects 0.000 claims description 2
- 230000003252 repetitive effect Effects 0.000 claims description 2
- 231100000430 skin reaction Toxicity 0.000 claims description 2
- 230000003068 static effect Effects 0.000 claims description 2
- 230000000638 stimulation Effects 0.000 claims description 2
- 230000002889 sympathetic effect Effects 0.000 claims description 2
- 238000010276 construction Methods 0.000 claims 1
- 210000003414 extremity Anatomy 0.000 claims 1
- 230000006872 improvement Effects 0.000 claims 1
- 230000008855 peristalsis Effects 0.000 claims 1
- 238000012546 transfer Methods 0.000 claims 1
- 230000000474 nursing effect Effects 0.000 description 12
- 238000013473 artificial intelligence Methods 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000003183 myoelectrical effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H7/00—Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
Definitions
- the invention belongs to the field of artificial intelligence robot technology, and relates to the field of robot technology, image acquisition and recognition artificial intelligence.
- multi-sensing devices collect muscle information, gravity pressure and other multi-sensing information.
- the client terminal uses the client terminal to connect with the robot controller to collect objects, facial image information, self-massage, fetch objects, and be used in families, hospitals, nursing homes, nursing institutions, hotels and other service industries.
- the invention effectively prevents human errors through intelligent voice chatting, massage, fetching, and escort service robot design, realizes autonomous escort, massages, fetches, and double-arm service, and realizes multi-task parallel processing.
- the purpose of the present invention is to overcome the above-mentioned shortcomings and deficiencies of the prior art, and to provide a multi-task parallel processing robot device for escorting massage arms.
- sensory nerve motor nerve information
- multi-sensor information multi-sensor information.
- the voice module it is used for voice interaction between the main control system and the user, and the voice chats like a human.
- Use the camera vision module to collect face images and common object images in the accompanying environment to identify common object images. Utilize multimedia display screen, robot-human interaction, multimedia playback, internet browsing, internet communication.
- the robot arm equipped with massage claw built-in infrared generator, vibration generator for massage, retractable soft body finger kneading massage head, double-peak sliding peristaltic massage head, vibrating air bag hammer massage head, plush material massage palm.
- vibration generator for massage
- retractable soft body finger kneading massage head double-peak sliding peristaltic massage head
- vibrating air bag hammer massage head plush material massage palm.
- the machine claw, robot arm, vision module, radar, mobile base for autonomous positioning, navigation and movement, moving to the position of the object, picking up objects with the robot claw, sorting, and cleaning.
- the present invention provides a voice device for human-computer interaction, remote voice commands, an autonomous mobile device, a robotic arm massage device, and a robotic arm fetching device for remote, autonomous escort, massage, retrieval, and service .
- An escort massage double-arm multitasking parallel processing robot device includes:
- a robot controller which is connected with a robot arm module, a multi-sensing device, an electromyography acquisition device, and a massage device, is used to control the robot.
- the voice module is connected with the robot controller, and is used for voice interaction between the main control system and the user, and the voice chats like a human.
- the camera vision module which is connected with the robot controller and the robot arm, is used to collect face images and images of common objects in the accompanying environment, and collect face images to assist in detecting body parts, locating faces, chest , back, legs, arms, waist, hands, feet, joint positions. Collect images of common objects in the escort environment, identify object images, and assist the robotic claw to pick up objects. Capture video in real time and monitor the escort environment.
- Multimedia display screen module the multimedia display screen module, the application connector is connected with the robot controller, can be separated from the robot body, portable, uses Bluetooth, wifi to communicate with the robot, used for multimedia playback, web page display, Internet browsing, communication .
- the robot arm module is connected with the robot controller, the massage claw and the robot claw.
- the massage claw is used for massaging the upper limbs, chest, back, waist, hands, feet and joints, and the robot claw is used for picking up objects.
- a robotic claw picking module the robotic claw module is connected with a robot controller and a robotic arm for picking up objects.
- Massage claw module the massage claw module is connected with the robot controller and the robot arm.
- the massage claw and the robot arm are of separate design, directly inserted and disassembled, portable, and communicate with the robot arm through the wifi bluetooth module and the connection plug.
- Massage claws include: retractable soft body finger kneading massage head, bimodal sliding peristaltic massage head, vibrating air bag hammer massage head, plush material massage palm. For imitation hand kneading massage, rolling peristaltic massage, hammer massage, and plush material stroking massage.
- An electromyography acquisition device which is connected to a robot controller, is used to collect muscle information, muscle state, sensory nerve, and motor nerve information.
- a multi-sensing information acquisition module is connected with the robot controller, and is used for collecting pressure, gravity and various sensor information.
- the infrared module which is connected with the robot controller and the robot claw, is used for emitting infrared, and is an infrared generator for massage claw.
- the vibrator, the vibration module is connected with the robot controller and the robot claw, and is used for massage vibration, which is a massage claw vibration generator.
- the radar autonomous movement module is connected with the robot controller and the mobile base for autonomous positioning and navigation.
- the robot controller is connected to the voice module, and the voice module is used for interaction between the robot and the user, including voice recognition, voice-text interconversion, voice guidance, voice commands, voice companionship, and voice medical question and answer. Access the video of the escort environment through the user cloud client, and use the voice module to support remote voice communication.
- the robot controller is connected with the camera vision acquisition module, and the camera vision module is used to collect face images, identify the face, legs, arms, waist, and joints, return the position information of the body parts, and locate the positions of the legs, arms, waist, and joints. .
- Collect object images apply the object recognition module, and identify objects near the escort environment.
- image parameters use the comprehensive features such as color, shape, and outline of the improved machine learning method to classify people, items, and equipment, and use the improved machine learning method and deep neural network method in the vision module to collect common escort environment.
- Object image recognize object image, assist robot claw to take objects. Intelligently identify colors, numbers, letters, texts, special signs, identify people, items, equipment, and feedback their location information.
- the EMG acquisition module is connected with the robot controller and the robot arm, and is used to collect the muscle information of the chest, waist, back, upper limb, lower limb, hand, and foot limb, including muscle contraction mode, static power state, muscle fatigue state, and sensory nerve. , motor nerve conduction, repetitive electrical stimulation, estimation of the number of motor units, sympathetic skin response, deep learning algorithm to autonomously adjust training intensity, training period, and training times.
- the multi-sensing information acquisition module is connected to the robot controller and the robot arm, and is used to collect gravity information, pressure information, and direction information. According to the return information, the main system communicates with the multi-sensor, and adjusts the The parameter values of the robot arm and gravity device.
- the multimedia LCD display module is connected with the robot controller.
- the multimedia liquid crystal display module is connected with the robot controller by the application connector, can be separated from the robot body, is portable, communicates with the robot by using bluetooth and wifi, and uses the user interface under the robot system to control the robot massage, music, and voice.
- Dialogue human-like interactive chat, Internet-based online learning, used to play multimedia and display network information.
- Robot controller is connected with radar, camera, mobile base.
- the information collected by the radar is sent to the main system client through the communication method of message and service to realize the self-built map of the scene.
- the main system communicates with the mobile chassis: publishes the created map information, communicates with the mobile chassis nodes, accepts the map information, and realizes autonomous navigation.
- the image information collected by the camera is sent to the main system client through the service communication method to communicate with the robot arm to realize action planning, etc.
- the robot controller is connected with the camera module and the massage device, and the massage device includes: a robot arm module and a massage claw module.
- the robot arm module is connected with the robot controller and the massage claws, and is used for massaging the upper limbs, waist and lower limbs.
- the massage claw module includes: a retractable and flexible finger kneading massage head, a bimodal sliding peristaltic massage head, a vibrating air bag hammer massage head, and a plush material massage palm. For imitation hand kneading massage, rolling peristaltic massage, hammer massage, and plush material stroking massage.
- the robot controller, the robot arm, and the vision module are connected, and the camera releases the visual information including: various parts of the body, returning to the chest, waist, back, upper limbs, lower limbs, hands, feet, and joint position information.
- the main system, the massage device receives visual information, detects the face, various parts of the body, and accurately locates each position of the body.
- Select the body massage range through the user interface including: massage range of chest, waist, back, upper limbs, lower limbs, hands, feet, joints, delineate the massage position, select the rubbing method, pick method, push method, press method, point method, pinching method, pinching method, patting method, striking method, bouncing method, rolling method, palm rubbing method, finger rubbing method, shaking method, shaking method, holding method, rubbing method, shaking method, vibration method, infrared massage method and Massage parameters of frequency, intensity, gravity. Position, move to the body massage position, follow a series of movements planned by the movement, massage accompany, relieve fatigue, repair muscle tissue, and reduce fat.
- the robot controller is connected with the camera module and the picking device, and the picking device includes: a robotic arm module and a machine picking claw module.
- the robot arm module is connected with the robot controller and the machine picking claw for picking up objects. Pick and place objects in the escort environment through user interface or voice selection.
- the visual information released by the camera includes: object information in the accompanying environment and environmental scene information. Positioning the object, the main controller of the robot, and the claw device of the machine receive the visual information, identify the object, and return the position of the object. According to the position information, the robot arm moves to the position.
- the invention can solve the remote control robot voice device through a multi-task parallel processing robot device for escorting massage arms, which is used for human-computer interaction, remote voice command, provides an autonomous moving device, and provides a robot arm mounted massage claw device , Machine-fetching claw device for distal, autonomous massage, fetching.
- the problems of high pressure and complicated work in escort work have been improved.
- the massage intensity and the movement parameters of the muscle tissue can be adjusted adaptively, which greatly improves the work efficiency and reduces the cost of human escort.
- efficient escort can be realized, auxiliary hospital, nursing home, nursing institution nursing, family, hotel and other service institutions can accompany nursing.
- Fig. 1 is the schematic diagram of the robot device module in the specification of this application; Fig. 1 is marked:
- 100-remote control device 101-robot controller; 102-voice module; 103-radar autonomous movement module; 104-camera vision module; 105-machine claw picking module; 106-massage claw module; 107-machine arm module ;108-multi-sensing information acquisition module;109-multimedia display module;110-myoelectric acquisition device;
- Figure 2 is a schematic diagram of the composition of the robot device in the description of the application; Figure 2 is marked:
- the purpose of the present invention is to design a remote-controllable robot that replaces human work, realize remote-controlled machine acquisition of images, and effectively solve autonomous and remote acquisition of nursing, accompanying images, voice communication, video communication, massage, and moving objects.
- Using artificial intelligence robot technology in the field of automation, remote and autonomous control of machine movement, positioning and navigation, and using robotic arms to retrieve objects.
- An escort massage double-arm multitasking parallel processing robot device includes:
- the robot controller 101 the robot controller 101 is connected with the robot arm module 107, the multi-sensing device 108, the myoelectric acquisition device 110, and the massage device, and is used to control the robot.
- the voice module 102 is connected with the robot controller 101, and is used for voice interaction between the main control system and the user, and the voice chats like a human.
- the camera vision module 104 is connected with the robot controller 101 and the robot arm 107, and is used to collect face images and common object images in the accompanying environment, and collect face images to assist in detecting body parts and positioning. Human face, chest, back, legs, arms, waist, hands, feet, joint positions. Collect images of common objects in the escort environment, identify object images, and assist the robotic claw to pick up objects. Capture video in real time and monitor the escort environment.
- the multimedia display screen module 109, the multimedia display screen module 109, the application connector is connected to the robot controller 101, can be separated from the robot body, portable, uses Bluetooth, wifi to communicate with the robot, used for multimedia playback, web page display, Internet Browse, communicate.
- Robotic arm module 107 the robotic arm module 107 is connected with the robot controller 101, the massage claw 106, and the machine claw 105.
- the massage claw 106 is used for massaging the upper limbs, waist, and lower limbs, and the machine claw 105 is used for picking up objects. .
- a robot claw picking module 105, the robot claw module 105 is connected with the robot controller 101 and the robot arm 107 for picking up objects.
- the massage claw module 106, the massage claw module 106 is connected with the robot controller 101 and the robot arm 107, the massage claw and the robot arm 107 are of a separate design, directly inserted and disassembled, portable, and connected with the wifi Bluetooth module and the connection plug.
- the massage claw 106 includes: a retractable and flexible finger kneading massage head, a bimodal sliding peristaltic massage head, a vibration airbag hammer massage head, and a plush material massage palm. For imitation hand kneading massage, rolling peristaltic massage, hammer massage, and plush material stroking massage.
- the electromyography acquisition device 110 is connected to the robot controller 101, and is used to collect muscle information, muscle state, sensory nerve, and motor nerve information.
- a multi-sensing information collection module 108 the multi-sensing information collection module 108 is connected with the robot controller 101, and is used for collecting pressure, gravity and various sensor information.
- the infrared module 208 is connected with the robot controller 101 and the machine claw 105 for emitting infrared, and is an infrared generator for the massage claw 106 .
- the vibrator 209 , the vibration module is connected with the robot controller 101 and the machine claw 105 for massage and vibration, and is a vibration generator for the massage claw 106 .
- the radar autonomous movement module 103, the radar autonomous movement module 103 is connected with the robot controller 101 and the mobile base 210 for autonomous positioning and navigation.
- Management users such as administrators use the remote control module 100 to communicate with the main control system 101, the remote sends out control commands, the main control system 101 communicates with the robotic arm 107, and the camera 214 is used to collect face images.
- the camera 214 is used to collect face images.
- the remote control module 100 remotely controls the robotic arm 107A, and the robotic arm 107B moves to the position of the body part, collects body pictures, detects the body part, and assists in positioning.
- the collection and escort environment is realized.
- Common object images identify object images, and assist the robotic claw to pick up objects. Collect video in real time, monitor the escort environment, intelligently identify colors, numbers, letters, characters, special signs, and feed back the location information of personnel, items, equipment, etc.
- the voice module 216 the voice device 216 includes a sound collection device, a microphone device and a speaker 216, the sound pickup device 216 can obtain voice information, input user voice through the microphone device 216 and human-computer interaction through the speaker device, voice guidance, text and voice exchange, Speech synthesis, voice wake-up.
- the radar autonomous positioning and navigation module 210 communicates with the mobile chassis 210 and the radar 210, and the information collected by the radar 210 is sent to the main system through the communication method of message and service to realize the scene self-built map. Publish the created map information to the mobile chassis node, and the mobile chassis accepts the map information to realize autonomous navigation.
- the body massage range is selected through the user interface, including: the massage range of the chest, waist, back, upper limbs, lower limbs, hands, feet, and joints.
- Massage location The user chooses the rubbing method, the picking method, the pushing method, the pressing method, the point method, the pinching method, the pinching method, the slapping method, the striking method, the bouncing method, the rolling method, the palm rubbing method, the finger rubbing method, the shaking method, the shaking method, and the holding method.
- the robot arm 211A the main control system 101 communicates with the camera 214, the image information of the face, chest, back, legs, arms, waist, hands, feet, joint positions collected by the camera 214, positioning and returning the image position information of each part of the body
- the machine Arm 211A is equipped with 201, 202, 203 retractable and flexible finger kneading massage heads; 204, 205 double-peak sliding and rolling massage heads; 206 plush material massage palms; 207 vibration airbag hammer massage heads; 208 infrared generators; 209 vibration Massager and other massage devices.
- the robot arm 211A Position the body face, chest, back, legs, arms, waist, hands, feet, and joints in the target massage area, move the robot arm 211A to the massage area, and release the massage method (rubbing method, Method, push method, press method, point method, pinch method, pinching method, beat method, strike method, bounce method, rolling method, palm kneading method, finger kneading method, shaking method, shaking method, holding method, rubbing method, shaking method) information, frequency, strength, gravity, pressure information, distance information, according to the information released by the myoelectric acquisition device 206, gravity sensor 208, pressure sensor 209, the robotic arm receives and subscribes to muscle information, gravity, pressure, multi-sensing information , the massage device according to the received massage mode, frequency, intensity and the corresponding planned action, such as the user selects the pinch action, 201, 202, 203 - soft body and thick finger kneading massage head according to the planned hand pinch action, complete the pinch action .
- the double-peak sliding and rolling massage heads 204 and 205 complete the sliding and rolling action according to the planned double-peak sliding and rolling action. For example, if the user chooses stroking, hammer action, plush material massage palm 206, vibrating air bag hammer massage head 207 strokes according to the planned plush material massage palm 206, vibrating air bag hammer massage head 207 strokes and fills the massage hammer action.
- Rubbing method picking method, pushing method, pressing method, point method, pinching method, pinching method, patting method, striking method, bouncing method, rolling method, palm kneading method, finger kneading method, shaking method, shaking method, holding method, Rub hair, shake a series of massage movements.
- the planned movements complete the massage rehabilitation movements.
- the objects in the escort environment are selected and placed through the user interface or voice.
- the visual information released by the camera includes: object information in the accompanying environment and environmental scene information. Positioning the object, the main controller of the robot, and the claw device of the machine receive the visual information, identify the object, and move the robot to the position of the placement area according to the position of the returned object.
- the visual camera 214 collects object images, and applies the object recognition module to identify and confirm picking and placing objects.
- the robot arm 211B and the robot claw 217 are planned and set according to the joint angle, joint limit, specified joint shape, joint limit, joint trajectory position, velocity component, joint velocity, motion constraint, target trajectory, speed Set and execute the planned trajectory, the objects that can be picked up by the target pose are set for the robot pose parameters, as well as the robotic arm 211B and the robot claw 217, parameter settings, grasping, pick and place, grasping pose parameter setting and matching target pose.
- the robot arm 211B and the machine picking claw 217 Initialize the robot arm 211B and the machine picking claw 217, and place the grasping, the position of the object, the grasping gesture object, the picking area, the target position, and the placement position of the storage box.
- the robot arm 211B and the robot picking claw 217 grab (initialize the grabbing object, create the open and closed posture of the clip, and evacuate the parameters of the target) and move, and place it in the storage box 218.
- the robot arm 211B is placed in the storage box from 218 Grab, organize, place items, and move to the placement area 1000 required by the user.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Dermatology (AREA)
- Epidemiology (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computational Linguistics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
Description
Claims (8)
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,一种陪护按摩双臂多任务并行处理机器人装置包括:机器人控制器,所述的机器人控制器与机器臂模块,多传感装置,肌电采集装置,按摩装置连接,用于控制机器人;语音模块,所述的语音模块与机器人控制器连接,用于机器人主系统与用户间语音交互,语音仿人聊天;摄像头视觉模块,所述的摄像头视觉模块与机器人控制器,机器臂连接,用于采集人脸图像及陪护环境下的常见物体图像,采集人脸图像用于辅助检测身体部位,定位人脸,胸,背,腿,手臂,腰,手,足,关节位置,采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物。实时采集视频,监视陪护环境;多媒体显示屏模块,所述的多媒体显示屏模块,应用连接器与机器人控制器连接,可与机器人主体分离,便携式,应用蓝牙,wifi与机器人通信,用于多媒体播放,网页展示,互联网浏览,通信;机器臂模块,所述的机器臂模块与机器人控制器,按摩爪,机器爪连接,按摩爪用于按摩上肢,腰,下肢,机器爪用于取物;机器爪取物模块,所述的机器爪取物模块与机器人控制器,机器臂连接,用于取物;按摩爪模块,所述的按摩爪模块与机器人控制器,机器臂连接,按摩爪与机器臂为分离式设计,直接插入式拆装,便携式,通过wifi蓝牙模块及连接插头与机器人机器臂通信,按摩爪包括:可伸缩柔体粗手指揉捏按摩头,双峰滑动蠕动按摩头,气囊锤式按摩头,毛绒材料按摩掌, 用于仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩;肌电采集装置,所述的肌电采集装置与机器人控制器连接,用于采集肌肉信息,肌肉状态,感觉神经,运动神经的信息;多传感信息采集模块,所述的多传感信息采集模块与机器人控制器连接,用于采集压力,重力及多种传感器信息;红外模块,所述的红外模块与机器人控制器,机器爪连接,用于发射红外,为按摩爪红外发生器;振动器,所述的振动器与机器人控制器,机器爪连接,用于按摩震动,为按摩爪震动发生器;雷达自主移动模块,所述的雷达自主移动模块与机器人控制器,移动底座连接,用于自主定位导航。
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,语音模块,所述的语音模块与机器人控制器连接,所述语音模块用于机器人与用户间交互,包括语音识别,语音文字互转,语音引导,语音指令,语音陪伴,语音医疗问答,通过用户云端客户端访问陪护环境视频,利用语音模块支持远端语音通信。
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,摄像头视觉采集模块,所述的摄像头视觉采集模块与机器人控制器连接,所述摄像头视觉采集模块用于采集人脸图像,识别人脸,腿,手臂,腰,关节,返回身体部位的位置信息,定位腿,手臂,腰,关节位置,采集物体图像,应用物体识别模块,识别陪护环境附近物体,通过图像参数的设置,利用改进的机器学习方法的颜色,形状,轮廓的综合特征分类人员,物品,器材,利用视觉模块内改进的机器学习方法与深度神经网络方法,采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物,智能识别颜色,数字,字母,文字,特殊标识,识别人员,物品,器材,并反馈其所在位置信息,通过图像参数的设置,实时采集陪护环境下视频,监视陪护环境,通过用户云端客户端实时访问陪护环境视频。4. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,肌电采集模块,所述的肌电采集模块与机器人控制器,机器臂连接,用于采集胸,腰,背,上肢,下肢,手,足肢肌肉信息,包括肌肉的收缩方式,静力动力状态,肌肉疲劳状态,感觉神经,运动神经传导,重复电刺激,运动单位数目估计,交感皮肤反应,深度学习算法自主调解训练强度,训练周期,训练次数。
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,多传感信息采集模块,所述的多传感信息采集模块与机器人控制器,机器臂连接,用于采集重力信息,压力信息,方向信息,依据返回信息,机器人控制器与多传感器通信,依据机器人控制器接受的多传感信息,调整机器臂及重力装置的参数值。
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,多媒体液晶显示屏模块,所述的多媒体液晶显示屏模块与机器人控制器连接,所述的多媒体液晶显示屏模块,应用连接器与机器人控制器连接,可与机器人主体分离,便携式,应用蓝牙,wifi与机器人通信,利用机器人系统下的用户交互界面,控制机器人按摩,音乐,语音对话,仿人式交互聊天,互联网式在线学习,用于播放多媒体,展示网络信息。
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,机器人控制器与雷达,摄像头,移动底座相连,雷达采集的信息通过消息,服务的通信方式,发送至机器人主系统实现场景自建地图,机器人主系统与移动底盘通信:将创建的地图信息发布,与移动底盘节点通信,接受地图信息,实现自主导航,与摄像头与通信,摄像头采集的图像信息通过服务的通信方式,发送至机器人主系统客户端与机器臂通信实现动作规划。
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,按摩装置,所述的按摩装置与机器人控制器与摄像头模块连接,所述的按摩装置包括:机器臂模块及按摩爪模块,机器臂模块与机器人控制器,按摩爪连接,用于按摩上肢,腰,下肢,所述的按摩爪模块包括:可伸缩柔体粗手指揉捏按摩头,双峰滑动,蠕动按摩头,震动气囊锤式按摩头,毛绒材料按摩掌,用于仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩,机器人控制器,机器臂,视觉模块连接,摄像头发布视觉信息包括:身体各部位,返回胸,腰,背,上肢,下肢,手,足,关节位置信息,主系统,按摩装置接收视觉信息,检测面部,身体各部位,精准定位身体各位置,通过用户交互界面选定身体按摩范围,包括:胸,腰,背,上肢,下肢,手,足,关节的按摩范围,划定按摩位置,选择揉,捏,摸,锤,滑动,蠕动,震动,红外的按摩方式及频率,强度,重力的按摩参数,定位,移动到身体按摩位置,依照动作规划的一系列动作,按摩陪伴,缓解疲劳,修复肌肉组织,减少脂肪。
- 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,取物装置,所述的取物装置与机器人控制器与摄像头模块连接,所述的取物装置包括:机器臂模块及机器取物爪模块,所述的机器臂模块与机器人控制器,机器取物爪连接用于取物,通过用户交互界面或语音选定取放陪护环境下的物体,摄像头发布视觉信息包括:陪护环境下的物体信息,环境场景信息,定位物体,机器人主控制器,机器取物爪装置接收视觉信息,识别物体,返回物体位置,按照位置信息,机器臂移动到位置,通过视觉识别陪护环境下的目标,机器取物爪动作,拾取和放置物体,机器取物爪动作规划,通过配置机器臂,取物爪的位置参数,角度参数,规划抓,取动,放置参数,机器取物爪移动物品,器材,放置,整理,摆放物品,器材。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202022230819.0 | 2020-10-09 | ||
CN202022230819 | 2020-10-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022073467A1 true WO2022073467A1 (zh) | 2022-04-14 |
Family
ID=81125589
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/122528 WO2022073467A1 (zh) | 2020-10-09 | 2021-10-07 | 一种陪护按摩双臂多任务并行处理机器人装置 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022073467A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113797088A (zh) * | 2021-08-31 | 2021-12-17 | 中科尚易健康科技(北京)有限公司 | 机械臂和床的联动控制方法及控制系统 |
CN116308949A (zh) * | 2023-02-21 | 2023-06-23 | 京大(北京)技术有限公司 | 一种社区居家型康复训练机器人 |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120066157A (ko) * | 2010-12-14 | 2012-06-22 | 김현겸 | 인공지능 로봇 안마장치 |
CN106821718A (zh) * | 2017-03-24 | 2017-06-13 | 徐加国 | 全智能扫描按摩舒经活络的保健方法及机器人 |
CN206748435U (zh) * | 2017-01-17 | 2017-12-15 | 五邑大学 | 一种智能陪护机器人 |
CN107582039A (zh) * | 2017-08-28 | 2018-01-16 | 陕西舜洋电子科技有限公司 | 一种智能老年人用保健机器人 |
CN107693295A (zh) * | 2017-09-26 | 2018-02-16 | 北京联合大学 | 带有肌肉电控制装置的按摩椅 |
CN107752984A (zh) * | 2017-11-15 | 2018-03-06 | 李玉东 | 一种基于大数据的高智能全科医疗执业机器人 |
CN207415376U (zh) * | 2017-10-20 | 2018-05-29 | 深圳市前海安测信息技术有限公司 | 多功能健康监护机器人 |
CN110123623A (zh) * | 2019-05-16 | 2019-08-16 | 湖北工业大学 | 一种智能康复按摩机器人 |
US20200035237A1 (en) * | 2019-07-09 | 2020-01-30 | Lg Electronics Inc. | Communication robot and method for operating the same |
CN111343958A (zh) * | 2017-11-29 | 2020-06-26 | 美的集团股份有限公司 | 使用机器视觉的按摩机器人 |
CN111437174A (zh) * | 2020-04-16 | 2020-07-24 | 深圳瀚维智能医疗科技有限公司 | 理疗按摩机器人 |
-
2021
- 2021-10-07 WO PCT/CN2021/122528 patent/WO2022073467A1/zh active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120066157A (ko) * | 2010-12-14 | 2012-06-22 | 김현겸 | 인공지능 로봇 안마장치 |
CN206748435U (zh) * | 2017-01-17 | 2017-12-15 | 五邑大学 | 一种智能陪护机器人 |
CN106821718A (zh) * | 2017-03-24 | 2017-06-13 | 徐加国 | 全智能扫描按摩舒经活络的保健方法及机器人 |
CN107582039A (zh) * | 2017-08-28 | 2018-01-16 | 陕西舜洋电子科技有限公司 | 一种智能老年人用保健机器人 |
CN107693295A (zh) * | 2017-09-26 | 2018-02-16 | 北京联合大学 | 带有肌肉电控制装置的按摩椅 |
CN207415376U (zh) * | 2017-10-20 | 2018-05-29 | 深圳市前海安测信息技术有限公司 | 多功能健康监护机器人 |
CN107752984A (zh) * | 2017-11-15 | 2018-03-06 | 李玉东 | 一种基于大数据的高智能全科医疗执业机器人 |
CN111343958A (zh) * | 2017-11-29 | 2020-06-26 | 美的集团股份有限公司 | 使用机器视觉的按摩机器人 |
CN110123623A (zh) * | 2019-05-16 | 2019-08-16 | 湖北工业大学 | 一种智能康复按摩机器人 |
US20200035237A1 (en) * | 2019-07-09 | 2020-01-30 | Lg Electronics Inc. | Communication robot and method for operating the same |
CN111437174A (zh) * | 2020-04-16 | 2020-07-24 | 深圳瀚维智能医疗科技有限公司 | 理疗按摩机器人 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113797088A (zh) * | 2021-08-31 | 2021-12-17 | 中科尚易健康科技(北京)有限公司 | 机械臂和床的联动控制方法及控制系统 |
CN116308949A (zh) * | 2023-02-21 | 2023-06-23 | 京大(北京)技术有限公司 | 一种社区居家型康复训练机器人 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108187310B (zh) | 基于力觉信息和姿态信息的肢体运动意图理解与上肢康复训练机器人及其控制方法 | |
WO2022073467A1 (zh) | 一种陪护按摩双臂多任务并行处理机器人装置 | |
CN109172066B (zh) | 基于语音控制与视觉识别的智能假肢手及其系统和方法 | |
Chu et al. | The helping hand: An assistive manipulation framework using augmented reality and tongue-drive interfaces | |
Dragusanu et al. | Design, development, and control of a tendon-actuated exoskeleton for wrist rehabilitation and training | |
CN109940636A (zh) | 一种用于商业表演的人形机器人 | |
CN106943300A (zh) | 点穴治疗保健塑身的方法及系统 | |
CN212421309U (zh) | 一种足式机器人的远程操控装置 | |
Galambos et al. | Vibrotactile force feedback for telemanipulation: Concept and applications | |
CN111687847A (zh) | 一种足式机器人的远程操控装置和操控交互方式 | |
Imran et al. | Design of an Affordable Prosthetic Arm Equipped With Deep Learning Vision-Based Manipulation | |
Bhuvaneswari et al. | Humanoid robot based physiotherapeutic assistive trainer for elderly health care | |
Chu et al. | Hands-free assistive manipulator using augmented reality and tongue drive system | |
Pacchierotti | Cutaneous haptic feedback for robotics and Virtual Reality | |
Huang et al. | Enhancing Telecooperation Through Haptic Twin for Internet of Robotic Things: Implementation and Challenges | |
CN114010184A (zh) | 平面康复机器人运动数据采集和镜像方法 | |
CN210377375U (zh) | 体感交互装置 | |
CN211044788U (zh) | 演示系统 | |
Castellini et al. | Gaze tracking for robotic control in intelligent teleoperation and prosthetics | |
Boboc et al. | Learning new skills by a humanoid robot through imitation | |
Xu | Design, Development, and Control of an Assistive Robotic Exoskeleton Glove Using Reinforcement Learning-Based Force Planning for Autonomous Grasping | |
KR101348940B1 (ko) | 로봇의 원격조작시 발생하는 물리적 상호작용 전달을 위한 감각 재현 시스템 | |
CN113878595B (zh) | 基于树莓派的仿人实体机器人系统 | |
Setapen et al. | Beyond teleoperation: Exploiting human motor skills with marionet | |
WO2022073468A1 (zh) | 一种外科治疗,康复机器人装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21876985 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21876985 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21876985 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22/09/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21876985 Country of ref document: EP Kind code of ref document: A1 |