WO2022073467A1 - 一种陪护按摩双臂多任务并行处理机器人装置 - Google Patents

一种陪护按摩双臂多任务并行处理机器人装置 Download PDF

Info

Publication number
WO2022073467A1
WO2022073467A1 PCT/CN2021/122528 CN2021122528W WO2022073467A1 WO 2022073467 A1 WO2022073467 A1 WO 2022073467A1 CN 2021122528 W CN2021122528 W CN 2021122528W WO 2022073467 A1 WO2022073467 A1 WO 2022073467A1
Authority
WO
WIPO (PCT)
Prior art keywords
massage
module
robot
claw
arm
Prior art date
Application number
PCT/CN2021/122528
Other languages
English (en)
French (fr)
Inventor
谈斯聪
于皓
于梦非
Original Assignee
谈斯聪
于皓
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 谈斯聪, 于皓 filed Critical 谈斯聪
Publication of WO2022073467A1 publication Critical patent/WO2022073467A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H7/00Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition

Definitions

  • the invention belongs to the field of artificial intelligence robot technology, and relates to the field of robot technology, image acquisition and recognition artificial intelligence.
  • multi-sensing devices collect muscle information, gravity pressure and other multi-sensing information.
  • the client terminal uses the client terminal to connect with the robot controller to collect objects, facial image information, self-massage, fetch objects, and be used in families, hospitals, nursing homes, nursing institutions, hotels and other service industries.
  • the invention effectively prevents human errors through intelligent voice chatting, massage, fetching, and escort service robot design, realizes autonomous escort, massages, fetches, and double-arm service, and realizes multi-task parallel processing.
  • the purpose of the present invention is to overcome the above-mentioned shortcomings and deficiencies of the prior art, and to provide a multi-task parallel processing robot device for escorting massage arms.
  • sensory nerve motor nerve information
  • multi-sensor information multi-sensor information.
  • the voice module it is used for voice interaction between the main control system and the user, and the voice chats like a human.
  • Use the camera vision module to collect face images and common object images in the accompanying environment to identify common object images. Utilize multimedia display screen, robot-human interaction, multimedia playback, internet browsing, internet communication.
  • the robot arm equipped with massage claw built-in infrared generator, vibration generator for massage, retractable soft body finger kneading massage head, double-peak sliding peristaltic massage head, vibrating air bag hammer massage head, plush material massage palm.
  • vibration generator for massage
  • retractable soft body finger kneading massage head double-peak sliding peristaltic massage head
  • vibrating air bag hammer massage head plush material massage palm.
  • the machine claw, robot arm, vision module, radar, mobile base for autonomous positioning, navigation and movement, moving to the position of the object, picking up objects with the robot claw, sorting, and cleaning.
  • the present invention provides a voice device for human-computer interaction, remote voice commands, an autonomous mobile device, a robotic arm massage device, and a robotic arm fetching device for remote, autonomous escort, massage, retrieval, and service .
  • An escort massage double-arm multitasking parallel processing robot device includes:
  • a robot controller which is connected with a robot arm module, a multi-sensing device, an electromyography acquisition device, and a massage device, is used to control the robot.
  • the voice module is connected with the robot controller, and is used for voice interaction between the main control system and the user, and the voice chats like a human.
  • the camera vision module which is connected with the robot controller and the robot arm, is used to collect face images and images of common objects in the accompanying environment, and collect face images to assist in detecting body parts, locating faces, chest , back, legs, arms, waist, hands, feet, joint positions. Collect images of common objects in the escort environment, identify object images, and assist the robotic claw to pick up objects. Capture video in real time and monitor the escort environment.
  • Multimedia display screen module the multimedia display screen module, the application connector is connected with the robot controller, can be separated from the robot body, portable, uses Bluetooth, wifi to communicate with the robot, used for multimedia playback, web page display, Internet browsing, communication .
  • the robot arm module is connected with the robot controller, the massage claw and the robot claw.
  • the massage claw is used for massaging the upper limbs, chest, back, waist, hands, feet and joints, and the robot claw is used for picking up objects.
  • a robotic claw picking module the robotic claw module is connected with a robot controller and a robotic arm for picking up objects.
  • Massage claw module the massage claw module is connected with the robot controller and the robot arm.
  • the massage claw and the robot arm are of separate design, directly inserted and disassembled, portable, and communicate with the robot arm through the wifi bluetooth module and the connection plug.
  • Massage claws include: retractable soft body finger kneading massage head, bimodal sliding peristaltic massage head, vibrating air bag hammer massage head, plush material massage palm. For imitation hand kneading massage, rolling peristaltic massage, hammer massage, and plush material stroking massage.
  • An electromyography acquisition device which is connected to a robot controller, is used to collect muscle information, muscle state, sensory nerve, and motor nerve information.
  • a multi-sensing information acquisition module is connected with the robot controller, and is used for collecting pressure, gravity and various sensor information.
  • the infrared module which is connected with the robot controller and the robot claw, is used for emitting infrared, and is an infrared generator for massage claw.
  • the vibrator, the vibration module is connected with the robot controller and the robot claw, and is used for massage vibration, which is a massage claw vibration generator.
  • the radar autonomous movement module is connected with the robot controller and the mobile base for autonomous positioning and navigation.
  • the robot controller is connected to the voice module, and the voice module is used for interaction between the robot and the user, including voice recognition, voice-text interconversion, voice guidance, voice commands, voice companionship, and voice medical question and answer. Access the video of the escort environment through the user cloud client, and use the voice module to support remote voice communication.
  • the robot controller is connected with the camera vision acquisition module, and the camera vision module is used to collect face images, identify the face, legs, arms, waist, and joints, return the position information of the body parts, and locate the positions of the legs, arms, waist, and joints. .
  • Collect object images apply the object recognition module, and identify objects near the escort environment.
  • image parameters use the comprehensive features such as color, shape, and outline of the improved machine learning method to classify people, items, and equipment, and use the improved machine learning method and deep neural network method in the vision module to collect common escort environment.
  • Object image recognize object image, assist robot claw to take objects. Intelligently identify colors, numbers, letters, texts, special signs, identify people, items, equipment, and feedback their location information.
  • the EMG acquisition module is connected with the robot controller and the robot arm, and is used to collect the muscle information of the chest, waist, back, upper limb, lower limb, hand, and foot limb, including muscle contraction mode, static power state, muscle fatigue state, and sensory nerve. , motor nerve conduction, repetitive electrical stimulation, estimation of the number of motor units, sympathetic skin response, deep learning algorithm to autonomously adjust training intensity, training period, and training times.
  • the multi-sensing information acquisition module is connected to the robot controller and the robot arm, and is used to collect gravity information, pressure information, and direction information. According to the return information, the main system communicates with the multi-sensor, and adjusts the The parameter values of the robot arm and gravity device.
  • the multimedia LCD display module is connected with the robot controller.
  • the multimedia liquid crystal display module is connected with the robot controller by the application connector, can be separated from the robot body, is portable, communicates with the robot by using bluetooth and wifi, and uses the user interface under the robot system to control the robot massage, music, and voice.
  • Dialogue human-like interactive chat, Internet-based online learning, used to play multimedia and display network information.
  • Robot controller is connected with radar, camera, mobile base.
  • the information collected by the radar is sent to the main system client through the communication method of message and service to realize the self-built map of the scene.
  • the main system communicates with the mobile chassis: publishes the created map information, communicates with the mobile chassis nodes, accepts the map information, and realizes autonomous navigation.
  • the image information collected by the camera is sent to the main system client through the service communication method to communicate with the robot arm to realize action planning, etc.
  • the robot controller is connected with the camera module and the massage device, and the massage device includes: a robot arm module and a massage claw module.
  • the robot arm module is connected with the robot controller and the massage claws, and is used for massaging the upper limbs, waist and lower limbs.
  • the massage claw module includes: a retractable and flexible finger kneading massage head, a bimodal sliding peristaltic massage head, a vibrating air bag hammer massage head, and a plush material massage palm. For imitation hand kneading massage, rolling peristaltic massage, hammer massage, and plush material stroking massage.
  • the robot controller, the robot arm, and the vision module are connected, and the camera releases the visual information including: various parts of the body, returning to the chest, waist, back, upper limbs, lower limbs, hands, feet, and joint position information.
  • the main system, the massage device receives visual information, detects the face, various parts of the body, and accurately locates each position of the body.
  • Select the body massage range through the user interface including: massage range of chest, waist, back, upper limbs, lower limbs, hands, feet, joints, delineate the massage position, select the rubbing method, pick method, push method, press method, point method, pinching method, pinching method, patting method, striking method, bouncing method, rolling method, palm rubbing method, finger rubbing method, shaking method, shaking method, holding method, rubbing method, shaking method, vibration method, infrared massage method and Massage parameters of frequency, intensity, gravity. Position, move to the body massage position, follow a series of movements planned by the movement, massage accompany, relieve fatigue, repair muscle tissue, and reduce fat.
  • the robot controller is connected with the camera module and the picking device, and the picking device includes: a robotic arm module and a machine picking claw module.
  • the robot arm module is connected with the robot controller and the machine picking claw for picking up objects. Pick and place objects in the escort environment through user interface or voice selection.
  • the visual information released by the camera includes: object information in the accompanying environment and environmental scene information. Positioning the object, the main controller of the robot, and the claw device of the machine receive the visual information, identify the object, and return the position of the object. According to the position information, the robot arm moves to the position.
  • the invention can solve the remote control robot voice device through a multi-task parallel processing robot device for escorting massage arms, which is used for human-computer interaction, remote voice command, provides an autonomous moving device, and provides a robot arm mounted massage claw device , Machine-fetching claw device for distal, autonomous massage, fetching.
  • the problems of high pressure and complicated work in escort work have been improved.
  • the massage intensity and the movement parameters of the muscle tissue can be adjusted adaptively, which greatly improves the work efficiency and reduces the cost of human escort.
  • efficient escort can be realized, auxiliary hospital, nursing home, nursing institution nursing, family, hotel and other service institutions can accompany nursing.
  • Fig. 1 is the schematic diagram of the robot device module in the specification of this application; Fig. 1 is marked:
  • 100-remote control device 101-robot controller; 102-voice module; 103-radar autonomous movement module; 104-camera vision module; 105-machine claw picking module; 106-massage claw module; 107-machine arm module ;108-multi-sensing information acquisition module;109-multimedia display module;110-myoelectric acquisition device;
  • Figure 2 is a schematic diagram of the composition of the robot device in the description of the application; Figure 2 is marked:
  • the purpose of the present invention is to design a remote-controllable robot that replaces human work, realize remote-controlled machine acquisition of images, and effectively solve autonomous and remote acquisition of nursing, accompanying images, voice communication, video communication, massage, and moving objects.
  • Using artificial intelligence robot technology in the field of automation, remote and autonomous control of machine movement, positioning and navigation, and using robotic arms to retrieve objects.
  • An escort massage double-arm multitasking parallel processing robot device includes:
  • the robot controller 101 the robot controller 101 is connected with the robot arm module 107, the multi-sensing device 108, the myoelectric acquisition device 110, and the massage device, and is used to control the robot.
  • the voice module 102 is connected with the robot controller 101, and is used for voice interaction between the main control system and the user, and the voice chats like a human.
  • the camera vision module 104 is connected with the robot controller 101 and the robot arm 107, and is used to collect face images and common object images in the accompanying environment, and collect face images to assist in detecting body parts and positioning. Human face, chest, back, legs, arms, waist, hands, feet, joint positions. Collect images of common objects in the escort environment, identify object images, and assist the robotic claw to pick up objects. Capture video in real time and monitor the escort environment.
  • the multimedia display screen module 109, the multimedia display screen module 109, the application connector is connected to the robot controller 101, can be separated from the robot body, portable, uses Bluetooth, wifi to communicate with the robot, used for multimedia playback, web page display, Internet Browse, communicate.
  • Robotic arm module 107 the robotic arm module 107 is connected with the robot controller 101, the massage claw 106, and the machine claw 105.
  • the massage claw 106 is used for massaging the upper limbs, waist, and lower limbs, and the machine claw 105 is used for picking up objects. .
  • a robot claw picking module 105, the robot claw module 105 is connected with the robot controller 101 and the robot arm 107 for picking up objects.
  • the massage claw module 106, the massage claw module 106 is connected with the robot controller 101 and the robot arm 107, the massage claw and the robot arm 107 are of a separate design, directly inserted and disassembled, portable, and connected with the wifi Bluetooth module and the connection plug.
  • the massage claw 106 includes: a retractable and flexible finger kneading massage head, a bimodal sliding peristaltic massage head, a vibration airbag hammer massage head, and a plush material massage palm. For imitation hand kneading massage, rolling peristaltic massage, hammer massage, and plush material stroking massage.
  • the electromyography acquisition device 110 is connected to the robot controller 101, and is used to collect muscle information, muscle state, sensory nerve, and motor nerve information.
  • a multi-sensing information collection module 108 the multi-sensing information collection module 108 is connected with the robot controller 101, and is used for collecting pressure, gravity and various sensor information.
  • the infrared module 208 is connected with the robot controller 101 and the machine claw 105 for emitting infrared, and is an infrared generator for the massage claw 106 .
  • the vibrator 209 , the vibration module is connected with the robot controller 101 and the machine claw 105 for massage and vibration, and is a vibration generator for the massage claw 106 .
  • the radar autonomous movement module 103, the radar autonomous movement module 103 is connected with the robot controller 101 and the mobile base 210 for autonomous positioning and navigation.
  • Management users such as administrators use the remote control module 100 to communicate with the main control system 101, the remote sends out control commands, the main control system 101 communicates with the robotic arm 107, and the camera 214 is used to collect face images.
  • the camera 214 is used to collect face images.
  • the remote control module 100 remotely controls the robotic arm 107A, and the robotic arm 107B moves to the position of the body part, collects body pictures, detects the body part, and assists in positioning.
  • the collection and escort environment is realized.
  • Common object images identify object images, and assist the robotic claw to pick up objects. Collect video in real time, monitor the escort environment, intelligently identify colors, numbers, letters, characters, special signs, and feed back the location information of personnel, items, equipment, etc.
  • the voice module 216 the voice device 216 includes a sound collection device, a microphone device and a speaker 216, the sound pickup device 216 can obtain voice information, input user voice through the microphone device 216 and human-computer interaction through the speaker device, voice guidance, text and voice exchange, Speech synthesis, voice wake-up.
  • the radar autonomous positioning and navigation module 210 communicates with the mobile chassis 210 and the radar 210, and the information collected by the radar 210 is sent to the main system through the communication method of message and service to realize the scene self-built map. Publish the created map information to the mobile chassis node, and the mobile chassis accepts the map information to realize autonomous navigation.
  • the body massage range is selected through the user interface, including: the massage range of the chest, waist, back, upper limbs, lower limbs, hands, feet, and joints.
  • Massage location The user chooses the rubbing method, the picking method, the pushing method, the pressing method, the point method, the pinching method, the pinching method, the slapping method, the striking method, the bouncing method, the rolling method, the palm rubbing method, the finger rubbing method, the shaking method, the shaking method, and the holding method.
  • the robot arm 211A the main control system 101 communicates with the camera 214, the image information of the face, chest, back, legs, arms, waist, hands, feet, joint positions collected by the camera 214, positioning and returning the image position information of each part of the body
  • the machine Arm 211A is equipped with 201, 202, 203 retractable and flexible finger kneading massage heads; 204, 205 double-peak sliding and rolling massage heads; 206 plush material massage palms; 207 vibration airbag hammer massage heads; 208 infrared generators; 209 vibration Massager and other massage devices.
  • the robot arm 211A Position the body face, chest, back, legs, arms, waist, hands, feet, and joints in the target massage area, move the robot arm 211A to the massage area, and release the massage method (rubbing method, Method, push method, press method, point method, pinch method, pinching method, beat method, strike method, bounce method, rolling method, palm kneading method, finger kneading method, shaking method, shaking method, holding method, rubbing method, shaking method) information, frequency, strength, gravity, pressure information, distance information, according to the information released by the myoelectric acquisition device 206, gravity sensor 208, pressure sensor 209, the robotic arm receives and subscribes to muscle information, gravity, pressure, multi-sensing information , the massage device according to the received massage mode, frequency, intensity and the corresponding planned action, such as the user selects the pinch action, 201, 202, 203 - soft body and thick finger kneading massage head according to the planned hand pinch action, complete the pinch action .
  • the double-peak sliding and rolling massage heads 204 and 205 complete the sliding and rolling action according to the planned double-peak sliding and rolling action. For example, if the user chooses stroking, hammer action, plush material massage palm 206, vibrating air bag hammer massage head 207 strokes according to the planned plush material massage palm 206, vibrating air bag hammer massage head 207 strokes and fills the massage hammer action.
  • Rubbing method picking method, pushing method, pressing method, point method, pinching method, pinching method, patting method, striking method, bouncing method, rolling method, palm kneading method, finger kneading method, shaking method, shaking method, holding method, Rub hair, shake a series of massage movements.
  • the planned movements complete the massage rehabilitation movements.
  • the objects in the escort environment are selected and placed through the user interface or voice.
  • the visual information released by the camera includes: object information in the accompanying environment and environmental scene information. Positioning the object, the main controller of the robot, and the claw device of the machine receive the visual information, identify the object, and move the robot to the position of the placement area according to the position of the returned object.
  • the visual camera 214 collects object images, and applies the object recognition module to identify and confirm picking and placing objects.
  • the robot arm 211B and the robot claw 217 are planned and set according to the joint angle, joint limit, specified joint shape, joint limit, joint trajectory position, velocity component, joint velocity, motion constraint, target trajectory, speed Set and execute the planned trajectory, the objects that can be picked up by the target pose are set for the robot pose parameters, as well as the robotic arm 211B and the robot claw 217, parameter settings, grasping, pick and place, grasping pose parameter setting and matching target pose.
  • the robot arm 211B and the machine picking claw 217 Initialize the robot arm 211B and the machine picking claw 217, and place the grasping, the position of the object, the grasping gesture object, the picking area, the target position, and the placement position of the storage box.
  • the robot arm 211B and the robot picking claw 217 grab (initialize the grabbing object, create the open and closed posture of the clip, and evacuate the parameters of the target) and move, and place it in the storage box 218.
  • the robot arm 211B is placed in the storage box from 218 Grab, organize, place items, and move to the placement area 1000 required by the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Dermatology (AREA)
  • Epidemiology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

陪护按摩双臂多任务并行处理机器人装置,包括:机器人控制器(101)、语音模块(102)、摄像头视觉模块(104)、多媒体显示屏模块(109)、机器臂模块(107)、机器爪取物模块(105)、按摩爪模块(106)、肌电采集装置(110)、多传感信息采集模块(108)、红外模块(208)、振动器(209)和雷达自主移动模块(103)。采用机器人代替人,远端语音控制自主取物,能够采集反馈信息,自主完成康复动作。

Description

一种陪护按摩双臂多任务并行处理机器人装置 技术领域
本发明属于人工智能机器人技术领域,涉及机器人技术领域,图像采集,识别人工智能领域。
背景技术
陪护需要耗费大量人力,时间,采用机器代替人,远端及自主语音陪护,按摩,取物及其他陪护服务。家庭,医院,老人院,看护机构由于人为陪护成本高,服务质量差。为解决陪护需要耗费大量人力,采用机器代替人,设计专业的按摩爪按摩装置,机器臂取物爪,视觉模块识别陪护环境下的物体,雷达移动取物成为陪护的重要课题。利用机器人装置,人工智能算法,动作规划算法解决陪护服务差,成本高,效率低下,陪护人员态度不好,失误多等问题,目前应用于家庭,医院,老人院,看护机构,酒店等服务行业。
利用机器臂,多传感,肌电采集装置,按摩爪,机器取物爪远端自主陪护。利用语音装置,多媒体液晶屏幕,通过互联网自主聊天,多传感装置采集肌肉信息,重力压力及其他多传感信息。
利用客户端与机器人控制器连接,采集物体,人脸部位图像信息,自主按摩,取物,用于家庭,医院,老人院,看护机构,酒店等服务行业。本发明通过智能化语音聊天,按摩,取物,陪护服务机器人设计有效防止人为失误,实现自主陪护,按摩取物双臂服务,实现了多任务并行处理。
技术问题
本发明的目的就在于克服上述现有技术的缺点和不足,提供一种陪护按摩双臂多任务并行处理机器人装置,机器代替人,远端及自主控制机器人,采集肌电的肌肉信息,肌肉状态,感觉神经,运动神经的信息,多传感器信息。利用语音模块,用于主控制系统与用户间语音交互,语音仿人聊天。利用摄像头视觉模块,采集人脸图像及采集陪护环境下的常见物体图像,识别常见物体图像。利用多媒体显示屏,机器人-人交互,多媒体播放,互联网浏览,互联网通信。利用机器臂,搭载按摩爪内置红外发生器,震动发生器用于按摩,可伸缩柔体手指揉捏按摩头,双峰滑动蠕动按摩头,震动气囊锤式按摩头,毛绒材料按摩掌。 用于胸,腰,背,上肢,下肢,手,足,关节身体部位的仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩。利用机器取物爪,机器臂,视觉模块,雷达,移动底座,用于自主定位导航移动,移动至物体位置,用机器爪取物,整理,清理。
解决陪护人员人为失误作业,陪护繁忙,责任大压力大问题,通过机器人自主作业,缓解作业压力。提高陪护效率。减少成本,提高陪护灵活性。本发明提供了语音装置,用于人机交互,远端语音命令,提供了自主移动装置,提供了机器臂按摩装置,机器臂取物装置用于远端,自主陪护,按摩,取物,服务。
技术解决方案
一种陪护按摩双臂多任务并行处理机器人装置包括:
机器人控制器,所述的机器人控制器与机器臂模块,多传感装置,肌电采集装置,按摩装置连接,用于控制机器人。
语音模块,所述的语音模块与机器人控制器连接,用于主控制系统与用户间语音交互,语音仿人聊天。
摄像头视觉模块,所述的摄像头视觉模块与机器人控制器,机器臂连接,用于采集人脸图像及陪护环境下的常见物体图像,采集人脸图像用于辅助检测身体部位,定位人脸,胸,背,腿,手臂,腰,手,足,关节位置。采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物。实时采集视频,监视陪护环境。
多媒体显示屏模块,所述的多媒体显示屏模块,应用连接器与机器人控制器连接,可与机器人主体分离,便携式,应用蓝牙,wifi与机器人通信,用于多媒体播放,网页展示,互联网浏览,通信。
机器臂模块,所述的机器臂模块与机器人控制器,按摩爪,机器爪连接,按摩爪用于按摩上肢,胸,背,腰,手,足,关节,机器爪用于取物。
机器爪取物模块,所述的机器爪模块与机器人控制器,机器臂连接,用于取物。
按摩爪模块,所述的按摩爪模块与机器人控制器,机器臂连接,按摩爪与机器臂为分离式设计,直接插入式拆装,便携式,通过wifi蓝牙模块及连接插头与机器人机器臂通信。按摩爪包括:可伸缩柔体手指揉捏按摩头,双峰滑动蠕动按摩头,震动气囊锤式按摩头,毛绒材料按摩掌。 用于仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩。
肌电采集装置,所述的肌电采集装置与机器人控制器连接,用于采集肌肉信息,肌肉状态,感觉神经,运动神经的信息。
多传感信息采集模块,所述的多传感信息采集模块与机器人控制器连接,用于采集压力,重力及多种传感器信息。
红外模块,所述的红外模块与机器人控制器,机器爪连接,用于发射红外,为按摩爪红外发生器。
振动器,所述的振动模块与机器人控制器,机器爪连接,用于按摩震动,为按摩爪震动发生器。
雷达自主移动模块,所述的雷达自主移动模块与机器人控制器,移动底座连接,用于自主定位导航。
机器人控制器与语音模块连接,所述语音模块用于机器人与用户间交互,包括语音识别,语音文字互转,语音引导,语音指令,语音陪伴,语音医疗问答。通过用户云端客户端访问陪护环境视频,利用语音模块支持远端语音通信。
机器人控制器与摄像头视觉采集模块连接,所述摄像头视觉模块用于采集人脸图像,识别人脸,腿,手臂,腰,关节,返回身体部位的位置信息,定位腿,手臂,腰,关节位置。采集物体图像,应用物体识别模块,识别陪护环境附近物体。通过图像参数的设置,利用改进的机器学习方法的颜色,形状,轮廓等综合特征分类人员,物品,器材,利用视觉模块内的改进的机器学习方法与深度神经网络方法,采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物。智能识别颜色,数字,字母,文字,特殊标识,识别人员,物品,器材,并反馈其所在位置信息。通过图像参数的设置,实时采集陪护环境下视频,监视陪护环境,通过用户云端客户端实时访问陪护环境视频。
肌电采集模块与机器人控制器,机器臂连接,用于采集胸,腰,背,上肢,下肢,手,足肢肌肉信息,包括肌肉的收缩方式,静力动力状态,肌肉疲劳状态,感觉神经,运动神经传导,重复电刺激,运动单位数目估计,交感皮肤反应,深度学习算法自主调解训练强度,训练周期,训练次数。
多传感信息采集模块与机器人控制器,机器臂连接,用于采集重力信息,压力信息,方向信息,依据返回信息,主系统与多传感器通信,依据主控制系统接受的多传感信息,调整机器臂及重力装置的参数值。
多媒体液晶显示屏模块与机器人控制器连接。所述的多媒体液晶显示屏模块,应用连接器与机器人控制器连接,可与机器人主体分离,便携式,应用蓝牙,wifi与机器人通信,利用机器人系统下的用户交互界面,控制机器人按摩,音乐,语音对话,仿人式交互聊天,互联网式在线学习,用于播放多媒体,展示网络信息。
机器人控制器与雷达,摄像头,移动底座相连。雷达采集的信息通过消息,服务的通信方式,发送至主系统客户端实现场景自建地图。主系统与移动底盘通信:将创建的地图信息发布,与移动底盘节点通信,接受地图信息,实现自主导航。与摄像头与通信,摄像头采集的图像信息通过服务的通信方式,发送至主系统客户端与机器臂通信实现动作规划等。
机器人控制器与摄像头模块,按摩装置连接,所述的按摩装置包括:机器臂模块及按摩爪模块。所述的机器臂模块与机器人控制器,按摩爪连接,用于按摩上肢,腰,下肢。所述的按摩爪模块包括:可伸缩柔体手指揉捏按摩头,双峰滑动蠕动按摩头,震动气囊锤式按摩头,毛绒材料按摩掌。 用于仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩。机器人控制器,机器臂,视觉模块连接,摄像头发布视觉信息包括:身体各部位,返回胸,腰,背,上肢,下肢,手,足,关节位置信息。主系统,按摩装置接收视觉信息,检测面部,身体各部位,精准定位身体各位置。通过用户交互界面选定身体按摩范围,包括:胸,腰,背,上肢,下肢,手,足,关节的按摩范围,划定按摩位置,选择擦法,採法,推法,按法,点法,掐法,捏法,拍法,击法,弹法,滚法,掌揉法,指揉法,震法,抖法,拿法,搓发,摇法,震动,红外的按摩方式及频率,强度,重力的按摩参数。定位,移动到身体按摩位置,依照动作规划的一系列动作,按摩陪伴,缓解疲劳,修复肌肉组织,减少脂肪。
 机器人控制器与摄像头模块,取物装置连接,所述的取物装置包括:机器臂模块及机器取物爪模块。所述的机器臂模块与机器人控制器,机器取物爪连接用于取物。通过用户交互界面或语音选定取放陪护环境下的物体。摄像头发布视觉信息包括:陪护环境下的物体信息,环境场景信息。定位物体,机器人主控制器,机器取物爪装置接收视觉信息,识别物体,返回物体位置。按照位置信息,机器臂移动到位置。通过视觉识别陪护环境下的目标,机器取物爪动作,拾取和放置物体,机器取物爪动作规划,通过配置机器臂,取物爪的位置参数,角度参数,规划抓,取动,放置参数,机器取物爪移动物品,器材,放置,整理,摆放物品,器材。
有益效果
本发明能够通过一种陪护按摩双臂多任务并行处理机器人装置,解决远端控制机器人语音装置,用于人机交互,远端语音命令,提供了自主移动装置,提供了机器臂搭载按摩爪装置,机器取物爪装置,用于远端,自主按摩,取物。改善了陪护工作作业压力大,作业繁杂等问题。同时,通过采集的肌电数据,多传感数据,依据反馈数据,自适应调整按摩强度,肌肉组织的运动参数,大幅度提高工作效率减少人力陪护成本。通过本发明,能够实现高效陪护,辅助医院,老人院,看护机构看护,家庭,酒店等服务机构看陪护。
附图说明
图1是本申请说明书中机器人装置模块示意图;附图1标记:
100-远端控制装置;101-机器人控制器;102-语音模块;103-雷达自主移动模块;104-摄像头视觉模块;105-机器爪取物模块;106-按摩爪模块;107-机器臂模块;108-多传感信息采集模块;109-多媒体显示屏模块;110-肌电采集装置;
图2是本申请说明书中机器人装置组成结构示意图;附图2标记:
201,202,203-可伸缩柔体手指揉捏按摩头;204,205-双峰滑动蠕动按摩头;206-毛绒材料按摩掌;207-震动气囊锤式按摩头;208-红外发生器;209-振动器;210-移动底座;211-机器臂;212-多传感装置;213-肌电采集器;214-摄像头;215-多媒体屏;216-语音装置;217-机器取物爪;218-储物装置;
 
本发明的实施方式
本发明的目的是设计取代人类工作的可远端控制机器人,实现远端控制机器采集图像,同时有效解决自主及远端采集看护,陪护图像,语音通信,视频通信,按摩,移动取物。利用人工智能机器人技术,自动化领域,远端及自主控制机器移动定位导航,利用机器臂取物。
实现自主,远端语音命令,通过机器人语音模块,视觉模块实现语音交互,实现远端语音,视频通信。机器人自主控制机器移动,取物。解决了人为失误,实现机器人远端及自主移动,按摩,提高了智能采集,语音通信,视频通信,按摩,移动取物效率。解决了远端陪护,看护问题。为了更好的理解上述技术方案,下面结合实施例及附图,对本发明作进一步地的详细说明,但本发明的实施方式不限于此。
本申请实施中的技术方案为解决上述技术问题的总体思路如下:
实施例1:
一种陪护按摩双臂多任务并行处理机器人装置包括:
机器人控制器101,所述的机器人控制器101与机器臂模块107,多传感装置108,肌电采集装置110,按摩装置连接,用于控制机器人。
语音模块102,所述的语音模块102与机器人控制器101连接,用于主控制系统与用户间语音交互,语音仿人聊天。
摄像头视觉模块104,所述的摄像头视觉模块104与机器人控制器101,机器臂107连接,用于采集人脸图像及陪护环境下的常见物体图像,采集人脸图像用于辅助检测身体部位,定位人脸,胸,背,腿,手臂,腰,手,足,关节位置。采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物。实时采集视频,监视陪护环境。
多媒体显示屏模块109,所述的多媒体显示屏模块109,应用连接器与机器人控制器101连接,可与机器人主体分离,便携式,应用蓝牙,wifi与机器人通信,用于多媒体播放,网页展示,互联网浏览,通信。
机器臂模块107,所述的机器臂模块107与机器人控制器101,按摩爪106,机器取物爪105连接,按摩爪106用于按摩上肢,腰,下肢,机器取物爪105用于取物。
机器爪取物模块105,所述的机器爪模块105与机器人控制器101,机器臂107连接,用于取物。
按摩爪模块106,所述的按摩爪模块106与机器人控制器101,机器臂107连接,按摩爪与机器臂107为分离式设计,直接插入式拆装,便携式,通过wifi蓝牙模块及连接插头与机器人机器臂通信。按摩爪106包括:可伸缩柔体手指揉捏按摩头,双峰滑动蠕动按摩头,震动气囊锤式按摩头,毛绒材料按摩掌。 用于仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩。
肌电采集装置110,所述的肌电采集装置110-与机器人控制器101连接,用于采集肌肉信息,肌肉状态,感觉神经,运动神经的信息。
多传感信息采集模块108,所述的多传感信息采集模块108与机器人控制器101连接,用于采集压力,重力及多种传感器信息。
红外模块208,所述的红外模块与机器人控制器101,机器取物爪105连接,用于发射红外,为按摩爪106红外发生器。
振动器209,所述的振动模块与机器人控制器101,机器取物爪105连接,用于按摩震动,为按摩爪106震动发生器。
雷达自主移动模块103,所述的雷达自主移动模块103与机器人控制器101,移动底座210连接,用于自主定位导航。
管理员等管理用户利用远端控制模块100与主控制系统101通信,远端发出控制命令,主控制系统101与机器臂107通信,应用摄像头214采集人脸图像,依据人脸,胸,腰,背,上肢,下肢,手,足,关节图像,返回脸,胸,腰,背,上肢,下肢,手,足,关节位置信息,定位人部位。远端控制模块100远端控制机器臂107A,机器臂107B移动至身体部位位置,采集身体图片,检测身体部位,辅助定位。
通过摄像头214与图像参数的设置,利用改进的机器学习方法的颜色,形状,轮廓等综合特征分类人员,物品,器材,利用改进的机器学习方法与深度神经网络方法,实现了采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物。实时采集视频,监视陪护环境,智能识别颜色,数字,字母,文字,特殊标识,并反馈人员,物品,器材等所在位置信息。
语音模块216,语音装置216包括声音采集装置麦克装置及扬声器216,拾音装置216可以获得语音信息,通过麦克风装置216输入用户语音及通过扬声器装置人机间交互,语音引导,文字语音互换,语音合成,语音唤醒。
雷达自主定位导航模块210,主控制系统201与移动底盘210,雷达210通信,雷达210采集的信息通过消息,服务的通信方式,发送至主系统,实现场景自建地图。将创建的地图信息发布至移动底盘节点,移动底盘接受地图信息,实现自主导航。
按照语音装置216的语音引导,语音指令,通过用户交互界面选定身体按摩范围,包括:胸,腰,背,上肢,下肢,手,足,关节的按摩范围,在用户交互界面用户选择划定按摩位置。用户选择擦法,採法,推法,按法,点法,掐法,捏法,拍法,击法,弹法,滚法,掌揉法,指揉法,震法,抖法,拿法,搓发,摇法,震动,红外的按摩方式及频率,强度,重力的按摩参数。机器臂211A,主控制系统101与摄像头214通信,摄像头214采集的人脸,胸,背,腿,手臂,腰,手,足,关节位置图像信息,定位及返回身体各部位图像位置信息,机器臂211A搭载201,202,203可伸缩柔体手指揉捏按摩头;204,205双峰滑动滚动按摩头;206毛绒材料按摩掌;207震动气囊锤式按摩头;208红外发生器;209振动器等按摩装置。定位身体人脸,胸,背,腿,手臂,腰,手,足,关节目标按摩区域位置,机器臂211A移动至按摩区域位置,按照主控制系统201发布用户选择的按摩方式(擦法,採法,推法,按法,点法,掐法,捏法,拍法,击法,弹法,滚法,掌揉法,指揉法,震法,抖法,拿法,搓发,摇法)信息,频率,强度,重力,压力信息,距离信息,按照肌电采集装置206,重力传感器208,压力传感器209发布的信息,机器臂接收订阅肌肉的信息,重力,压力,多传感信息,按摩装置按照接收到的按摩方式,频率,强度以及对应的规划动作,如用户选择捏动作,201,202,203-柔体粗手指揉捏按摩头依照规划的手部捏动作,完成捏动作。如用户选择滚动滑动动作,双峰滑动滚动按摩头204,205,依照规划的双峰滑动滚动动作完成滑动滚动动作。如用户选择抚摸,锤动作,毛绒材料按摩掌206,震动气囊锤式按摩头207依照规划的毛绒材料按摩掌206抚摸,震动气囊锤式按摩头207抚摸及填充后的按摩锤动作。擦法,採法,推法,按法,点法,掐法,捏法,拍法,击法,弹法,滚法,掌揉法,指揉法,震法,抖法,拿法,搓发,摇法一系列按摩动作。按照规划的按摩区,规划的动作完成按摩康复动作。
按照语音装置216的语音引导,语音指令,通过用户交互界面或语音选定取放陪护环境下的物体。摄像头发布视觉信息包括:陪护环境下的物体信息,环境场景信息。定位物体,机器人主控制器,机器取物爪装置接收视觉信息,识别物体,依据返回的物体位置,机器人移动到区放置区位置。在取物区放置区1000,利用视觉识别模块104以及改善的神经网络方法,视觉摄像头214采集物体图像,应用物体识别模块,识别确认取放物体。机器臂211B及机器取物爪217,依照动作规划及设定的关节角度,关节限位,指定的关节位形,关节限制,关节轨迹位置,速度分量,关节速度,运动约束,目标轨迹,速度设置,执行规划的轨迹,目标位姿所能拾取的物体对于机器人位姿参数设置,以及机械臂211B及机器取物爪217,参数设置,抓握,取放,抓取位姿参数设置与匹配目标位姿。
初始化机器臂211B及机器取物爪217,放置抓取,物体的位置,抓取姿态对象,取物区,目标位置,储物箱的放置位置。机器臂211B及机器取物爪217抓取(初始化抓取对象,创建夹瓜张开闭合的姿态,撤离目标的参数)移动,放置在储物箱218,机器臂211 B从放置在储物箱218抓取,整理,摆放物品,移动到用户要求的放置区1000。

Claims (8)

  1. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,一种陪护按摩双臂多任务并行处理机器人装置包括:
    机器人控制器,所述的机器人控制器与机器臂模块,多传感装置,肌电采集装置,按摩装置连接,用于控制机器人;
    语音模块,所述的语音模块与机器人控制器连接,用于机器人主系统与用户间语音交互,语音仿人聊天;
    摄像头视觉模块,所述的摄像头视觉模块与机器人控制器,机器臂连接,用于采集人脸图像及陪护环境下的常见物体图像,采集人脸图像用于辅助检测身体部位,定位人脸,胸,背,腿,手臂,腰,手,足,关节位置,采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物。实时采集视频,监视陪护环境;
    多媒体显示屏模块,所述的多媒体显示屏模块,应用连接器与机器人控制器连接,可与机器人主体分离,便携式,应用蓝牙,wifi与机器人通信,用于多媒体播放,网页展示,互联网浏览,通信;
    机器臂模块,所述的机器臂模块与机器人控制器,按摩爪,机器爪连接,按摩爪用于按摩上肢,腰,下肢,机器爪用于取物;
    机器爪取物模块,所述的机器爪取物模块与机器人控制器,机器臂连接,用于取物;
    按摩爪模块,所述的按摩爪模块与机器人控制器,机器臂连接,按摩爪与机器臂为分离式设计,直接插入式拆装,便携式,通过wifi蓝牙模块及连接插头与机器人机器臂通信,按摩爪包括:可伸缩柔体粗手指揉捏按摩头,双峰滑动蠕动按摩头,气囊锤式按摩头,毛绒材料按摩掌, 用于仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩;
    肌电采集装置,所述的肌电采集装置与机器人控制器连接,用于采集肌肉信息,肌肉状态,感觉神经,运动神经的信息;
    多传感信息采集模块,所述的多传感信息采集模块与机器人控制器连接,用于采集压力,重力及多种传感器信息;
    红外模块,所述的红外模块与机器人控制器,机器爪连接,用于发射红外,为按摩爪红外发生器;
    振动器,所述的振动器与机器人控制器,机器爪连接,用于按摩震动,为按摩爪震动发生器;
    雷达自主移动模块,所述的雷达自主移动模块与机器人控制器,移动底座连接,用于自主定位导航。
  2. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,语音模块,所述的语音模块与机器人控制器连接,所述语音模块用于机器人与用户间交互,包括语音识别,语音文字互转,语音引导,语音指令,语音陪伴,语音医疗问答,通过用户云端客户端访问陪护环境视频,利用语音模块支持远端语音通信。
  3. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,摄像头视觉采集模块,所述的摄像头视觉采集模块与机器人控制器连接,所述摄像头视觉采集模块用于采集人脸图像,识别人脸,腿,手臂,腰,关节,返回身体部位的位置信息,定位腿,手臂,腰,关节位置,采集物体图像,应用物体识别模块,识别陪护环境附近物体,通过图像参数的设置,利用改进的机器学习方法的颜色,形状,轮廓的综合特征分类人员,物品,器材,利用视觉模块内改进的机器学习方法与深度神经网络方法,采集陪护环境下的常见物体图像,识别物体图像,辅助机器爪取物,智能识别颜色,数字,字母,文字,特殊标识,识别人员,物品,器材,并反馈其所在位置信息,通过图像参数的设置,实时采集陪护环境下视频,监视陪护环境,通过用户云端客户端实时访问陪护环境视频。
    4. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,肌电采集模块,所述的肌电采集模块与机器人控制器,机器臂连接,用于采集胸,腰,背,上肢,下肢,手,足肢肌肉信息,包括肌肉的收缩方式,静力动力状态,肌肉疲劳状态,感觉神经,运动神经传导,重复电刺激,运动单位数目估计,交感皮肤反应,深度学习算法自主调解训练强度,训练周期,训练次数。
  4. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,多传感信息采集模块,所述的多传感信息采集模块与机器人控制器,机器臂连接,用于采集重力信息,压力信息,方向信息,依据返回信息,机器人控制器与多传感器通信,依据机器人控制器接受的多传感信息,调整机器臂及重力装置的参数值。
  5. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,多媒体液晶显示屏模块,所述的多媒体液晶显示屏模块与机器人控制器连接,所述的多媒体液晶显示屏模块,应用连接器与机器人控制器连接,可与机器人主体分离,便携式,应用蓝牙,wifi与机器人通信,利用机器人系统下的用户交互界面,控制机器人按摩,音乐,语音对话,仿人式交互聊天,互联网式在线学习,用于播放多媒体,展示网络信息。
  6. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,机器人控制器与雷达,摄像头,移动底座相连,雷达采集的信息通过消息,服务的通信方式,发送至机器人主系统实现场景自建地图,机器人主系统与移动底盘通信:将创建的地图信息发布,与移动底盘节点通信,接受地图信息,实现自主导航,与摄像头与通信,摄像头采集的图像信息通过服务的通信方式,发送至机器人主系统客户端与机器臂通信实现动作规划。
  7. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,按摩装置,所述的按摩装置与机器人控制器与摄像头模块连接,所述的按摩装置包括:机器臂模块及按摩爪模块,机器臂模块与机器人控制器,按摩爪连接,用于按摩上肢,腰,下肢,所述的按摩爪模块包括:可伸缩柔体粗手指揉捏按摩头,双峰滑动,蠕动按摩头,震动气囊锤式按摩头,毛绒材料按摩掌,用于仿手部捏揉式按摩,滚动蠕动式按摩,锤式按摩,及毛绒材料抚摸式按摩,机器人控制器,机器臂,视觉模块连接,摄像头发布视觉信息包括:身体各部位,返回胸,腰,背,上肢,下肢,手,足,关节位置信息,主系统,按摩装置接收视觉信息,检测面部,身体各部位,精准定位身体各位置,通过用户交互界面选定身体按摩范围,包括:胸,腰,背,上肢,下肢,手,足,关节的按摩范围,划定按摩位置,选择揉,捏,摸,锤,滑动,蠕动,震动,红外的按摩方式及频率,强度,重力的按摩参数,定位,移动到身体按摩位置,依照动作规划的一系列动作,按摩陪伴,缓解疲劳,修复肌肉组织,减少脂肪。
  8. 一种陪护按摩双臂多任务并行处理机器人装置,其特征在于,取物装置,所述的取物装置与机器人控制器与摄像头模块连接,所述的取物装置包括:机器臂模块及机器取物爪模块,所述的机器臂模块与机器人控制器,机器取物爪连接用于取物,通过用户交互界面或语音选定取放陪护环境下的物体,摄像头发布视觉信息包括:陪护环境下的物体信息,环境场景信息,定位物体,机器人主控制器,机器取物爪装置接收视觉信息,识别物体,返回物体位置,按照位置信息,机器臂移动到位置,通过视觉识别陪护环境下的目标,机器取物爪动作,拾取和放置物体,机器取物爪动作规划,通过配置机器臂,取物爪的位置参数,角度参数,规划抓,取动,放置参数,机器取物爪移动物品,器材,放置,整理,摆放物品,器材。
PCT/CN2021/122528 2020-10-09 2021-10-07 一种陪护按摩双臂多任务并行处理机器人装置 WO2022073467A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202022230819.0 2020-10-09
CN202022230819 2020-10-09

Publications (1)

Publication Number Publication Date
WO2022073467A1 true WO2022073467A1 (zh) 2022-04-14

Family

ID=81125589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/122528 WO2022073467A1 (zh) 2020-10-09 2021-10-07 一种陪护按摩双臂多任务并行处理机器人装置

Country Status (1)

Country Link
WO (1) WO2022073467A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113797088A (zh) * 2021-08-31 2021-12-17 中科尚易健康科技(北京)有限公司 机械臂和床的联动控制方法及控制系统
CN116308949A (zh) * 2023-02-21 2023-06-23 京大(北京)技术有限公司 一种社区居家型康复训练机器人

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120066157A (ko) * 2010-12-14 2012-06-22 김현겸 인공지능 로봇 안마장치
CN106821718A (zh) * 2017-03-24 2017-06-13 徐加国 全智能扫描按摩舒经活络的保健方法及机器人
CN206748435U (zh) * 2017-01-17 2017-12-15 五邑大学 一种智能陪护机器人
CN107582039A (zh) * 2017-08-28 2018-01-16 陕西舜洋电子科技有限公司 一种智能老年人用保健机器人
CN107693295A (zh) * 2017-09-26 2018-02-16 北京联合大学 带有肌肉电控制装置的按摩椅
CN107752984A (zh) * 2017-11-15 2018-03-06 李玉东 一种基于大数据的高智能全科医疗执业机器人
CN207415376U (zh) * 2017-10-20 2018-05-29 深圳市前海安测信息技术有限公司 多功能健康监护机器人
CN110123623A (zh) * 2019-05-16 2019-08-16 湖北工业大学 一种智能康复按摩机器人
US20200035237A1 (en) * 2019-07-09 2020-01-30 Lg Electronics Inc. Communication robot and method for operating the same
CN111343958A (zh) * 2017-11-29 2020-06-26 美的集团股份有限公司 使用机器视觉的按摩机器人
CN111437174A (zh) * 2020-04-16 2020-07-24 深圳瀚维智能医疗科技有限公司 理疗按摩机器人

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120066157A (ko) * 2010-12-14 2012-06-22 김현겸 인공지능 로봇 안마장치
CN206748435U (zh) * 2017-01-17 2017-12-15 五邑大学 一种智能陪护机器人
CN106821718A (zh) * 2017-03-24 2017-06-13 徐加国 全智能扫描按摩舒经活络的保健方法及机器人
CN107582039A (zh) * 2017-08-28 2018-01-16 陕西舜洋电子科技有限公司 一种智能老年人用保健机器人
CN107693295A (zh) * 2017-09-26 2018-02-16 北京联合大学 带有肌肉电控制装置的按摩椅
CN207415376U (zh) * 2017-10-20 2018-05-29 深圳市前海安测信息技术有限公司 多功能健康监护机器人
CN107752984A (zh) * 2017-11-15 2018-03-06 李玉东 一种基于大数据的高智能全科医疗执业机器人
CN111343958A (zh) * 2017-11-29 2020-06-26 美的集团股份有限公司 使用机器视觉的按摩机器人
CN110123623A (zh) * 2019-05-16 2019-08-16 湖北工业大学 一种智能康复按摩机器人
US20200035237A1 (en) * 2019-07-09 2020-01-30 Lg Electronics Inc. Communication robot and method for operating the same
CN111437174A (zh) * 2020-04-16 2020-07-24 深圳瀚维智能医疗科技有限公司 理疗按摩机器人

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113797088A (zh) * 2021-08-31 2021-12-17 中科尚易健康科技(北京)有限公司 机械臂和床的联动控制方法及控制系统
CN116308949A (zh) * 2023-02-21 2023-06-23 京大(北京)技术有限公司 一种社区居家型康复训练机器人

Similar Documents

Publication Publication Date Title
CN108187310B (zh) 基于力觉信息和姿态信息的肢体运动意图理解与上肢康复训练机器人及其控制方法
WO2022073467A1 (zh) 一种陪护按摩双臂多任务并行处理机器人装置
CN109172066B (zh) 基于语音控制与视觉识别的智能假肢手及其系统和方法
Chu et al. The helping hand: An assistive manipulation framework using augmented reality and tongue-drive interfaces
Dragusanu et al. Design, development, and control of a tendon-actuated exoskeleton for wrist rehabilitation and training
CN109940636A (zh) 一种用于商业表演的人形机器人
CN106943300A (zh) 点穴治疗保健塑身的方法及系统
CN212421309U (zh) 一种足式机器人的远程操控装置
Galambos et al. Vibrotactile force feedback for telemanipulation: Concept and applications
CN111687847A (zh) 一种足式机器人的远程操控装置和操控交互方式
Imran et al. Design of an Affordable Prosthetic Arm Equipped With Deep Learning Vision-Based Manipulation
Bhuvaneswari et al. Humanoid robot based physiotherapeutic assistive trainer for elderly health care
Chu et al. Hands-free assistive manipulator using augmented reality and tongue drive system
Pacchierotti Cutaneous haptic feedback for robotics and Virtual Reality
Huang et al. Enhancing Telecooperation Through Haptic Twin for Internet of Robotic Things: Implementation and Challenges
CN114010184A (zh) 平面康复机器人运动数据采集和镜像方法
CN210377375U (zh) 体感交互装置
CN211044788U (zh) 演示系统
Castellini et al. Gaze tracking for robotic control in intelligent teleoperation and prosthetics
Boboc et al. Learning new skills by a humanoid robot through imitation
Xu Design, Development, and Control of an Assistive Robotic Exoskeleton Glove Using Reinforcement Learning-Based Force Planning for Autonomous Grasping
KR101348940B1 (ko) 로봇의 원격조작시 발생하는 물리적 상호작용 전달을 위한 감각 재현 시스템
CN113878595B (zh) 基于树莓派的仿人实体机器人系统
Setapen et al. Beyond teleoperation: Exploiting human motor skills with marionet
WO2022073468A1 (zh) 一种外科治疗,康复机器人装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21876985

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 21876985

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 21876985

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22/09/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21876985

Country of ref document: EP

Kind code of ref document: A1