CN111673745A - Robot control system and method based on somatosensory interaction - Google Patents

Robot control system and method based on somatosensory interaction Download PDF

Info

Publication number
CN111673745A
CN111673745A CN202010462701.2A CN202010462701A CN111673745A CN 111673745 A CN111673745 A CN 111673745A CN 202010462701 A CN202010462701 A CN 202010462701A CN 111673745 A CN111673745 A CN 111673745A
Authority
CN
China
Prior art keywords
robot
camera
interactive
data
arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010462701.2A
Other languages
Chinese (zh)
Inventor
王昊洁
胡博
沈燕
谈荣胜
范全枝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Ruijie Network Science & Technology Co ltd
Original Assignee
Shanghai Ruijie Network Science & Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Ruijie Network Science & Technology Co ltd filed Critical Shanghai Ruijie Network Science & Technology Co ltd
Priority to CN202010462701.2A priority Critical patent/CN111673745A/en
Publication of CN111673745A publication Critical patent/CN111673745A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot control system and method based on somatosensory interaction, and relates to the technical field of somatosensory interaction. The system comprises a lifting mechanism provided with a 3D camera, a control host and an interactive robot, wherein a monitoring camera captures video data of the execution action of the interactive robot in real time and transmits the video data to a display screen beside an interdynamic person for displaying through the control host, and the 3D camera, the monitoring camera, a first servo motor and the interactive robot are respectively and electrically connected with the control host; the control host comprises a master controller, an arm identification module, a memory, a servo motor controller and a robot control module. The invention solves the problem that the robot control cannot be directly interacted through the interaction human body action, and the interaction person can directly enter the induction range to implement the interaction action on the robot without any identification and sign on the human body, and does not need to directly contact the robot controller part to control the robot.

Description

Robot control system and method based on somatosensory interaction
Technical Field
The invention belongs to the technical field of somatosensory interaction, and particularly relates to a robot control system based on somatosensory interaction and a method for controlling projection based on somatosensory interaction.
Background
A Robot (Robot) is a machine device that automatically performs work. It can accept human command, run the program programmed in advance, and also can operate according to the principle outline action made by artificial intelligence technology. The task of which is to assist or replace human work, such as production, construction, or dangerous work.
The motion sensing technology enables people to directly use limb actions to interact with peripheral devices or environments, and any complex control equipment is not needed. For example, the motion sensing game can control the player in the game by using the body motion without any controller, so that the user can roam into the sea of the game more really. The existing sensing technologies mainly include image sensing technologies, such as microsoft color and depth sensing lens Kinect; distance sensing technologies such as infrared proximity sensors, ultrasonic distance sensor technologies, etc. In the prior art, almost all robots are controlled by controllers to achieve corresponding actions, how to enable the execution actions of the robots to be as same as those of people and interact together to achieve feedback action indication of the interaction people is achieved, the robots are applied to emergency rescue and disaster relief, and the robots are combined with a 5G technology for remote control, so that great significance is achieved.
Disclosure of Invention
The invention solves the problem that the robot control cannot be directly interacted through the interaction human body action, and the interaction person can directly enter the induction range to implement the interaction action on the robot without any identification and sign on the human body, and does not need to directly contact the robot controller part to control the robot.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention discloses a robot control system based on somatosensory interaction, which comprises a lifting mechanism provided with a 3D camera, a projection wall vertically opposite to a projection surface of the 3D camera, a control host and an interactive robot, wherein the control host is connected with the interactive robot; the lifting mechanism comprises a vertical plate vertically arranged on the surface of the bottom plate, a first lifting screw rod arranged between the connecting seats at the top and the bottom of the vertical plate and a first servo motor arranged at the top of the vertical plate and used for driving the first lifting screw rod, and the 3D camera is arranged on the first lifting screw rod through a ball nut; the interactive robot adopts a six-degree-of-freedom mechanical arm robot and comprises a base, a support column vertically arranged on the surface of the base, a second servo motor arranged at the bottom of the support column, a second lifting screw rod vertically arranged on the support column, and a six-degree-of-freedom mechanical arm arranged on the second lifting screw rod through a ball nut;
a monitoring camera which is over against the six-degree-of-freedom mechanical arm to monitor in real time is installed at the top of the supporting column, and adjacent connecting rods in the six-degree-of-freedom mechanical arm are driven to be connected through a third servo motor to control actions;
the monitoring camera captures the video data of the execution action of the interactive robot in real time and transmits the video data to a display screen beside an interactive person through the control host for display, and the 3D camera, the monitoring camera, the first servo motor and the interactive robot are respectively and electrically connected with the control host;
the control host comprises a master controller, and an arm identification module, a memory, a servo motor controller and a robot control module which are respectively and electrically connected with the master controller; the control host is used for regulating and controlling a projection surface of the 3D camera, storing arm identification model data and controlling the interactive robot to execute arm interactive actions.
Furthermore, the 3D camera adopts a Kinect somatosensory camera and is used for acquiring action image data of an interactive person in front of the 3D camera in real time by utilizing a depth map imaging principle, positioning the arm of the interactive person and capturing the action data to form continuous image data and transmitting the continuous image data to the master controller.
Further, the arm recognition module adopts an ATK-PAJ7620 recognition module, and is configured to detect a gesture image captured by the 3D camera and subjected to image processing by the master controller, and compare the gesture image with the model-trained arm sample model stored in the memory for recognition.
Furthermore, the robot control module comprises an execution action data acquisition card, a pulse regulator and a motor driver which are electrically connected in sequence, wherein the pulse regulator is electrically connected with a PI controller.
Furthermore, the execution motion acquisition card adopts an NI-USB6008 data acquisition card, acquires an angular velocity data signal, and gives a control signal to the motor driver to control the second servo motor and the third servo motor to rotate.
Further, the pulse regulator adopts a PWM pulse regulator, and is configured to perform pulse width modulation on the control signal, where the PI controller is configured to control the PWM pulse regulator.
A robot control method based on somatosensory interaction comprises the following steps:
s01, standing an interactive person between the 3D camera and the projection wall to perform human body action;
s02, the 3D camera is started to perform continuous infrared projection on the interactive people in real time by using a depth map imaging principle, a CMOS imager in the 3D camera is used for capturing deformation to determine depth information and receive graphic data of a three-dimensional space;
s03, after pixel-level evaluation, edge detection and noise reduction are carried out on the acquired three-dimensional space graphic data by the master controller, a human body contour and a background are separated by using a depth map in a static environment, and twenty joint parts of an interactive human body are positioned according to the separated human body contour;
s04, the arm recognition module compares and recognizes the twenty joint parts in the acquired human body outline with the arm action model data in the memory, positions the coordinates of each part of the arm part executing the action, and acquires the image data with the coordinate data of the corresponding executing action;
s05, the servo motor controller performs lifting action according to the image data of the coordinate data, and the 3D camera is regulated and controlled to be right opposite to the interactive person; the robot control module acquires the image data with the coordinate data and drives the interactive robot to execute the corresponding action of the interactive person.
Furthermore, the interactive robot receives the image data with the coordinate data to perform drive control, and the drive control is realized by adopting a visual programming tool LabVIEW, and the image data with the coordinate data is directly read through an SDK (software development kit) carried by the interactive robot.
The invention has the following beneficial effects:
the robot control system and the method based on somatosensory interaction have the advantages that the problem that interaction cannot be generated on robot control directly through interaction human body actions is solved, an interdynamic person can directly enter a sensing range without any identification and sign on a human body, the interaction action on the robot can be implemented without directly contacting with a robot controller part to control the robot, the robot control system and the method based on somatosensory interaction are strong in universality, the robot can be controlled to be suitable for actions of various target models, different interaction actions can be set according to needs, accurate identification interaction operation can be achieved without directly contacting with any controller equipment, the depth recognition capability is realized, real-time updating operation and control actions and association between a training interaction human action model and a robot action operation command are carried out according to use needs, and high-accuracy and high-universality robot interaction experience is achieved.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of an overall system external structure of a robot control system based on somatosensory interaction according to the present invention;
FIG. 2 is a right side view of the configuration of the interactive robot of FIG. 1;
FIG. 3 is a frame diagram of a system of a robot control system based on somatosensory interaction according to the present invention;
FIG. 4 is a schematic diagram of the configuration of the robotic control module of FIG. 3;
FIG. 5 is a step diagram of a robot control method based on somatosensory interaction according to the present invention;
FIG. 6 is a simplified method diagram of the interactive human silhouette recognition and arm interaction positioning region of S03 and S04 in FIG. 5;
in the drawings, the components represented by the respective reference numerals are listed below:
1-lifting mechanism, 101-bottom plate, 2-3D camera, 3-first servo motor, 301-first lifting screw rod, 4-control host, 5-interactive robot, 501-base, 502-support column, 503-second lifting screw rod, 504-monitoring camera, 505-six-degree-of-freedom mechanical arm, 506-second servo motor, 507-third servo motor, 6-display screen and 7-projection wall.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it is to be understood that the terms "vertical," "top," "bottom," and the like are used for convenience in describing the invention and for simplicity in description, and do not indicate or imply that the components or elements so referred to must be in a particular orientation, constructed and operated in a particular orientation, and thus are not to be considered as limiting the invention.
Referring to fig. 1-4, the robot control system based on somatosensory interaction of the invention comprises a lifting mechanism 1 provided with a 3D camera 2, a projection wall 7 vertically facing the projection surface of the 3D camera 2, a control host 4 and an interactive robot 5; the lifting mechanism 1 comprises a vertical plate vertically arranged on the surface of the bottom plate 101, a first lifting screw 301 arranged between the connecting seats at the top and the bottom of the vertical plate, and a first servo motor 3 arranged at the top of the vertical plate and used for driving the first lifting screw 301, and the 3D camera 2 is arranged on the first lifting screw 301 through a ball nut; the interactive robot 5 adopts a six-degree-of-freedom mechanical arm robot and comprises a base 501, a support column 502 vertically arranged on the surface of the base 501, a second servo motor 506 arranged at the bottom of the support column 502, a second lifting screw 503 vertically arranged on the support column 502, and a six-degree-of-freedom mechanical arm 505 arranged on the second lifting screw 503 through a ball nut;
a monitoring camera 504 which is over against the six-degree-of-freedom mechanical arm 505 to monitor in real time is installed at the top of the supporting column 502, and adjacent connecting rods in the six-degree-of-freedom mechanical arm 505 are driven by a third servo motor 507 to be connected with each other to control actions;
the monitoring camera 504 captures the action video data executed by the interactive robot 5 in real time and transmits the action video data to the display screen 6 beside the interactive person through the control host 4 for displaying, and the 3D camera 2, the monitoring camera 504, the first servo motor 3 and the interactive robot 5 are respectively and electrically connected with the control host 4;
the control host 4 comprises a master controller, and an arm identification module, a memory, a servo motor controller and a robot control module which are respectively and electrically connected with the master controller; the control host 4 is used for regulating and controlling a projection plane of the 3D camera 2, storing arm recognition model data and controlling the interactive robot 5 to execute arm interactive actions.
The 3D camera 2 adopts a Kinect somatosensory camera for acquiring the action image data of the interactive person in front of the 3D camera 2 in real time by utilizing a depth map imaging principle, positioning the arm of the interactive person and grabbing the action data to form continuous image data and transmitting the continuous image data to the master controller.
The arm recognition module adopts an ATK-PAJ7620 recognition module, and is used for detecting the gesture image which is captured by the 3D camera 2 and is subjected to image processing by the master controller, and comparing and recognizing the gesture image with the arm sample model which is stored in the memory and is subjected to model training.
The robot control module comprises an execution action data acquisition card, a pulse regulator and a motor driver which are electrically connected in sequence, and the pulse regulator is electrically connected with a PI (proportional integral) controller.
The execution motion acquisition card adopts an NI-USB6008 data acquisition card, acquires an angular velocity data signal, and gives a control signal to the motor driver to control the second servo motor 506 and the third servo motor 507 to rotate.
The pulse regulator adopts a PWM pulse regulator and is used for carrying out pulse width modulation on the control signal, and the PI controller is used for controlling the PWM pulse regulator.
As shown in fig. 5 to 6, a robot control method based on somatosensory interaction includes the following steps:
s01, the interactive person stands between the 3D camera 2 and the projection wall 7 to perform human body action;
s02, the 3D camera 2 starts continuous infrared projection on the interactive people in real time by using a depth map imaging principle, the depth information is determined by capturing deformation through a CMOS imager in the 3D camera 2, and the graphic data of the three-dimensional space are received;
s03, after pixel-level evaluation, edge detection and noise reduction processing are carried out on the acquired three-dimensional space graphic data by the master controller, a human body outline and a background are separated by using a depth map in a static environment, and twenty joint parts of an interactive human body are positioned according to the separated human body outline, wherein the twenty joint parts comprise a head, a neck bottom, a right hand, a right elbow, a right wrist, a right shoulder, a left hand, a left elbow, a left wrist, a left shoulder, a spine, a hip, a left hip, a right hip, a left knee, a right knee, a left ankle, a right ankle, a left foot and a right foot;
s04, the arm recognition module compares and recognizes twenty joint parts in the acquired human body outline with arm motion model data in a memory, positions coordinates of each part of the arm part executing the motion, and acquires image data with coordinate data of the corresponding executing motion, wherein the arm motion model data is acquired by training a movable human arm model;
s05, the servo motor controller carries out lifting action according to the image data of the coordinate data, and the 3D camera 2 is adjusted and controlled to be right opposite to the interactive person; the robot control module acquires the image data with the coordinate data and drives the interactive robot 5 to execute the corresponding action of the interactive person.
The interactive robot 5 receives image data with coordinate data to perform drive control, and the drive control is realized by adopting a visual programming tool LabVIEW, and the image data with the coordinate data is directly read through an SDK (software development kit) of the interactive robot 5.
Has the advantages that:
the invention relates to a robot control system and a method based on somatosensory interaction, which solves the problem that the robot control cannot be interacted directly through the interaction human body action, enables an interdynamic person to directly enter a sensing range without any identification and sign on the human body, can implement the interaction action on a robot without directly contacting a robot controller part for controlling the robot, has strong universality, can control the action of the robot suitable for various target models, sets different interaction actions according to needs, can realize accurate identification interaction operation without directly contacting any controller equipment, has deep identification capability, performs real-time updating operation action according to the use needs and trains the association between an interaction human action model and a robot action operation command, the robot interaction experience with high accuracy and high universality is realized.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.

Claims (8)

1. A robot control system based on somatosensory interaction comprises a lifting mechanism (1) provided with a 3D camera (2), a projection wall (7) vertically opposite to a projection surface of the 3D camera (2), a control host (4) and an interactive robot (5); the lifting mechanism (1) comprises a vertical plate vertically arranged on the surface of the bottom plate (101), a first lifting screw rod (301) arranged between the connecting seats at the top and the bottom of the vertical plate and a first servo motor (3) arranged at the top of the vertical plate and used for driving the first lifting screw rod (301), and the 3D camera (2) is arranged on the first lifting screw rod (301) through a ball nut; interactive robot (5) adopt six degree of freedom robotic arms robot, including base (501), set up perpendicularly in support column (502) on base (501) surface, install second servo motor (506) in support column (502) bottom, vertically install second lift lead screw (503) on support column (502), install six degree of freedom robotic arms (505) on second lift lead screw (503) through ball nut, its characterized in that:
a monitoring camera (504) which is just over against a six-degree-of-freedom mechanical arm (505) to monitor in real time is mounted at the top of the supporting column (502), and adjacent connecting rods in the six-degree-of-freedom mechanical arm (505) are driven by a third servo motor (507) to be connected with each other to control actions;
the monitoring camera (504) captures the action video data executed by the interactive robot (5) in real time and transmits the action video data to a display screen (6) beside an interactive person through the control host (4) for displaying, and the 3D camera (2), the monitoring camera (504), the first servo motor (3) and the interactive robot (5) are respectively and electrically connected with the control host (4);
the control host (4) comprises a master controller, and an arm identification module, a memory, a servo motor controller and a robot control module which are respectively and electrically connected with the master controller; the control host (4) is used for regulating and controlling a projection surface of the 3D camera (2), storing arm recognition model data and controlling the interactive robot (5) to execute arm interactive actions.
2. The robot control system based on somatosensory interaction of claim 1, wherein the 3D camera (2) adopts a Kinect somatosensory camera, and is used for acquiring motion image data of an interactive person in front of the 3D camera (2) in real time by using a depth map imaging principle, positioning the arm of the interactive person and capturing the motion data to form continuous image data and transmitting the continuous image data to the master controller.
3. The robot control system based on somatosensory interaction of claim 1, wherein the arm recognition module adopts an ATK-PAJ7620 recognition module, and is used for detecting gesture images captured by the 3D camera (2) and subjected to image processing by the master controller, and comparing and recognizing the gesture images with the model-trained arm sample model stored in the memory.
4. The robot control system based on somatosensory interaction of claim 1, wherein the robot control module comprises an execution action data acquisition card, a pulse regulator and a motor driver which are electrically connected in sequence, and the pulse regulator is electrically connected with a PI (proportional-integral) controller.
5. The robot control system based on somatosensory interaction of claim 4, wherein the execution motion acquisition card adopts an NI-USB6008 data acquisition card, acquires an angular velocity data signal, and gives a control signal to the motor driver to control the rotation of the second servo motor (506) and the third servo motor (507).
6. The somatosensory interaction-based robot control system according to claim 4, wherein the pulse regulator is a PWM (pulse width modulation) pulse regulator and is used for performing pulse width modulation on the control signal, and the PI controller is used for controlling the PWM pulse regulator.
7. A method for controlling a robot based on somatosensory interaction according to any one of claims 1-6, comprising the steps of:
s01, the interactive person stands between the 3D camera (2) and the projection wall (7) to perform human body action;
s02, the 3D camera (2) starts continuous infrared projection on the interactive people in real time by using a depth map imaging principle, and the depth information is determined by capturing deformation through a CMOS imager in the 3D camera (2) and receiving the graphic data of the three-dimensional space;
s03, after pixel-level evaluation, edge detection and noise reduction are carried out on the acquired three-dimensional space graphic data by the master controller, a human body contour and a background are separated by using a depth map in a static environment, and twenty joint parts of an interactive human body are positioned according to the separated human body contour;
s04, the arm recognition module compares and recognizes the twenty joint parts in the acquired human body outline with the arm action model data in the memory, positions the coordinates of each part of the arm part executing the action, and acquires the image data with the coordinate data of the corresponding executing action;
s05, the servo motor controller carries out lifting action according to the image data of the coordinate data, and the 3D camera (2) is regulated and controlled to face the interactive person; the robot control module acquires image data with coordinate data and drives the interactive robot (5) to execute corresponding actions of the interactive person.
8. The robot control method based on somatosensory interaction according to claim 7, wherein the interactive robot (5) receives image data with coordinate data for driving control and is realized by adopting a visual programming tool LabVIEW, and the image data with coordinate data is directly read through an SDK (software development kit) carried by the interactive robot (5).
CN202010462701.2A 2020-05-27 2020-05-27 Robot control system and method based on somatosensory interaction Pending CN111673745A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010462701.2A CN111673745A (en) 2020-05-27 2020-05-27 Robot control system and method based on somatosensory interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010462701.2A CN111673745A (en) 2020-05-27 2020-05-27 Robot control system and method based on somatosensory interaction

Publications (1)

Publication Number Publication Date
CN111673745A true CN111673745A (en) 2020-09-18

Family

ID=72453566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010462701.2A Pending CN111673745A (en) 2020-05-27 2020-05-27 Robot control system and method based on somatosensory interaction

Country Status (1)

Country Link
CN (1) CN111673745A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112936282A (en) * 2021-03-08 2021-06-11 常州刘国钧高等职业技术学校 Method and system for improving motion sensing control accuracy of industrial robot
CN113498952A (en) * 2021-06-07 2021-10-15 西安理工大学 Model display device with human-computer interaction function

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107214679A (en) * 2017-07-17 2017-09-29 武汉大学 Mechanical arm man-machine interactive system based on body-sensing sensor
WO2019024577A1 (en) * 2017-08-01 2019-02-07 东南大学 Natural human-computer interaction system based on multi-sensing data fusion

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107214679A (en) * 2017-07-17 2017-09-29 武汉大学 Mechanical arm man-machine interactive system based on body-sensing sensor
WO2019024577A1 (en) * 2017-08-01 2019-02-07 东南大学 Natural human-computer interaction system based on multi-sensing data fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄祖良等: "体感交互式拟人手臂机器人", 《轻工机械》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112936282A (en) * 2021-03-08 2021-06-11 常州刘国钧高等职业技术学校 Method and system for improving motion sensing control accuracy of industrial robot
CN112936282B (en) * 2021-03-08 2022-01-07 常州刘国钧高等职业技术学校 Method and system for improving motion sensing control accuracy of industrial robot
CN113498952A (en) * 2021-06-07 2021-10-15 西安理工大学 Model display device with human-computer interaction function

Similar Documents

Publication Publication Date Title
CN111055281B (en) ROS-based autonomous mobile grabbing system and method
Asfour et al. Toward humanoid manipulation in human-centred environments
US20210205986A1 (en) Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose
CN110480634B (en) Arm guide motion control method for mechanical arm motion control
CN107891425B (en) Control method of intelligent double-arm safety cooperation man-machine co-fusion robot system
CN109571513B (en) Immersive mobile grabbing service robot system
CN110216674B (en) Visual servo obstacle avoidance system of redundant degree of freedom mechanical arm
US9008442B2 (en) Information processing apparatus, information processing method, and computer program
CN105252532A (en) Method of cooperative flexible attitude control for motion capture robot
US20230042756A1 (en) Autonomous mobile grabbing method for mechanical arm based on visual-haptic fusion under complex illumination condition
CN107756417A (en) The intelligent man-machine co-melting robot system of both arms security cooperation
CN103112007A (en) Human-machine interaction method based on mixing sensor
KR20100085297A (en) Mobile display apparatus, robot have mobile display apparatus and display method thereof
CN111673745A (en) Robot control system and method based on somatosensory interaction
US11422625B2 (en) Proxy controller suit with optional dual range kinematics
CN115741732A (en) Interactive path planning and motion control method of massage robot
JP2020019127A (en) Cooperative operation support device
Matsumoto et al. The essential components of human-friendly robot systems
CN207578422U (en) The intelligent man-machine co-melting robot system of both arms security cooperation
Wang et al. Design and implementation of humanoid robot behavior imitation system based on skeleton tracking
Barnes et al. Direction control for an active docking behaviour based on the rotational component of log-polar optic flow
JP2019202354A (en) Robot control device, robot control method, and robot control program
KR20190091870A (en) Robot control system using motion sensor and VR
KR20230100101A (en) Robot control system and method for robot setting and robot control using the same
Inaba et al. Vision-based multisensor integration in remote-brained robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200918

RJ01 Rejection of invention patent application after publication