CN107030692A - One kind is based on the enhanced manipulator teleoperation method of perception and system - Google Patents
One kind is based on the enhanced manipulator teleoperation method of perception and system Download PDFInfo
- Publication number
- CN107030692A CN107030692A CN201710192822.8A CN201710192822A CN107030692A CN 107030692 A CN107030692 A CN 107030692A CN 201710192822 A CN201710192822 A CN 201710192822A CN 107030692 A CN107030692 A CN 107030692A
- Authority
- CN
- China
- Prior art keywords
- positional information
- manipulator
- information
- image
- human hand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/085—Force or torque sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40315—Simulation with boundary graphs
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Manipulator (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention relates to one kind based on enhanced manipulator teleoperation method and system is perceived, belong to manipulator control technical field;Wherein, method includes:Identify the positional information in human hand finger tip and centre of the palm in motion process;Positional information is mapped to Robot actions instruction;The pressure data of mechanical pressure sensor on hand is received, and according to the oscillation intensity of vibrator at pressure data control human hand;The image information for being arranged in robot work platform acquired in the depth camera of more than two around manipulator is received, 3D real-time operation scenes are built.By way of depth camera and hand mark, quickly and easily obtain hand exercise data, and the perception of remote operating process is strengthened by pressure feedback and visual feedback, operating personnel is got more information in remote control process, help optimizes remote operation effect.
Description
Technical field
Technical field is manipulated the present invention relates to manipulator, specifically, is related to a kind of distant based on enhanced manipulator is perceived
Operating method and system.
Background technology
In recent years, as manipulator manipulates the development of technology, especially by collection human hand action data to simulate human hand
The manipulator manipulation technology of action, makes manipulator be had great application prospect in every profession and trade.In such manipulation technology, normal open
The wearable devices such as ectoskeleton, data glove are crossed to be acquired human hand action data, and the collection of this kind of wearable device is to be based on
Sensor thereon, causes its price higher, it is difficult to practical, especially civil nature.
In addition, with the fast development of robot technology, especially civilian service robot and humanoid robot technology are right
The demand of inexpensive class manipulation technology is also increasing, and the teleoperation of remote control can be especially carried out to manipulator.
The content of the invention
It is an object of the invention to provide one kind based on enhanced manipulator teleoperation method and system is perceived, at lower cost
Realize the remote operating to manipulator.
To achieve these goals, the present invention is provided includes action recognition based on the enhanced manipulator teleoperation method of perception
Step, action mapping step, pressure feedback step and visual feedback step;Wherein, action recognition step includes identifying human hand
The positional information in finger tip and the centre of the palm in motion process;Action mapping step refers to including positional information is mapped into Robot actions
Order;Pressure feedback step includes receiving the pressure data of mechanical pressure sensor on hand, and is controlled according to pressure data at human hand
The oscillation intensity of vibrator;The depth camera that visual feedback step is arranged in more than two around manipulator including receiving is obtained
The image information of robot work platform is taken, 3D real-time operation scenes are built.
In the course of the work, the positional information in human hand finger tip and centre of the palm in motion process is identified by identification step,
By the digitization for then passing through collection finger tip, the positional information realization in the centre of the palm is acted, and the low collections of cost such as Kinect can be used
Equipment, to reduce the cost of the distant behaviour's technology of manipulator;By acting mapping step, action data processing is converted into corresponding machine
Tool hand operational order, realizes the mapping that human action's data are instructed to Robot actions, so as to remotely be grasped to manipulator
Control;By pressure feedback step, pressure information of the manipulator during emulation human hand crawl is fed back to by oscillation intensity
Human hand, to realize the pressure of human perception manipulator;By visual feedback step, the image obtained using two depth cameras
Information architecture goes out the 3D real-time scenes of Robot actions and feeds back to operating personnel, to realize that perceiving enhanced manipulator remotely controls
System operation.
Specific scheme is to identify that human hand includes in motion process the step of the positional information in finger tip and the centre of the palm:Receive
Coloured image, depth image and skeleton data of the human hand in motion process are obtained by Kinect;According to skeleton data, people is obtained
The positional information of the centre of the palm;According to depth image, coloured image and the positional information in the centre of the palm, identify in fingertip
Colour-coded, acquisition includes the fingertip positional information of finger tip three-dimensional coordinate information.By arranging color mark in fingertip
Remember the colour-codeds such as paper tape, according to Kinect depth information and image information, obtain finger tip three-dimensional coordinate data.
Another specific scheme includes for positional information is mapped into the step of Robot actions are instructed:According to position letter
Breath, extracts semantic information, and semantic information is mapped into Robot actions instruction.
More specifically scheme includes to extract the step of semantic information:It is according to the positional information continuously acquired, people is manual
Work is classified as four kinds of semantemes of mobile, static, crawl and release.Effectively simplify and refer to human hand positional information mapping Robot actions
The mapping algorithm of order.
Another specific scheme is arranged in around manipulator machinery acquired in the depth camera of more than two to receive
The image information of hand workbench, the step of building 3D real-time operation scenes includes:Depth camera is Kinect;According to Kinect
Depth image, the coloured image of acquisition, build the 3D point cloud data of different directions;Depth image, the colour obtained according to Kinect
Image, identifies the mark arranged on robot work platform;Calibrated by marking, the 3D point cloud data of different directions are melted
Synthesize 3D rendering model;3D rendering model is optimized by ICP algorithm.It is easy to operating personnel to be seen by screen or VR glasses
Real-time 3D operation scenarios are examined, enhancing is perceived to realize.
In order to realize above-mentioned another object, the present invention is provided includes action based on the enhanced manipulator remote control system of perception
Recognition unit, visual feedback unit, pressure feedback unit and control unit;Wherein, action recognition unit includes being used to obtain people
The Kinect of hand position information, the depth of more than two that visual feedback unit includes obtaining robot work platform image information is taken the photograph
Camera;Pressure feedback unit includes obtaining the pressure sensor and the vibrator on human hand of mechanical hand pressure, control unit
Including communicating the processor being connected with Kinect, depth image video camera, pressure sensor and vibrator.It is distant by the manipulator
The cooperation of operating system and manipulator, can realize the remote control to manipulator with relatively low cost.
Specific scheme is used for for processor:Receive and coloured image of the human hand in motion process, depth are obtained by Kinect
Spend image and skeleton data;The positional information of people's centre of the palm is obtained according to skeleton data;According to depth image, coloured image and the palm
The positional information of the heart, identifies the colour-coded in fingertip, and acquisition includes the finger of finger tip three-dimensional coordinate information
Sharp positional information;According to the positional information of people's centre of the palm for continuously acquiring and finger tip, by human hand action be classified as it is mobile, static,
Four kinds of semantemes of crawl and release, and the semantic information extracted from positional information is mapped to Robot actions instruction;Receive pressure
The pressure data of force snesor and the oscillation intensity that vibrator is controlled according to the pressure data;Receive acquired in depth camera
Image information, builds 3D real-time operation scenes.
The present invention is based on enhanced manipulator teleoperation method and system is perceived, and it utilizes image technique, sensing technology, pressure
Power-vibrational feedback technology and 3D visual feedback technologies, are imitated by the way that human hand positional information is converted into operational order control machinery hand
Stranger's hands movement is operated, and the pressure of manipulator in operation and visual feedback are not only convenient for operating personnel
Operating personnel carry out remote control, and the relative prior art using Wearable collecting device to manipulator, and effectively reduction is real
Ready-made.
Brief description of the drawings
Fig. 1 is the present invention based on the structured flowchart for perceiving enhanced manipulator remote control system;
Fig. 2 is the present invention based on the workflow diagram for perceiving enhanced manipulator teleoperation method.
Embodiment
With reference to embodiments and its accompanying drawing the invention will be further described.
Embodiment
Referring to Fig. 1, the present invention based on perceive enhanced manipulator remote control system 1 include manipulator 10, control unit 11,
Action recognition unit 12, pressure feedback unit 13 and visual feedback unit 14.
Manipulator 10 is made up of mechanical arm and gripper, and wherein mechanical arm uses EPSON C4 model mechanical arms, gripper
Gripper is referred to using ROBOTIQ tri-.
Action recognition unit 12 is arranged in front of user including one to be used to obtain operating personnel people in operation
The Kinect of hand position information.
Pressure feedback unit 13 includes on gripper detecting that it connects during execute instruction with the object to be operated
The pressure sensor and the vibrator on human hand to be fed back by its oscillation intensity to pressure of pressure are touched, at this
Pressure sensor uses FlexiForce pressure sensors in embodiment, and vibrator uses 1027 flat motors.
Visual feedback unit 14 is arranged on the different directions of remote mechanical hand workbench including more than two, for obtaining
The depth camera for taking with anaglyph to build 3D real-time operation scenes and the display for showing 3D real-time operation scenes
Device, screen or VR glasses, in the present embodiment, depth camera selects Kinect.
Control unit 11, which includes showing with pressure sensor, Kinect, vibrator, manipulator and 3D real-time operations scene, to be filled
The processor for carrying out communication connection is put, in the present embodiment, communication is connected as carrying out data transmission by communication line, connection
Road is to include one or more being configured in processor and pressure sensor, Kinect, vibrator, manipulator and 3D real-time operations
It is used for the data circuit of data information transfer, including but not limited to electric line, optowire, radiolink between scene display device
And the combination more than both.
Referring to Fig. 2, the present invention is reflected based on enhanced manipulator teleoperation method is perceived including action recognition step S1, action
Penetrate step S2, pressure feedback step S3 and visual feedback step S4.
Action recognition step S1, identifies the positional information in human hand finger tip and centre of the palm in motion process.
(1) three kinds of data that Kinect is obtained are received:Receive Kinect and obtain operating personnel's human hand in motion process
Skeleton data, depth image and coloured image, return if any one in three kinds of data obtains failure and reacquire, here
The skeleton data of selected distance Kinect most persons of modern times;
(2) obtain and limit hand region:According to the coordinate of the right hand in skeleton data, coordinate is mapped to depth value and colour
Image pixel coordinates, choose a depth threshold and a picture traverse threshold value, select and whole hand is included in coloured image
Rectangular extent;
(3) colour recognition is carried out:RGB image in the rectangular extent is processed into YUV model images, passes through marker color
YUV threshold values and depth threshold, find out the color coordinate for meeting threshold condition, be converted into two-value gray level image, such as do not mark
Remember color, then return to step (1);
(4) morphology processing is carried out to bianry image:Etching operation is carried out first, removes noise data, then perform
Expansive working recovers image, finally carries out closed operation, and carrying out edge to image improves processing;
(5) finger tip coordinate is obtained:Connected region maximum in bianry image is found out, center is calculated, then converted
It is distance between referring to by finger tip coordinate transformation for Kinect three-dimensional coordinate, i.e. finger tip coordinate.By distance between finger and skeleton data
Hand coordinate is sent to distal end.
Mapping step S2 is acted, positional information is mapped to Robot actions instruction.
Action mapping is that the data that will be collected are for further processing, and extracts semantic, i.e., according to current location information and
A series of correlation between positional informations is obtained before, judges the corresponding operational semantics of current location information, operational semantics
Including static and mobile, crawl and release.When human hand is moved, the Euclidean distance that centre of the palm coordinate can be with centre of the palm coordinate before
Generation relatively large deviation, due to noise data, the Euclidean distance of current centre of the palm coordinate and centre of the palm coordinate before when static
Fluctuation is had, but the fluctuation range is that in a less scope, can be more than the threshold when fluctuating by setting a threshold value
During value, then human hand is moved operation, when less than threshold value, then human hand is in static.Same mode is taken in crawl and release,
Distance threshold between finger by setting crawl and release, to judge crawl and release behavior.Then it is corresponding by semantic conversion
Robot actions are instructed, the data of the action such as the preferred multigroup motion of collection and static, crawl and release, find out the threshold value of action,
Then by mapping algorithm, be converted to instruction and be sent to manipulator.Mapping algorithm is:
(1) manipulator is initialized:Initialization command is sent to mechanical arm and gripper respectively, mechanical arm and gripper is held
Row initialization operation;
(2) distance between referring to is obtained, and finger tip distance is for further processing, finger tip distance is converted into gripper crawl width
Spend parameter;
(3) mechanical arm changing coordinates are obtained, judge whether it is to map for the first time, if so, then recording current action data
As initial position data, and allow manipulator to perform the initial position operation specified, calibrated, be then return to step (2);If
It is no, then continue executing with step (4);
(4) judge whether to complete last operation, that is, whether judge Current mechanical arm coordinate is the last time to perform operation
Purpose coordinate;If it is not, step (2) is then returned to, if so, then performing step (5);
(5) judge whether carrying out crawl release operation, if so, crawl release operation is then performed, the operation of control machine machinery claw,
Mechanical arm is not operated, and is then return to step (2);If it is not, then judging (turning point of hand exercise is sent when whether remote holder is static
To mechanical arm, turning point is static), if it is not, step (2) is then returned to, if so, then performing step (6);
(6) Coordinate Conversion, according to the initial position set in step (3), obtains the relative of current location and initial position
Coordinate, with reference to the mechanical arm initial position co-ordinates of setting, is the coordinate in mechanical arm coordinate system by Coordinate Conversion, performs mobile behaviour
Make, be then return to step (2) and continue executing with.
Pressure feedback step S3, receives the pressure data of mechanical pressure sensor on hand, and control people according to pressure data
The oscillation intensity of vibrator at hand.
Pressure feedback be force information back during manipulator is captured to operating personnel, so as to preferably help people
Remote control grasping movement.Pressure sensor is arranged in the finger tip of gripper, manipulator end is obtained by pressure sensor
Tactile impressions information;It is to dress vibrating sensor on hand in operator by the way of for the feedback of tactile impressions information
Mode.Sensor is connected respectively to two Arduino single-chip microcomputers, and Arduino single-chip microcomputers are connected to bluetooth module, data transfer
Sent by bluetooth.The pressure data collected is divided into 5 grade strengths, and grade strength data are sent into receiving terminal, connect
Receiving end sets the magnitude of voltage of Arduino analog ports according to grade strength data, and varying strength occurs for control vibrating sensor
Vibration, realizes pressure feedback.
Visual feedback step S4, reception is arranged in around manipulator machinery acquired in the depth camera of more than two by hand
Make the image information of platform, build 3D real-time operation scenes.
Visual feedback is that the 3D real-time operation scenes during robot movement are fed back into operating personnel, to help to observe
Remote control manipulator.The Kinect of more than two is arranged in the different directions of robot work platform, is calibrated, known first
Mark in other Kinect image scenes, gets the coordinate position being marked in Kinect coordinate systems, according to Kinect and mark
The relative position of note by Kinect coordinate systems Coordinate Conversion a little into mark coordinate system coordinate, according to default mark
Remember relative position in global coordinate system, be world coordinates by Coordinate Conversion, realize that many Kinect cloud data melts
Close, construct real-time 3D scenes, then a rotation and translation transformation matrix is constantly iterated to calculate out by ICP algorithm so that
Distance between the homologous points of different Kinect cloud data is minimum, so that optimize the 3D scenes constructed, it is anti-in order to ensure
The real-time of feedback, limitation will be made to iterations, when iterations is too high, return to sub-optimal result.What manipulator end was obtained
Kinect initial data is transmitted directly to remote port, and initial data is being further processed as cloud data by remote port, it is to avoid straight
Cloud data is received and sent, data volume is larger, influence the real-time of visual feedback.
In use, the processor in control unit 11 is used for:
(1) receive and coloured image, depth image and skeleton data of the human hand in motion process are obtained by Kinect;Foundation
Skeleton data, obtains the positional information of people's centre of the palm;According to depth image, coloured image and the positional information in the centre of the palm, identify
Colour-coded in fingertip, acquisition includes the fingertip positional information of finger tip three-dimensional coordinate information;So as to recognize
Go out the positional information in human hand finger tip and centre of the palm in motion process.
(2) positional information according to people's centre of the palm for continuously acquiring and finger tip, by human hand action be classified as it is mobile, static,
Four kinds of semantemes of crawl and release, and the semantic information extracted from positional information is mapped to Robot actions instruction, then will behaviour
Gripper and mechanical arm are sent respectively to as instruction.
(3) pressure data of pressure sensor is received, pressure data is divided into 5 grades, and according to the pressure rating number
According to the oscillation intensity of control vibrator.
(4) depth image, the coloured image obtained according to Kinect, builds the 3D point cloud data of different directions;Foundation
Depth image, the coloured image of Kinect acquisitions, identify the mark arranged on robot work platform;Determined by mark
Mark, by the 3D point cloud data fusion of different directions into 3D rendering model;3D rendering model is optimized by ICP algorithm.
Description in specific operation process such as above-mentioned method and step, will not be repeated here.
The present invention is obtained during hand exercise using the data of depth camera, by mapping algorithm, data is turned
Semantic information is turned to, corresponding Robot actions instruction is further converted to, is sent to the manipulator of distal end, the machinery of distal end
Hand is by many Kinect being arranged in the pressure sensor of manipulator finger tip He be arranged in manipulator different directions, by number pressure
Operating personnel are sent back in real time according to 3D real-time scenes, so as to realize the enhanced remote control of perception.
Claims (7)
1. one kind is based on the enhanced manipulator teleoperation method of perception, it is characterised in that:
Action recognition step, identifies the positional information in human hand finger tip and centre of the palm in motion process;
Mapping step is acted, positional information is mapped to Robot actions instruction;
Pressure feedback step, receives the pressure data of machinery pressure sensor on hand, and according to being shaken at pressure data control human hand
The oscillation intensity of dynamic device;
Visual feedback step, reception is arranged in around manipulator robot work platform acquired in the depth camera of more than two
Image information, builds 3D real-time operation scenes.
2. manipulator teleoperation method according to claim 1, it is characterised in that described to identify human hand in motion process
The step of positional information in middle finger tip and the centre of the palm, includes:
Receive and coloured image, depth image and skeleton data of the human hand in motion process are obtained by Kinect;
According to skeleton data, the positional information of people's centre of the palm is obtained;
According to depth image, coloured image and the positional information in the centre of the palm, the colour-coded in fingertip is identified, is obtained
Fingertip positional information including finger tip three-dimensional coordinate information.
3. manipulator teleoperation method according to claim 1, it is characterised in that described that positional information is mapped to machinery
The step of hand operational order, includes:
According to positional information, semantic information is extracted;
Semantic information is mapped to Robot actions instruction.
4. manipulator teleoperation method according to claim 3, it is characterised in that wrap the step of the extraction semantic information
Include:
According to the positional information continuously acquired, human hand action is classified as four kinds of semantemes of mobile, static, crawl and release.
5. manipulator teleoperation method according to claim 1, it is characterised in that the reception is arranged in around manipulator
The image information of robot work platform acquired in the depth camera of more than two, the step of building 3D real-time operation scenes is wrapped
Include:
The depth camera is Kinect;
Depth image, the coloured image obtained according to Kinect, builds the 3D point cloud data of different directions;
Depth image, the coloured image obtained according to Kinect, identifies the mark arranged on robot work platform;
Calibrated by marking, by the 3D point cloud data fusion of different directions into 3D rendering model;
3D rendering model is optimized by ICP algorithm.
6. one kind is based on the enhanced manipulator remote control system of perception, it is characterised in that including:
Action recognition unit, including for obtaining the Kinect of human hand positional information;
Visual feedback unit, including obtain the depth camera of more than two of robot work platform image information;
Pressure feedback unit, including obtain the pressure sensor and the vibrator on human hand of mechanical hand pressure;
Control unit, including communicate the processor being connected with Kinect, depth image video camera, pressure sensor and vibrator.
7. manipulator remote control system according to claim 6, it is characterised in that the processor is used for:
Receive and coloured image, depth image and skeleton data of the human hand in motion process are obtained by Kinect;
The positional information of people's centre of the palm is obtained according to skeleton data;
According to depth image, coloured image and the positional information in the centre of the palm, the colour-coded in fingertip is identified, is obtained
Fingertip positional information including finger tip three-dimensional coordinate information;
According to the positional information of people's centre of the palm for continuously acquiring and finger tip, by human hand action be classified as it is mobile, static, capture and release
Four kinds of semantemes are put, and the semantic information extracted from positional information is mapped to Robot actions instruction;
Receive the pressure data of pressure sensor and the oscillation intensity of vibrator is controlled according to the pressure data;
The image information acquired in depth camera is received, 3D real-time operation scenes are built.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710192822.8A CN107030692B (en) | 2017-03-28 | 2017-03-28 | Manipulator teleoperation method and system based on perception enhancement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710192822.8A CN107030692B (en) | 2017-03-28 | 2017-03-28 | Manipulator teleoperation method and system based on perception enhancement |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107030692A true CN107030692A (en) | 2017-08-11 |
CN107030692B CN107030692B (en) | 2020-01-07 |
Family
ID=59533742
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710192822.8A Active CN107030692B (en) | 2017-03-28 | 2017-03-28 | Manipulator teleoperation method and system based on perception enhancement |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107030692B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107639620A (en) * | 2017-09-29 | 2018-01-30 | 西安交通大学 | A kind of control method of robot, body feeling interaction device and robot |
CN107953338A (en) * | 2017-12-29 | 2018-04-24 | 深圳市越疆科技有限公司 | A kind of method, apparatus and mechanical arm of robot segregating articles |
CN107961078A (en) * | 2017-12-18 | 2018-04-27 | 微创(上海)医疗机器人有限公司 | Surgical robot system and its operating theater instruments |
CN108748139A (en) * | 2018-04-18 | 2018-11-06 | 四川文理学院 | Robot control method based on human body temperature type and device |
CN109483538A (en) * | 2018-11-16 | 2019-03-19 | 左志强 | A kind of VR movement projection robot system based on Kinect technology |
CN110815258A (en) * | 2019-10-30 | 2020-02-21 | 华南理工大学 | Robot teleoperation system and method based on electromagnetic force feedback and augmented reality |
CN110853099A (en) * | 2019-11-19 | 2020-02-28 | 福州大学 | Man-machine interaction method and system based on double Kinect cameras |
CN111160088A (en) * | 2019-11-22 | 2020-05-15 | 深圳壹账通智能科技有限公司 | VR (virtual reality) somatosensory data detection method and device, computer equipment and storage medium |
CN115741701A (en) * | 2022-11-22 | 2023-03-07 | 上海智能制造功能平台有限公司 | Force and position hybrid robot track and action guiding system and method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102814814A (en) * | 2012-07-31 | 2012-12-12 | 华南理工大学 | Kinect-based man-machine interaction method for two-arm robot |
CN103302668A (en) * | 2013-05-22 | 2013-09-18 | 东南大学 | Kinect-based space teleoperation robot control system and method thereof |
CN104108097A (en) * | 2014-06-25 | 2014-10-22 | 陕西高华知本化工科技有限公司 | Feeding and discharging mechanical arm system based on gesture control |
CN104866097A (en) * | 2015-05-22 | 2015-08-26 | 厦门日辰科技有限公司 | Hand-held signal output apparatus and method for outputting signals from hand-held apparatus |
CN204740560U (en) * | 2015-05-22 | 2015-11-04 | 厦门日辰科技有限公司 | Handheld signal output device |
CN105877846A (en) * | 2016-03-30 | 2016-08-24 | 杨重骏 | Oral cavity diagnosis robot system and control method thereof |
-
2017
- 2017-03-28 CN CN201710192822.8A patent/CN107030692B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102814814A (en) * | 2012-07-31 | 2012-12-12 | 华南理工大学 | Kinect-based man-machine interaction method for two-arm robot |
CN103302668A (en) * | 2013-05-22 | 2013-09-18 | 东南大学 | Kinect-based space teleoperation robot control system and method thereof |
CN104108097A (en) * | 2014-06-25 | 2014-10-22 | 陕西高华知本化工科技有限公司 | Feeding and discharging mechanical arm system based on gesture control |
CN104866097A (en) * | 2015-05-22 | 2015-08-26 | 厦门日辰科技有限公司 | Hand-held signal output apparatus and method for outputting signals from hand-held apparatus |
CN204740560U (en) * | 2015-05-22 | 2015-11-04 | 厦门日辰科技有限公司 | Handheld signal output device |
CN105877846A (en) * | 2016-03-30 | 2016-08-24 | 杨重骏 | Oral cavity diagnosis robot system and control method thereof |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107639620A (en) * | 2017-09-29 | 2018-01-30 | 西安交通大学 | A kind of control method of robot, body feeling interaction device and robot |
CN107961078A (en) * | 2017-12-18 | 2018-04-27 | 微创(上海)医疗机器人有限公司 | Surgical robot system and its operating theater instruments |
CN107953338A (en) * | 2017-12-29 | 2018-04-24 | 深圳市越疆科技有限公司 | A kind of method, apparatus and mechanical arm of robot segregating articles |
CN108748139A (en) * | 2018-04-18 | 2018-11-06 | 四川文理学院 | Robot control method based on human body temperature type and device |
CN109483538A (en) * | 2018-11-16 | 2019-03-19 | 左志强 | A kind of VR movement projection robot system based on Kinect technology |
CN110815258A (en) * | 2019-10-30 | 2020-02-21 | 华南理工大学 | Robot teleoperation system and method based on electromagnetic force feedback and augmented reality |
CN110853099A (en) * | 2019-11-19 | 2020-02-28 | 福州大学 | Man-machine interaction method and system based on double Kinect cameras |
CN110853099B (en) * | 2019-11-19 | 2023-04-14 | 福州大学 | Man-machine interaction method and system based on double Kinect cameras |
CN111160088A (en) * | 2019-11-22 | 2020-05-15 | 深圳壹账通智能科技有限公司 | VR (virtual reality) somatosensory data detection method and device, computer equipment and storage medium |
WO2021098147A1 (en) * | 2019-11-22 | 2021-05-27 | 深圳壹账通智能科技有限公司 | Vr motion sensing data detection method and apparatus, computer device, and storage medium |
CN115741701A (en) * | 2022-11-22 | 2023-03-07 | 上海智能制造功能平台有限公司 | Force and position hybrid robot track and action guiding system and method |
Also Published As
Publication number | Publication date |
---|---|
CN107030692B (en) | 2020-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107030692A (en) | One kind is based on the enhanced manipulator teleoperation method of perception and system | |
CN110480634B (en) | Arm guide motion control method for mechanical arm motion control | |
CN110405730B (en) | Human-computer interaction mechanical arm teaching system based on RGB-D image | |
CN104589356B (en) | The Dextrous Hand remote operating control method caught based on Kinect human hand movement | |
JP5895569B2 (en) | Information processing apparatus, information processing method, and computer program | |
CN103135754B (en) | Adopt interactive device to realize mutual method | |
CN107662195A (en) | A kind of mechanical hand principal and subordinate isomery remote operating control system and control method with telepresenc | |
CN108509026B (en) | Remote maintenance support system and method based on enhanced interaction mode | |
CN110728739B (en) | Virtual human control and interaction method based on video stream | |
CN109955254A (en) | The remote operating control method of Mobile Robot Control System and robot end's pose | |
CN108828996A (en) | A kind of the mechanical arm remote control system and method for view-based access control model information | |
CN113103230A (en) | Human-computer interaction system and method based on remote operation of treatment robot | |
CN107639620A (en) | A kind of control method of robot, body feeling interaction device and robot | |
CN104656893A (en) | Remote interaction control system and method for physical information space | |
US11422625B2 (en) | Proxy controller suit with optional dual range kinematics | |
CN106468917A (en) | A kind of tangible live real-time video image remotely assume exchange method and system | |
JP6164319B2 (en) | Information processing apparatus, information processing method, and computer program | |
CN111179341B (en) | Registration method of augmented reality equipment and mobile robot | |
CN106502400A (en) | Virtual reality system and virtual reality system input method | |
CN109214295A (en) | The gesture identification method of data fusion based on Kinect v2 and Leap Motion | |
JP7398227B2 (en) | Work support systems and programs | |
CN211180839U (en) | Motion teaching equipment and motion teaching system | |
CN112181135B (en) | 6-DOF visual and tactile interaction method based on augmented reality | |
CN112790760A (en) | Three-dimensional motion attitude capturing method, device, processing equipment and system | |
CN206178663U (en) | Device is judged in gesture instruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |