CN111409068A - Bionic manipulator control system and bionic manipulator - Google Patents
Bionic manipulator control system and bionic manipulator Download PDFInfo
- Publication number
- CN111409068A CN111409068A CN202010176478.5A CN202010176478A CN111409068A CN 111409068 A CN111409068 A CN 111409068A CN 202010176478 A CN202010176478 A CN 202010176478A CN 111409068 A CN111409068 A CN 111409068A
- Authority
- CN
- China
- Prior art keywords
- bionic manipulator
- manipulator
- control
- gesture
- bionic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000011664 nicotinic acid Substances 0.000 title claims abstract description 96
- 230000005540 biological transmission Effects 0.000 claims abstract description 19
- 238000001514 detection method Methods 0.000 claims description 7
- 238000013507 mapping Methods 0.000 claims description 4
- 238000012795 verification Methods 0.000 claims description 3
- 210000003811 finger Anatomy 0.000 description 37
- 238000000034 method Methods 0.000 description 8
- 210000003813 thumb Anatomy 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 235000011034 Rubus glaucus Nutrition 0.000 description 2
- 244000235659 Rubus idaeus Species 0.000 description 2
- 235000009122 Rubus idaeus Nutrition 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000012636 effector Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 210000004932 little finger Anatomy 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a bionic manipulator control system and a bionic manipulator. Relates to the field of automatic control, wherein the system comprises: the first sensor is used for detecting and collecting gesture information; the gesture recognition system upper computer is used for receiving gesture information and converting the gesture information into a control driving signal of the bionic manipulator; the lower computer controller is used for being connected with an upper computer of the gesture recognition system through an IPV6 network, receiving and transmitting control driving signals, encrypting and correspondingly decrypting the control driving signals through an IPV6 protocol encryption mode when the control driving signals are transmitted, receiving the control driving signals, and driving a motor of the bionic manipulator according to the control driving signals to realize the manipulator controller for controlling the bionic manipulator. The safety of a transmission network is improved, and meanwhile, because the IPV6 uses a smaller routing table, the length of the routing table in the router can be reduced, the transmission efficiency of control driving signals is improved, and the bionic manipulator can be controlled in real time.
Description
Technical Field
The invention relates to the field of automatic control, in particular to a bionic manipulator control system and a bionic manipulator.
Background
At present, dangerous work under some severe environments needs manual work, for example, digging and rescuing the lives and properties under a collapsed building after an earthquake occurs, bomb removing work under a battlefield dangerous environment in military operations, and work such as cabin leaving and maintenance of equipment under a space environment, and the like. Therefore, under the consideration of safety and efficiency, the application of the bionic manipulator is more and more extensive, the life and property safety of an operator can be guaranteed, and necessary operation in a dangerous environment can be efficiently and flexibly completed.
However, most of the existing bionic manipulators transmit control signals in an IPV4 protocol mode, and the transmission efficiency is low, so that the control signals cannot control the manipulators in real time, which may cause great loss of life and property, and meanwhile, the transmission network has poor safety, and control signals may be changed by interference, which may cause control misoperation. Therefore, it is necessary to provide a bionic manipulator control system capable of improving the transmission efficiency and safety of the control drive signal of the bionic manipulator.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a bionic manipulator control system which can improve the transmission efficiency and the safety of the bionic manipulator control driving signal.
In a first aspect, an embodiment of the present invention provides: a bionic manipulator control system for controlling a bionic manipulator, comprising:
A first sensor: the gesture detection and acquisition device is used for detecting and acquiring gesture information;
The gesture recognition system upper computer: the bionic manipulator is used for receiving the gesture information and converting the gesture information into a control driving signal of the bionic manipulator;
A lower computer controller: the gesture recognition system is used for being connected with the gesture recognition system upper computer through an IPV6 network, receiving and transmitting the control driving signal, and encrypting and correspondingly decrypting the control driving signal in an IPV6 protocol encryption mode when the control driving signal is transmitted;
A manipulator controller: and the motor is used for receiving the control driving signal and driving the bionic manipulator according to the control driving signal so as to realize the control of the bionic manipulator.
Further, the first sensor is a leap motion sensor, and is configured to identify a gesture motion in the detection area, and collect gesture information included in the gesture motion, where the gesture information includes: position information, motion information.
Further, converting the gesture information into a control driving signal of the bionic manipulator specifically includes:
Obtaining the position relation between each finger tip and the palm center of the bionic manipulator according to the mapping relation between the gesture information and the bionic manipulator;
And respectively calculating to obtain a driving signal corresponding to each finger steering engine according to the position relation, wherein the driving signal is a pulse width modulation signal.
Further, the network connection of the upper computer of the gesture recognition system and the lower computer controller through the IPV6 specifically includes: and establishing connection between the client of the upper computer of the gesture recognition system and the server of the lower computer controller through socket programming, and after the connection is successfully established, transmitting the control driving signal by adopting a TCP (transmission control protocol).
And further, the bionic manipulator is connected with an image acquisition device and the manipulator controller, acquires action pictures of the bionic manipulator in real time, and transmits the action pictures back to the upper computer of the gesture recognition system.
Further, the first sensor system employs a right-handed cartesian coordinate system having an origin at the center of the first sensor, an x-axis and a z-axis on a horizontal plane of the first sensor, and a y-axis perpendicular to the horizontal plane of the first sensor.
Further, bionic manipulator is including the supporting platform who locates the bottom, the last vertical support of connecting of supporting platform, include in the vertical support: and the five mechanical finger steering engines are respectively connected with five mechanical fingers.
Further, the lower computer controller is connected with the manipulator controller through a serial port module, and the manipulator controller controls the rotation angle of the finger steering engine by using the control driving signal to realize the control of the mechanical finger.
Further, the system also comprises a face recognition module: the gesture recognition system upper computer is used for carrying out face recognition to obtain the identity of an operator, and after the identity verification is successful, the gesture recognition system upper computer receives gesture information of the operator.
In a second aspect, an embodiment of the present invention provides: a bionic manipulator, which is applied to the bionic manipulator control system according to any one of the first aspect.
The embodiment of the invention has the beneficial effects that:
The bionic manipulator control system of the embodiment of the invention is used for controlling the bionic manipulator and comprises: the first sensor is used for detecting and collecting gesture information; the gesture recognition system upper computer is used for receiving gesture information and converting the gesture information into a control driving signal of the bionic manipulator; the manipulator controller is used for being connected with an upper computer of the gesture recognition system through an IPV6 network, receiving and transmitting control driving signals, encrypting and correspondingly decrypting the control driving signals through an IPV6 protocol encryption mode when the control driving signals are transmitted, receiving the control driving signals, driving a motor of the bionic manipulator according to the control driving signals and realizing the control of the bionic manipulator. The connection is carried out through an IPV6 network protocol, the control driving signal transmitted is encrypted and decrypted by utilizing an IPV6 protocol, the safety of a transmission network is improved, and meanwhile, the length of a routing table in the router can be reduced due to the fact that a smaller routing table is used by the IPV6, so that the speed of forwarding a data packet by the router is increased, namely, the transmission efficiency of the control driving signal is improved, and the bionic manipulator can be controlled in real time.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
Fig. 1 is a block diagram of a bionic manipulator control system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a coordinate system of a bionic manipulator control system according to an embodiment of the present disclosure;
Fig. 3 is a schematic diagram of a fingertip coordinate system and a palm coordinate system of an embodiment of a bionic manipulator control system according to the embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The first embodiment is as follows:
The bionic manipulator is a universal end effector device with the appearance and the size similar to those of a human hand, can grab and operate an object like the human hand, has higher degree of freedom in a limited space, and is applied to many fields. Therefore, the embodiment of the invention provides a bionic manipulator control system for a bionic manipulator.
Fig. 1 is a block diagram of a bionic manipulator control system according to an embodiment of the present invention, and as shown in fig. 1, the system is configured to control a bionic manipulator 100, and includes:
First sensor 200: for detecting and collecting gesture information.
For example, in this embodiment, the first sensor 200 may select a leap motion sensor for recognizing the gesture motion of the detection area, and collect gesture information included in the gesture motion, where the gesture information includes: position information, motion information.
The leap motion sensor is a PC and Mac oriented somatosensory controller, and is connected to a gesture recognition system upper computer 300 through a USB (for example, a PC can be selected as the gesture recognition system upper computer), a working space of 4 cubic feet, namely a detection area is created, in the detection area, the leap motion sensor carries out hand positioning and finger position data acquisition according to two built-in high-frame cameras and an infrared filter, pictures captured from different angles are used for detecting and tracking hand, finger, arm position and gesture motion, the gesture motion of a real world three-dimensional space is reconstructed through a simulation binocular vision technology, and gesture information in the gesture motion is collected and sent to the gesture recognition system upper computer 300 in a data frame mode.
further, in this embodiment, the leap Motion sensor may detect and track a hand, a finger, and a tool similar to the finger, and the acquired gesture information includes position information, Motion information, and Motion information, for example, the position information includes a moving position of the finger, and the finger can be positioned by the position information, the Motion information includes a fist making, palm opening, and a gesture Motion of drawing numbers with a gesture, and the Motion information includes a running state of a palm, an arm, and other components, such as waving, stretching, and the like.
The upper computer 300 of the gesture recognition system: for receiving the gesture information and converting the gesture information into a control driving signal of the bionic manipulator 100.
When the first sensor is a leaf motion sensor, a controller corresponding to the leaf motion sensor is installed in the gesture recognition system upper computer 300, receives a data frame containing gesture information, and converts the gesture information in the data frame into a control driving signal of the bionic manipulator 100 so as to control the bionic manipulator 100 in real time.
The lower computer controller 400: the gesture recognition system is used for being connected with the gesture recognition system upper computer 300 through an IPV6 network, receiving and transmitting control driving signals, and encrypting and correspondingly decrypting the control driving signals through an IPV6 protocol encryption mode when the control driving signals are transmitted.
The robot controller 500: the control device is used for receiving the control driving signal and driving a motor of the bionic manipulator according to the control driving signal to realize the control of the bionic manipulator, and for example, STM32F407ZE series chips can be selected as the manipulator controller 500.
At present, the bionic manipulator mostly transmits control signals in an IPV4 protocol mode, on one hand, the transmission efficiency is low, so that the control signals cannot control the manipulator in real time, and great loss of life and property is possibly caused, and on the other hand, the safety of a transmission network is poor, so that the control signals are possibly interfered to change, and control operation errors are caused. At the same time, the resources of the IPV4 are about to be exhausted, so in this embodiment, an IPV6 network is used for connection to transmit the control driving signal, thereby reducing the data transmission delay and enhancing the data transmission security.
The IPV6 network has a larger coding address space, and the IPV6 adopts a smaller routing table than the IPV4, so that the length of the routing table in the router can be obviously reduced, and the speed of forwarding the control driving signal data packet by the router is improved. And the IPV6 supports automatic address configuration, so that network management including a local area network is more convenient and faster, and if a plurality of bionic manipulators are needed to work in one scene, the network can be managed in a centralized manner, and large-scale remote deployment can be carried out.
Further, the IPV6 network has higher security, and when in use, the control driving signal transmitted by the network layer may be encrypted in data according to the IPV6 protocol encryption method and the IP packet may be verified at the same time, so as to enhance the security of network data transmission, and it can be understood that when the upper computer 300 of the gesture recognition system sends the encrypted control driving signal to the lower computer controller 400, the lower computer controller 400 decrypts the control driving signal in a corresponding decryption method.
The embodiment completes the transmission of the control driving data based on an IPv6 network, allocates an IPv6 address according to transmission requirements, and selects a suitable communication mode and connection mode to complete network settings including IPv6Qos, IPv6 multicast, and IPv6 security deployment. Compared with the existing network connection, the real-time remote control with low delay is realized, the delay of the network to the image data and the control data can be reduced, and the delay of the data is controlled in millisecond level.
The network connection of the upper computer 300 of the gesture recognition system and the lower computer controller through the IPV6 specifically comprises the following steps: the connection between the client of the gesture recognition system upper computer 300 and the server of the lower computer controller 400 is established through socket programming, and after the connection is successfully established, a TCP protocol is adopted to control the transmission of driving signals.
The Raspberry Pi mini-system can be selected as the lower computer controller 400, the Chinese name is Raspberry group, during actual use, IPV6 network connection is carried out on a network IP address and a port of the Raspberry group input into the gesture recognition system upper computer 300, communication is carried out through an IPV6 network, the lower computer controller 400 and the manipulator controller 500 can be connected in a USB serial port mode, meanwhile, the lower computer controller 400 and the manipulator controller 500 can be embedded into the bionic manipulator 100, the purpose of simplifying the structure is achieved, the bionic manipulator 100 and the lower computer controller 400 can be conveniently moved to required operation positions at the same time, and practical use is facilitated.
In addition, the bionic manipulator motion monitoring system further comprises an image acquisition device 600 connected with the manipulator controller 500, the motion pictures of the bionic manipulator are acquired in real time and are transmitted back to the gesture recognition system upper computer 300 through the lower computer controller 400 to be displayed, an operator can conveniently perform remote monitoring on the surrounding environment in real time, gesture motion is adjusted in time, and the image acquisition device 600 can be a high-definition camera.
In order to increase the safety of use, the face recognition module 700 is further included: the gesture recognition system upper computer 300 is used for carrying out face recognition to obtain the identity of an operator, and receiving gesture information of the operator after the identity verification is successful.
In a specific embodiment, the gesture recognition system upper computer 300 may effectively control the bionic manipulator 100 only after analyzing and converting the acquired data frame, and the process of converting the gesture information into the control driving signal of the bionic manipulator specifically includes:
According to the mapping relation between the gesture information and the bionic manipulator, the position relation between each finger tip and the palm center of the bionic manipulator is obtained by adopting a space inverse solution method, then the driving signal corresponding to each finger steering engine is obtained by respectively calculating according to the position relation, and the driving signal is a pulse width modulation signal.
As shown in fig. 2, which is a schematic diagram of a coordinate system of this embodiment, the leap motion sensor system in this embodiment adopts a right-handed cartesian coordinate system, an origin of the coordinate system is at the center of the leap motion sensor, an x-axis and a z-axis are on a horizontal plane of the leap motion sensor, and a y-axis is perpendicular to the horizontal plane of the leap motion sensor.
The data frame contains a data information list for implementing control, for example, hand data, Fingers data, Tools data, and Gestures data, including various gesture information in the beginning, the end, or the progress.
One way of the structure of the bionic manipulator 100 in the present embodiment is described as follows: bionic manipulator 100 is including locating the supporting platform of bottom, and the vertical support is connected through bolted connection's mode on the supporting platform, includes in the vertical support: and the five mechanical finger steering engines are respectively connected with five mechanical fingers. The vertical support can be understood as an arm, and the mechanical fingers with the degree of freedom of 3 can be selected, so that the operation flexibility is ensured, and the hand action of an operator can be better executed. The bionic manipulator is connected with corresponding steering gears through a metal connecting rod in the middle of the vertical support, the manipulator controller 500 is connected with five steering gears of the bionic manipulator 100, the manipulator is controlled by controlling the rotation angle of the steering gears to drive the vertical connecting rod, and action control is completed.
in the L eap Motion model, the physical features of the finger and the tool are abstracted into an endpoint object, that is, the finger and the tool are used as a class of endpoint objects, and a fingertip coordinate system Tip Position and a Palm coordinate system Palm Position are established, and in the coordinate system of the L eap Motion, the coordinates of the Palm center can be measured in millimeters.
when an operator makes different gesture actions above the L eap Motion sensor, the coordinates from the palm center to the origin of the sensor device are returned by a palm Position () function method, and the coordinates are recorded as palm.x, palm.y and palm.z, and then the fingertip coordinates of five fingers on the palm of the operator are returned by a Tip Position () function method, and are respectively recorded as vector indexes, Thumb, middle, ring and pinky, which respectively represent the fingertip coordinates of the Index finger, the big finger, the middle finger, the ring finger and the small finger.
Taking Thumb as an example, thumb.x, thumb.y, and thumb.z respectively represent position data of a Thumb under a coordinate system acquired by a leap motion sensor, and then the distances (i.e. the position relations) between the five finger tips and the palm center are respectively calculated according to the mapping relations between the gesture information and the bionic manipulator, and the formula is expressed as:
The distances between the other four fingers and the palm center can be obtained similarly and are respectively marked as the index finger X 2Middle finger X 3Ring finger X 4Little finger X 5。
After the distances between the fingertips of five fingers and the center of the palm are obtained through calculation, a driving signal corresponding to each finger steering engine is obtained through calculation according to the position relation, and the driving signal is a pulse width modulation signal. For example: the thumb, index finger, middle finger and ring finger of the bionic manipulator are respectively controlled, the steering engines of the little finger are a first steering engine, a second steering engine, a third steering engine, a fourth steering engine and a fifth steering engine, and the pulse width modulation signal data are recorded as P 1、P2、P3、P4、P5And calculating according to the structural parameters of the actual bionic manipulator.
The following is a calculation process of one embodiment.
P1=2500-(X1-30)*28
P2=(X2-30)*28
P3=(X3-30)*28
P4=(X4-30)*28
P5=2500-(X5-30)*28
In the above embodiments, since the steering engine requires a pulse for control, the term "digital" means waveform data transmitted to the steering engine. For example: the steering engine with a pulse angle of 180 rotates, and the calculated P value is used to control a pulse waveform, where the data of 28, 30, etc. is a set of data obtained by a specific experiment, it can be understood that an adjustment may be performed in an actual process, which is not specifically limited in this embodiment.
The gesture recognition system upper computer 300 sends the analyzed driving signal to the lower computer controller 400 to perform real-time control of the bionic manipulator 100.
The bionic manipulator control system of the embodiment encrypts and correspondingly decrypts the control driving signal through an IPV6 protocol encryption mode, is used for receiving the control driving signal, drives a motor of the bionic manipulator according to the control driving signal, and realizes the manipulator controller for controlling the bionic manipulator. The security of transmission network has been increased, simultaneously because the practical littleer routing table of IPV6, can reduce the length of routing table in the router, consequently improved the speed that the router forwarded the data package, improved control drive signal's transmission efficiency promptly, can control bionic manipulator in real time.
Example two:
The present embodiment provides a bionic manipulator, which is applied to the control system of the bionic manipulator according to any one of the embodiments.
The above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same, although the present invention is described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.
Claims (10)
1. A bionic manipulator control system is used for controlling a bionic manipulator and is characterized by comprising:
A first sensor: the gesture detection and acquisition device is used for detecting and acquiring gesture information;
The gesture recognition system upper computer: the bionic manipulator is used for receiving the gesture information and converting the gesture information into a control driving signal of the bionic manipulator;
A lower computer controller: the gesture recognition system is used for being connected with the gesture recognition system upper computer through an IPV6 network, receiving and transmitting the control driving signal, and encrypting and correspondingly decrypting the control driving signal in an IPV6 protocol encryption mode when the control driving signal is transmitted;
A manipulator controller: and the motor is used for receiving the control driving signal and driving the bionic manipulator according to the control driving signal so as to realize the control of the bionic manipulator.
2. The bionic manipulator control system according to claim 1, wherein the first sensor is a lap motion sensor, and is configured to identify a gesture motion in a detection area, and collect gesture information included in the gesture motion, where the gesture information includes: position information, motion information.
3. The bionic manipulator control system according to claim 1, wherein converting the gesture information into the control driving signal of the bionic manipulator specifically comprises:
Obtaining the position relation between each finger tip and the palm center of the bionic manipulator according to the mapping relation between the gesture information and the bionic manipulator;
And respectively calculating to obtain a driving signal corresponding to each finger steering engine according to the position relation, wherein the driving signal is a pulse width modulation signal.
4. The bionic manipulator control system of claim 1, wherein the connection between the upper computer of the gesture recognition system and the lower computer controller through an IPV6 network specifically comprises: and establishing connection between the client of the upper computer of the gesture recognition system and the server of the lower computer controller through socket programming, and after the connection is successfully established, transmitting the control driving signal by adopting a TCP (transmission control protocol).
5. The bionic manipulator control system according to claim 1, further comprising an image acquisition device connected with the manipulator controller, and used for acquiring action pictures of the bionic manipulator in real time and transmitting the action pictures back to the upper computer of the gesture recognition system.
6. The bionic manipulator control system of claim 2, wherein the system of the first sensor adopts a right-handed cartesian coordinate system, the origin of the coordinate system is at the center of the first sensor, the x-axis and the z-axis are on the horizontal plane of the first sensor, and the y-axis is perpendicular to the horizontal plane of the first sensor.
7. The bionic manipulator control system according to any one of claims 1 to 6, wherein the bionic manipulator comprises a support platform arranged at the bottom end, a vertical support is connected to the support platform, and the vertical support comprises: and the five mechanical finger steering engines are respectively connected with five mechanical fingers.
8. The bionic manipulator control system according to claim 7, wherein the lower computer controller is connected with the manipulator controller through a serial port module, and the manipulator controller controls the rotation angle of the finger steering engine by using the control driving signal to realize control over the mechanical fingers.
9. The bionic manipulator control system of claim 7, further comprising a face recognition module: the gesture recognition system upper computer is used for carrying out face recognition to obtain the identity of an operator, and after the identity verification is successful, the gesture recognition system upper computer receives gesture information of the operator.
10. A bionic manipulator, which is applied to the bionic manipulator control system according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010176478.5A CN111409068A (en) | 2020-03-13 | 2020-03-13 | Bionic manipulator control system and bionic manipulator |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010176478.5A CN111409068A (en) | 2020-03-13 | 2020-03-13 | Bionic manipulator control system and bionic manipulator |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111409068A true CN111409068A (en) | 2020-07-14 |
Family
ID=71487605
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010176478.5A Pending CN111409068A (en) | 2020-03-13 | 2020-03-13 | Bionic manipulator control system and bionic manipulator |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111409068A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113746833A (en) * | 2021-09-02 | 2021-12-03 | 上海商汤智能科技有限公司 | Communication method and apparatus, electronic device, and storage medium |
CN114986499A (en) * | 2022-05-23 | 2022-09-02 | 兰州大学 | Mechanical arm motion control method, system and equipment and readable storage medium |
CN115816456A (en) * | 2022-12-09 | 2023-03-21 | 上海清芸机器人有限公司 | System and method for controlling dexterous hand of humanoid robot based on raspberry pie |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101079904A (en) * | 2007-06-15 | 2007-11-28 | 中兴通讯股份有限公司 | Implementation method for IPV6 supported by Java virtual machine |
US20090073983A1 (en) * | 2007-09-13 | 2009-03-19 | Jin-Hyoung Kim | METHOD AND APPARATUS FOR PROVIDING GATEWAY TO TRANSMIT IPv6 PACKET IN A WIRELESS LOCAL AREA NETWORK SYSTEM |
CN107688390A (en) * | 2017-08-28 | 2018-02-13 | 武汉大学 | A kind of gesture recognition controller based on body feeling interaction equipment |
CN207223989U (en) * | 2017-10-09 | 2018-04-13 | 兰州大学 | A kind of remote mechanical arm control system based on IPv6 networks |
CN207752446U (en) * | 2018-05-03 | 2018-08-21 | 林潼 | A kind of gesture identification interaction systems based on Leap Motion equipment |
-
2020
- 2020-03-13 CN CN202010176478.5A patent/CN111409068A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101079904A (en) * | 2007-06-15 | 2007-11-28 | 中兴通讯股份有限公司 | Implementation method for IPV6 supported by Java virtual machine |
US20090073983A1 (en) * | 2007-09-13 | 2009-03-19 | Jin-Hyoung Kim | METHOD AND APPARATUS FOR PROVIDING GATEWAY TO TRANSMIT IPv6 PACKET IN A WIRELESS LOCAL AREA NETWORK SYSTEM |
CN107688390A (en) * | 2017-08-28 | 2018-02-13 | 武汉大学 | A kind of gesture recognition controller based on body feeling interaction equipment |
CN207223989U (en) * | 2017-10-09 | 2018-04-13 | 兰州大学 | A kind of remote mechanical arm control system based on IPv6 networks |
CN207752446U (en) * | 2018-05-03 | 2018-08-21 | 林潼 | A kind of gesture identification interaction systems based on Leap Motion equipment |
Non-Patent Citations (2)
Title |
---|
冯涛,郭显: "《无线传感器网络》", 30 June 2017, 西安电子科技大学出版社 * |
国家新课程教学策略研究组, 喀什维吾尔文出版社,新疆青少年出版社 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113746833A (en) * | 2021-09-02 | 2021-12-03 | 上海商汤智能科技有限公司 | Communication method and apparatus, electronic device, and storage medium |
CN114986499A (en) * | 2022-05-23 | 2022-09-02 | 兰州大学 | Mechanical arm motion control method, system and equipment and readable storage medium |
CN114986499B (en) * | 2022-05-23 | 2023-03-28 | 兰州大学 | Mechanical arm motion control method, system and equipment and readable storage medium |
CN115816456A (en) * | 2022-12-09 | 2023-03-21 | 上海清芸机器人有限公司 | System and method for controlling dexterous hand of humanoid robot based on raspberry pie |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Whitney et al. | Ros reality: A virtual reality framework using consumer-grade hardware for ros-enabled robots | |
CN111409068A (en) | Bionic manipulator control system and bionic manipulator | |
Krupke et al. | Comparison of multimodal heading and pointing gestures for co-located mixed reality human-robot interaction | |
CN108453742B (en) | Kinect-based robot man-machine interaction system and method | |
CN105291138B (en) | It is a kind of to strengthen the visual feedback platform of virtual reality immersion sense | |
CN109955254A (en) | The remote operating control method of Mobile Robot Control System and robot end's pose | |
CN103398702A (en) | Mobile-robot remote control apparatus and control technology | |
CN102814814A (en) | Kinect-based man-machine interaction method for two-arm robot | |
CN109079794B (en) | Robot control and teaching method based on human body posture following | |
CN103192387A (en) | Robot and control method thereof | |
CN107030692B (en) | Manipulator teleoperation method and system based on perception enhancement | |
CN115469576B (en) | Teleoperation system based on human-mechanical arm heterogeneous motion space hybrid mapping | |
CN108828996A (en) | A kind of the mechanical arm remote control system and method for view-based access control model information | |
CN109968310A (en) | A kind of mechanical arm interaction control method and system | |
CN107856014A (en) | Mechanical arm pose control method based on gesture recognition | |
CN110977981A (en) | Robot virtual reality synchronization system and synchronization method | |
Dwivedi et al. | Combining electromyography and fiducial marker based tracking for intuitive telemanipulation with a robot arm hand system | |
JP3742879B2 (en) | Robot arm / hand operation control method, robot arm / hand operation control system | |
CN108115671B (en) | Double-arm robot control method and system based on 3D vision sensor | |
CN110539315B (en) | Construction robot based on virtual reality control | |
Capolei et al. | Positioning the laparoscopic camera with industrial robot arm | |
Siegele et al. | Optimizing collaborative robotic workspaces in industry by applying mixed reality | |
CN116160440A (en) | Remote operation system of double-arm intelligent robot based on MR remote control | |
Yang et al. | Design of EOD Robot Based on Hybrid Control of Brain Wave and Remote Network | |
CN114851200A (en) | Method and system for controlling mechanical arm to assemble blade based on gesture recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200714 |