CN107639620A - A kind of control method of robot, body feeling interaction device and robot - Google Patents

A kind of control method of robot, body feeling interaction device and robot Download PDF

Info

Publication number
CN107639620A
CN107639620A CN201710910785.XA CN201710910785A CN107639620A CN 107639620 A CN107639620 A CN 107639620A CN 201710910785 A CN201710910785 A CN 201710910785A CN 107639620 A CN107639620 A CN 107639620A
Authority
CN
China
Prior art keywords
robot
feeling interaction
body feeling
information
interaction device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710910785.XA
Other languages
Chinese (zh)
Inventor
郭文静
段楠
李述胜
袁铮
徐志瑞
赵宁馨儿
郭艳婕
尹昱东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201710910785.XA priority Critical patent/CN107639620A/en
Publication of CN107639620A publication Critical patent/CN107639620A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses a kind of control method of robot, body feeling interaction device and robot, control method to include:Body feeling interaction device gathers the rotational angle of target body limbs joint, the angle information collected is calculated, the posture or action to target body carry out analysis identification;The body feeling interaction instruction according to corresponding to producing the analysis recognition result got;Body feeling interaction instruction is sent to robot, robot performs body feeling interaction instruction;Robot gathers oneself ambient condition information, such as fingertip pressure information and whole body image information simultaneously, feeds back to target body by body feeling interaction device.Traditional control mode to robot is not only breached through the above way, and also has the display of 3D rendering by way of touch feedback, allows user to obtain the operating experience of immersion in big degree.It is not only the operational error rate for easily operating and substantially reducing obtained from therewith, while more lively operating experience is also brought to user.

Description

A kind of control method of robot, body feeling interaction device and robot
【Technical field】
The present invention relates to intelligent terminal technical field and field of virtual reality, more particularly to a kind of control of robot Method, body feeling interaction device and robot.
【Background technology】
Along with virtual reality technology, augmented reality, the appearance of haptic feedback technology and intelligent robot, people There are more and more different complete new experiences in vision and tactile.
For example, at present for robot mode of operation generally by remote control or manual touch-control, this is to making The operation of user brings many inconveniences and for beginner nor fine grasp, while operational error rate is higher, Prevent user is from the vivid world for experiencing robot.
【The content of the invention】
The present invention solves the technical problem of provide a kind of control method of robot, body feeling interaction device and machine Device people.The present invention not only breaches traditional control mode to robot, and by way of touch feedback and 3D figures The display of picture, user is at utmost allowed to obtain the operating experience of immersion.Be not only obtained from therewith convenient operation and The operational error rate substantially reduced, while more lively operating experience is also brought to user.
In order to solve the above technical problems, one aspect of the present invention is:
The control method of a robot is provided, the control method comprises the following steps:
Step 1, body feeling interaction device collection target body limbs joint angle information is first passed through, according to the target collected Human body limb joint angles information carries out limb motion angle recognition to target body;
Step 2, body feeling interaction device is further according to body corresponding to the limb motion angle result generation of the target body of identification Feel interactive instruction;
Step 3, body feeling interaction device again sends body feeling interaction instruction to robot, and robot performs body feeling interaction and referred to Order.
In the step 1, body feeling interaction device collection target body limbs joint movement angle, according to the target collected The step of human body limb joint angles information carries out limb motion angle recognition to target body specifically includes:
Step 1.1, the Eulerian angles in each joint of target body limbs are gathered;
Step 1.2, the Eulerian angles of the target body collected in step 1.1 are converted into spin matrix, then by spin moment Battle array brings the algorithm for resolving human body relative attitude into, calculates the result of each joint relative motion attitude angle of human body.
In the step 2, body feeling interaction instruction includes the execution of robot motion posture, robot direction of advance adjustment, machine Device people pace adjusts, robot GPS automatic driving modes conversion and the collection of robot fingertip pressure data.
In the step 3, when robot performs body feeling interaction instruction:
The robot fingertip pressure that robot is obtained by the fingertip pressure sensing unit in robot under current environment is believed Breath, and fingertip pressure information is sent to body feeling interaction device, so that fingertip pressure information has been converted into by body feeling interaction device The vibration of gradient, and give the vibrational feedback to the target body;
Robot obtains first under current environment by the first camera in robot and second camera respectively View data and the second view data, and the first view data and the second view data are sent to body feeling interaction device, So that body feeling interaction device will be shown after the first view data and the second view data synthesis 3D rendering;
Robot obtains the machine by the GPS module in robot, electronic compass and power supply real time monitoring apparatus The real-time attitude information of people and power information, and robot pose information and power information are sent to body feeling interaction and filled Put, so that body feeling interaction device is shown after integrating robot pose information and power information.
Using communication transmission data between body feeling interaction device and robot.
A kind of body feeling interaction device, including control unit, it is single that the collection of human body limb joint angles is connected with control unit Member, touch feedback unit and transmitting element;
Human body limb joint angles collecting unit:For gathering target body joint angles information;
Control unit:For the target body joint angles information conversion for gathering human body limb joint angles collecting unit Into the body feeling interaction instruction for robot motion, and the fingertip pressure information of the robot of acquisition is converted into touch feedback list The signal that member can identify;
Touch feedback unit:The signal that the fingertip pressure information of robot for control unit to be obtained is converted turns Change into the positively related vibration for having gradient of fingertip pressure information, and by the vibrational feedback for having gradient to target body;
Transmitting element:Body feeling interaction instruction for control unit to be generated is sent to robot.
The body feeling interaction device also includes the receiving unit and display unit being connected with control unit;
Receiving unit:The information sent for receiving machine human hair;
Display unit:For showing fingertip pressure information, robot pose information and robot power supply information, and by machine The first image information and the second image information of device people collection synthesize 3D rendering and shown.
The display unit is virtual reality head-mounted display apparatus.
A kind of robot, including control unit, fingertip pressure sensing unit, receiving unit, hair are connected with control unit Send unit, image acquisition units, execution unit and monitoring unit;
Receiving unit:For receiving the body feeling interaction instruction of body-sensing interactive device transmission;
Fingertip pressure sensing unit:For gathering robot fingertip pressure information;
Image acquisition units:For gathering the view data for being used for generating 3D rendering;
Monitoring unit:For gathering robot attitude information itself and power information;
Control unit:For the information acquired in fingertip pressure sensing unit, image acquisition units and monitoring unit to be led to Cross transmitting element and be transferred to body feeling interaction device, and the body feeling interaction instruction that receiving unit is received is sent to execution unit;
Transmitting element:Data message for robot to be collected is sent to body feeling interaction device;
Execution unit:The instruction and execution of reception control unit.
Image acquisition units include the first camera and second camera, the first camera and second camera and control list Member connection.
Compared with prior art, advantage of the invention is that:
The control method of the robot of the present invention gathers target body limbs joint angle information by body feeling interaction device, Limb motion angle recognition is carried out to target body according to the target body limbs joint angle information collected, then passes through identification Body feeling interaction instruction is sent to robot, robot hold again by as a result body feeling interaction instruction corresponding to generation, body feeling interaction device Row body feeling interaction instructs, therefore the control method of the robot of the present invention breaches traditional control mode to robot, energy Enough so that the operational error rate that robot is easily operated and substantially reduced, while more lively behaviour is also brought to user Experience.
Further, the present invention introduces the concept of feedback in motion sensing control, by robot context situation profit Gathered with a series of sensors and feed back to body feeling interaction device, using 3D rendering and have the vibrational feedback of gradient to target body Itself, therefore the control method of the robot of the present invention can make operator obtain immersion control machine people by body feeling interaction Operating experience, not only enrich control mode of the prior art to robot, and without again by other equipment to machine People is controlled, the hardware device required for also simplify robot control system while Consumer's Experience is improved, and reduces life Cost is produced, simple to operate, left-hand seat is convenient, can also substantially reduce operational error rate.
【Brief description of the drawings】
Fig. 1 is the conceptual model when control method of robot of the present invention is implemented;
Fig. 2 is the schematic flow sheet of the control method embodiment one of robot of the present invention;
Fig. 3 is the schematic flow sheet of another embodiment of control method of robot of the present invention;
Fig. 4 is fingertip pressure collection and the structural representation of haptic feedback system of robot of the present invention;
Fig. 5 is fingertip pressure collection and the schematic flow sheet of haptic feedback system of robot of the present invention;
Fig. 6 is the flow chart of the 3D visual feedbacks of body feeling interaction device of the present invention;
Fig. 7 is the overall structural representation of body feeling interaction device of the present invention;
Fig. 8 is the overall structural representation of robot of the present invention.
Wherein, 101- body feeling interactions device, 102- robots, 401- robots, 402- Intelligent gloves.
【Embodiment】
It is next with reference to the accompanying drawings and examples that the present invention is further illustrated.
As shown in figure 1, conceptual model when implementing for the control method of robot of the present invention, model include body feeling interaction Device 101 and robot 102, wherein pass through wireless connection between body feeling interaction device 101 and robot.
Wherein, the body feeling interaction device 101 realizes man-to-man wireless connection using IP is distributed with robot 102.Together When ensure that between like product interchangeability compatibility.
In order that user possesses the operating experience of immersion, more simple and direct accurately operation robot is also allowed users to, Body feeling interaction device 101 is combined with robot 102 by communication, by body-sensing action come control machine people, To improve Consumer's Experience.
Specifically, body feeling interaction device 101 gathers target body limbs joint angle information, and limb is carried out to target body The analysis identification of body movement angle, body feeling interaction instruction corresponding with recognition result is obtained, body feeling interaction instruction is sent to machine People 102;
Robot 102 receives and performs body feeling interaction instruction:Robot 102 is taken the photograph by fingertip pressure sensing unit, first As head and second camera, GPS module, electronic compass and power supply real time monitoring apparatus obtain robot fingertip pressure respectively Data, the first view data and itself attitude data of the second view data, robot and robot power supply information data, and will These data are sent to body feeling interaction device 101;
Body feeling interaction device 101 receive robot fingertip pressure data, for generate context 3D rendering data one View data and itself attitude data of the second view data, robot and robot power supply information data, by fingertip pressure number According to the vibrational feedback for being converted into gradient to target body, the first view data and the second view data are synthesized into 3D rendering and shown Show, also shown after robot attitude data itself and robot power supply information data are integrated using body feeling interaction feedback device Come.
For the apparent above-mentioned course of work of explanation, further referring to Fig. 2 and Fig. 3, robot of the invention Control method comprises the following steps:
Step 201:Target body limbs joint angle is gathered by being deployed in the body feeling interaction control device with effector Information is spent, obtains the Eulerian angles in human body limb each joint, using electronic gyroscope come the posture of acquisition control person's four limbs during collection;
Step 202:In order to realize that body feeling interaction operates, the Eulerian angles conversion for the target body that body feeling interaction device collects For spin matrix, then spin matrix substituted into and resolves the algorithm of human body relative attitude to resolve human body relative attitude, the algorithm is Each limbs of user are obtained relative to the angular pose of the earth sensor and represent it in a manner of spin matrix, to representing The spin matrix of each limbs posture carries out matrix operation between any two, and the matrix operation, which can illustrate, to be described as, limbs B rotation Matrix multiplication between the inverse matrix of matrix and limbs A spin matrix, then relative attitude between each limbs is calculated respectively simultaneously Euler's angular coordinate is converted to, user's human body relative attitude or action are calculated using Euler's angular coordinate, and by the number after calculating According to Eulerian angles are again converted into, the angle that corresponding steering wheel needs to lock is extracted;
After the articulation angle-data after getting conversion, it is also necessary to judge whether the angle-data is effective, simultaneously Can also automatic rejection wrong data to reduce the generation of accidental error, such as user joint motions itself be it is conditional, therefore Whether in the reasonable scope the algorithm for being used to resolve human body relative attitude described in step 202 can judge each data, be come with this Judge whether the data are effective, if invalid abandon this gathered data immediately;
Step 203:The body feeling interaction corresponding with analysis recognition result is obtained to instruct:
Body feeling interaction device is after the limb motion angle recognition result of target body is obtained, further according to the identification knot Being used to resolve the algorithm of human body relative attitude and refer to come body feeling interaction corresponding to generating the recognition result described in fruit and step 202 Order;
In order to further improve the accuracy rate or compatibility that obtain body feeling interaction instruction, additionally it is possible to take body-sensing and remote control The mode that device combines issues body feeling interaction instruction, for example, carrying out complicated behaviour using the finger of somatosensory operation control machine people Make, while the speed advanced using foot-operated remote control control machine people;
Step 204:The body feeling interaction instruction that step 203 is generated is sent to robot so that the robot performs body Feel interactive instruction.
For example, the body feeling interaction instruction that current body feeling interaction device is sent is to change direction of advance, then robot is receiving After being instructed to the body feeling interaction, change existing direction of advance, be adjusted to the direction that instruction requires.If current body feeling interaction refers to Making will accelerate to advance after the instruction is received for assisted instruction, robot in existing speed.
In another embodiment, in known machine people destination, opened by body feeling interaction instruction come control machine people Self-navigation pattern is opened, makes its automatic pathfinding to destination;For example, by body feeling interaction device and the machine of a built-in navigation module Device people connects, and after body feeling interaction device and robot enter working condition, obtains identification using body feeling interaction device and shifts to an earlier date Limbs posture is set, determines the instruction that the posture is robot self-navigation pattern, and destination is set by touch controlled key, After then the robot receives self-navigation instruction, self-navigation pattern, automatic pathfinding are opened;
In another embodiment, in order that user obtains optimal immersion operating experience, that is, visual feedback is utilized Allow users to experience the environment arround robot to greatest extent by the vision and tactile of itself with touch feedback, essence True ground fingertip pressure feedback can also make operation of the user to robot more accurate, that is, realize the operating experience of immersion.
Fig. 3 is further regarded to, the control method of robot of the invention has following steps:
Step 301, target body joint angles information is gathered by body feeling interaction control device, the target body is entered Row limb motion angle analysis identifies;
Step 302, body feeling interaction device generates body feeling interaction instruction corresponding with recognition result;
Step 303:Body feeling interaction device sends body feeling interaction instruction to robot, and robot performs body feeling interaction instruction The step of after, also comprise the following steps:
Step 304:Body feeling interaction device receives the fingertip pressure that robot returns after the body feeling interaction instruction is performed Data, the first view data and the second view data, and robot characteristic information itself;Fingertip pressure data are by robot Fingertip pressure sensing unit is obtained, and the first view data is obtained by the camera of robot first, and the second view data is by robot Second camera obtains;The GPS module that robot characteristic information itself is carried by robot, electronic compass and power supply are supervised in real time Device is controlled to obtain;
Specifically, robot is put down after the body feeling interaction instruction of body feeling interaction device transmission is received by robot Fingertip pressure sensing unit collection robot fingertip pressure data on platform, pass through the first camera on robot platform and the Two cameras are shot to robot local environment respectively, pass through the GPS module on robot platform, electronic compass, power supply Real-time monitoring module collection robot position itself, posture and power information data, and returned these data by wireless module Return to body feeling interaction device;
Robot gets fingertip pressure data by fingertip pressure sensing unit, is imaged by the first camera and second Head obtains the first view data and the second view data, is got by GPS module, electronic compass, power supply real-time monitoring module Robot position itself, posture and this process of power information data are handed over these data are returned into body-sensing by wireless module This process of mutual device is real-time synchronization;
Corresponding, body feeling interaction device receives fingertip pressure data, the first view data and the second view data, machine People position itself, posture and power information data;
Step 305:Body feeling interaction device by fingertip pressure data conversion into the vibration for having gradient, by the first image and Two view data synthesize 3D rendering, show the 3D rendering;By robot characteristic information itself, including robot real time position, Real-time direction and spatial attitude and robot battery information centralized displaying.
In order to more really reduce tactile sensing device of robot, more real immersion operating experience is further provided the user with Robot fingertip pressure data are converted into and the positively related vibration frequency of fingertip pressure by effect, body feeling interaction device by algorithm Rate, the algorithm are to handle the pressure on the number information that sensor collects using condition judgment sentence and threshold value setting, will be pressed Force information is converted to positively related vibration frequency therewith, and is restored by the vibrating device on body feeling interaction device, stimulates User's antennal nerve;First view data and the second view data are subjected to three-dimensional fusion, compositing 3 d images, and utilize VR 3D glasses are shown;The processing of robot position itself, posture and power information graphical data is included on the boundary of VR glasses On face.
Body feeling interaction device produces vibratory stimulation target body tactile with an Intelligent glove;It can be watched using one 3D rendering data and the VR glasses of Design of Graphical Interface can be carried out to show the real-time 3D rendering of synthesis, while utilize interface Floating frame is designed to show the real-time position of robot, posture and power information data.
In above-mentioned implementation process, simple and convenient operating method can be not only brought to user, simultaneously because regarding Feel the presence of feedback and touch feedback, the utilization of user's sense organ can be improved to greatest extent, give the operation of user's immersion Experience.The sense of hearing even can be extended to, voice data is gathered using the sound transducer on robot platform and returns to body-sensing Interactive device, transfers the sense of hearing of user, further improves the immersion operating experience of user.
It is different from prior art, the angle of rotation of the body feeling interaction device collection target body limbs joint of present embodiment Degree, is calculated the angle information collected, the posture or action to target body carry out analysis identification using spin matrix; Obtain body feeling interaction instruction corresponding with recognition result;Then body feeling interaction instruction is sent to robot so that robot is just Really perform body feeling interaction instruction.The mode of above-mentioned control machine people not only breaches traditional control mode to robot, and And also have the display of 3D rendering by way of touch feedback, largely allow user to obtain the operating body of immersion Test.It is not only the operational error rate for easily operating and substantially reducing obtained from therewith, while is also brought more to user For lively operating experience.
In addition, the fingertip pressure data that body feeling interaction device is returned by robot determine to feed back to the vibration of user's finger tip Frequency, user can be by the tactiles of oneself come the vivid pressure size perceived on robot finger, and then can be more Accurately control machine finger is exerted oneself.Meanwhile the display of 3D rendering so that user intuitively can be seen in robot eye The world, the sense of reality that the operation of user's immersion is brought is improved, further increases the experience of user.
Refering to Fig. 4, the finger tip of robot is designed with pressure sensor, can accurately be detected when robot touches object Robot fingertip pressure size;
The touch feedback unit of body feeling interaction device is specifically arranged on the finger tip of Intelligent glove 402 as shown in Figure 4, touches Feel that feedback unit is used to robot fingertip pressure signal returning to body feeling interaction device, body feeling interaction device is by calculating Afterwards, user's frequency and the positively related vibration signal of fingertip pressure signal will be fed back to by touch feedback unit.
Refering to Fig. 5, robot fingertip pressure collection of the present invention and haptic feedback system idiographic flow comprise the following steps:
Step 501:Pressure information is gathered by the pressure sensor for being deployed in manipulator finger tip during robotic contact foreign object;
Step 502:Robot again sends fingertip pressure information to body feeling interaction device, is passed using communication It is defeated;
Step 503:The fingertip pressure information that body feeling interaction device collects to robot is handled, and is converted into vibration frequency Rate and the positively related vibration information of fingertip pressure signal, then vibration information is sent to the body feeling interaction device touch feedback list Member;
Step 504:The corresponding vibrating reed of finger tip performs touch feedback vibration and referred on the body feeling interaction device of user's wearing Order, and then realize that Mechanical Touch feeds back.
When user performs the order of crawl object using body feeling interaction instruction come control machine people, robot crawl object Dynamics vibration will be converted into by touch feedback returns to user, user can perceive robot according to finger tip vibration frequency The size of crawl dynamics, produce a kind of vivid immersion operating experience.
It is body feeling interaction device 3D visual feedback schematic flow sheets of the present invention with reference to figure 6.The 3D visual feedback work(of the present invention Body feeling interaction device can be needed to be realized with machine person cooperative work, embodiment comprises the following steps:
Step 601:Robot rides upon the first camera on robot platform and the after body feeling interaction instruction is performed Two cameras gather the first view data and the second view data respectively, and return to body feeling interaction device in real time;
Step 602:Body feeling interaction device is handled by image three-dimensional and synthesizes the first view data and the second view data 3D rendering;
Step 603:3D rendering information after processing is transferred on VR glasses by body feeling interaction device in real time, so that user Can be vivid observe robot context, give the vivid visual experience of user.
Refering to Fig. 7, Fig. 7 is the body feeling interaction apparatus structure schematic diagram of robot of the present invention, and the control of body feeling interaction device is single Member, be connected with control unit human body limb joint angles collecting unit, touch feedback unit, transmitting element, receiving unit with And display unit;
Human body limb joint angles collecting unit:For gathering target body joint angles information;
Control unit:For the target body joint angles information conversion for gathering human body limb joint angles collecting unit Into the body feeling interaction instruction for robot motion, and the fingertip pressure information of the robot of acquisition is converted into touch feedback list The signal that member can identify;
Touch feedback unit:The signal that the fingertip pressure information of robot for control unit to be obtained is converted turns Change into the positively related vibration for having gradient of fingertip pressure information, and by the vibrational feedback for having gradient to target body;
Transmitting element:Body feeling interaction instruction for control unit to be generated is sent to robot.
Receiving unit:The information sent for receiving machine human hair;
Display unit:For showing fingertip pressure information, robot pose information and robot power supply information, and by machine The first image information and the second image information of device people collection synthesize 3D rendering and shown.
Transmitting element uses communication so that the use of body feeling interaction device is more flexible, and the experience of user is more Add conveniently, display unit is virtual reality head-mounted display apparatus;
In order to realize body feeling interaction control operation, human body limb joint angles collecting unit needs to supervise user's limbs joint Superintend and direct and gathered in real time, main Eulerian angles (i.e. roll angle, the deviation angle and the angle of pitch for gathering each joint of target body limbs Three kinds of angle informations);Eulerian angles are substituted into the algorithm for being used to resolve human body relative attitude described in step 202, utilize spin moment Battle array is calculated the target body limb angle signal of acquisition, and Eulerian angles are converted into spin matrix, using for resolving The algorithm of human body relative attitude calculates the result of each joint relative attitude angle of human body;Finally give corresponding posture or dynamic Make body feeling interaction instruction.
When user controls the robot to perform the order of crawl object using body feeling interaction instruction, robot crawl The dynamics of object will be converted into vibration by touch feedback and return to user, and user can perceive institute according to finger tip vibration frequency The size of robot crawl dynamics is stated, produces a kind of vivid immersion operating experience;Meanwhile after body feeling interaction device will be handled 3D rendering information be transferred in real time on VR glasses.Allow the user to it is vivid observe the robot context, Give the vivid visual experience of user.
Refering to Fig. 8, Fig. 8 is the overall structure diagram of robot of the present invention.The robot of the present invention includes control unit, Be connected with control unit fingertip pressure sensing unit, receiving unit, transmitting element, image acquisition units, execution unit and Monitoring unit;
Receiving unit:For receiving the body feeling interaction instruction of body-sensing interactive device transmission;
Fingertip pressure sensing unit:For gathering robot fingertip pressure information;
Image acquisition units:For gathering the view data for being used for generating 3D rendering;
Monitoring unit:For gathering robot attitude information itself and power information;
Control unit:For the information acquired in fingertip pressure sensing unit, image acquisition units and monitoring unit to be led to Cross transmitting element and be transferred to body feeling interaction device, and the body feeling interaction instruction that receiving unit is received is sent to execution unit;
Transmitting element:Data message for robot to be collected is sent to body feeling interaction device;
Execution unit:The instruction and execution of reception control unit.
Wherein, image acquisition units include the first camera and second camera, the first camera and second camera with Control unit connects.
In addition, the fingertip pressure data that body feeling interaction device is returned by robot determine to feed back to the vibration of user's finger tip Frequency, user can be by the tactiles of oneself come the vivid pressure size perceived on robot finger, and then can be more Accurately control machine finger is exerted oneself.The display of 3D rendering simultaneously so that user intuitively can be seen in robot eye The world, the sense of reality that the operation of user's immersion is brought is improved, further increases the experience of user.Allow robot to turn into use The outer incarnation of the body of family truly.
Embodiments of the present invention are the foregoing is only, are not intended to limit the scope of the invention, it is every to utilize this The equivalent structure transformation that description of the invention and accompanying drawing content are made, or either directly or indirectly it is used in other related technology necks Domain, similarly it is included in scope of patent protection of the present invention.

Claims (10)

1. a kind of body feeling interaction device, it is characterised in that including control unit, human body limb joint angle is connected with control unit Spend collecting unit, touch feedback unit and transmitting element;
Human body limb joint angles collecting unit:For gathering target body joint angles information;
Control unit:Target body joint angles information for human body limb joint angles collecting unit to be gathered is converted into using Instructed in the body feeling interaction of robot motion, and the fingertip pressure information of the robot of acquisition is converted into touch feedback unit energy The signal enough identified;
Touch feedback unit:The signal that the fingertip pressure information of robot for control unit to be obtained is converted is converted into With the positively related vibration for having gradient of fingertip pressure information, and by the vibrational feedback for having gradient to target body;
Transmitting element:Body feeling interaction instruction for control unit to be generated is sent to robot.
2. body feeling interaction device according to claim 1, it is characterised in that the body feeling interaction device also includes and control Unit connected receiving unit and display unit;
Receiving unit:The information sent for receiving machine human hair;
Display unit:For showing fingertip pressure information, robot pose information and robot power supply information, and by robot The first image information and the second image information of collection synthesize 3D rendering and shown.
3. body feeling interaction device according to claim 2, it is characterised in that the display unit is virtual reality wear-type Display device.
A kind of 4. robot that body feeling interaction device with described in claim 1-3 any one is adapted to, it is characterised in that including Control unit, fingertip pressure sensing unit, receiving unit, transmitting element, image acquisition units, execution are connected with control unit Unit and monitoring unit;
Receiving unit:For receiving the body feeling interaction instruction of body-sensing interactive device transmission;
Fingertip pressure sensing unit:For gathering robot fingertip pressure information;
Image acquisition units:For gathering the view data for being used for generating 3D rendering;
Monitoring unit:For gathering robot attitude information itself and power information;
Control unit:For the information acquired in fingertip pressure sensing unit, image acquisition units and monitoring unit to be passed through into hair Unit is sent to be transferred to body feeling interaction device, and the body feeling interaction instruction that receiving unit is received is sent to execution unit;
Transmitting element:Data message for robot to be collected is sent to body feeling interaction device;
Execution unit:The instruction and execution of reception control unit.
5. a kind of robot according to claim 4, it is characterised in that image acquisition units include the first camera and the Two cameras, the first camera and second camera are connected with control unit.
6. a kind of control method of robot, it is characterised in that comprise the following steps:
Step 1, body feeling interaction device collection target body limbs joint angle information is first passed through, according to the target body collected Limbs joint angle information carries out limb motion angle recognition to target body;
Step 2, body feeling interaction device is handed over further according to body-sensing corresponding to the limb motion angle result generation of the target body of identification Mutually instruction;
Step 3, body feeling interaction device again sends body feeling interaction instruction to robot, and robot performs body feeling interaction instruction;
Wherein, body feeling interaction device is the body feeling interaction device as described in claim 1-3 any one.
7. the control method of robot according to claim 6, it is characterised in that in the step 1, body feeling interaction device Target body limbs joint movement angle is gathered, target body is entered according to the target body limbs joint angle information collected The step of row limb motion angle recognition, specifically includes:
Step 1.1, the Eulerian angles in each joint of target body limbs are gathered;
Step 1.2, the Eulerian angles of the target body collected in step 1.1 are converted into spin matrix, then by spin matrix band Enter the algorithm for resolving human body relative attitude, calculate the result of each joint relative motion attitude angle of human body.
8. the control method of robot according to claim 6, it is characterised in that in the step 2, body feeling interaction instruction Performed including robot motion posture, robot direction of advance adjustment, robot pace adjustment, robot GPS drives automatically Sail the collection of patten transformation and robot fingertip pressure data.
9. the control method of robot according to claim 6, it is characterised in that in the step 3, robot performs body When feeling interactive instruction:
Robot obtains the robot fingertip pressure information under current environment by the fingertip pressure sensing unit in robot, and Fingertip pressure information is sent to body feeling interaction device, so that fingertip pressure information is converted into gradient by body feeling interaction device Vibration, and give the vibrational feedback to the target body;
Robot obtains the first image under current environment by the first camera in robot and second camera respectively Data and the second view data, and the first view data and the second view data are sent to body feeling interaction device, so that Body feeling interaction device will be shown after the first view data and the second view data synthesis 3D rendering;
Robot obtains the robot reality by the GPS module in robot, electronic compass and power supply real time monitoring apparatus When attitude information and power information, and robot pose information and power information are sent to body feeling interaction device, with Body feeling interaction device is set to be shown after integrating robot pose information and power information.
10. the control method of robot according to claim 6, it is characterised in that body feeling interaction device and robot it Between using communication transmission data.
CN201710910785.XA 2017-09-29 2017-09-29 A kind of control method of robot, body feeling interaction device and robot Pending CN107639620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710910785.XA CN107639620A (en) 2017-09-29 2017-09-29 A kind of control method of robot, body feeling interaction device and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710910785.XA CN107639620A (en) 2017-09-29 2017-09-29 A kind of control method of robot, body feeling interaction device and robot

Publications (1)

Publication Number Publication Date
CN107639620A true CN107639620A (en) 2018-01-30

Family

ID=61123029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710910785.XA Pending CN107639620A (en) 2017-09-29 2017-09-29 A kind of control method of robot, body feeling interaction device and robot

Country Status (1)

Country Link
CN (1) CN107639620A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108527381A (en) * 2018-04-09 2018-09-14 上海方立数码科技有限公司 A kind of service humanoid robot based on gesture recognition
CN108772839A (en) * 2018-06-25 2018-11-09 中国人民解放军第二军医大学 Master-slave operation and human-machine system
CN109003666A (en) * 2018-06-21 2018-12-14 珠海金山网络游戏科技有限公司 Long-range remote sensing based on motion capture is accompanied and attended to the methods, devices and systems of robot
CN109941436A (en) * 2019-04-09 2019-06-28 国网黑龙江省电力有限公司电力科学研究院 It is a kind of can livewire work maintenance feeder line fault unmanned plane
CN110209264A (en) * 2019-03-28 2019-09-06 钟炜凯 A kind of behavioral data processing system and method
CN110328648A (en) * 2019-08-06 2019-10-15 米召礼 A kind of man-machine working machine moved synchronously
CN111438673A (en) * 2020-03-24 2020-07-24 西安交通大学 High-altitude operation teleoperation method and system based on stereoscopic vision and gesture control
CN111687847A (en) * 2020-07-09 2020-09-22 深圳市多够机器人技术有限公司 Remote control device and control interaction mode of foot type robot
CN111880643A (en) * 2019-06-26 2020-11-03 广州凡拓数字创意科技股份有限公司 Navigation method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013140257A1 (en) * 2012-03-20 2013-09-26 Alexopoulos Llias Methods and systems for a gesture-controlled lottery terminal
KR20140051554A (en) * 2012-10-23 2014-05-02 이인택 Motion capture system for using ahrs
CN103955215A (en) * 2014-04-15 2014-07-30 桂林电子科技大学 Automatic obstacle avoidance trolley based on gesture recognition and control device and method of automatic obstacle avoidance trolley
CN104589356A (en) * 2014-11-27 2015-05-06 北京工业大学 Dexterous hand teleoperation control method based on Kinect human hand motion capturing
CN104771175A (en) * 2015-03-04 2015-07-15 上海交通大学 Wearable smart ring capable of capturing three-dimensional postures of four limbs of human body
WO2015188268A1 (en) * 2014-06-08 2015-12-17 Hsien-Hsiang Chiu Gestural interface with virtual control layers
CN107030692A (en) * 2017-03-28 2017-08-11 浙江大学 One kind is based on the enhanced manipulator teleoperation method of perception and system
CN107203192A (en) * 2017-06-28 2017-09-26 上海应用技术大学 A kind of mobile robot automatic ride control system based on electronic map

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013140257A1 (en) * 2012-03-20 2013-09-26 Alexopoulos Llias Methods and systems for a gesture-controlled lottery terminal
KR20140051554A (en) * 2012-10-23 2014-05-02 이인택 Motion capture system for using ahrs
CN103955215A (en) * 2014-04-15 2014-07-30 桂林电子科技大学 Automatic obstacle avoidance trolley based on gesture recognition and control device and method of automatic obstacle avoidance trolley
WO2015188268A1 (en) * 2014-06-08 2015-12-17 Hsien-Hsiang Chiu Gestural interface with virtual control layers
CN104589356A (en) * 2014-11-27 2015-05-06 北京工业大学 Dexterous hand teleoperation control method based on Kinect human hand motion capturing
CN104771175A (en) * 2015-03-04 2015-07-15 上海交通大学 Wearable smart ring capable of capturing three-dimensional postures of four limbs of human body
CN107030692A (en) * 2017-03-28 2017-08-11 浙江大学 One kind is based on the enhanced manipulator teleoperation method of perception and system
CN107203192A (en) * 2017-06-28 2017-09-26 上海应用技术大学 A kind of mobile robot automatic ride control system based on electronic map

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108527381A (en) * 2018-04-09 2018-09-14 上海方立数码科技有限公司 A kind of service humanoid robot based on gesture recognition
CN109003666A (en) * 2018-06-21 2018-12-14 珠海金山网络游戏科技有限公司 Long-range remote sensing based on motion capture is accompanied and attended to the methods, devices and systems of robot
CN108772839A (en) * 2018-06-25 2018-11-09 中国人民解放军第二军医大学 Master-slave operation and human-machine system
CN108772839B (en) * 2018-06-25 2021-07-20 中国人民解放军第二军医大学 Master-slave operation and man-machine integrated system
CN110209264A (en) * 2019-03-28 2019-09-06 钟炜凯 A kind of behavioral data processing system and method
CN110209264B (en) * 2019-03-28 2022-07-05 钟炜凯 Behavior data processing system and method
CN109941436A (en) * 2019-04-09 2019-06-28 国网黑龙江省电力有限公司电力科学研究院 It is a kind of can livewire work maintenance feeder line fault unmanned plane
CN111880643A (en) * 2019-06-26 2020-11-03 广州凡拓数字创意科技股份有限公司 Navigation method and device
CN110328648A (en) * 2019-08-06 2019-10-15 米召礼 A kind of man-machine working machine moved synchronously
CN111438673A (en) * 2020-03-24 2020-07-24 西安交通大学 High-altitude operation teleoperation method and system based on stereoscopic vision and gesture control
CN111687847A (en) * 2020-07-09 2020-09-22 深圳市多够机器人技术有限公司 Remote control device and control interaction mode of foot type robot
CN111687847B (en) * 2020-07-09 2024-02-02 广东鹏行智能有限公司 Remote control device and control interaction mode of foot robot

Similar Documents

Publication Publication Date Title
CN107639620A (en) A kind of control method of robot, body feeling interaction device and robot
CN104699247B (en) A kind of virtual reality interactive system and method based on machine vision
CA2882968C (en) Facilitating generation of autonomous control information
CN103488291B (en) Immersion virtual reality system based on motion capture
CN105027030B (en) The wireless wrist calculating connected for three-dimensional imaging, mapping, networking and interface and control device and method
WO2015180497A1 (en) Motion collection and feedback method and system based on stereoscopic vision
KR101850028B1 (en) Device for virtual tour experience and method thereof
CN106227231A (en) The control method of unmanned plane, body feeling interaction device and unmanned plane
CN107122048A (en) One kind action assessment system
CN106227339A (en) wearable device, virtual reality human-computer interaction system and method
CN107537135A (en) A kind of lower limb rehabilitation training device and system based on virtual reality technology
CN107656505A (en) Use the methods, devices and systems of augmented reality equipment control man-machine collaboration
CN106256509A (en) Augmentor
EP3797931B1 (en) Remote control system, information processing method, and program
CN107030692A (en) One kind is based on the enhanced manipulator teleoperation method of perception and system
CN103019386A (en) Method for controlling human-machine interaction and application thereof
US20210200311A1 (en) Proxy controller suit with optional dual range kinematics
CN107443374A (en) Manipulator control system and its control method, actuation means, storage medium
CN113221381B (en) Design method of virtual reality multi-view fusion model
CN108062102A (en) A kind of gesture control has the function of the Mobile Robot Teleoperation System Based of obstacle avoidance aiding
CN111687847A (en) Remote control device and control interaction mode of foot type robot
CN111459276A (en) Motion capture glove of virtual human hand reality system and virtual reality system
CN212421309U (en) Remote control device of foot type robot
JP2019093537A (en) Deep learning system, deep learning method, and robot
US11518036B2 (en) Service providing system, service providing method and management apparatus for service providing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180130