CN106005086A - Leg-wheel composite robot based on Xtion equipment and gesture control method thereof - Google Patents

Leg-wheel composite robot based on Xtion equipment and gesture control method thereof Download PDF

Info

Publication number
CN106005086A
CN106005086A CN201610389736.1A CN201610389736A CN106005086A CN 106005086 A CN106005086 A CN 106005086A CN 201610389736 A CN201610389736 A CN 201610389736A CN 106005086 A CN106005086 A CN 106005086A
Authority
CN
China
Prior art keywords
robot
steering wheel
xtion
gesture
lower limb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610389736.1A
Other languages
Chinese (zh)
Inventor
丁希仑
齐静
徐坤
彭赛金
杨帆
尹业成
郑羿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201610389736.1A priority Critical patent/CN106005086A/en
Publication of CN106005086A publication Critical patent/CN106005086A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • B62D57/032Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a leg-wheel composite robot based on Xtion equipment and a gesture control method thereof. The leg-wheel composite robot comprises a robot body and six robot single leg structures. A control system is mounted on a sensing layer, a driving layer, a control layer and a power layer inside the robot body. Gesture control is achieved by combining an XtionPRO LIVE camera mounted on the sensing layer, and a gesture recognition module and a motion control module which are mounted on the control layer. The gesture recognition module is divided into a dynamic gesture control sub-module and a static gesture control sub-module so that dynamic gesture control and static gesture control can be achieved correspondingly. The leg-wheel composite robot based on the Xtion equipment and the gesture control method thereof have the advantages that more messages are provided for subsequent image processing, mapping and human-machine interaction functions can be achieved, human-robot interaction naturalness is improved, a user can control the six-foot leg-wheel composite robot through gestures, and practical application of the six-foot leg-wheel composite robot is facilitated; and the gesture recognition control part of the robot is designed based on a robot operation system (ROS), is high in transportability and can be applied to other robot control systems.

Description

A kind of lower limb based on Xtion equipment wheel composite machine people and gestural control method thereof
Technical field
The invention belongs to robotics, be specifically related to a kind of based on Xtion equipment lower limb wheel composite machine people and gesture control thereof Method and apparatus processed.
Background technology
Robot has broad application prospects in terms of social life and commercial production.Application in terms of commercial production, as celestial body is visited Survey, Post disaster relief etc..Application prospect in terms of social life, as helped the elderly, helping the disabled.Aging society is marched toward in China, old People needs to look after, and person between twenty and fifty need work to make a living, and have no time to look after old man.Robot can serve as part labour force, Supplement the deficiency of labour force, have wide practical use at the invalid aspect such as pregnant of the children that helps the elderly.
Legged mobile robot can adapt to complicated landform, but the speed of travel is relatively slow, and the wheeled robot speed of travel is very fast, but can only At flatter ground running, the ability of across obstacle is poor, lower limb, the motion being combined into robot of two kinds of motion modes of wheel Provide new facility, how to design wheel, lower limb two ways combine mechanism become be currently needed for solve problem.
Rolled bore typically requires the specific control instruction of particular person input could control robot motion, is ignorant of Professional knowledge Ordinary people cannot directly operate, and this interactive mode constrains the extensive application of robot to a certain extent.
Summary of the invention
For the problems referred to above, the present invention provides a kind of lower limb based on Xtion equipment to take turns composite machine people;Make machine interpersonal Realize the most mutual.
Present invention lower limb based on Xtion equipment wheel composite machine people, including robot body and 6 robot list lower limb structures, 6 Bar robot list lower limb structure circumference is uniformly installed on robot body.Single lower limb structure is the wheel lower limb knot having walking with wheel row function Structure, has three leg sections, and 4 drive steering wheel;Three leg sections are made to be respectively the first leg section, the second leg section, the 3rd leg section, 4 drive steering wheel to be respectively the first steering wheel, the second steering wheel, the 3rd steering wheel, the 4th steering wheel.Wherein, the fixing peace in first leg section one end It is contained on the output shaft of the first steering wheel, forms hip joint;First steering wheel output shaft axis and horizontal plane, is driven by the first steering wheel Dynamic first leg section teeter.Second steering wheel is fixedly installed in the first leg section other end, the second steering wheel output shaft and the second leg section one End is fixing, forms knee joint;Second steering wheel output shaft axis is vertical with the first steering wheel output shaft axis, by the second servo driving Two leg section longitudinal oscillations.The second leg section other end and the 3rd steering wheel output shaft are fixed, and form ankle joint;3rd steering wheel output shaft axle Line is parallel with the second steering wheel output shaft axis, by the 3rd servo driving the 3rd leg section longitudinal oscillation.4th steering wheel is positioned at the 3rd leg section Middle part, the output shaft axis of the 4th steering wheel is parallel with the 3rd steering wheel output shaft axis, coaxial by spline on the 4th steering wheel output shaft It is installed with wheel, by the 4th servo driving wheel turns.The 3rd leg section other end is provided with Zu Di testing agency, is used for Single lower limb structure and contacting between ground, can realize the detection of foot ground contact condition simultaneously.
Be divided into four layers by the aluminium alloy plate with lightening hole inside robot body, be from top to bottom followed successively by sensing layer, drive layer, Key-course and mover layer, for installation control system.Wherein, sensing layer is provided with Xtion PRO LIVE photographic head and IMU; Xtion PRO LIVE photographic head is used for gathering environmental information, it is achieved location and map building simultaneously, and obtains the information of people, For man-machine interaction;IMU is used for obtaining self attitude of robot, auxiliary positioning and establishment environmental map.Drive and be provided with on layer Six pieces of actuator driving plates, are respectively used to each steering wheel motion controlling in 6 single lower limb structures.Master control borad is installed on key-course, real The functions such as existing communication management, sensor data acquisition, data process and driven management.It is additionally provided with gesture in key-course and controls mould Block and motion-control module, it is achieved the gesture of lower limb wheel composite machine people is controlled.Battery case is installed, in battery case on mover layer Mounting robot supplying cell.
Robot body and remote control terminal carry out radio communication, remote control terminal be controlled;Remote control terminal has safety Authentication module, audio and video playing module, operational module and information display module.Modules is divided into front end and backstage;Front end For Man Machine Interface;Network service is responsible on backstage, data process.Wherein, security authentication module is for obtaining user's input User profile, is packaged into packet through backstage and sends to the control master control borad of robot, master control borad feedback log in result to peace Full authentication module shows.Audio and video playing module receives, by backstage, the acoustic information that Xtion PRO LIVE photographic head returns With the image of collected by camera, show in audio and video playing module after background decoding.Information display module receives IMU and returns Six groups of robot sensing's information, as attitude, joint angles and joint moment information are used for showing Hexapod Robot heat transfer agent. Operational module is made up of motion control, can control robot motion;Robot traveling mode can be set simultaneously.
Gesture identification method for above-mentioned lower limb based on Xtion equipment wheel composite machine people is to install gesture identification on key-course Module and motion-control module;Gesture recognition module includes dynamic hand gesture recognition submodule and static gesture identification submodule, wherein:
Dynamic hand gesture recognition submodule is used for realizing dynamic hand gesture recognition, have stable detection device, draw loop truss device, push away detector, Moving detector.Stable detection device, for identifying whether the hand of current active is in steady statue, is drawn loop truss device and is used for identifying The picture circular motion of current active hand, pushes away detector and makees for the front promotion identifying current active hands, and moving detector is used for identifying The stroke of current active hands.
The specific design mode of above-mentioned dynamic hand gesture recognition submodule is:
A, the Xtion SDK initialization context of use Xtion PRO LIVE photographic head;
B, set up context environmental and create image composer, and set the pixel of image composer, frame per second;
C, set up, register scene manager, and initialize scene manager;
D, set up, register stable detection device, draw loop truss device, push away detector, moving detector, and by the prison in each detector Device is listened to add to scene manager;Meanwhile, set stable detection device, draw loop truss device, push away detector, moving detector Call back function.
The dynamic hand gesture recognition module designed by the way, when carrying out gesture identification, first passes through Xtion PRO LIVE Camera collection RGB image information;It is subsequently based on OpenNI software the image gathered is processed, extracts the movement locus of gesture, And by stable detection device, draw loop truss device, push away detector, moving detector identification hands transfixion, draw circle, push away and move Gesture;Subsequently stable detection device, draw loop truss device, push away detector, moving detector will in respective call back function output know Other result, and recognition result is sent to motion-control module;Motion-control module, according to recognition result, is known in conjunction with dynamic gesture In other module, predefined dynamic gesture and the mapping relations of robot motion, control robot and complete to move accordingly.
Described static gesture identification submodule, based on coloured image, is used for identifying static gesture;By static gesture identification submodule, When carrying out gesture identification, first passing through Xtion PRO LIVE camera collection RGB image information, this image information passes through machine Message in Topic in people's operating system realizes;Subsequently, the complexion model prestored in utilizing static gesture identification submodule Go out face part with HOG feature detection, and then according to organization of human body feature, detect the body part of people, and by body part Being partitioned into the hand region of people, the characteristics of image of hand region has been split in final extraction, carries out the training identification of gesture, and will know Other result sends master control borad, according to recognition result, in conjunction with the mapping relations of the static gesture being pre-stored in Yu robot motion, controls Robot completes to move accordingly.
It is an advantage of the current invention that:
1, in present invention lower limb based on Xtion equipment wheel composite machine people, Xtion Pro Live photographic head can not only obtain RGB information and depth information, also can obtain the skeleton point information of user, provides more information for follow-up image procossing, can be real Existing map structuring, human-computer interaction function;
2, present invention lower limb based on Xtion equipment wheel composite machine people and gestural control method thereof is portable strong.The method base In ROS, specifically, dynamic hand gesture recognition based on ROS and OpenNI Frame Design realize, static gesture identification based on ROS, Can be used for other robot control system;
3, present invention lower limb based on Xtion equipment wheel composite machine people and gestural control method thereof, including static gesture and dynamic hands Gesture controls two ways, and user can select suitable control mode as required;
4, present invention lower limb based on Xtion equipment wheel composite machine people and gestural control method thereof, the upper part of the body that object of study is behaved Picture, body structure based on people is partitioned into gesture area, highly versatile.This kind of dividing method can not only be split gloveless Hands, moreover it is possible to be partitioned into the hands of band glove;
5, present invention lower limb based on Xtion equipment wheel composite machine people and gestural control method thereof, make the control robot that uses gesture, Improve the mutual naturality of people and robot, meet the development trend of man-machine interaction;
6, of the present invention based on Xtion equipment lower limb wheel composite machine people and gestural control method and device, static gesture is known Other and based on OpenNI software dynamic hand gesture recognition, is applied to six foot wheel lower limb composite machine people, makes user can pass through gesture control System six foot wheel lower limb composite machine people, beneficially six foots take turns the actual application of lower limb composite machine people.
Accompanying drawing explanation
Fig. 1 is that present invention lower limb based on Xtion equipment takes turns composite machine people's overall structure schematic diagram;
Fig. 2 is single lower limb structural representation of structure one in present invention lower limb based on Xtion equipment wheel composite machine people;
Fig. 3 is the mounting means schematic diagram in single lower limb structure between steering wheel and leg section;
Fig. 4 is that present invention lower limb based on Xtion equipment takes turns list lower limb structural representation when composite machine people stands;
List lower limb structural representation when Fig. 5 is to take turns row in present invention lower limb based on Xtion equipment wheel composite machine people;
Fig. 6 is robot body's structural representation in present invention lower limb based on Xtion equipment wheel composite machine people;
Fig. 7 is the enclosure hierarchy signal of robot body in present invention lower limb based on Xtion equipment wheel composite machine people Figure;
Fig. 8 is charging inlet and power switch design position view in present invention lower limb based on Xtion equipment wheel composite machine people;
Fig. 9 is the establishing method flow process of dynamic hand gesture recognition submodule in present invention lower limb based on Xtion equipment wheel composite machine people Figure;
Figure 10 is the gesture identification method of dynamic hand gesture recognition submodule in present invention lower limb based on Xtion equipment wheel composite machine people;
Figure 11 is the gesture identification method of static gesture identification submodule in present invention lower limb based on Xtion equipment wheel composite machine people Flow chart.
In figure:
1-robot body 2-mono-lower limb structure 3-steering wheel
4-projecting shaft 5-bearing 6-wheel
7-executor's connector 8-end effector 9-Xtion PRO LIVE photographic head
10-IMU 11-actuator driving plate 12-master control borad
13-battery case 14-charging inlet 15-on and off switch
101-shell 102-periphery cover 201-the first leg section
202-the second leg section 203-the 3rd leg section 204-the first steering wheel
205-the second steering wheel 206-the 3rd steering wheel 207-the 4th steering wheel
208-Zu Di testing agency 901-base 902-inverted U support
Detailed description of the invention
The present invention is described in detail below in conjunction with the accompanying drawings.
Present invention lower limb based on Xtion equipment wheel composite machine people, including robot body 1 and 6 robot list lower limb structures 2, As it is shown in figure 1,6 robot list lower limb structure 2 circumferences are uniformly installed on robot body 1.
In order to realize foot end moving freely in space during general robot walking, single lower limb structure 2DOF is at least 3, machine People's list lower limb structure 2DOF is the most, and lower limb is the most flexible, but the quality of its design, control difficulty and lower limb will increase step by step.This In bright, 6 robot list lower limb structures 2 all use three steering wheel output three degree of freedoms, it is achieved basic walking-function.
In the present invention, single lower limb structure is the wheel lower limb structure having walking with wheel row function, as in figure 2 it is shown, have three leg sections, and 4 drive steering wheel.Making three leg sections be respectively first leg section the 201, second leg section the 202, the 3rd leg section 203,4 drive steering wheel to divide It it is not first steering wheel the 204, second steering wheel the 205, the 3rd steering wheel the 206, the 4th steering wheel 207.Wherein, first leg section 201 one end is fixed It is arranged on the output shaft of the first steering wheel 204, forms hip joint;First steering wheel 204 output shaft axis and horizontal plane, by One steering wheel 204 drives the first leg section 201 teeter.Second steering wheel 205 is fixedly installed in first leg section 201 other end, the second rudder Machine 205 output shaft and second leg section 202 one end are fixed, and form knee joint;Second steering wheel 205 output shaft axis and the first steering wheel 204 Output shaft axis is vertical, the second steering wheel 205 drive the second leg section 202 longitudinal oscillation.Second leg section 202 other end and the 3rd steering wheel 206 output shafts are fixed, and form ankle joint;3rd steering wheel 206 output shaft axis is parallel with the second steering wheel 205 output shaft axis, by Three steering wheels 206 drive the 3rd leg section 203 longitudinal oscillation.
Steering wheel and first leg section the 201, second leg section the 202, the 3rd leg section 203 for above-mentioned hip joint, knee joint and ankle Between connected mode identical, as it is shown on figure 3, be wherein mounted by means of screws with steering wheel 3 on steering wheel output shaft, steering wheel 3 embeds lower limb The groove inner position of design on the connection position, side, end of joint;The output shaft of steering wheel is coaxially installed with projecting shaft 4, projecting shaft 4 End is connected position by bearing 5 and is connected with leg section end opposite side.Thus power is transferred to leg section by above-mentioned connected mode by steering wheel, Decrease the use of bearing, reach the purpose of loss of weight.
Above-mentioned 3rd leg section 203 is designed as bending to robot body 1 direction, and bending angle is 140 degree;Make the 4th steering wheel 207 In knee, the output shaft axis of the 4th steering wheel 207 is parallel with the 3rd steering wheel 203 output shaft axis, on the 4th steering wheel 207 output shaft Coaxially it is installed with wheel 6 by spline, drives wheel 6 to rotate by the 4th steering wheel 207.
3rd leg section 203 other end is provided with Zu Di testing agency 208, for single lower limb structure 2 and contacting between ground, and simultaneously can be real The now detection of foot ground contact condition.Above-mentioned Zu Di testing agency 208 can use in the patent of invention of Publication No. CN104816766A public A kind of sufficient end being applicable to legged mobile robot opened contacts to earth testing agency, and this foot ground contact detecting apparatus essence is to install at foot end Contact switch.
In single lower limb structure 2 of said structure, directly i.e. complete fixing for the first steering wheel 204 with robot originally on robot body 1 Installation between body.When robot carries out walking advance, the supporting form of 6 single lower limb structures 2 is according to different joint positions the most not To the greatest extent the most identical, according to bionics principle, single leg support configuration during imitated Blatta seu periplaneta walking, as shown in Figure 4, by the first steering wheel 201, Second steering wheel the 202, the 3rd steering wheel 203 drives hip joint, knee joint and ankle joint to rotate respectively, the pose in three joints of regulation, Swing or vertically move before and after realizing single lower limb structure 2, advance according to different gaits.When robot carries out taking turns row advance, can Drive knee joint and ankle joint in the single lower limb structure 2 of at least a part of which 3 to rotate the configuration changing single lower limb structure 2, make in single lower limb structure 2 Wheel 6 lands, as shown in Figure 5, it is achieved wheel row motion.
Described robot body 1 is made up of, as shown in Figure 6 shell 101, periphery cover 102;All using 3D printing and making, material is Plastics, lighter weight hardness is suitable, and the process-cycle is short, it is simple to the moulding of robot body 1 controls.Periphery cover 102 installs additional Outside in shell 101, the installing port of reserved single lower limb structure 2 in circumference;Periphery cover 102 is used for protecting shell 101 and shell 101 internal Control system.In above-mentioned robot body 1, shell 101 uses nearly cylinder, and inside is divided by the aluminium alloy plate with lightening hole It is segmented into four layers, as it is shown in fig. 7, be from top to bottom followed successively by sensing layer, drive layer, key-course and mover layer, is used for control system is installed System.
Wherein, sensing layer is provided with Xtion PRO LIVE photographic head 9, IMU (inertial navigation unit) 10;Xtion PRO LIVE It is outside that photographic head 9 is positioned at shell 101, and its base 901 is positioned at sensor layer, is fixedly mounted on inverted U support 902, inverted U Support 902 is fixed on sensing layer upper surface.IMU10 is fixed on sensor layer, is positioned at inverted U support 902, is achieved in tight Gather design, reduce the volume of robot.Above-mentioned Xtion PRO LIVE photographic head 9 has two effects: one is to gather environmental information, Realize location and map building (Simultaneous Localization and Mapping, SLAM, simultaneously location and ground simultaneously Figure creates);Two is the information obtaining people, for man-machine interaction.Xtion PRO LIVE photographic head 9 comprises 2 cameras, is respectively One RGB camera and a depth camera;Depth camera coverage is between 0.8m to 3.5m.Additionally Xtion PRO LIVE The Mike of photographic head 9 both sides forms microphone array, it is possible to effectively obtains the acoustic information in environment, can be used for realizing Voice command Deng nature man-machine interaction mode.IMU10 uses inertial navigation unit, for obtaining self attitude of robot, auxiliary positioning and wound Build environmental map.In the present invention, IMU10 uses MTI-300AHRS, can export three axis accelerometers, three axis angular rates, three axles inclined Angle, sensor fusion algorithm XEE (Xsens Estimation Engine) that internal employing is breakthrough, overcome to a certain extent The restriction of Kalman filter, performance is close to optical gyroscope.
Drive, on layer, six pieces of actuator driving plates 11 are installed, be respectively used to each steering wheel motion controlling in 6 single lower limb structures 2.Six Block actuator driving plate 11 is disposed side by side on driving layer upper surface, it is possible to achieve parallel computation, real-time is good.
Being provided with master control borad 12 on key-course, master control borad 12 is the control centre of robot, can realize communication management, sensor number According to functions such as collection, data process, driven management.Master control borad 12 uses MIO-2263 series embedded board computer, for carrying EmbeddedThe technical grade embedded board computer of CeleronJ1900 tetra-core processor.Master control borad 12 is x86 framework, to respectively Class software and hardware has good compatibility, and its higher performance can meet macrooperation amount demand simultaneously, and uses ROS framework to enter Row exploitation.
Being provided with battery case 13 on mover layer, battery case 13 is provided with one piece of 12V lithium battery and one piece of 7.4V lithium battery, makes robot Without external power supply, independent, expand the robot scope of application.
Charging inlet 14 and on and off switch 15 it is also devised with, as shown in Figure 9 on above-mentioned key-course;Charging conductor two ends respectively with charging Interface 14 is connected with in mover layer two pieces of lithium batteries;External power supply is thus made to be come for lithium cell charging by charging inlet 14.Power supply Switch 15 for electricity under the powering on of each steering wheel controlling in every single lower limb structure 2 and master control borad 12.
The present invention, under wlan wireless network environment, carries out radio communication, by far with remote control terminal (mobile phone terminal or flat board end) Process control end is controlled.Control end and use android system, be designed with control module;Control module include security authentication module, Audio and video playing module, operational module, information display module, as shown in Figure 8.Control modules in end be divided into front end with after Platform.Front end is Man Machine Interface;Network service is responsible on backstage, data process.Wherein, security authentication module is used for obtaining use The user profile of family input, is packaged into packet through backstage and sends to the control master control borad 12 of robot, master control borad feedback step on Record result shows to security authentication module.Audio and video playing module receives Xtion PRO in Hexapod Robot by backstage The acoustic information of LIVE photographic head return and the image of collected by camera, show after background decoding in audio and video playing module. Information display module receives six groups of robot sensing's information that IMU returns, as attitude, joint angles and joint moment information are used for showing Show Hexapod Robot heat transfer agent, such as attitude, joint angles and joint moment.Operational module is made up of multiple controls, can control The actions such as Hexapod Robot is advanced, retreats, turns left, turns right, suspended, stopping.Robot traveling mode can be set simultaneously, Such as lower limb formula traveling mode or wheeled traveling mode.
For panel computer end, owing to the screen of panel computer is bigger, it is possible to accommodate more functional module.Compare mobile phone terminal, The present invention also add three-dimensional artificial module at panel computer end.Three-dimensional artificial mould is according to Xtion PRO LIVE photographic head and IMU The robot motion of module return and surrounding signals, simultaneous display robot motion and environment terrain.Run in robot During, three-dimensional artificial module, according to Hexapod Robot joint angles information, drives threedimensional model Tong Bu with real machine people (low Time delay) motion.
Lower limb for said structure takes turns composite machine people, and the present invention also proposes a kind of gestural control method, increases hands in key-course Gesture control module and motion-control module, it is achieved the gesture of lower limb wheel composite machine people is controlled.Gesture recognition module base in the present invention The hydro version of the robot operating system (Robot Operating System, ROS) installed on Linux Ubuntu 12.04 The design, by abstract to gesture recognition module and motion-control module for the different nodes in whole robot operating system, then gesture Communication between identification module with motion-control module can be regarded as between different node and communicates, therefore intermodule is with ROS (robot operating system) The form communication of middle Topic (topic), the message (message) that the content of communication is in Topic.
Above-mentioned gesture recognition module includes dynamic hand gesture recognition submodule and static gesture identification submodule, and user can select as required Selecting suitable gesture identification method, wherein, dynamic hand gesture recognition is soft based on OpenNI (Open Natural Interaction) Part designs, and is used for realizing dynamic hand gesture recognition;OpenNI provides multilingual, a cross-platform framework for developer, and provides The api interface of its natural interaction.OpenNI API (Application Programming Interface, application programming Interface) be one group write natural interaction application interface.The main purpose of OpenNI is to form a standard application programming to connect Mouthful, build vision and audio sensor and vision and the bridge of audio perception middleware communication with this.In NITE OpenNI Interbed, is mainly used to realize gesture identification.NITE is mainly realized by gesture detector, and these detectors are in NITE program Occur in the form of classes, be used for realizing the identification of certain gestures.Dynamic hand gesture recognition submodule in the present invention designs based on NITE Realize, used gesture detector include stable detection device (SteadyDetector), picture loop truss device (CircleDetector), Push away detector (PushDetector), moving detector (SwipeDetector).Wherein, stable detection device is used for identifying currently Whether active hand is in steady statue, draws loop truss device and is used for identifying the picture circular motion of current active hand, pushes away detector and use Making (that is, pushing away before hand is perpendicular to health) in the front promotion identifying current active hands, moving detector is used for identifying current active hands Stroke (that is, hands is along the plane motion in health front).
As shown in Figure 9, the specific design mode of above-mentioned dynamic hand gesture recognition submodule is:
A, use Xtion PRO LIVE photographic head 9 Xtion SDK (the body-sensing developing instrument in Android platform) in XML Document (such as: Sample-tracking.xml) initialization context;
B, set up context environmental and create image composer (ImageGenerator), and set the pixel of image composer, frame Rate etc.;
C, set up, register scene manager (SessionManager), and initialize scene manager;
D, set up, register stable detection device, draw loop truss device, push away detector, moving detector, and by the prison in each detector Device is listened to add to scene manager;Meanwhile, set stable detection device, draw loop truss device, push away detector, moving detector Call back function;
The dynamic hand gesture recognition module designed by the way, when carrying out gesture identification, as shown in Figure 10, first passes through Xtion PRO LIVE photographic head 9 gathers the RGB image information of dynamic gesture;It is subsequently based on OpenNI software the image gathered is processed, Extract the movement locus of gesture, and by stable detection device, draw loop truss device, to push away detector, moving detector identification hands static Gesture motionless, that draw circle, push away and move;Subsequently stable detection device, draw loop truss device, push away detector, moving detector will be each From call back function in export recognition result, and with the form of Message, recognition result is sent to fortune in robot operating system Dynamic control module.Motion-control module is according to recognition result, in conjunction with dynamic gesture predefined in dynamic hand gesture recognition module and machine The mapping relations of device people motion, control robot and complete to move accordingly.Above-mentioned Dynamic Recognition submodule range is relatively wide, can Use time in the case of indoor light is bad.
Described static gesture identification submodule, based on coloured image, is used for identifying static gesture.By static gesture identification submodule, When carrying out gesture identification, as shown in figure 11, first pass through Xtion PRO LIVE photographic head 9 and gather the RGB image information of human body, This image information is sent to static gesture identification submodule by the message in Topic in robot operating system;Subsequently, profit Complexion model and HOG feature detection with prestoring in static gesture identification submodule go out face part, and then according to organization of human body feature, Detect the body part of people, and by being partitioned into the hand region of people in body part, the figure of hand region has been split in final extraction As feature, carry out the training identification of gesture, and recognition result is sent motion-control module, according to recognition result, in conjunction with prestoring In the mapping relations of static gesture and robot motion, control robot and complete to move accordingly.Above-mentioned static gesture identification Module can use in the case of indoor or outdoors illumination is good.
Embodiment one: use dynamic gesture to control robot motion
User waves towards Xtion PRO LIVE photographic head 9, after robot recognizes the action of waving of user, and robot original place Mark time march, when user's hands upwards vertically moves, robot walks the most forward, when user's hands is moved vertically downward, machine People retreats straight, when user's hands draws circle clockwise, and robot remains where one is away after turning around clockwise, when user's hands pushes away forward, Robot stop motion.
Embodiment two: use static gesture to control robot motion
Static gesture identification module can be with the mapping relations of Pre-defined gesture kind Yu robot motion, such as, to user one hand 1-9 Numeral gesture is done predefined, as shown in table 1:
Table 1 static gesture predefines
Under default situations, Hexapod Robot is walked with lower limb formula " 3+3 " gait.User does the gesture of numeral 1, before Hexapod Robot Enter;User does the gesture of numeral 3, and Hexapod Robot is turned to the left;User does the gesture of numeral 4, and Hexapod Robot is turned right Curved;User does the gesture of numeral 5, and Hexapod Robot stops;User does the gesture of numeral 6, and Hexapod Robot legged walking is cut It is changed to running on wheels;User does the gesture of numeral 2, and Hexapod Robot retreats with wheel line mode;User does the gesture of numeral 7, Hexapod Robot wheel row switches to legged walking;User does the gesture of numeral 4, and Hexapod Robot is turned right in legged walking mode Curved;User does the gesture of numeral 5, and Hexapod Robot stops;User does the gesture of numeral 8, and Hexapod Robot lifts one leg; User does the gesture of numeral 5, and Hexapod Robot stops;User does the gesture of numeral 9, and Hexapod Robot lifts two legs;With The gesture of numeral 5 is done at family, and Hexapod Robot stops.

Claims (8)

1. lower limb based on an Xtion equipment wheel composite machine people, including robot body and 6 robot list lower limb structures, 6 robot list lower limb structure circumferences are uniformly installed on robot body;Single lower limb structure is the wheel lower limb structure having walking with wheel row function, has three leg sections, and 4 drive steering wheel;Making three leg sections be respectively the first leg section, the second leg section, the 3rd leg section, 4 drive steering wheel to be respectively the first steering wheel, the second steering wheel, the 3rd steering wheel, the 4th steering wheel;Wherein, first leg section one end is fixedly mounted on the output shaft of the first steering wheel, forms hip joint;First steering wheel output shaft axis and horizontal plane, by first servo driving the first leg section teeter;Second steering wheel is fixedly installed in the first leg section other end, the second steering wheel output shaft and second leg section one end and fixes, and forms knee joint;Second steering wheel output shaft axis is vertical with the first steering wheel output shaft axis, by second servo driving the second leg section longitudinal oscillation;The second leg section other end and the 3rd steering wheel output shaft are fixed, and form ankle joint;3rd steering wheel output shaft axis is parallel with the second steering wheel output shaft axis, by the 3rd servo driving the 3rd leg section longitudinal oscillation;4th steering wheel is positioned in the middle part of the 3rd leg section, and the output shaft axis of the 4th steering wheel is parallel with the 3rd steering wheel output shaft axis, and the 4th steering wheel output shaft is coaxially installed with wheel, by the 4th servo driving wheel turns by spline;The 3rd leg section other end is provided with Zu Di testing agency, for single lower limb structure 2 and contacting between ground, can realize the detection of foot ground contact condition simultaneously;
It is characterized in that: inside robot body, be divided into four layers by the aluminium alloy plate with lightening hole, be from top to bottom followed successively by sensing layer, drive layer, key-course and mover layer, for installation control system;Wherein, sensing layer is provided with Xtion PRO LIVE photographic head and IMU;Xtion PRO LIVE photographic head is used for gathering environmental information, it is achieved location and map building simultaneously, and obtains the information of people, for man-machine interaction;IMU is used for obtaining self attitude of robot, auxiliary positioning and establishment environmental map;Drive, on layer, six pieces of actuator driving plates are installed, be respectively used to each steering wheel motion controlling in 6 single lower limb structures;Master control borad is installed, it is achieved the functions such as the process of communication management, sensor data acquisition, data and driven management on key-course;Battery case is installed, mounting robot supplying cell in battery case on mover layer;
Robot body and remote control terminal carry out radio communication, remote control terminal be controlled;Remote control terminal has security authentication module, audio and video playing module, operational module and information display module;Modules is divided into front end and backstage;Front end is Man Machine Interface;Network service is responsible on backstage, data process;Wherein, security authentication module, for obtaining the user profile of user's input, is packaged into packet through backstage and sends to the control master control borad of robot, master control borad feedback log in result and show to security authentication module;Audio and video playing module receives, by backstage, acoustic information and the image of collected by camera that Xtion PRO LIVE photographic head returns, and shows after background decoding in audio and video playing module;Information display module receives six groups of robot sensing's information that IMU returns, as attitude, joint angles and joint moment information are used for showing Hexapod Robot heat transfer agent;Operational module is made up of motion control, can control robot motion;Robot traveling mode can be set simultaneously.
A kind of lower limb based on Xtion equipment wheel composite machine people, it is characterized in that: the connected mode between hip joint, knee joint with the steering wheel of ankle and the first leg section, the second leg section, the 3rd leg section is identical, wherein being mounted by means of screws with steering wheel on steering wheel output shaft, steering wheel embeds the groove inner position of design on the connection position, side, end of leg section;The output shaft of steering wheel is coaxially installed with projecting shaft, and projecting shaft end is connected position by bearing and is connected with leg section end opposite side.
A kind of lower limb based on Xtion equipment wheel composite machine people, it is characterised in that: install periphery cover outside robot body additional, the installing port of reserved single lower limb structure in periphery cover circumference;By periphery cover, robot is originally protected.
A kind of lower limb based on Xtion equipment wheel composite machine people, it is characterized in that: Xtion PRO LIVE photographic head is positioned at outside robot body, base is positioned at sensor layer, is fixedly mounted on inverted U support, and inverted U support is fixed on sensing layer upper surface;IMU is positioned at inverted U support.
A kind of lower limb based on Xtion equipment wheel composite machine people, it is characterised in that: six pieces of actuator driving plates are disposed side by side on driving layer upper surface, it is possible to achieve parallel computation, real-time is good.
A kind of lower limb based on Xtion equipment wheel composite machine people, it is characterised in that: it is also devised with charging inlet and on and off switch on key-course;Charging conductor two ends are connected with two pieces of lithium batteries in charging inlet and mover layer respectively;External power supply is thus made to be come for lithium cell charging by charging inlet;On and off switch is for controlling electricity under the powering on of each steering wheel in every single lower limb structure and master control borad.
A kind of lower limb based on Xtion equipment wheel composite machine people, it is characterised in that: remote control terminal also has three-dimensional artificial module;Three-dimensional artificial mould is according to Xtion PRO LIVE photographic head and the robot motion of IMU module return and surrounding signals, simultaneous display robot motion and environment terrain;In robot running, three-dimensional artificial module, according to Hexapod Robot joint angles information, drives threedimensional model to be synchronized with the movement with real machine people.
8. for the gestural control method of a kind of lower limb based on Xtion equipment wheel composite machine people described in claim 1, it is characterised in that: gesture recognition module and motion-control module are installed on key-course;Gesture recognition module includes dynamic hand gesture recognition submodule and static gesture identification submodule, wherein:
Dynamic hand gesture recognition submodule is used for realizing dynamic hand gesture recognition, has stable detection device, picture loop truss device, pushes away detector, moving detector;Wherein, stable detection device is for identifying whether the hand of current active is in steady statue, drawing loop truss device and be used for identifying the picture circular motion of current active hand, push away detector and make for the front promotion identifying current active hands, moving detector is for identifying the stroke of current active hands;
The specific design mode of above-mentioned dynamic hand gesture recognition submodule is:
A, the Xtion SDK initialization context of use Xtion PRO LIVE photographic head;
B, set up context environmental and create image composer, and set the pixel of image composer, frame per second;
C, set up, register scene manager, and initialize scene manager;
D, set up, register stable detection device, draw loop truss device, push away detector, moving detector, and the audiomonitor in each detector is added in scene manager;Meanwhile, set stable detection device, draw loop truss device, push away the call back function of detector, moving detector;
The dynamic hand gesture recognition module designed by the way, when carrying out gesture identification, first passes through Xtion PRO LIVE camera collection RGB image information;Be subsequently based on OpenNI software the image gathered is processed, extract the movement locus of gesture, and by stable detection device, draw loop truss device, push away detector, moving detector identification hands transfixion, draw circle, the gesture that pushes away and move;Subsequently stable detection device, draw loop truss device, push away detector, moving detector will in respective call back function export recognition result, and by recognition result send to motion-control module;Motion-control module is according to recognition result, in conjunction with the mapping relations of dynamic gesture predefined in dynamic hand gesture recognition module Yu robot motion, controls robot and completes to move accordingly.
Described static gesture identification submodule, based on coloured image, is used for identifying static gesture;By static gesture identification submodule, when carrying out gesture identification, first passing through Xtion PRO LIVE camera collection RGB image information, this image information is realized by the message in the Topic in robot operating system;Subsequently, the complexion model and the HOG feature detection that prestore in utilizing static gesture identification submodule go out face part, and then according to organization of human body feature, detect the body part of people, and by body part is partitioned into the hand region of people, the characteristics of image of hand region has been split in final extraction, carry out the training identification of gesture, and recognition result is sent master control borad, according to recognition result, in conjunction with the mapping relations of the static gesture being pre-stored in Yu robot motion, control robot and complete to move accordingly.
CN201610389736.1A 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof Pending CN106005086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610389736.1A CN106005086A (en) 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610389736.1A CN106005086A (en) 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof

Publications (1)

Publication Number Publication Date
CN106005086A true CN106005086A (en) 2016-10-12

Family

ID=57090510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610389736.1A Pending CN106005086A (en) 2016-06-02 2016-06-02 Leg-wheel composite robot based on Xtion equipment and gesture control method thereof

Country Status (1)

Country Link
CN (1) CN106005086A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106363625A (en) * 2016-10-13 2017-02-01 杭州宇树科技有限公司 Quadruped robot tele-operation mode based on operator foot pose sensor
CN107085422A (en) * 2017-01-04 2017-08-22 北京航空航天大学 A kind of tele-control system of the multi-functional Hexapod Robot based on Xtion equipment
CN107688779A (en) * 2017-08-18 2018-02-13 北京航空航天大学 A kind of robot gesture interaction method and apparatus based on RGBD camera depth images
CN107765855A (en) * 2017-10-25 2018-03-06 电子科技大学 A kind of method and system based on gesture identification control machine people motion
CN107933733A (en) * 2018-01-03 2018-04-20 河南科技大学 A kind of imitative tortoise returns item pendulum shin coupling turning robot
CN110254554A (en) * 2019-06-24 2019-09-20 重庆大学 A kind of the elderly's care robot
CN111142523A (en) * 2019-12-26 2020-05-12 西北工业大学 Wheel-leg type mobile robot motion control system
CN112224304A (en) * 2020-10-28 2021-01-15 北京理工大学 Wheel step composite mobile platform and gesture and voice control method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4527650A (en) * 1983-03-18 1985-07-09 Odetics, Inc. Walking machine
CN101125564A (en) * 2007-09-28 2008-02-20 北京航空航天大学 Six-wheel/leg hemispherical outer casing detecting robot
CN101948011A (en) * 2010-09-09 2011-01-19 北京航空航天大学 Hexapod universal walking multifunctional moonshot robot
CN102063111A (en) * 2010-12-14 2011-05-18 广东雅达电子股份有限公司 Mobile terminal-based remote robot control system
CN104443105A (en) * 2014-10-29 2015-03-25 西南大学 Low-energy-loss six-foot robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4527650A (en) * 1983-03-18 1985-07-09 Odetics, Inc. Walking machine
CN101125564A (en) * 2007-09-28 2008-02-20 北京航空航天大学 Six-wheel/leg hemispherical outer casing detecting robot
CN101948011A (en) * 2010-09-09 2011-01-19 北京航空航天大学 Hexapod universal walking multifunctional moonshot robot
CN102063111A (en) * 2010-12-14 2011-05-18 广东雅达电子股份有限公司 Mobile terminal-based remote robot control system
CN104443105A (en) * 2014-10-29 2015-03-25 西南大学 Low-energy-loss six-foot robot

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JAN H. COCATRE-ZILGIEN: "walking machine with backhoe legs", 《HTTP://COCATREZ.NET/EARTH/AUTONOMOUSWALKINGROBOTS/SIXHOEPROJECT/SIXHOEPATAPP.HTML》 *
PRIMESENSE INC.: "《Prime SensorTM NITE 1.3 Controls Programmer"s Guide》", 31 December 2010 *
华硕电脑股份有限公司: "Kinect Chapter 7.NITE Gestures", 《HTTP://FIVEDOTS.COE.PSU.AC.TH/~AD/NUI163/GESTURES.PDF》 *
师哲等: "手势和体感控制机器人", 《百度文库》 *
杨骐溪: "关于体感机器人技术研究的心得", 《百度文库》 *
陈敬德等: "基于kinect的机器人控制系统", 《电子设计工程》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106363625A (en) * 2016-10-13 2017-02-01 杭州宇树科技有限公司 Quadruped robot tele-operation mode based on operator foot pose sensor
CN106363625B (en) * 2016-10-13 2019-03-05 杭州宇树科技有限公司 A kind of quadruped robot teleoperation method based on control staff's foot Position and attitude sensor
CN107085422A (en) * 2017-01-04 2017-08-22 北京航空航天大学 A kind of tele-control system of the multi-functional Hexapod Robot based on Xtion equipment
CN107688779A (en) * 2017-08-18 2018-02-13 北京航空航天大学 A kind of robot gesture interaction method and apparatus based on RGBD camera depth images
CN107765855A (en) * 2017-10-25 2018-03-06 电子科技大学 A kind of method and system based on gesture identification control machine people motion
CN107933733A (en) * 2018-01-03 2018-04-20 河南科技大学 A kind of imitative tortoise returns item pendulum shin coupling turning robot
CN107933733B (en) * 2018-01-03 2023-09-01 河南科技大学 Turtle-return-imitating swing-shank coupling overturning robot
CN110254554A (en) * 2019-06-24 2019-09-20 重庆大学 A kind of the elderly's care robot
CN111142523A (en) * 2019-12-26 2020-05-12 西北工业大学 Wheel-leg type mobile robot motion control system
CN111142523B (en) * 2019-12-26 2022-03-15 西北工业大学 Wheel-leg type mobile robot motion control system
CN112224304A (en) * 2020-10-28 2021-01-15 北京理工大学 Wheel step composite mobile platform and gesture and voice control method thereof

Similar Documents

Publication Publication Date Title
CN106005086A (en) Leg-wheel composite robot based on Xtion equipment and gesture control method thereof
CN106971028B (en) Method for applying virtual reality technology to assembly type building industry
CN106530894B (en) A kind of virtual head up display method and system of flight training device
CN107085422A (en) A kind of tele-control system of the multi-functional Hexapod Robot based on Xtion equipment
CN105966488A (en) Six-wheel-leg movable operation robot test platform
CN204465738U (en) A kind of disaster relief rescue visible system
CN203133746U (en) Integrated virtual landscape sightseeing device based on somatosensory interaction
CN106601060A (en) Virtual reality system for experiencing fire-fighting scene
CN103389699A (en) Robot monitoring and automatic mobile system operation method based on distributed intelligent monitoring controlling nodes
CN106217393A (en) Portable far-end is come personally interaction platform
CN110664593A (en) Hololens-based blind navigation system and method
CN108664121A (en) A kind of emulation combat system-of-systems drilling system
CN108214497A (en) A kind of family assiatant intelligent robot system
CN205585386U (en) Intelligent security cap based on augmented reality , spatial scanning and gesture recognition technology
CN107861629A (en) A kind of practice teaching method based on VR
JP2020116710A (en) Robot control system
CN110764623A (en) Virtual reality simulation system for pipe gallery space
CN105997447A (en) Blind guiding robot with wheel leg structure and utilization method for same
CN205325698U (en) Intelligence usher robot based on tall and erect system of ann
CN206326608U (en) A kind of family assiatant intelligent robot system
CN206574041U (en) A kind of AR map systems
CN105487449B (en) A kind of special motion controller for being used for four ropes traction cameras people
CN210639652U (en) Human-machine interactive type school museum display equipment based on virtual reality
CN206200977U (en) Portable distal end is come personally interaction platform
CN113194140A (en) Integrated remote monitoring system based on fire-fighting robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20161012