CN107247466A - Robot head gesture control method and system - Google Patents

Robot head gesture control method and system Download PDF

Info

Publication number
CN107247466A
CN107247466A CN201710439682.XA CN201710439682A CN107247466A CN 107247466 A CN107247466 A CN 107247466A CN 201710439682 A CN201710439682 A CN 201710439682A CN 107247466 A CN107247466 A CN 107247466A
Authority
CN
China
Prior art keywords
gesture
gesture shape
recognition result
robot
shape recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710439682.XA
Other languages
Chinese (zh)
Other versions
CN107247466B (en
Inventor
黄毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
RESEARCH INSTITUTE OF BIT IN ZHONGSHAN
Zhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltd
Original Assignee
Zhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltd filed Critical Zhongshan Changfeng Intelligent Automation Equipment Research Institute Co ltd
Priority to CN201710439682.XA priority Critical patent/CN107247466B/en
Publication of CN107247466A publication Critical patent/CN107247466A/en
Application granted granted Critical
Publication of CN107247466B publication Critical patent/CN107247466B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a robot head gesture control method and system. The method comprises the steps of identifying the gesture shape of the hand of a detected person to obtain a gesture shape identification result; when the gesture shape recognition result is a first gesture shape, setting a tracking flag position set in the robot, and triggering the robot to enter a tracking preparation state; when the gesture shape recognition result is a second gesture shape and the tracking flag is set, the robot tracks the hand motion of the person to be detected to perform head rotation motion; and when the gesture shape recognition result is a third gesture shape, clearing the tracking flag bit, stopping the head of the robot from rotating, and fixing the head of the robot at a stopping position. The method and the system control the robot to perform corresponding actions through different gestures of the tested person, can truly simulate the actual interactive operation between doctors and patients, and provide a good and effective traditional Chinese medicine rotation manipulation practice platform for doctors.

Description

A kind of robot head gestural control method and system
Technical field
The present invention relates to machine binocular vision technology field and technical field of medical equipment, more particularly to a kind of robot Head gestural control method and system.
Background technology
With the development of society, increasing people is perplexed by cervical spondylopathy.The treatment side of the cervical spondylopathy used at present Method is mainly expectant treatment, and wherein traditional Chinese medical science rotation class gimmick is adopted extensively as a kind for the treatment of method simple to operate, instant effect With.Traditional Chinese medical science rotation class gimmick can be divided into self-positioning, preloading, quick acting, recover four steps, and class gimmick is rotated with right side Exemplified by:Patient's end sitting position, neck natural relaxation, doctor uses by method, rubs the gimmicks such as method, rolling method and loosen neck soft 5- 10min;Allowing patients head, actively horizontal rotation to extreme angles, rotates, reaches fixed sense again after maximum flexion;Doctor is with ancon Patient's lower jaw is held in the palm, 3-5s is gently pulled up;Patient is allowed to relax one's muscles, ancon is quickly lifted upwards with short power;After operating successfully It can hear or many sound snaps;Using carrying, by etc. gimmick musculi colli is loosened again.
In the self-positioning step of clinical manipulation, patient is firstly the need of following the gesture of doctor to rotate head, to reach The physical endurance angle of itself, then carries out traditional Chinese medical science rotation class manipulation in the angle position.But the incidence of patient is Compare fragile place, it is necessary to which doctor can carry out dynamics and position accurately presses during traditional Chinese medical science rotation class manipulation Rub treatment, it is therefore desirable to which doctor carries out substantial amounts of traditional Chinese medical science rotation class gimmick exercise.Used in current traditional Chinese medical science rotation class gimmick training Cervical spondylopathy patient is simulated to provide exercising platform for doctor by robot, but the method that control machine people uses is predominantly using distant Device straighforward operation is controlled, the position of needs is gone to using the head of remote control control robot.But in clinical practice operation, suffer from Person is to need to follow the gesture of doctor to rotate head, to reach the physical endurance angle of itself, therefore uses remote control control The head of robot processed, which is gone to, needs the method for position to simulate the interactive operation between actual doctors and patients in clinical practice.
The content of the invention
It is an object of the invention to provide a kind of robot head gestural control method and system, allow robot according to quilt The gesture of survey personnel is acted, the interactive operation between the actual doctors and patients of true simulation, to provide exercising platform for doctor.
To achieve the above object, the invention provides following scheme:
A kind of robot head gestural control method, methods described includes:
The gesture shape of tested personnel's hand is recognized, the gesture shape recognition result is obtained;The gesture shape identification As a result first gesture shape, second gesture shape and the 3rd gesture shape are included;
When the gesture shape recognition result is the first gesture shape, by the tracking mark set in the robot Will position position, triggers the robot and enters tracking SBR, preparation starts to track the hand motion of the tested personnel;
When the gesture shape recognition result is the second gesture shape and tracking mark position has been set, institute The hand motion for stating tested personnel described in robotic tracking carries out end rotation motion;
When the gesture shape recognition result is three gesture shape, tracking mark position is reset, stopped The end rotation motion of the robot, the robot head is fixed on stop position.
Optionally, the gesture shape of identification tested personnel's hand, obtains the gesture shape recognition result, specific bag Include:
Obtain the coloured image and depth image of tested personnel's hand;
Gesture foreground picture is obtained according to the coloured image and the depth image;
The gesture shape of the tested personnel is identified according to the gesture foreground picture, the gesture shape identification knot is obtained Really.
Optionally, it is described that gesture foreground picture is obtained according to the coloured image and the depth image, specifically include:
The depth image is handled using Threshold Segmentation Algorithm, image district of the gray value in setting range is extracted Domain is used as foreground area;
The coloured image of the foreground area is obtained according to correspondence position of the foreground area in the coloured image;
Histogram is set up according to features of skin colors;
The coloured image of the foreground area is transformed into corresponding color space;
Back projection is carried out in the color space according to the histogram and obtains probability graph;
Denoising is carried out to the probability graph using morphological erosion expansion algorithm and Threshold Segmentation Algorithm, obtains described Gesture foreground picture.
Optionally, the gesture shape that the tested personnel is identified according to the gesture foreground picture, obtains the hand Gesture shape recognition result, is specifically included:
Calculate the characteristic vector of the gesture foreground picture;
The characteristic vector is classified using SVMs, gesture classification result is obtained;
The gesture shape of tested personnel's hand is identified according to the gesture classification result, the gesture shape is obtained Recognition result.
Optionally, it is described when the gesture shape recognition result be the second gesture shape and the tracking mark position When being set, the hand motion of tested personnel described in the robotic tracking carries out end rotation motion, specifically includes:
The direction of rotation of the robot head is determined according to the probability graph;
The horizontal rotation speed and vertical rotary speed of the robot head are calculated according to the probability graph;
The robot head is controlled according to the direction of rotation, the horizontal rotation speed and the vertical rotary speed Horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to the direction of rotation and the vertical rotation Speed is rotated vertically.
The invention also discloses a kind of robot head gestural control system, the system includes:
Gesture shape recognition result acquisition module, the gesture shape for recognizing tested personnel's hand obtains the gesture Shape recognition result;The gesture shape recognition result includes first gesture shape, second gesture shape and the 3rd gesture shape;
First gesture shape control module, for when the gesture shape recognition result be the first gesture shape when, By the tracking mark position set in the robot position, trigger the robot and enter tracking SBR, preparation start with The hand motion of tested personnel described in track;
Second gesture shape control module, for being the second gesture shape and institute when the gesture shape recognition result Tracking mark position is stated when being set, controls the hand motion of tested personnel described in the robotic tracking to carry out end rotation fortune It is dynamic;
3rd gesture shape control module, for when the gesture shape recognition result be three gesture shape when, Tracking mark position is reset, stops the end rotation motion of the robot, the robot head, which is fixed on, to stop Stop bit is put.
Optionally, the gesture shape recognition result acquisition module is specifically included:
Image acquisition submodule, coloured image and depth image for obtaining tested personnel's hand;
Gesture foreground picture acquisition submodule, for obtaining gesture prospect according to the coloured image and the depth image Figure;
Gesture shape recognition result acquisition submodule, for identifying the tested personnel's according to the gesture foreground picture Gesture shape, obtains the gesture shape recognition result.
Optionally, the gesture foreground picture acquisition submodule is specifically included:
Foreground area extraction unit, for being handled using Threshold Segmentation Algorithm the depth image, extracts gray scale The image-region being worth in setting range is used as foreground area;
Prospect color image taking unit, for being obtained according to correspondence position of the foreground area in the coloured image Obtain the coloured image of the foreground area;
Histogram sets up unit, for setting up histogram according to features of skin colors;
Image conversion unit, for the coloured image of the foreground area to be transformed into corresponding color space;
Probability graph acquiring unit, probability is obtained for carrying out back projection in the color space according to the histogram Figure;
Gesture foreground picture acquiring unit, for using morphological erosion expansion algorithm and Threshold Segmentation Algorithm to the probability Figure carries out denoising, obtains the gesture foreground picture.
Optionally, the gesture shape recognition result acquisition submodule is specifically included:
Characteristic vector computing unit, the characteristic vector for calculating the gesture foreground picture;
Gesture classification result acquiring unit, for classifying using SVMs to the characteristic vector, obtains hand Gesture classification results;
Gesture shape recognition result acquiring unit, for identifying tested personnel's hand according to the gesture classification result The gesture shape in portion, obtains the gesture shape recognition result.
Optionally, the second gesture shape control module is specifically included:
Direction of rotation acquisition submodule, the direction of rotation for determining the robot head according to the probability graph;
Rotary speed calculating sub module, the horizontal rotation speed for calculating the robot head according to the probability graph With vertical rotary speed;
Rotary motion control submodule, for according to the direction of rotation, the horizontal rotation speed and the vertical rotation Rotary speed controls the robot head to be horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to institute State direction of rotation and the vertical rotary speed is rotated vertically.
The specific embodiment provided according to the present invention, the invention discloses following technique effect:
The invention provides a kind of robot head gestural control method and system.Methods described recognizes tested personnel first The gesture shape of hand, obtains the gesture shape recognition result;The gesture shape recognition result include first gesture shape, Second gesture shape and the 3rd gesture shape;, will be described when the gesture shape recognition result is the first gesture shape The tracking mark position position set in robot, triggers the robot and enters tracking SBR, it is described that preparation starts tracking The hand motion of tested personnel;When the gesture shape recognition result be the second gesture shape and the tracking mark position When being set, the hand motion of tested personnel described in the robotic tracking carries out end rotation motion;When the gesture shape When recognition result is three gesture shape, tracking mark position is reset, stops the head rotation of the robot Transhipment is dynamic, and the robot head is fixed on stop position.Methods described and system are controlled by the different gestures of tested personnel Robot processed is acted accordingly, can truly be simulated the interactive operation between actual doctors and patients, be provided well for doctor Effective traditional Chinese medical science rotation class gimmick exercising platform.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to institute in embodiment The accompanying drawing needed to use is briefly described, it should be apparent that, drawings in the following description are only some implementations of the present invention Example, for those of ordinary skill in the art, without having to pay creative labor, can also be according to these accompanying drawings Obtain other accompanying drawings.
Fig. 1 is the method flow diagram of robot head gestural control method of the embodiment of the present invention;
Fig. 2 is the schematic diagram of gesture shape recognition result described in the embodiment of the present invention;
Fig. 3 is the schematic diagram of state space of the present invention and the steady-state spatial coordinate system;
Fig. 4 is the signal moved using robot head gestural control method control machine head part of the present invention Figure;
Fig. 5 is the structural representation of robot head gestural control system of the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of protection of the invention.
It is an object of the invention to provide a kind of robot head gestural control method and system.
In order to facilitate the understanding of the purposes, features and advantages of the present invention, it is below in conjunction with the accompanying drawings and specific real Applying mode, the present invention is further detailed explanation.
Fig. 1 is the method flow diagram of robot head gestural control method of the embodiment of the present invention.
Referring to Fig. 1, a kind of robot head gestural control method, including:
Step 101:The gesture shape of tested personnel's hand is recognized, the gesture shape recognition result is obtained.The gesture Shape recognition result includes first gesture shape, second gesture shape and the 3rd gesture shape.
In step 101, the gesture shape of identification tested personnel's hand obtains the gesture shape recognition result, tool Body includes:
Step 1011:Obtain the coloured image and depth image of tested personnel's hand.
The coloured image and the depth image shoot acquisition by being fixed on the imaging sensor of robot head.Institute State the Kinect that imaging sensor is Microsoft.
Step 1012:Gesture foreground picture is obtained according to the coloured image and the depth image.
The step 1012 is specifically included:
Step (1):The depth image is handled using Threshold Segmentation Algorithm, gray value is extracted in setting range Image-region be used as foreground area.
The light and shade of the depth image of gained represents the distance of object distance camera lens representated by current pixel point herein, false If there is a cupboard in camera lens apart from 5 meters of camera lens, a flat people raised one's hand is apart from 3 meters of camera lens, and wherein the hand of people is apart from camera lens 2.5 meters, then what is shown in the obtained depth image is exactly the patch of the shape of a dark cupboard, and one brighter Human body shape patch, also have the brighter patch of hand shape on this human body patch (because for hand is compared with human body Apart from camera lens closer to), then according to light and shade (gray value) set threshold value with regard to the object of different distance can be partitioned into.In the present embodiment The depth image is handled using Threshold Segmentation Algorithm, image-region of the gray value in setting range is extracted as preceding Scene area, exactly in order to be partitioned into hand region in image background.Specific method is:The depth image is traveled through, by gray scale Pixel intensity of the value in setting range retains, and the pixel outside setting range is set to 0, thus can be by the prospect Split described in region from view picture in depth image.
Step (2):The foreground area is obtained according to correspondence position of the foreground area in the coloured image Coloured image.
The foreground area is exactly the image-region of tested personnel's hand, extracts figure of the gray value in setting range After picture region is as foreground area, according to position of the foreground area in the depth image, against the cromogram Identical region is with regard to that can obtain the coloured image of the foreground area as in.Then according still further to skin color segmentation image, other are removed It is not the object of the colour of skin (such as close to clothing of hand etc.), it becomes possible to obtain the gesture foreground picture.
Step (3):Histogram is set up according to features of skin colors.
The features of skin colors is exactly the feature that human body complexion has, and the features of skin colors can be obtained from many places, this reality Apply in the method described in example, be the picture by choosing tested personnel's hand in advance, then the colour of skin to the hand is entered Row statistics obtains the features of skin colors, is then compared with the similar features in application scenarios, is subject to certain calculating with mutual Distinguish, the specific features of skin colors obtained from.
Histogram described in the present embodiment uses Cr-Cb two-dimensional histogram.Initially set up one 50 × 50 two Histogram is tieed up, statistics falls into the number of the pixel of each block in the two-dimensional histogram, sets up the institute of the features of skin colors State two-dimensional histogram.Similarly, then the two-dimensional histogram of current scene in the coloured image for obtaining the foreground area is counted, contrast The histogrammic difference of scene and the colour of skin, more significant feature in colour of skin histogram is left, it is easy to the feature that background is obscured Delete, just obtained the last histogram, the histogram is normalized, its scope is fallen between 0-255.
Step (4):The coloured image of the foreground area is transformed into corresponding color space.
The features of skin colors has the face used in the different forms of expression, the present embodiment under different colours space The colour space is YCrCb color spaces.
Step (5):Back projection is carried out in the color space according to the histogram and obtains probability graph.
In the histogram above set up, for certain point, abscissa is Cr value, and ordinate is Cb value, should The value of point represents the number (frequency is may be considered after normalization) of the pixel with Cr, Cb value, in the coloured image Full figure is traveled through again, and to any pixel point, Cr, Cb value for the point inquire about the corresponding frequency in the histogram, by this Frequency and then obtains the probability graph as the brightness of the point, and bright-dark degree's representative of certain pixel should in the probability graph Point is the probability size of tested personnel's hand skin, and the point is brighter, and the probability is bigger.
Step (6):Denoising is carried out to the probability graph using morphological erosion expansion algorithm and Threshold Segmentation Algorithm, Obtain the gesture foreground picture.
The probability graph is handled using morphological erosion expansion algorithm and Threshold Segmentation Algorithm, the probability is removed Influence of noise in figure, obtains the gesture foreground picture, and the gesture foreground picture is a width black and white gray level image.
Step 1013:The gesture shape of the tested personnel is identified according to the gesture foreground picture, the gesture is obtained Shape recognition result.
The step 1013 is specifically included:
Step is 1.:Calculate the characteristic vector of the gesture foreground picture.
Geometric invariant moment (Hu squares) feature of the gesture foreground picture is calculated, hand described in the gesture foreground picture is calculated Finger tip number, calculate the girth and area ratio of the gesture foreground picture.
The Hu moment characteristics, the finger tip number and the girth and area ratio are spliced into a row vector as work as The characteristic vector of the preceding gesture foreground picture.For example calculate the obtained Hu be characterized as [0.8,0.1,0.01,0,0,0, 0], the finger tip number is calculated as 3, and the girth is 0.02 with area ratio, then the characteristic vector that splicing is obtained is just It is [0.8,0.1,0.01,0,0,0,0,3,0.02].
Step is 2.:The characteristic vector is classified using SVMs, gesture classification result is obtained.
The characteristic vector is classified using the grader of trained completion, such as using the SVMs Algorithm for Training grader, is then classified using the grader to the characteristic vector, obtains the gesture classification result.
Step is 3.:The gesture shape of tested personnel's hand is identified according to the gesture classification result, obtains described Gesture shape recognition result.
Fig. 2 is the schematic diagram of gesture shape recognition result described in the embodiment of the present invention.Gesture shape of the present invention is known Other result includes three kinds of gesture shapes, respectively first gesture shape, second gesture shape and the 3rd gesture shape.Wherein, One gesture shape is used to represent that the triggering robot prepares to start tracking, and second gesture shape is used to represent that the robot is opened The gesture for beginning to track the tested personnel carries out end rotation motion, and the 3rd gesture shape is used to represent tracking stopping.Referring to figure 2, in the present embodiment, using the gesture shape shown in Fig. 2 (a) as the first gesture shape, using the hand shown in Fig. 2 (b) Gesture shape is used as the 3rd gesture shape as the second gesture shape using the gesture shape shown in Fig. 2 (c).In reality In the application of border, the different gesture shapes can be arranged as required to as first, second, and third gesture shape.
Step 102:When the gesture shape recognition result is the first gesture shape, it will be set in the robot Tracking mark position position, trigger the robot and enter tracking SBR, preparation starts to track the hand of the tested personnel Portion is acted.
When the gesture shape recognition result is the first gesture shape shown in Fig. 2 (a), by the robot The tracking mark position position of setting, triggers the robot and enters tracking SBR, preparation starts to track the tested personnel Hand motion.The tracking mark position is a kind of protection setting to the robot motion, and robot is rotated It is preceding can all detect tracking mark position whether set, if without set, the robot is not carried out movement instruction, i.e., will not be with The hand motion of tested personnel described in track is rotated.
Step 103:When the gesture shape recognition result be the second gesture shape and the tracking mark position by During set, the hand motion of tested personnel described in the robotic tracking carries out end rotation motion.
When the gesture shape recognition result is the second gesture shape shown in Fig. 2 (b) and tracking mark position When being set, the hand motion that the robot starts to track the tested personnel carries out end rotation motion.By to institute Probability graph is stated to carry out can be calculated coordinate of the presently described gesture shape in the coloured image, then to the gesture shape Image coordinate carry out calculate can obtain the movement velocity in each joint of robot.
The robot is the training robot that class gimmick training is rotated towards the traditional Chinese medical science, and the robot is used for simulating cervical vertebra Patient for doctor to provide exercising platform.The training robot head neck has two joints, wherein the first joint can be with water Flat rotation, second joint can rotate vertically, and the structural simulation human cervical spine with a kind of variation rigidity.
Step 103 is specifically included:
Step 1031:The direction of rotation of the robot head is determined according to the probability graph.
First, the coordinate that any point is defined in the probability graph is (x, y), and the gray value of (x, y) point is p (x, y), institute The p+q rank geometric moments for stating probability graph are:
Then:
M00=∑ p (x, y) (2)
M10=∑ xp (x, y) (3)
M01=∑ yp (x, y) (4)
Center of gravity P of the second gesture shape in the probability graphc(xc,yc) be:
Wherein, xcRepresent the x coordinate of the center of gravity, ycRepresent the y-coordinate of the center of gravity.
The state space that present image plane is image is defined, described image plane is to be differentiated according to described image sensor The plane that rate is defined, described image and described image plane are identical with the resolution ratio of described image sensor, such as when use When described image sensor resolution ratio is 1440 × 900, the resolution sizes of described image plane and described image are also 1440 ×900.Then the current state space is:
X=(xc,yc)T (7)
Steady-state spatial is defined in the state space for Ω s,
Wherein, uwThe width of the state space is represented, β represents proportionality coefficient, and β is that value is being less than 1/2nd just Number, vhRepresent the height of the state space.
Coordinate of the presently described second gesture shape in the state space is calculated according to the probability graph.According to described Relative position relation between coordinate and the steady-state spatial border, determines the direction of rotation of the robot head.
Fig. 3 is the schematic diagram of state space of the present invention and the steady-state spatial coordinate system.As shown in figure 3, u0,u1, v0,v1Left and right, the upper and lower border of the steady-state spatial Ω s is represented respectively.When the second gesture shape is in the state space In horizontal coordinate be located at the left side of the steady-state spatial left margin, i.e., the value of described horizontal coordinate is less than u0When, it is determined that described Robot head turns clockwise;When the horizontal coordinate is more than u1When, determine the robot head rotate counterclockwise.Also may be used To be less than u when the value of the horizontal coordinate0When, determine the robot head rotate counterclockwise;When the horizontal coordinate is more than u1 When, determine that the robot head turns clockwise.When vertical seat target value of the mark in the state space is small In v0When, determine that the robot head rotates to direction of bowing;When the horizontal coordinate is more than v1When, determine the robot Head rotates to new line direction;Or when the vertical seat target value is less than v0When, determine the robot head to new line direction Rotation;When the horizontal coordinate is more than v1When, determine that the robot head rotates to direction of bowing.
Step 1032:The horizontal rotation speed and vertical rotation speed of the robot head are calculated according to the probability graph Degree.
According to coordinate position of the presently described second gesture shape in the state space and the steady-state spatial border Position calculate site error, calculate the site error be exactly the current state space of the second gesture shape with The steady-state spatial closest to border make it is poor.
According to the state space and the steady-state spatial, the current site error is calculated, the site error e's Calculation formula is as follows:
Wherein, R represents the conversion formula on the steady-state spatial border,Wherein a, b, c, d takes Value is respectively:
Wherein, c is a column vector, represents the border of the steady-state spatial,u0Represent the steady-state spatial Ω S border, u1The right margin of the steady-state spatial Ω s, v are represented respectively0The coboundary of the steady-state spatial Ω s, v are represented respectively1 The lower boundary of the steady-state spatial Ω s is represented respectively.
Wherein, X represents the current state space, X=(xc,yc)T
The input u for controlling the robot head rotary speed is calculated according to the site error et,
ut=ke (14)
Wherein, k represents scaling coefficient, the scaling for carrying out the site error,Its Middle ηuvRepresent proportionality coefficient, the ηu、ηvIt is two constants.
In order that the rotary motion of the robot is more steady, sign function is asked for the site error, i.e.,:
Wherein, euRepresent error of the site error in image level direction, evRepresent that the site error is perpendicular in image Nogata to error,The speed of described image horizontal direction is represented,Represent the speed of described image vertical direction.
The speed of service w for obtaining the robot end is calculated using image turnxAnd wy, calculation formula is as follows:
Wherein, ωxRepresent the rotary speed rotated around image transverse axis of the robot head, that is, the machine The vertical rotary speed of head part;ωyThe rotary speed rotated around the image longitudinal axis of the robot head is represented, also It is the horizontal rotation speed of the robot head;(up,vp) be described image sensor image coordinate system principal point;λ tables Show that described image sensor focal length is converted into the length of pixel;U represents the second gesture shape on the probability graph Row coordinate, v represents row coordinate of the second gesture shape on the probability graph;JsRepresent image turn,
Described robot is the training robot that class gimmick training is rotated towards the traditional Chinese medical science, and described robot is used for simulating Cervical spondylopathy patient for doctor to provide exercising platform.The artificial two-articulated robot of described machine, the incidence tool of the robot There are two joints, wherein the first joint can be horizontally rotated, second joint can rotate vertically, and use a kind of variation rigidity Structural simulation human cervical spine.
Step 1033:According to the direction of rotation, the vertical rotary speed ωxWith the horizontal rotation speed omegayControl The robot head is horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to the direction of rotation Rotated vertically with the vertical rotary speed.When the direction of rotation represents to turn clockwise, described first is controlled to close Section is turned clockwise according to the horizontal rotation speed, and when the direction of rotation represents rotate counterclockwise, control is described First joint carries out rotate counterclockwise according to the horizontal rotation speed, that is, controls the left-right rotation of the robot head.Together When the direction of rotation represents to rotate to direction of bowing, control the second joint according to the vertical rotary speed to institute State direction of bowing to rotate, when the direction of rotation represents to rotate to new line direction, control the second joint according to described perpendicular Direct rotary rotary speed rotates to the new line direction, that is, controls rotating upwardly and downwardly for the robot head.
Step 104:It is when the gesture shape recognition result is three gesture shape, tracking mark position is clear Zero, stop the end rotation motion of the robot, the robot head is fixed on stop position.
When the robot head rotates to exercise desired position, gesture is replaced by shown in Fig. 2 (c) by operating personnel The 3rd gesture shape.Now the gesture shape recognition result is the 3rd gesture shape, and tracking mark position is clear Zero, stop the end rotation motion of the robot, the robot head is fixed on stop position.That is, described machine Head part is tracked behind position needed for the second gesture shape is rotated to, static to be fixed on position needed for exercise.
Fig. 4 is the signal moved using robot head gestural control method control machine head part of the present invention Figure.As shown in figure 4, tested personnel's hand 401 is placed in the front of described image sensor 402, according to rotation position needs Make three kinds of gesture shapes as shown in figure (2).The robot head has first joint 403 and the second joint 404。
When tested personnel's hand makes the first gesture shape as shown in Fig. 2 (a), the now sign-shaped Shape recognition result is the first gesture shape, by the tracking mark position set in the robot position, triggers the machine People enters tracking SBR, and preparation starts to track the hand motion of the tested personnel.
Next it is now described when tested personnel's hand makes the second gesture shape as shown in Fig. 2 (b) Gesture shape recognition result is the second gesture shape and the tracking mark position when being set, the robot start with The hand motion of tested personnel described in track carries out end rotation motion.According to the direction of rotation, the vertical rotation calculated Rotary speed ωxWith the horizontal rotation speed omegay, first joint 403 of the robot head is controlled according to the rotation Direction and the horizontal rotation speed are horizontally rotated, when the direction of rotation represents to turn clockwise, and control described the One joint 403 is turned clockwise according to the horizontal rotation speed, when the direction of rotation represents rotate counterclockwise, control Make first joint 403 and carry out rotate counterclockwise according to the horizontal rotation speed, that is, control a left side for the robot head Turn right dynamic.The second joint 404 of the robot head is controlled according to the direction of rotation and the vertical rotation simultaneously Speed is rotated vertically, when the direction of rotation represents to rotate to direction of bowing, and controls the second joint 404 according to institute State vertical rotary speed to rotate to the direction of bowing, when the direction of rotation represents to rotate to new line direction, control is described Second joint 404 rotates according to the vertical rotary speed to the new line direction, that is, controls above and below the robot head Rotate.
When the robot head rotates to exercise desired position, the tested personnel (operator) changes gesture For the 3rd gesture shape shown in Fig. 2 (c).Now the gesture shape recognition result is the 3rd gesture shape, will be described Tracking mark position is reset, and stops the end rotation motion of the robot, and the robot head is fixed on stop position. The final second gesture shape movement for having made robotic tracking tested personnel is to the angles and positions of exercise needs, Neng Gouzhen Interactive operation between the actual doctors and patients of real simulation, the exercising platform for the treatment of skill is provided for doctor.
Fig. 5 is the structural representation of robot head gestural control system of the embodiment of the present invention.
As shown in figure 5, the robot head gestural control system, including:
Gesture shape recognition result acquisition module 501, the gesture shape for recognizing tested personnel's hand obtains the hand Gesture shape recognition result;The gesture shape recognition result includes first gesture shape, second gesture shape and the 3rd sign-shaped Shape;
First gesture shape control module 502, for being the first gesture shape when the gesture shape recognition result When, by the tracking mark position set in the robot position, trigger the robot and enter tracking SBR, prepare to start Track the hand motion of the tested personnel;
Second gesture shape control module 503, for being the second gesture shape when the gesture shape recognition result And tracking mark position is when being set, the hand motion of tested personnel described in the robotic tracking is controlled to carry out head rotation Transhipment is dynamic;
3rd gesture shape control module 504, for being the 3rd gesture shape when the gesture shape recognition result When, tracking mark position is reset, stops the end rotation motion of the robot, the robot head is fixed on Stop position.
Wherein, the gesture shape recognition result acquisition module 501 is specifically included:
Image acquisition submodule, coloured image and depth image for obtaining tested personnel's hand;
Gesture foreground picture acquisition submodule, for obtaining gesture prospect according to the coloured image and the depth image Figure;
Gesture shape recognition result acquisition submodule, for identifying the tested personnel's according to the gesture foreground picture Gesture shape, obtains the gesture shape recognition result.
Wherein, the gesture foreground picture acquisition submodule is specifically included:
Foreground area extraction unit, for being handled using Threshold Segmentation Algorithm the depth image, extracts gray scale The image-region being worth in setting range is used as foreground area;
Prospect color image taking unit, for being obtained according to correspondence position of the foreground area in the coloured image Obtain the coloured image of the foreground area;
Histogram sets up unit, for setting up histogram according to features of skin colors;
Image conversion unit, for the coloured image of the foreground area to be transformed into corresponding color space;
Probability graph acquiring unit, probability is obtained for carrying out back projection in the color space according to the histogram Figure;
Gesture foreground picture acquiring unit, for using morphological erosion expansion algorithm and Threshold Segmentation Algorithm to the probability Figure carries out denoising, obtains the gesture foreground picture.
Wherein, the gesture shape recognition result acquisition submodule is specifically included:
Characteristic vector computing unit, the characteristic vector for calculating the gesture foreground picture;
Gesture classification result acquiring unit, for classifying using SVMs to the characteristic vector, obtains hand Gesture classification results;
Gesture shape recognition result acquiring unit, for identifying tested personnel's hand according to the gesture classification result The gesture shape in portion, obtains the gesture shape recognition result.
Wherein, the second gesture shape control module 503 is specifically included:
Direction of rotation acquisition submodule, the direction of rotation for determining the robot head according to the probability graph;
Rotary speed calculating sub module, the horizontal rotation speed for calculating the robot head according to the probability graph With vertical rotary speed;
Rotary motion control submodule, for according to the direction of rotation, the horizontal rotation speed and the vertical rotation Rotary speed controls the robot head to be horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to institute State direction of rotation and the vertical rotary speed is rotated vertically.
Robot head gestural control system of the present invention, can be controlled according to the gesture shape of tested personnel's hand The rotation and stopping of robot head, make the robot head move to the angles and positions of exercise needs, being capable of true mould Intend the interactive operation between actual doctors and patients, the exercising platform for the treatment of skill is provided for doctor.
Specific case used herein is set forth to the principle and embodiment of the present invention, and above example is said The bright method and its core concept for being only intended to help to understand the present invention;Simultaneously for those of ordinary skill in the art, foundation The thought of the present invention, will change in specific embodiments and applications.In summary, this specification content is not It is interpreted as limitation of the present invention.

Claims (10)

1. a kind of robot head gestural control method, it is characterised in that methods described includes:
The gesture shape of tested personnel's hand is recognized, the gesture shape recognition result is obtained;The gesture shape recognition result Including first gesture shape, second gesture shape and the 3rd gesture shape;
When the gesture shape recognition result is the first gesture shape, by the tracking mark set in the robot position Set, triggers the robot and enters tracking SBR, preparation starts to track the hand motion of the tested personnel;
When the gesture shape recognition result is the second gesture shape and tracking mark position has been set, the machine The hand motion that device people tracks the tested personnel carries out end rotation motion;
When the gesture shape recognition result is three gesture shape, tracking mark position is reset, stopped described The end rotation motion of robot, the robot head is fixed on stop position.
2. according to the method described in claim 1, it is characterised in that the gesture shape of identification tested personnel's hand, obtain The gesture shape recognition result, is specifically included:
Obtain the coloured image and depth image of tested personnel's hand;
Gesture foreground picture is obtained according to the coloured image and the depth image;
The gesture shape of the tested personnel is identified according to the gesture foreground picture, the gesture shape recognition result is obtained.
3. method according to claim 2, it is characterised in that described to be obtained according to the coloured image and the depth image To gesture foreground picture, specifically include:
The depth image is handled using Threshold Segmentation Algorithm, image-region of the gray value in setting range is extracted and makees For foreground area;
The coloured image of the foreground area is obtained according to correspondence position of the foreground area in the coloured image;
Histogram is set up according to features of skin colors;
The coloured image of the foreground area is transformed into corresponding color space;
Back projection is carried out in the color space according to the histogram and obtains probability graph;
Denoising is carried out to the probability graph using morphological erosion expansion algorithm and Threshold Segmentation Algorithm, the gesture is obtained Foreground picture.
4. method according to claim 3, it is characterised in that described that described be tested is identified according to the gesture foreground picture The gesture shape of personnel, obtains the gesture shape recognition result, specifically includes:
Calculate the characteristic vector of the gesture foreground picture;
The characteristic vector is classified using SVMs, gesture classification result is obtained;
The gesture shape of tested personnel's hand is identified according to the gesture classification result, the gesture shape identification is obtained As a result.
5. method according to claim 4, it is characterised in that described when the gesture shape recognition result is described second Gesture shape and tracking mark position is when being set, the hand motion of tested personnel enters wardrobe described in the robotic tracking Portion's rotary motion, is specifically included:
The direction of rotation of the robot head is determined according to the probability graph;
The horizontal rotation speed and vertical rotary speed of the robot head are calculated according to the probability graph;
According to the direction of rotation, the horizontal rotation speed and the vertical rotary speed control the robot head according to The direction of rotation and the horizontal rotation speed are horizontally rotated, according to the direction of rotation and the vertical rotary speed Rotated vertically.
6. a kind of robot head gestural control system, it is characterised in that the system includes:
Gesture shape recognition result acquisition module, the gesture shape for recognizing tested personnel's hand obtains the gesture shape Recognition result;The gesture shape recognition result includes first gesture shape, second gesture shape and the 3rd gesture shape;
First gesture shape control module, for when the gesture shape recognition result is the first gesture shape, by institute The tracking mark position position set in robot is stated, the robot is triggered and enters tracking SBR, preparation starts to track institute State the hand motion of tested personnel;
Second gesture shape control module, for when the gesture shape recognition result be the second gesture shape and it is described with When track flag bit has been set, the hand motion of tested personnel described in the robotic tracking is controlled to carry out end rotation motion;
3rd gesture shape control module, for when the gesture shape recognition result is three gesture shape, by institute The clearing of tracking mark position is stated, stops the end rotation motion of the robot, the robot head is fixed on stop position Put.
7. system as claimed in claim 6, it is characterised in that the gesture shape recognition result acquisition module is specifically included:
Image acquisition submodule, coloured image and depth image for obtaining tested personnel's hand;
Gesture foreground picture acquisition submodule, for obtaining gesture foreground picture according to the coloured image and the depth image;
Gesture shape recognition result acquisition submodule, the gesture for identifying the tested personnel according to the gesture foreground picture Shape, obtains the gesture shape recognition result.
8. system as claimed in claim 7, it is characterised in that the gesture foreground picture acquisition submodule is specifically included:
Foreground area extraction unit, for being handled using Threshold Segmentation Algorithm the depth image, is extracted gray value and existed Image-region in setting range is used as foreground area;
Prospect color image taking unit, for obtaining institute according to correspondence position of the foreground area in the coloured image State the coloured image of foreground area;
Histogram sets up unit, for setting up histogram according to features of skin colors;
Image conversion unit, for the coloured image of the foreground area to be transformed into corresponding color space;
Probability graph acquiring unit, probability graph is obtained for carrying out back projection in the color space according to the histogram;
Gesture foreground picture acquiring unit, for being entered using morphological erosion expansion algorithm and Threshold Segmentation Algorithm to the probability graph Row denoising, obtains the gesture foreground picture.
9. system as claimed in claim 8, it is characterised in that the gesture shape recognition result acquisition submodule is specifically wrapped Include:
Characteristic vector computing unit, the characteristic vector for calculating the gesture foreground picture;
Gesture classification result acquiring unit, for classifying using SVMs to the characteristic vector, obtains gesture point Class result;
Gesture shape recognition result acquiring unit, for identifying tested personnel's hand according to the gesture classification result Gesture shape, obtains the gesture shape recognition result.
10. system as claimed in claim 9, it is characterised in that the second gesture shape control module is specifically included:
Direction of rotation acquisition submodule, the direction of rotation for determining the robot head according to the probability graph;
Rotary speed calculating sub module, for calculating the horizontal rotation speed of the robot head according to the probability graph and erecting Direct rotary rotary speed;
Rotary motion control submodule, for according to the direction of rotation, the horizontal rotation speed and the vertical rotation speed The degree control robot head is horizontally rotated according to the direction of rotation and the horizontal rotation speed, according to the rotation Turn direction and the vertical rotary speed is rotated vertically.
CN201710439682.XA 2017-06-12 2017-06-12 Robot head gesture control method and system Expired - Fee Related CN107247466B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710439682.XA CN107247466B (en) 2017-06-12 2017-06-12 Robot head gesture control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710439682.XA CN107247466B (en) 2017-06-12 2017-06-12 Robot head gesture control method and system

Publications (2)

Publication Number Publication Date
CN107247466A true CN107247466A (en) 2017-10-13
CN107247466B CN107247466B (en) 2020-10-20

Family

ID=60019058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710439682.XA Expired - Fee Related CN107247466B (en) 2017-06-12 2017-06-12 Robot head gesture control method and system

Country Status (1)

Country Link
CN (1) CN107247466B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717553A (en) * 2018-05-18 2018-10-30 杭州艾米机器人有限公司 A kind of robot follows the method and system of human body
CN111968723A (en) * 2020-07-30 2020-11-20 宁波羽扬科技有限公司 Kinect-based upper limb active rehabilitation training method
CN111975765A (en) * 2019-05-24 2020-11-24 京瓷办公信息系统株式会社 Electronic device, robot system, and virtual area setting method
CN115145403A (en) * 2022-09-05 2022-10-04 北京理工大学 Hand marker tracking method and system based on gestures

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120290111A1 (en) * 2011-05-09 2012-11-15 Badavne Nilay C Robot
WO2013112504A1 (en) * 2012-01-25 2013-08-01 Chrysler Group Llc Automotive vehicle power window control using capacitive switches
CN103472920A (en) * 2013-09-13 2013-12-25 通号通信信息集团有限公司 Action-recognition-based medical image control method and system
KR20140022654A (en) * 2012-08-14 2014-02-25 (주)동부로봇 Cleaning robot for having gesture recignition function, and the contol method
CN103903011A (en) * 2014-04-02 2014-07-02 重庆邮电大学 Intelligent wheelchair gesture recognition control method based on image depth information
CN104636342A (en) * 2013-11-07 2015-05-20 大连东方之星信息技术有限公司 Archive display system with gesture control function
CN105787471A (en) * 2016-03-25 2016-07-20 南京邮电大学 Gesture identification method applied to control of mobile service robot for elder and disabled
CN106200395A (en) * 2016-08-05 2016-12-07 易晓阳 A kind of multidimensional identification appliance control method
CN106502272A (en) * 2016-10-21 2017-03-15 上海未来伙伴机器人有限公司 A kind of target following control method and device
CN106529432A (en) * 2016-11-01 2017-03-22 山东大学 Hand area segmentation method deeply integrating significance detection and prior knowledge
CN107204005A (en) * 2017-06-12 2017-09-26 北京理工大学 A kind of hand mark tracking and system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120290111A1 (en) * 2011-05-09 2012-11-15 Badavne Nilay C Robot
WO2013112504A1 (en) * 2012-01-25 2013-08-01 Chrysler Group Llc Automotive vehicle power window control using capacitive switches
KR20140022654A (en) * 2012-08-14 2014-02-25 (주)동부로봇 Cleaning robot for having gesture recignition function, and the contol method
CN103472920A (en) * 2013-09-13 2013-12-25 通号通信信息集团有限公司 Action-recognition-based medical image control method and system
CN104636342A (en) * 2013-11-07 2015-05-20 大连东方之星信息技术有限公司 Archive display system with gesture control function
CN103903011A (en) * 2014-04-02 2014-07-02 重庆邮电大学 Intelligent wheelchair gesture recognition control method based on image depth information
CN105787471A (en) * 2016-03-25 2016-07-20 南京邮电大学 Gesture identification method applied to control of mobile service robot for elder and disabled
CN106200395A (en) * 2016-08-05 2016-12-07 易晓阳 A kind of multidimensional identification appliance control method
CN106502272A (en) * 2016-10-21 2017-03-15 上海未来伙伴机器人有限公司 A kind of target following control method and device
CN106529432A (en) * 2016-11-01 2017-03-22 山东大学 Hand area segmentation method deeply integrating significance detection and prior knowledge
CN107204005A (en) * 2017-06-12 2017-09-26 北京理工大学 A kind of hand mark tracking and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
MEI WANG 等: "《Hand gesture recognition using valley circle feature and Hu’s moments technique for robot movement control》", 《MEASUREMENT》 *
吴宇: "《机器人视觉交流中的手势识别仿真》", 《计算机仿真》 *
周凯: "《基于肤色和SVM的手势识别及其应用研究》", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
王艳苓: "《基于手势识别的光标控制交互技术研究》", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717553A (en) * 2018-05-18 2018-10-30 杭州艾米机器人有限公司 A kind of robot follows the method and system of human body
CN108717553B (en) * 2018-05-18 2020-08-18 杭州艾米机器人有限公司 Method and system for robot to follow human body
CN111975765A (en) * 2019-05-24 2020-11-24 京瓷办公信息系统株式会社 Electronic device, robot system, and virtual area setting method
CN111975765B (en) * 2019-05-24 2023-05-23 京瓷办公信息系统株式会社 Electronic device, robot system, and virtual area setting method
CN111968723A (en) * 2020-07-30 2020-11-20 宁波羽扬科技有限公司 Kinect-based upper limb active rehabilitation training method
CN115145403A (en) * 2022-09-05 2022-10-04 北京理工大学 Hand marker tracking method and system based on gestures

Also Published As

Publication number Publication date
CN107247466B (en) 2020-10-20

Similar Documents

Publication Publication Date Title
US10417775B2 (en) Method for implementing human skeleton tracking system based on depth data
WO2022121645A1 (en) Method for generating sense of reality of virtual object in teaching scene
WO2021129064A9 (en) Posture acquisition method and device, and key point coordinate positioning model training method and device
Joo et al. Panoptic studio: A massively multiview system for social motion capture
CN103941866B (en) Three-dimensional gesture recognizing method based on Kinect depth image
JP5837508B2 (en) Posture state estimation apparatus and posture state estimation method
CN107247466A (en) Robot head gesture control method and system
Prisacariu et al. 3D hand tracking for human computer interaction
CN101853399B (en) Method for realizing blind road and pedestrian crossing real-time detection by utilizing computer vision technology
US20070098250A1 (en) Man-machine interface based on 3-D positions of the human body
CN108961400B (en) A kind of historical relic is intelligent to assist bootstrap technique and guidance system
Davis et al. Determining 3-d hand motion
CN110561399B (en) Auxiliary shooting device for dyskinesia condition analysis, control method and device
CN108363973A (en) A kind of unconfined 3D expressions moving method
CN108154551A (en) The method and system of real-time dynamic reconstruction three-dimensional (3 D) manikin
CN107102736A (en) The method for realizing augmented reality
KR20220024494A (en) Method and system for human monocular depth estimation
CN106022211B (en) A method of utilizing gesture control multimedia equipment
CN103761011B (en) A kind of method of virtual touch screen, system and the equipment of calculating
Zou et al. Automatic reconstruction of 3D human motion pose from uncalibrated monocular video sequences based on markerless human motion tracking
JPH03252780A (en) Feature quantity extracting method
JPWO2019150431A1 (en) Information processing device
Kondori et al. Direct hand pose estimation for immersive gestural interaction
CN107204005B (en) Hand marker tracking method and system
CN115761901A (en) Horse riding posture detection and evaluation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20180212

Address after: 528400 Guangdong province Zhongshan Torch Development Zone, Cheung Hing Road 6 No. 11 South Hebei trade building 1110 card

Applicant after: ZHONGSHAN CHANGFENG INTELLIGENT AUTOMATION EQUIPMENT RESEARCH INSTITUTE Co.,Ltd.

Applicant after: RESEARCH INSTITUTE OF BIT IN ZHONGSHAN

Address before: 528400 Guangdong province Zhongshan Torch Development Zone, Cheung Hing Road 6 No. 11 South Hebei trade building 1110 card

Applicant before: ZHONGSHAN CHANGFENG INTELLIGENT AUTOMATION EQUIPMENT RESEARCH INSTITUTE Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201020