CN105867630A - Robot gesture recognition method and device and robot system - Google Patents
Robot gesture recognition method and device and robot system Download PDFInfo
- Publication number
- CN105867630A CN105867630A CN201610252110.6A CN201610252110A CN105867630A CN 105867630 A CN105867630 A CN 105867630A CN 201610252110 A CN201610252110 A CN 201610252110A CN 105867630 A CN105867630 A CN 105867630A
- Authority
- CN
- China
- Prior art keywords
- gesture
- information
- robot
- module
- image information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Abstract
The invention provides a robot system provided with a gesture recognition device. The gesture recognition device at least comprises a database for storing information, an image acquisition module, a hand recognition module, a gesture command recognition module and a central control module, wherein the image acquisition module is used for acquiring image information containing gesture information and sending the image information to the hand recognition module; the hand recognition module is used for receiving the image information sent by the image acquisition module and recognizing a hand area in the image information; the gesture command recognition module is used for tracing the hand area and recognizing a gesture command in the hand area according to the information stored in the database; the central control module is used for sending control information to a robot according to the gesture command transmitted by the gesture command recognition module. The robot system further comprises a central controller and an action executing mechanism, wherein the central controller is used for receiving the gesture command information sent by the gesture recognition device and generating an action control command according to the information; the action executing mechanism is used for acting according to the action control command sent by the central controller and performing actions according to the gesture command.
Description
Technical field
The present invention relates to robotics, particularly relate to gesture identification method and the dress thereof of a kind of robot
Put.
Background technology
Along with the fast development of science and technology, in order to reduce cost, improve work efficiency, in every field
In production and application, robot is sufficiently applied.The robot put into productive life at present is broadly divided into
Industrial robot and specialized robot, the most so-called industrial robot is mainly used in towards industrial circle exactly
Manufacture the multi-joint manipulator in industry or multi-freedom robot;Specialized robot is then except industrial machine
Outside device people, for nonmanufacturing industry the various sophisticated machine people that serve the mankind, include again: server
Device people, underwater robot, amusement robot, military robot, agricultural robot etc..It is widely used at present
All kinds of domestic robots, substantially achieve the functions such as children education, travelling control, life utility,
And in terms of the control of robot, mainly apply the modes such as infrared remote control, Bluetooth communication or direct human-machine operation
Realize, need the movement using RPB to realize robot.
And in terms of man-machine interaction, mainly using the form of interactive voice, robot sends according to user people
Phonetic order, and make and react action accordingly, finally realize the function of man-machine interaction.And it is this man-machine
Mutual mode depends on the phonetic entry of user, and the demand of phonetic entry is had certain requirement,
In view of the difference of language, the difference of user speech intonation, thus service efficiency can be dropped in interaction
Low, and time there is a need to situation about re-entering, cause the unfriendly of interaction, function singleness, operation
Loaded down with trivial details.And for the mankind, no matter be on language, to have difference how, the gesture of the mankind has certain logical
By property, if the identification to human gesture therefore can be realized, can largely improve the efficiency of man-machine interaction.
Summary of the invention
It is an object of the present invention to provide gesture identification method and the device thereof of a kind of robot, make robot carry out
During man-machine interaction, human gesture can be identified, and make corresponding reaction according to gesture instruction, improve man-machine
Interactive efficiency.
The present invention solves technical problem and adopts the following technical scheme that
The present invention provides the gesture identification method of a kind of robot, at least comprises the following steps:
A, obtain there is the image information of gesture information:
B, the image information got is carried out color conversion, be partitioned into hand region;
Hand region described in C, trace analysis, determines the gesture feature vector of gesture information;
D, determine correct gesture feature vector after and send instructions to robot;
Gesture information instruction described in the execution of E, described robot.
Wherein, described step B at least includes:
B11, according to the skin area set in Threshold segmentation image information;
B12, by arrange skin color model locating segmentation palm area;
B13, in the palm area being partitioned into, identify the palm of the hand and the coordinate of each finger fingertip.
Wherein, described step C at least comprises the following steps:
C11, employing optical flow field follow the tracks of the discrete vector obtaining gesture feature;
The hidden Markov model that C12, basis train calculates the likelihood value of gesture;
C13, the feature obtained according to step C11 and step C12 judge that gesture feature is vectorial.
The present invention also proposes the gesture identifying device of a kind of robot, for making the hands of robot identification user
Gesture information command, at least includes image collection module, hard recognition module, gesture instruction identification module, with
Other modules carry out the central control module of signal transmission, and the data base of storage information,
Image collection module, obtains the image information including gesture information, and sends to described hard recognition
Module;
Hard recognition module, receives the image information that described image collection module transmits, and identifies image information
In hand region;
Gesture instruction identification module, according in described data base storage information trace hand region and identify hands
Gesture instruction in gesture region;
Central control module, sends control information according to the gesture instruction of described gesture instruction identification module transmission
To described robot.
Wherein, described image recognition acquisition module is photographic head.
Wherein, described hard recognition module at least includes:
Color conversion unit, carries out color conversion processing according to the information of data base to image information;
Hand region tracking cell, determines according to the image information after the conversion that described color conversion unit is transmitted
Go out hand region, and follow the tracks of this region;
Gesture instruction confirmation unit, the hand region provided according to described hand region tracking cell and data
In storehouse, the information of storage determines the gesture instruction in image information.
Wherein, described gesture identifying device is arranged on the information receiving end of described robot, uses
Gesture instruction information described in Socket communications reception.
A kind of robot system installing described gesture identifying device, described robot at least includes:
Central controller, for receiving the gesture instruction information that described gesture identifying device sends, and according to this
Information generates action directive;
Action actuating mechanism, is used for the action directive action sent according to described central controller, according to
Gesture instruction makes action.
There is advantages that
The solution of the present invention allows the robot to identify human gesture's action, allows robot according to the hands of the mankind
Corresponding response is made in gesture action, largely strengthens and extend the interactive capability of robot;
The present invention according to dynamic gestures such as upper and lower, left and right, allow respectively robot to forward and backward, left, by
Mobile, or allow the head upper and lower, left and right of robot rotate, break away from RPB and achieve people
Machine separates, and gesture controls robot, allows man-machine interaction mode more have motility.
Accompanying drawing explanation
Fig. 1 is the flow chart of the gesture identification method of robot of the present invention;
Fig. 2 is the structured flowchart of the gesture identifying device of robot of the present invention;
Fig. 3 is the flow chart of robot system of the present invention.
Detailed description of the invention
Below in conjunction with embodiment and accompanying drawing, technical scheme is further elaborated.
The present invention provides the gesture identification method of a kind of robot, in an embodiment, with reference to shown in Fig. 1, and institute
The recognition methods stated at least comprises the following steps: A, obtains and has the image information of gesture information: B, to obtaining
The image information got carries out color conversion, is partitioned into hand region;Hand region described in C, trace analysis,
Determine the gesture feature vector of gesture information;D, determine correct gesture feature vector after and send instructions to
Robot;Gesture information instruction described in the execution of E, described robot.
Wherein said step B at least includes: B11, according to the skin area set in Threshold segmentation image information;
B12, by arrange skin color model locating segmentation palm area;B13, in the palm area being partitioned into
In identify the palm of the hand and the coordinate of each finger fingertip.The most in the present embodiment, permissible in described step B13
According to the contour line of palm area matching staff, then create staff coordinate model graph parameter and the threshold of coordinate points
Value, determines indication and the palm of the hand.
Step A is the collection to image information, in the present embodiment, has according to the action collection of user
The image information of gesture information, i.e. main collection is had to include user image information above the waist.Collecting
After this information, just can be realized by above-mentioned step B, i.e. Gesture Recognition Algorithm, utilize YCrCb face
The colour space have colourity is separated with brightness and good on the Clustering features of the colour of skin, affected by brightness flop little
Feature, distinguish area of skin color, YCrCb changed into YCrCb.According to prior art, human body
Skin color YCrCb chrominance space distribution substantially: 77≤Cb≤127,133≤Cr≤173,
The most in the present embodiment, this scope threshold value as skin color segmentation is chosen.By obtaining corresponding threshold value,
By skin segmentation out, location palm also identifies the palm of the hand and finger fingertip to binaryzation further to image
Position, palm is split from artwork, is divided into hand region by skin color model.
After determining hand region, performing step C, described step C at least comprises the following steps: C11,
Optical flow field is used to follow the tracks of the discrete vector obtaining gesture feature;The hidden Markov model that C12, basis train
Calculate the likelihood value of gesture;C13, the feature obtained according to step C11 and step C12 judge gesture feature
Vector.Wherein the calculating of optical flow field is the streamer field equation using gray scale non-conservation, to two accessed width
Two width images are carried out pretreatment with the medium filtering of 5*5 window, obtain sequence chart by adjacent hand images
Picture, is tracked.And Markov model (HMM) in the present embodiment is a dual random process,
In the present invention, dynamic hand gesture recognition based on HMM generally comprises three steps: (1) selectes the model of gesture;
(2) design discrimination method, uses the gesture having each form to go to train HMM model, uses
Baum-Welch algorithm, carries out continuous iteration renewal, finally according to each gesture to the parameter of HMM model
The type of motion forms 7 HMM model, represents respectively upwards, downwards, to the left, to the right, amplifies, contracting
Little, the posture of rotation.(3) utilization is calculated the serial variance characteristic vector of a gesture to HMM model
Training, is finally completed gesture identification.To sum up the present embodiment is to use to carry out gesture based on optical flow tracking method
Location and tracking, obtain gesture feature vector, thus complete gesture analysis.And can for gesture feature vector
Go out multiple situation with precondition, and associate with the action of robot, then after collecting gesture instruction,
Instruction is sent to robot, makes robot carry out action according to setting in advance.
In the method for the invention, it is possible to use Socket communication realizes control system and robot behavior system
Between binding and data transmit.
Corresponding to the gesture identification method of the present invention, the present invention also provides for the gesture identifying device of a kind of robot,
For making the gesture information of robot identification user instruct, with reference to shown in Fig. 2, this device at least includes figure
As acquisition module, hard recognition module, gesture instruction identification module, carry out signal transmission with other modules
Central control module, and the data base of storage information, wherein said image collection module, acquisition includes
The image information of gesture information, and send to described hard recognition module;Hard recognition module, receives institute
State the image information that image collection module transmits, and identify the hand region in image information;Gesture instruction is known
Other module, according in described data base storage information trace hand region and identify the gesture in gesture area
Instruction;Central control module, sends according to the gesture instruction of described gesture instruction identification module transmission and controls letter
Breath is to described robot.
In an embodiment of the present invention, described image recognition acquisition module is photographic head.Described hand is known
Other module at least includes: color conversion unit, according to the information of data base, image information is carried out color conversion
Process;Hand region tracking cell, true according to the image information after the conversion that described color conversion unit is transmitted
Make hand region, and follow the tracks of this region;Gesture instruction confirmation unit, follows the tracks of single according to described hand region
In the hand region of unit's offer and data base, the information of storage determines the gesture instruction in image information.Wherein,
Gesture is carried out by the hand region tracking cell of the present invention based on optical flow tracking method and hidden Markov model
Location and tracking, obtain gesture feature vector, thus complete gesture analysis, then by gesture instruction confirmation unit
Confirm gesture instruction.
In an embodiment of the present invention, described gesture identifying device is arranged on the information of described robot and connects
Receiving end, uses the gesture instruction information described in Socket communications reception.
In an embodiment of the present invention, also provide for being provided with the robot system of above-mentioned gesture identifying device,
Described robot at least includes: central controller, and the gesture sent for receiving described gesture identifying device refers to
Make information, and generate action directive according to this information;Action actuating mechanism, for according to described central authorities
The action directive action that controller sends, makes action according to gesture instruction.
Wherein, the central controller of described robot, mainly include that hoofing part control and head turn to
Drive and control, and action actuating mechanism is at least by the wheel walked and rotating head mechanism, in this reality
Executing in example, the wheel of walking can include trailing wheel, revolver, right wheel, and respectively by three driven by servomotor,
Head mechanism can by a Serve Motor Control, by rotating realize head towards left and right switch, up and down
Nod by a Serve Motor Control, realize, by rotating, the motion that head is upper and lower.
The running of the gesture induction robot of the present invention: when this photographic head is made dynamic gesture by people,
First gesture recognition system is partitioned into hand region, then, recycling optical flow tracking obtain gesture feature from
Dissipating vector, the HMM algorithm that recycling trains calculates the likelihood value of gesture, it is judged that the gesture of maximum possible,
If recognition result is correct, send control instruction, finally, control by Socket communication to robot controller
Device processed drives to robot behavior and sends corresponding motor message, and robot makes corresponding action, if
Gesture identification result mistake, controls photographic head and re-starts capture gesture motion.
With reference to shown in Fig. 3, the robot system concrete implementation flow process of the present invention is, the order person of sending is the most just
Being that the user of robot sends gesture instruction, the camera collection as image collection module moves with gesture
The image made, the color conversion unit of described hard recognition module, according to the information of data base, image is believed
Breath carries out color conversion processing;Again by the hand region tracking cell of described hard recognition module, according to institute
State the image information after the conversion of color conversion unit transmission and determine hand region, and calculate based on streamer field
The likelihood value that gesture feature vector and HMM model obtain follows the tracks of this region, and (concrete computational methods are the most above-mentioned
Described in method, no longer repeat at this);Again by gesture instruction confirmation unit, according to described hand region with
In the hand region of track unit offer and data base, the information of storage determines the gesture instruction in image information.
With reference to Fig. 3, if the gesture feature vector obtained is incorrect, then return to described central control module, by
Photographic head described in central control module control reacquires gesture motion;If the gesture feature vector obtained is just
Really, the communication of facility Socket sends gesture instruction.Described robot is just at hoofing part control and head
Under course changing control, control each servomotor and start, and then move left and right before and after control robot, or
Nod the action such as shake the head.
The present invention, by the image of photographic head capture gesture, utilizes YCrCb color space to carry out Hand Gesture Segmentation, carries
Take palm area and identify the coordinate of the palm of the hand and finger fingertip, simultaneously according to optical flow field and hidden Markov mould
Type (HMM) model, carries out real-time tracking and analysis to gesture area, draws corresponding hand signal, will divide
The hand signal separated out is sent to robot by Socket communication, and then controls the motion of robot.
The sequencing of above example only for ease of describing, does not represent the quality of embodiment.
Last it is noted that above example is only in order to illustrate technical scheme, rather than it is limited
System;Although the present invention being described in detail with reference to previous embodiment, those of ordinary skill in the art
It is understood that the technical scheme described in foregoing embodiments still can be modified by it, or to it
Middle part technical characteristic carries out equivalent;And these amendments or replacement, do not make appropriate technical solution
Essence departs from the spirit and scope of various embodiments of the present invention technical scheme.
Claims (8)
1. the gesture identification method of a robot, it is characterised in that at least comprise the following steps:
A, obtain there is the image information of gesture information:
B, the image information got is carried out color conversion, be partitioned into hand region;
Hand region described in C, trace analysis, determines the gesture feature vector of gesture information;
D, determine correct gesture feature vector after and send instructions to robot;
Gesture information instruction described in the execution of E, described robot.
Gesture identification method the most according to claim 1, it is characterised in that described step B is at least wrapped
Include:
B11, according to the skin area set in Threshold segmentation image information;
B12, by arrange skin color model locating segmentation palm area;
B13, in the palm area being partitioned into, identify the palm of the hand and the coordinate of each finger fingertip.
Gesture identification method the most according to claim 1, it is characterised in that described step C is at least
Comprise the following steps:
C11, employing optical flow field follow the tracks of the discrete vector obtaining gesture feature;
The hidden Markov model that C12, basis train calculates the likelihood value of gesture;
C13, the feature obtained according to step C11 and step C12 judge that gesture feature is vectorial.
4. a gesture identifying device for robot, for making the gesture information of robot identification user instruct,
It is characterized in that, at least include image collection module, hard recognition module, gesture instruction identification module, with
Other modules carry out the central control module of signal transmission, and the data base of storage information,
Image collection module, obtains the image information including gesture information, and sends to described hard recognition
Module;
Hard recognition module, receives the image information that described image collection module transmits, and identifies image information
In hand region;
Gesture instruction identification module, according in described data base storage information trace hand region and identify hands
Gesture instruction in gesture region;
Central control module, sends control information according to the gesture instruction of described gesture instruction identification module transmission
To described robot.
Gesture identifying device the most according to claim 4, it is characterised in that described image recognition obtains
Delivery block is photographic head.
Gesture identifying device the most according to claim 4, it is characterised in that described hard recognition mould
Block at least includes:
Color conversion unit, carries out color conversion processing according to the information of data base to image information;
Hand region tracking cell, determines according to the image information after the conversion that described color conversion unit is transmitted
Go out hand region, and follow the tracks of this region;
Gesture instruction confirmation unit, the hand region provided according to described hand region tracking cell and data
In storehouse, the information of storage determines the gesture instruction in image information.
Gesture identifying device the most according to claim 4, it is characterised in that described gesture identification dress
Put the information receiving end being arranged on described robot, use the gesture instruction letter described in Socket communications reception
Breath.
8. the robot system of the gesture identifying device being provided with described in claim 4, it is characterised in that
Described robot at least includes:
Central controller, for receiving the gesture instruction information that described gesture identifying device sends, and according to this
Information generates action directive;
Action actuating mechanism, is used for the action directive action sent according to described central controller, according to
Gesture instruction makes action.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610252110.6A CN105867630A (en) | 2016-04-21 | 2016-04-21 | Robot gesture recognition method and device and robot system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610252110.6A CN105867630A (en) | 2016-04-21 | 2016-04-21 | Robot gesture recognition method and device and robot system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105867630A true CN105867630A (en) | 2016-08-17 |
Family
ID=56633687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610252110.6A Pending CN105867630A (en) | 2016-04-21 | 2016-04-21 | Robot gesture recognition method and device and robot system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105867630A (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106363638A (en) * | 2016-10-19 | 2017-02-01 | 苏州大成电子科技有限公司 | Gesture command robot |
CN106502390A (en) * | 2016-10-08 | 2017-03-15 | 华南理工大学 | A kind of visual human's interactive system and method based on dynamic 3D Handwritten Digit Recognitions |
CN106681508A (en) * | 2016-12-29 | 2017-05-17 | 杭州电子科技大学 | System for remote robot control based on gestures and implementation method for same |
CN106791746A (en) * | 2017-01-23 | 2017-05-31 | 合肥优智领英智能科技有限公司 | A kind of Novel movable interactive mode ground projection structure and projecting method |
CN107688779A (en) * | 2017-08-18 | 2018-02-13 | 北京航空航天大学 | A kind of robot gesture interaction method and apparatus based on RGBD camera depth images |
CN108549490A (en) * | 2018-05-03 | 2018-09-18 | 林潼 | A kind of gesture identification interactive approach based on Leap Motion equipment |
CN108568820A (en) * | 2018-04-27 | 2018-09-25 | 深圳市商汤科技有限公司 | Robot control method and device, electronic equipment and storage medium |
WO2019029266A1 (en) * | 2017-08-07 | 2019-02-14 | 深圳市科迈爱康科技有限公司 | Body movement recognition method, robot and storage medium |
CN109409277A (en) * | 2018-10-18 | 2019-03-01 | 北京旷视科技有限公司 | Gesture identification method, device, intelligent terminal and computer storage medium |
CN110228065A (en) * | 2019-04-29 | 2019-09-13 | 北京云迹科技有限公司 | Motion planning and robot control method and device |
CN110434853A (en) * | 2019-08-05 | 2019-11-12 | 北京云迹科技有限公司 | A kind of robot control method, device and storage medium |
CN110925945A (en) * | 2019-11-27 | 2020-03-27 | 广东美的制冷设备有限公司 | Air conditioner robot control method and device based on gesture recognition |
CN112363538A (en) * | 2020-11-09 | 2021-02-12 | 哈尔滨工程大学 | AUV (autonomous underwater vehicle) area tracking control method under incomplete speed information |
CN113303708A (en) * | 2020-02-27 | 2021-08-27 | 佛山市云米电器科技有限公司 | Control method for maintenance device, and storage medium |
US20210331314A1 (en) * | 2019-03-08 | 2021-10-28 | Lg Electronics Inc. | Artificial intelligence cleaner |
CN114724243A (en) * | 2022-03-29 | 2022-07-08 | 赵新博 | Bionic action recognition system based on artificial intelligence |
CN117576787A (en) * | 2024-01-16 | 2024-02-20 | 北京大学深圳研究生院 | Method, device and equipment for handing over based on active tracking and self-adaptive gesture recognition |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101901350A (en) * | 2010-07-23 | 2010-12-01 | 北京航空航天大学 | Characteristic vector-based static gesture recognition method |
US20110001813A1 (en) * | 2009-07-03 | 2011-01-06 | Electronics And Telecommunications Research Institute | Gesture recognition apparatus, robot system including the same and gesture recognition method using the same |
CN104777775A (en) * | 2015-03-25 | 2015-07-15 | 北京工业大学 | Two-wheeled self-balancing robot control method based on Kinect device |
-
2016
- 2016-04-21 CN CN201610252110.6A patent/CN105867630A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110001813A1 (en) * | 2009-07-03 | 2011-01-06 | Electronics And Telecommunications Research Institute | Gesture recognition apparatus, robot system including the same and gesture recognition method using the same |
CN101901350A (en) * | 2010-07-23 | 2010-12-01 | 北京航空航天大学 | Characteristic vector-based static gesture recognition method |
CN104777775A (en) * | 2015-03-25 | 2015-07-15 | 北京工业大学 | Two-wheeled self-balancing robot control method based on Kinect device |
Non-Patent Citations (1)
Title |
---|
陈一民,张云华: "基于手势识别的机器人人机交互技术研究", 《机器人》 * |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106502390A (en) * | 2016-10-08 | 2017-03-15 | 华南理工大学 | A kind of visual human's interactive system and method based on dynamic 3D Handwritten Digit Recognitions |
CN106502390B (en) * | 2016-10-08 | 2019-05-14 | 华南理工大学 | A kind of visual human's interactive system and method based on dynamic 3D Handwritten Digit Recognition |
CN106363638A (en) * | 2016-10-19 | 2017-02-01 | 苏州大成电子科技有限公司 | Gesture command robot |
CN106681508A (en) * | 2016-12-29 | 2017-05-17 | 杭州电子科技大学 | System for remote robot control based on gestures and implementation method for same |
CN106791746A (en) * | 2017-01-23 | 2017-05-31 | 合肥优智领英智能科技有限公司 | A kind of Novel movable interactive mode ground projection structure and projecting method |
CN106791746B (en) * | 2017-01-23 | 2019-11-08 | 合肥虹慧达科技有限公司 | A kind of Novel movable interactive mode ground projection structure and projecting method |
WO2019029266A1 (en) * | 2017-08-07 | 2019-02-14 | 深圳市科迈爱康科技有限公司 | Body movement recognition method, robot and storage medium |
CN107688779A (en) * | 2017-08-18 | 2018-02-13 | 北京航空航天大学 | A kind of robot gesture interaction method and apparatus based on RGBD camera depth images |
CN108568820A (en) * | 2018-04-27 | 2018-09-25 | 深圳市商汤科技有限公司 | Robot control method and device, electronic equipment and storage medium |
CN108549490A (en) * | 2018-05-03 | 2018-09-18 | 林潼 | A kind of gesture identification interactive approach based on Leap Motion equipment |
CN109409277A (en) * | 2018-10-18 | 2019-03-01 | 北京旷视科技有限公司 | Gesture identification method, device, intelligent terminal and computer storage medium |
US20210331314A1 (en) * | 2019-03-08 | 2021-10-28 | Lg Electronics Inc. | Artificial intelligence cleaner |
CN110228065A (en) * | 2019-04-29 | 2019-09-13 | 北京云迹科技有限公司 | Motion planning and robot control method and device |
CN110434853B (en) * | 2019-08-05 | 2021-05-14 | 北京云迹科技有限公司 | Robot control method, device and storage medium |
CN110434853A (en) * | 2019-08-05 | 2019-11-12 | 北京云迹科技有限公司 | A kind of robot control method, device and storage medium |
CN110925945A (en) * | 2019-11-27 | 2020-03-27 | 广东美的制冷设备有限公司 | Air conditioner robot control method and device based on gesture recognition |
CN113303708A (en) * | 2020-02-27 | 2021-08-27 | 佛山市云米电器科技有限公司 | Control method for maintenance device, and storage medium |
CN112363538A (en) * | 2020-11-09 | 2021-02-12 | 哈尔滨工程大学 | AUV (autonomous underwater vehicle) area tracking control method under incomplete speed information |
CN112363538B (en) * | 2020-11-09 | 2022-09-02 | 哈尔滨工程大学 | AUV (autonomous underwater vehicle) area tracking control method under incomplete speed information |
CN114724243A (en) * | 2022-03-29 | 2022-07-08 | 赵新博 | Bionic action recognition system based on artificial intelligence |
CN117576787A (en) * | 2024-01-16 | 2024-02-20 | 北京大学深圳研究生院 | Method, device and equipment for handing over based on active tracking and self-adaptive gesture recognition |
CN117576787B (en) * | 2024-01-16 | 2024-04-16 | 北京大学深圳研究生院 | Method, device and equipment for handing over based on active tracking and self-adaptive gesture recognition |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105867630A (en) | Robot gesture recognition method and device and robot system | |
CN107139179B (en) | Intelligent service robot and working method | |
CN100360204C (en) | Control system of intelligent perform robot based on multi-processor cooperation | |
CN104589356B (en) | The Dextrous Hand remote operating control method caught based on Kinect human hand movement | |
CN104407694B (en) | The man-machine interaction method and device of a kind of combination face and gesture control | |
CN109044651B (en) | Intelligent wheelchair control method and system based on natural gesture instruction in unknown environment | |
CN105787471A (en) | Gesture identification method applied to control of mobile service robot for elder and disabled | |
Xu et al. | Real-time dynamic gesture recognition system based on depth perception for robot navigation | |
CN102999152A (en) | Method and system for gesture recognition | |
CN104777775A (en) | Two-wheeled self-balancing robot control method based on Kinect device | |
CN103353935A (en) | 3D dynamic gesture identification method for intelligent home system | |
CN110135249A (en) | Human bodys' response method based on time attention mechanism and LSTM | |
CN107336243A (en) | robot control system and control method based on intelligent mobile terminal | |
CN105500370B (en) | A kind of robot off-line teaching programing system and method based on body-sensing technology | |
CN109199240B (en) | Gesture control-based sweeping robot control method and system | |
CN105159452B (en) | A kind of control method and system based on human face modeling | |
CN103186230B (en) | Man-machine interaction method based on colour recognition with tracking | |
CN103605466A (en) | Facial recognition control terminal based method | |
CN110807391A (en) | Human body posture instruction identification method for human-unmanned aerial vehicle interaction based on vision | |
Hueser et al. | Learning of demonstrated grasping skills by stereoscopic tracking of human head configuration | |
CN111444488A (en) | Identity authentication method based on dynamic gesture | |
CN112631173A (en) | Brain-controlled unmanned platform cooperative control system | |
CN110569775A (en) | Method, system, storage medium and electronic device for recognizing human body posture | |
Francis et al. | Significance of hand gesture recognition systems in vehicular automation-a survey | |
CN106502416B (en) | A kind of driving simulation system and its control method of intelligent recognition bimanual input |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160817 |
|
RJ01 | Rejection of invention patent application after publication |