CN111300402A - Robot control method based on gesture recognition - Google Patents

Robot control method based on gesture recognition Download PDF

Info

Publication number
CN111300402A
CN111300402A CN201911172292.6A CN201911172292A CN111300402A CN 111300402 A CN111300402 A CN 111300402A CN 201911172292 A CN201911172292 A CN 201911172292A CN 111300402 A CN111300402 A CN 111300402A
Authority
CN
China
Prior art keywords
controller
gesture
determining
robot
palm position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911172292.6A
Other languages
Chinese (zh)
Inventor
胡佳文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Everest Shenzhen Technology Co ltd
Original Assignee
Everest Shenzhen Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Everest Shenzhen Technology Co ltd filed Critical Everest Shenzhen Technology Co ltd
Priority to CN201911172292.6A priority Critical patent/CN111300402A/en
Publication of CN111300402A publication Critical patent/CN111300402A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention discloses a robot control method and a robot control system based on gesture recognition, which comprises the following steps: acquiring a depth image of a controller; determining a controller gesture based on the depth image and determining a control command to start; determining a real-time palm position of the controller based on the depth image; determining a controller gesture movement trend and a controller control instruction based on the palm position; the robot is controlled to move based on the control instruction, the problems that voice interaction is easily interfered by noise and close-range interaction is limited by touch screen interaction in a robot control mode are solved, and the realization mode is simple.

Description

Robot control method based on gesture recognition
Technical Field
The present application relates to the field of robot control, and more particularly, to a robot control method based on gesture recognition.
Background
Robots are increasingly used in various production and living environments, and the traditional control mode of the robots is usually voice interaction through a robot voice recognition device or touch screen interaction at the near end of the robots. However, when the environment in which the robot is located is particularly noisy, the voice control device usually cannot achieve a good control effect, or the near end of the robot cannot be reached to perform control in a special environment.
The above disadvantages can be overcome by performing robot control based on the way of hand recognition by the controller.
However, recognizing the gesture of the controller by the control glove to control the robot is costly and complicated to use.
It is therefore desirable to provide an improved method of robot control based on hand recognition.
Disclosure of Invention
The invention aims to provide a robot control method and a robot control system based on gesture recognition, which aim to solve the problems that voice interaction is easily interfered by noise and touch screen interaction limits short-distance interaction in a robot control mode and are simple in implementation mode.
According to an aspect of the present invention, there is provided a robot control method based on gesture recognition, including: acquiring a depth image of a controller; determining a controller gesture based on the depth image and determining a control command to start; determining a real-time palm position of the controller based on the depth image; determining a controller gesture movement trend and a controller control instruction based on the palm position; and controlling the robot to move based on the control instruction.
In the robot control method based on gesture recognition, the depth image is acquired based on an optical Astra Pro depth camera in an Australia ratio.
In the robot control method based on gesture recognition, the determination of the gesture of a controller is realized based on an optical gesture recognition SDK module in an Olympic ratio.
In the robot control method based on gesture recognition, the determination of the real-time palm position of the controller is realized based on an optical gesture recognition SDK module in an Australian ratio.
In the robot control method based on gesture recognition, the method for determining the gesture motion trend of the controller comprises the steps of judging the three-dimensional coordinates of the palm position of the second image frame relative to the palm position of the first image frame, wherein the timestamps of the first image frame and the second image frame are adjacent, and the timestamp of the second image frame is later than the timestamp of the first image frame.
According to another aspect of the present invention, there is provided a robot control system including: the robot comprises a depth image acquisition module, a controller gesture and real-time palm position determination module, a controller control instruction starting and gesture movement trend determination module and a robot.
And the depth image acquisition module is used for acquiring the depth image of the controller and transmitting the image to the gesture and palm position determination module of the controller.
And the controller gesture and palm position determining module is used for determining a controller gesture, an image frame real-time palm position and a time stamp based on the depth image and sending the controller gesture, the image frame real-time palm position and the time stamp to the controller control instruction starting and gesture motion trend determining module.
And the controller control instruction starting and gesture movement trend determining module is used for marking the control state as the controller instruction starting and gesture movement trend of the controller, determining the control instruction and sending the control instruction to the robot.
And the robot responds to the control instruction and moves.
In the robot control system, the depth image is acquired based on an optical Astra Pro depth camera in the aobi.
The invention establishes a robot control system based on gesture recognition, provides a robot control method, solves the problems that voice interaction is easily interfered by noise and touch screen interaction limits short-distance interaction in a robot control mode, and is simple in implementation mode.
Drawings
Various other advantages and benefits of the present application will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. It is obvious that the drawings described below are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. Also, like parts are designated by like reference numerals throughout the drawings.
FIG. 1 is a flow diagram of a method for robot control based on gesture recognition according to one embodiment of the present invention;
fig. 2 is a block diagram of a robot control system according to an embodiment of the present invention.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Exemplary method
Fig. 1 illustrates a flowchart of a robot control method based on gesture recognition according to an embodiment of the present invention.
As shown in fig. 1, a flowchart of a robot control method based on gesture recognition according to an embodiment of the present invention includes:
s11: acquiring a depth image of a controller;
the controller depth image may be acquired by a depth camera, for example, an Astra Pro depth camera in obit. Furthermore, the Astra Pro depth camera is insensitive to the light of the background in which the controller is located, the relative angle of the controller and the camera, and the like.
S12: determining a controller gesture based on the depth image and determining a control command to start;
the determination of the controller gesture is realized based on the optical gesture recognition SDK module in the Oratic matched with the Astra Pro. After the gesture recognition SDK acquires the depth image, the gesture of the controller, such as fist making, is output. After the determination of the gesture of the controller, the control state is marked as a controller command start, and further a start control command, such as a fist start command, is marked.
S13: determining a real-time palm position of the controller based on the depth image;
the real-time palm position of the controller is determined based on the gesture recognition SDK module matched with the Astra Pro. And after the gesture recognition SDK module acquires the depth image frame, outputting a time stamp of the image frame and the palm position of the controller in real time, wherein the position is represented by a three-dimensional coordinate.
S14: determining a hand movement trend of a controller and determining a controller control instruction based on the palm position;
and judging the position of the second image frame relative to the palm of the hand of the first image frame, namely comparing three-dimensional coordinates corresponding to the position of the palm of the hand, and determining the gesture motion trend to be forward or backward, leftward or rightward, upward or downward. The time stamps of the first image frame and the second image frame are adjacent, and the time stamp of the second image frame is later than the time stamp of the first image frame. When the motion trend of the second image frame is changed compared with that of the second image frame, the gesture motion trend of the controller is determined.
And determining a controller control command based on the starting control command and the gesture motion trend.
For example, when the starting control command is a fist making command and the hand movement trend is to the left, the controller is defined to control the robot to the left, that is, the robot is controlled to move to the left.
S15: and controlling the robot to move based on the control instruction.
And controlling the robot to move based on the control instruction, wherein the robot has the moving capability and moves according to a predefined movement mode after receiving the control instruction. For example, when the control command is to the left, the robot moves to the left.
Exemplary System
FIG. 2 illustrates a robot control system block diagram according to one embodiment of the present invention.
As shown in fig. 2, a robot control system according to an embodiment of the present invention constitutes a block diagram 200, including: a depth image acquisition module 210, a controller gesture and real-time palm position determination module 220, a controller control command start and gesture motion trend determination module 230, and a robot 240.
A depth image acquisition module 210 for acquiring a depth image of the controller and transmitting the image to the controller gesture and palm position determination module 220. The acquisition controller depth image is acquired by a depth camera, for example, an optical Astra Pro depth camera in an otto. The depth camera may transmit the depth image to the controller gesture and palm position determination module 220 via a data bus. For example, the Astra Pro depth camera outputs the depth image through a USB interface and transmits to the controller gesture and palm position determination module 220.
A controller gesture and palm position determination module 220 for determining a controller gesture, an image frame real-time palm position and a time stamp based on the depth image and sending the controller gesture, the image frame real-time palm position and the time stamp to a controller control command start and gesture motion trend determination module 230.
For example, the determine controller gesture and real-time palm position module described may be implemented with a gesture recognition SDK module configured with the Astra Pro camera described above. The gesture recognition SDK module acquires an AstraPro depth image through a USB bus and outputs a gesture and a real-time palm position of a controller. The controller gestures, such as making a fist.
A controller control command initiation and gesture movement trend determination module 230 for marking the control state as a controller command initiation and determination controller gesture movement trend, determining the control command, and sending the control command to the robot.
The method for determining the gesture motion trend of the controller comprises the following steps: and judging the position of the palm of the hand of the second image frame relative to the position of the palm of the hand of the first image frame, namely comparing three-dimensional coordinates corresponding to the position of the palm of the hand, and determining the gesture movement trend to be forward or backward, leftward or rightward, upward or downward. The time stamps of the first image frame and the second image frame are adjacent, and the time stamp of the second image frame is later than the time stamp of the first image frame. When the motion trend of the second image frame is changed compared with that of the first image frame, the gesture motion trend of the controller is determined.
And determining a controller control command based on the starting control command and the hand movement trend.
For example, if the start control command is to make a fist and the hand movement trend is to the left, the controller control command is defined to be to the left, and the left command is sent to the robot.
And the robot 240 responds to the control instruction and moves.
The robot 240 has a moving capability, and moves according to a predefined movement mode after receiving the control command. For example, when the control command is to the left, the robot moves to the left. The robot control system 200 controls the movement of the robot according to the robot control method described above.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art.
Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the methods, apparatus and devices of the present application, the components or steps may be broken down and/or re-combined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (7)

1. A robot control method based on gesture recognition is characterized by comprising the following steps:
acquiring a depth image of a controller;
determining a controller gesture based on the depth image and determining a control command to start;
determining a real-time palm position of the controller based on the depth image;
determining a controller gesture movement trend and a controller control instruction based on the palm position;
and controlling the robot to move based on the control instruction.
2. The robot control method of claim 1, wherein the depth image is acquired based on an optical Astra Pro depth camera in an austemper.
3. The robot control method of claim 2, wherein determining the controller gesture is based on an oddi light gesture recognition (SDK) module.
4. The robot control method of claim 2, wherein the determining the real-time palm position of the controller is based on an oddi light gesture recognition (SDK) module.
5. The robot control method according to claim 2, wherein the method for determining the gesture movement trend of the controller comprises the steps of determining three-dimensional coordinates of the palm position of the second image frame relative to the palm position of the first image frame, wherein the timestamps of the first image frame and the second image frame are adjacent, and the timestamp of the second image frame is later than the timestamp of the first image frame.
6. A robotic control system, comprising:
the depth image acquisition module is used for acquiring a depth image of the controller and transmitting the image to the gesture and palm position determination module of the controller;
the controller gesture and palm position determining module is used for determining a controller gesture, an image frame real-time palm position and a time stamp based on the depth image and sending the controller gesture, the image frame real-time palm position and the time stamp to the controller control instruction starting and gesture motion trend determining module;
the controller control instruction starting and gesture movement trend determining module is used for marking the control state as the controller instruction starting and gesture movement trend of the controller, determining a control instruction and sending the control instruction to the robot;
and the robot responds to the control instruction and moves.
7. The robotic control system of claim 6, wherein the depth image is acquired based on an optical Astra Pro depth camera in Australia.
CN201911172292.6A 2019-11-26 2019-11-26 Robot control method based on gesture recognition Pending CN111300402A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911172292.6A CN111300402A (en) 2019-11-26 2019-11-26 Robot control method based on gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911172292.6A CN111300402A (en) 2019-11-26 2019-11-26 Robot control method based on gesture recognition

Publications (1)

Publication Number Publication Date
CN111300402A true CN111300402A (en) 2020-06-19

Family

ID=71158020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911172292.6A Pending CN111300402A (en) 2019-11-26 2019-11-26 Robot control method based on gesture recognition

Country Status (1)

Country Link
CN (1) CN111300402A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150217450A1 (en) * 2014-02-05 2015-08-06 Quanta Storage Inc. Teaching device and method for robotic arm
CN105787471A (en) * 2016-03-25 2016-07-20 南京邮电大学 Gesture identification method applied to control of mobile service robot for elder and disabled
CN107688779A (en) * 2017-08-18 2018-02-13 北京航空航天大学 A kind of robot gesture interaction method and apparatus based on RGBD camera depth images
CN107741781A (en) * 2017-09-01 2018-02-27 中国科学院深圳先进技术研究院 Flight control method, device, unmanned plane and the storage medium of unmanned plane
CN108274448A (en) * 2018-01-31 2018-07-13 佛山智能装备技术研究院 A kind of the robot teaching method and teaching system of human body interaction
CN109492578A (en) * 2018-11-08 2019-03-19 北京华捷艾米科技有限公司 A kind of gesture remote control method and device based on depth camera

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150217450A1 (en) * 2014-02-05 2015-08-06 Quanta Storage Inc. Teaching device and method for robotic arm
CN105787471A (en) * 2016-03-25 2016-07-20 南京邮电大学 Gesture identification method applied to control of mobile service robot for elder and disabled
CN107688779A (en) * 2017-08-18 2018-02-13 北京航空航天大学 A kind of robot gesture interaction method and apparatus based on RGBD camera depth images
CN107741781A (en) * 2017-09-01 2018-02-27 中国科学院深圳先进技术研究院 Flight control method, device, unmanned plane and the storage medium of unmanned plane
CN108274448A (en) * 2018-01-31 2018-07-13 佛山智能装备技术研究院 A kind of the robot teaching method and teaching system of human body interaction
CN109492578A (en) * 2018-11-08 2019-03-19 北京华捷艾米科技有限公司 A kind of gesture remote control method and device based on depth camera

Similar Documents

Publication Publication Date Title
US9811168B2 (en) Apparatus for performing gesture recognition and control based on ultrasonic positioning
EP2093650B1 (en) User interface system based on pointing device
US10642372B2 (en) Apparatus and method for remote control using camera-based virtual touch
US8648808B2 (en) Three-dimensional human-computer interaction system that supports mouse operations through the motion of a finger and an operation method thereof
US9367138B2 (en) Remote manipulation device and method using a virtual touch of a three-dimensionally modeled electronic device
TWI540461B (en) Gesture input method and system
US10156938B2 (en) Information processing apparatus, method for controlling the same, and storage medium
US20130135199A1 (en) System and method for user interaction with projected content
WO2013067849A1 (en) Trigger and control method and system of human-computer interaction operation command and laser emission device
US10372223B2 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit
US9292106B2 (en) Interface apparatus using motion recognition, and method for controlling same
CN107894834B (en) Control gesture recognition method and system in augmented reality environment
US20140304736A1 (en) Display device and method of controlling the display device
CN111300402A (en) Robot control method based on gesture recognition
TW201310339A (en) System and method for controlling a robot
KR20070025138A (en) The space projection presentation system and the same method
CN104639865A (en) Video conference motion control method, terminal and system
TW201351977A (en) Image capturing method for image rcognition and system thereof
KR20090093220A (en) Used the infrared ray camera the space projection presentation system
CN111565898B (en) Operation guidance system
US20130162530A1 (en) Content reproducing device and content reproducing method
KR20130096073A (en) Virtual mouse driving method using hand motion recognition
TWI587175B (en) Dimensional pointing control and interaction system
US20120176339A1 (en) System and method for generating click input signal for optical finger navigation
CN111258427A (en) Blackboard control method and control system based on binocular camera gesture interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200619