WO2020133078A1 - 一种机器人操作控制方法、手势识别装置以及机器人 - Google Patents

一种机器人操作控制方法、手势识别装置以及机器人 Download PDF

Info

Publication number
WO2020133078A1
WO2020133078A1 PCT/CN2018/124407 CN2018124407W WO2020133078A1 WO 2020133078 A1 WO2020133078 A1 WO 2020133078A1 CN 2018124407 W CN2018124407 W CN 2018124407W WO 2020133078 A1 WO2020133078 A1 WO 2020133078A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
palm
gesture
control module
gesture recognition
Prior art date
Application number
PCT/CN2018/124407
Other languages
English (en)
French (fr)
Inventor
熊友军
李亮
Original Assignee
深圳市优必选科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市优必选科技有限公司 filed Critical 深圳市优必选科技有限公司
Priority to PCT/CN2018/124407 priority Critical patent/WO2020133078A1/zh
Publication of WO2020133078A1 publication Critical patent/WO2020133078A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present application relates to the technical field of robot manipulation, in particular to a robot operation control method, a gesture recognition device, and a robot.
  • the current robot control methods can only be accomplished by touching virtual control buttons on the screen of the mobile terminal, or the control buttons are hardware, and the robot can be controlled by a remote controller with buttons.
  • a single button control on the one hand, has a relatively limited control type, and cannot smoothly implement auxiliary actions such as flipping.
  • the user has a poor experience when operating the model through the robot.
  • the communication connection between the touch-sensitive display and the universal robot controller, the end position of the universal robot is in the XY plane of the three-dimensional space through different predetermined motion directions of the user on the touch-sensitive display Or corresponding movement in the Z-axis direction, when the user's predetermined motion on the touch-sensitive display stops, the corresponding movement of the end position of the universal robot in the three-dimensional space also ends.
  • the control completed by the touch-sensitive display has simple control instructions and cannot control the end of the general robot to complete complex and delicate movements.
  • this application provides a way to get rid of the constraints of a touch-sensitive display interface or a remote control, collect multi-dimensional gesture information of a user's one or two hands in real time through an ultrasonic gesture recognition device, and use the gesture information to control the robot to complete various types of Actions not only make gesture recognition sensitive and accurate, enhance the maneuverability of the robot, but also increase the robot operation control method and the robot's interest in manipulating the robot.
  • the technical solution provided by the embodiments of the present application is to provide a robot operation control method, including the following steps:
  • the program is initialized and the original data of the palm is recorded;
  • the first control module acquires the distance data of the palm measured by the ultrasonic sensor at a set frequency
  • the first control module determines the set of distance change values of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determines the current value based on the set of distance change values gesture;
  • the first control module determines the robot motion according to the current gesture
  • the first control module is connected to a second control module, and the second control module is provided on a second gesture recognition device,
  • the program is initialized and the original palm data is recorded;
  • the second control module acquires the distance data of the palm measured by the ultrasonic sensor at a set frequency
  • the second control module determines the set of distance change values of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determines the first Two gestures
  • the second control module determines the action of the second robot according to the second gesture
  • the step of determining the current gesture according to the set of distance change values further includes:
  • mapping relationship library between the recognition feature and the gesture query and confirm the gesture corresponding to the current recognition feature.
  • the technical solution provided by the embodiments of the present application is to provide a gesture recognition device, including a gesture recognition box and a control module disposed on the gesture recognition box, the gesture recognition box includes multiple detections enclosed in a box A wall and an operation port for hand insertion, each test wall is provided with a number of ultrasonic sensors connected to a control module, the control module includes a measurement unit and a gesture recognition unit,
  • the measuring unit is used to acquire the distance data of the palm measured by the ultrasonic sensor at a set frequency
  • the gesture recognition unit is used to determine the set of distance change values of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determine according to the set of distance change values Current gesture.
  • control module further includes an initialization unit, a docking unit and an output interface,
  • the initialization unit is used for initializing the program and recording the original data of the palm when the palm reaches into the first gesture recognition device provided with a number of ultrasonic sensors;
  • the docking unit After determining the current gesture, the docking unit is used to determine the robot action according to the current gesture;
  • the output interface is used to send the robot motion to the connected robot to complete the first manipulation.
  • the output interface is connected to the robot, and the gesture recognition box includes a left detection wall, a right detection wall, an upper detection wall, a lower detection wall, and a rear detection wall, and each detection wall is provided with a number of ultrasonic sensors.
  • the technical solution provided by the embodiments of the present application is to provide a robot including a fuselage and a first manipulator, where the fuselage is provided with a main controller, and further includes a gesture recognition box and a first gesture recognition box provided on the gesture recognition box
  • a control module the first control module is connected to the main controller, the gesture recognition box includes a plurality of detection walls and an operation port for hand insertion, each test wall is provided with a number of ultrasonic sensors connected to the control module,
  • the control module includes a measurement unit and a gesture recognition unit,
  • the measuring unit is used to acquire the distance data of the palm measured by the ultrasonic sensor at a set frequency
  • the gesture recognition unit is used to determine the set of distance change values of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determine according to the set of distance change values Current gesture.
  • control module of the robot further includes an initialization unit, a docking unit and an output interface,
  • the initialization unit is used to initialize the program and record the original data of the palm when the palm reaches into the gesture recognition box;
  • the docking unit After determining the current gesture, the docking unit is used to determine the robot action according to the current gesture;
  • the output interface is used to send the robot motion to the robot's main controller to complete the control of the first manipulator.
  • this embodiment also provides a two-handed control solution.
  • the robot further includes a second manipulator, the main controller is connected to a second gesture recognition box and a second control module provided on the gesture recognition box,
  • the first control module is connected to the second control module
  • the program is initialized and the original palm data is recorded;
  • the second control module acquires the distance data of the palm measured by the ultrasonic sensor at a set frequency
  • the second control module determines the set of distance change values of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determines the first Two gestures
  • the second control module determines the action of the second robot according to the second gesture
  • the gesture recognition unit includes an extraction unit and a query unit:
  • the extraction unit is used to extract the identification features of the set of distance change values
  • the query unit is used to query and confirm the gesture corresponding to the current recognition feature according to the mapping relationship library between the recognition feature and the gesture.
  • the first control module and the second control module are connected to the main controller through a wireless network.
  • the robot actions include forward, backward, leftward, rightward, grasping, grasping, and turning.
  • the technical solution provided by the embodiments of the present application is to provide a computer program product, the computer program product including a computer program stored on a non-volatile computer-readable storage medium, the computer program including program instructions, when When the program instructions are executed by a computer, the computer is caused to execute the aforementioned method.
  • the robot operation control method and the robot of this embodiment collect multi-dimensional gesture information of a user's one or two hands in real time through an ultrasonic gesture recognition device, and use the gesture information to control the robot to complete various actions, not only
  • the gesture recognition is sensitive and accurate, enhances the controllability of the robot, and increases the user's interest in controlling the robot.
  • the robot operation control method and robot of this embodiment by providing at least one gesture recognition device, use ultrasonic waves to recognize palm multi-dimensional information, and track palm gestures in real time.
  • the recognized gestures can be used for connected robot operations and control.
  • the hand is put into a box body of the gesture recognition device, and the five faces of the box body are covered with miniature ultrasonic sensors. Palm movements, shaking hands, flipping, tilting and even bending fingers and flicking fingers can be detected and captured, and gesture recognition is very sensitive.
  • FIG. 1 is a main processing flowchart of a robot operation control method according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of the arrangement of ultrasonic sensors in a gesture recognition device according to an embodiment of the present application
  • FIG. 4 is a diagram of a gesture recognition module of a two-handed robot according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of robot manipulation according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the hardware architecture of the robot of the embodiment of the present application.
  • This application relates to a robot operation control method, a gesture recognition device, and a robot.
  • the robot operation control method and the robot of this embodiment adopt one or two ultrasonic gesture recognition devices to collect multi-dimensional ultrasonic gesture information of a user's one or two hands in real time.
  • the robot tracks palm hand gestures in real time, and establishes a mapping relationship library between the recognized gestures and robot actions.
  • the recognized gesture can be used to operate or control the connected robot to complete the correspondingly indicated action.
  • the robot operation control method of the present application and the robot uses the gesture information to control the robot to complete various actions, which not only makes the gesture recognition sensitive and accurate, enhances the control of the robot, but also increases the user's interest in controlling the robot.
  • put one box of the gesture recognition device with one hand or put the box body of the corresponding gesture recognition device with both hands and change the gesture to complete the operation and control of the robot.
  • each gesture recognition device has a box structure, and the five surfaces of the box body are covered with miniature ultrasonic sensors. Palm movements, shaking hands, flipping, tilting and even bending fingers and flicking fingers can be detected and captured, and gesture recognition is very sensitive.
  • the gesture recognition device of this embodiment includes a gesture recognition box and a control module provided on the gesture recognition box.
  • the data of the gesture recognition device is set according to the needs of specific application scenarios.
  • the robot is connected to two gesture recognition devices.
  • the first gesture recognition device includes a first gesture recognition box 60 and a first control set on the gesture recognition box 60 Module 62;
  • the second gesture recognition device includes a second gesture recognition box 70 and a second control module 72 provided on the second gesture recognition box 70.
  • the following description takes the first gesture recognition device as an example.
  • the gesture recognition box includes a left detection wall, a right detection wall, an upper detection wall, a lower detection wall, and a rear detection wall, and each detection wall is provided with a plurality of ultrasonic sensors.
  • the left detection wall 63 is provided with multiple rows of ultrasonic sensors 65. The more the ultrasonic sensors 65 are provided, the more the distance data of the palm is collected, and the more precise the palm gesture state is, for example, the finger is as fine as Flexion and extension.
  • the five detection walls are enclosed to form a box body.
  • the gesture recognition box further includes an operation port through which the palm extends, and the user extends into the gesture recognition box through the operation port.
  • a number of ultrasonic sensors provided on each test wall are connected to the corresponding control module of the gesture recognition box. For example, it is connected to the first control module 62.
  • the first control module 62 stores multiple software modules and collects distance data fed back by all the ultrasonic sensors.
  • the first control module 62 determines the distance change value set of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determines the distance change value set according to the distance change value set Current gesture.
  • the first control module 62 includes an initialization unit 620, a measurement unit 622, a gesture recognition unit 624, a docking unit 626, and an output interface 628.
  • the second gesture recognition device includes a second gesture recognition box 70 and a second control module 72 disposed on the gesture recognition box 70.
  • the second control module 72 includes an initialization unit 720, a measurement unit 722, a gesture recognition unit 724, a docking unit 726, and an output interface 728.
  • the initialization unit (620 and 720) is used to initialize the program and record the original data of the palm when the palm of the hand extends into the first gesture recognition device provided with several ultrasonic sensors;
  • the measuring units (622 and 722) are used to acquire the distance data of the palm measured by the ultrasonic sensor at a set frequency.
  • the gesture recognition unit (624 and 724) is used to determine the set of distance change values of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and according to the The set of distance change values determines the current gesture.
  • the docking unit (626 and 726) is used to determine the robot motion according to the current gesture.
  • the output interfaces (628 and 728) are connected to the robot, and the output interface is used to send the robot motion to the connected robot to complete the first manipulation.
  • the robot of this embodiment includes a fuselage 10, a first arm 41, a second arm 42, a first foot 31, and a second foot 32.
  • the waist 20 of the body 10 is provided with a main controller 22 that can be connected to the first gesture recognition device and the second gesture recognition device by wire or wirelessly.
  • a first manipulator 51 is provided at the end of the first arm 41, and a second manipulator 52 is provided at the end of the second arm 42.
  • the first gesture recognition device includes a first gesture recognition box 60 and a first control module 62 disposed on the gesture recognition box 60; the second gesture recognition device includes a second gesture recognition box 70 and the second gesture recognition box Identify the second control module 72 on the box 70.
  • the following description takes the first gesture recognition device as an example.
  • the gesture recognition box includes a left detection wall, a right detection wall, an upper detection wall, a lower detection wall, and a rear detection wall, and each detection wall is provided with a plurality of ultrasonic sensors.
  • each detection wall is provided with a plurality of ultrasonic sensors.
  • the left detection wall 63 is provided with multiple rows of ultrasonic sensors 65.
  • Each gesture recognition box is also provided with an operation port for the palm to enter.
  • the first control module 62 stores multiple software modules and collects distance data fed back by all the ultrasonic sensors.
  • the first control module 62 determines the distance change value set of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determines the distance change value set according to the distance change value set Current gesture.
  • the first control module 62 includes an initialization unit 620, a measurement unit 622, a gesture recognition unit 624, a docking unit 626, and an output interface 628.
  • the second gesture recognition device includes a second gesture recognition box 70 and a second control module 72 disposed on the gesture recognition box 70.
  • the second control module 72 includes an initialization unit 720, a measurement unit 722, a gesture recognition unit 724, a docking unit 726, and an output interface 728.
  • the initialization unit (620 and 720) is used to initialize the program and record the original data of the palm when the palm of the hand extends into the first gesture recognition device provided with several ultrasonic sensors;
  • the measuring units (622 and 722) are used to acquire the distance data of the palm measured by the ultrasonic sensor at a set frequency.
  • the gesture recognition unit (624 and 724) is used to determine the set of distance change values of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and according to the The set of distance change values determines the current gesture.
  • the docking unit (626 and 726) is used to determine the robot motion according to the current gesture.
  • the output interfaces (628 and 728) are connected to the robot, and the output interface is used to send the robot motion to the main controller 22 of the robot to complete the manipulation of the first manipulator 51.
  • the robot of this embodiment can control the robot with one hand by one gesture recognition device, or can realize more delicate two-handed control of the robot by two gesture recognition devices.
  • the robot further includes a second manipulator 52, the main controller 22 is connected to a second gesture recognition box and a second control module provided on the gesture recognition box,
  • the first control module 62 is connected to the second control module 72,
  • the program is initialized and the original palm data is recorded;
  • the second control module 72 acquires the distance data of the palm measured by the ultrasonic sensor at a set frequency
  • the second control module 72 determines the distance change value set of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and determines the distance change value set according to the distance change value set Second gesture
  • the second control module 72 determines the action of the second robot according to the second gesture
  • the second robot motion is sent to the connected robot to complete the manipulation of the second robot 52.
  • the gesture recognition units (624 and 724) also include feature extraction and query processing. On the system module, the gesture recognition units (624 and 724) include an extraction unit and a query unit.
  • the extraction unit is used to extract the identification features of the set of distance change values.
  • the query unit is used to query and confirm the gesture corresponding to the current recognition feature according to the mapping relationship library between the recognition feature and the gesture.
  • the first control module 62 and the second control module 72 are connected to the main controller 22 through a wired or wireless network.
  • the robot movements include forward, backward, left, right, grip, grab, and flip. With the support of a better processor, more delicate palm fingertip movements can also be captured and recognized.
  • the robot operation control method includes the following steps:
  • Step 101 When the palm reaches into the first gesture recognition device provided with a number of ultrasonic sensors, initialize the program and record the original palm data. After recording the original palm data, the palm can start activities for the ultrasonic sensor to collect distance data and calculate distance data changes rate;
  • Step 102 After collecting the original palm data, the hand shape is changed according to the manipulation requirements of the robot, and the first control module acquires the distance data of the palm measured by the ultrasonic sensor at a set frequency;
  • Step 103 The first control module determines the set of distance changes of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and according to the distance changes The collection determines the current gesture;
  • Step 104 The first control module determines the robot motion according to the current gesture
  • Step 105 Send the robot action to the connected robot to complete the first manipulation.
  • the above is the processing flow for manipulating the robot using the gesture recognition device with one hand.
  • the first control module 62 is connected to the second control module 72, and the first control module 62 or the second control module 72 alone completes the fusion of the actions. .
  • the first control module 62 and the second control module 72 are both connected to the main controller 22, and the main controller completes the fusion and superposition of actions.
  • Step 201 When the other palm reaches into the second gesture recognition device provided with several ultrasonic sensors, initialize the program and record the original data of the palm;
  • Step 202 The second control module acquires the distance data of the palm measured by the ultrasonic sensor at a set frequency
  • Step 203 The second control module determines the set of distance changes of all ultrasonic sensors relative to the palm according to the current distance data of the palm and the distance data of the palm after at least a set time interval, and according to the distance changes The set determines the second gesture;
  • Step 204 The second control module determines the second robot motion according to the second gesture
  • Step 205 Send the second robot motion to the connected robot to complete the second manipulation.
  • step 103 of gesture recognition For one-handed operation, when step 103 of gesture recognition is implemented, the following steps are also included:
  • Step 103-1 Extract the recognition features of the set of distance change values
  • Step 103-2 query and confirm the gesture corresponding to the current recognition feature according to the mapping relationship library between the recognition feature and the gesture.
  • the gesture recognition step 103 and the gesture recognition step 203 are performed simultaneously.
  • the gesture recognition step 203 is specifically implemented, the following steps are also included:
  • Step 203-1 Extract the recognition features of the set of distance change values
  • Step 203-2 query and confirm the gesture corresponding to the current recognition feature according to the mapping relationship library between the recognition feature and the gesture.
  • the robot operation control method and robot of this embodiment collect multi-dimensional gesture information of a user's one or two hands in real time through an ultrasonic gesture recognition device, and use the gesture information to control the robot to complete various actions, which not only makes gesture recognition sensitive and accurate, but also enhances Maneuverability, and increase the user's interest in manipulating robots.
  • the robot operation control method and robot of this embodiment by providing at least one gesture recognition device, use ultrasonic waves to recognize palm multi-dimensional information, and track palm gestures in real time.
  • the recognized gestures can be used for connected robot operations and control.
  • the hand is put into a box body of the gesture recognition device, and the five faces of the box body are covered with miniature ultrasonic sensors. Palm movements, shaking hands, flipping, tilting and even bending fingers and flicking fingers can be detected and captured, and gesture recognition is very sensitive.
  • ultrasonic sensors in this embodiment transmit the distance data of the detected gesture to the main controller 22 for storage and calculation.
  • the main controller 22 calculates the distance data of the current palm and the distance data of the palm after at least a set time interval, determines a set of distance change values of all ultrasonic sensors relative to the palm, and determines the current gesture according to the set of distance change values, The corresponding operation command of the robot is determined according to the current gesture.
  • the gesture recognition device of the present application can control the movement of the robot through movements such as one-hand or two-hand movement, flipping, flexing and extending fingers, and tilting the palm, which enhances the user's interest in operating the robot.
  • FIG. 6 is a schematic diagram of a hardware structure of a robot device 600 provided by an embodiment of the present application. As shown in FIG. 6, the device 600 includes:
  • One or more processors 610 a memory 620, and a communication component 650.
  • one processor 610 is taken as an example.
  • the memory 620 stores instructions executable by the at least one processor 610, that is, a computer program 640, and when the instructions are executed by the at least one processor, a data channel is established through the communication component 650 to enable the at least one processor to The robot operation control method is executed.
  • the processor 610, the memory 620, and the communication component 650 may be connected through a bus or in other ways. In FIG. 6, the connection through a bus is used as an example.
  • the memory 620 is a non-volatile computer-readable storage medium, and can be used to store non-volatile software programs, non-volatile computer executable programs, and modules, such as programs corresponding to the robot operation control method in the embodiments of the present application. Instructions/modules.
  • the processor 610 executes various functional applications and data processing of the server by running non-volatile software programs, instructions, and modules stored in the memory 620, that is, implementing the robot operation control method in the foregoing method embodiments.
  • the memory 620 may include a storage program area and a storage data area, where the storage program area may store an operating system and application programs required for at least one function; the storage data area may store data created according to the use of the robot and the like.
  • the memory 620 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the memory 620 may optionally include memories remotely provided with respect to the processor 610, and these remote memories may be connected to the robot through a network. Examples of the aforementioned network include, but are not limited to, the Internet, intranet, local area network, mobile communication network, and combinations thereof.
  • the one or more modules are stored in the memory 620, and when executed by the one or more processors 610, execute the above-mentioned robot operation control method, for example, execute the method steps 101 to 101 in FIG. 1 described above Step 105, or execute the method steps 201 to 205 in FIG. 2; realize the functions of the first control module 62 and the second control module 72 in FIG.
  • An embodiment of the present application provides a non-volatile computer-readable storage medium that stores computer-executable instructions that are executed by one or more processors, for example, the above The method steps 101 to 105 in FIG. 1 described above, or the method steps 201 to 205 in FIG. 2 are executed; the functions of the first control module 62 and the second control module 72 of FIG. 4 are implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Manipulator (AREA)

Abstract

一种机器人操作控制方法、手势识别装置以及机器人,该方法包括以下步骤:在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据;第一控制模块以设定频率获取该超声传感器测量的该手掌的距离数据;该第一控制模块根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势;该第一控制模块根据该当前手势确定机器人动作;发送该机器人动作至连接的机器人完成第一操控。

Description

一种机器人操作控制方法、手势识别装置以及机器人 技术领域
本申请涉及机器人操控技术领域,特别是涉及一种机器人操作控制方法、手势识别装置以及机器人。
背景技术
随着人工智能、计算机软硬件技术的发展,机器人获得了快速发展并具有一定的智能目前在多个领域被广泛应用。
目前机器人的操控方式,比如操作可移动积木模型,只能是通过触摸移动终端屏幕上的虚拟操控按键来完成,或者操控按键为硬件,通过配备带按键的遥控器完成对机器人的控制。
技术问题
但是单一的按钮操控,一方面操控类型比较有限,无法顺畅实现翻转等辅助动作,另一方面用户通过机器人操作模型时体验比较差。
在现有技术中的机器人操控中,触敏显示器与通用机器人控制器之间通讯连接,通过用户在所述触敏显示器上的不同预定动作方向,所述通用机器人末端位置在立体空间的XY平面或Z轴方向上做相应的运动,当用户在所述触敏显示器上的预定动作停止时,所述通用机器人末端位置在立体空间中的相应运动也结束。但是,通过触敏显示器完成的操控,操控指令简单,无法控制通用机器人末端完成复杂细腻的动作。
因此,现有的机器人操作控制技术还有待于改进和发展。
技术解决方案
本申请针对以上存在的技术问题,提供一种摆脱触敏显示界面或者遥控器的约束,通过超声波手势识别装置实时采集用户单手或者双手的多维度手势信息,通过该手势信息控制机器人完成各类动作,不仅手势识别灵敏准确,增强对机器人的操控性,而且增加用户操控机器人趣味性的机器人操作控制方法以及机器人。
第一方面,本申请实施方式提供的技术方案是:提供一种机器人操作控制方法,包括以下步骤:
在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据;
第一控制模块以设定频率获取该超声传感器测量的该手掌的距离数据;
该第一控制模块根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势;
该第一控制模块根据该当前手势确定机器人动作;
发送该机器人动作至连接的机器人完成第一操控。
该机器人操作控制方法中,该第一控制模块连接至第二控制模块,该第二控制模块设置在第二手势识别装置上,
在另一手掌伸入设置若干超声传感器的第二手势识别装置时,初始化程序并记录手掌原始数据;
第二控制模块以设定频率获取该超声传感器测量的该手掌的距离数据;
该第二控制模块根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定第二手势;
该第二控制模块根据该第二手势确定第二机器人动作,
发送该第二机器人动作至连接的机器人完成第二操控。
优选地,在根据该距离变化值集合确定当前手势的步骤中,还包括:
提取该距离变化值集合的识别特征;
根据识别特征与手势的映射关系库,查询确认当前识别特征对应的手势。
第二方面,本申请实施方式提供的技术方案是:提供一种手势识别装置,包括手势识别箱以及设置在该手势识别箱上的控制模块,该手势识别箱包括多个合围成箱体的检测壁以及一供手伸入的操作口,每一测试壁上设置若干连接至控制模块的超声传感器,该控制模块包括测量单元以及手势识别单元,
该测量单元用于以设定频率获取该超声传感器测量手掌的距离数据;
该手势识别单元用于根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势。
优选地,该控制模块还包括初始化单元、对接单元以及输出接口,
该初始化单元用于在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据;
在确定该当前手势后,该对接单元用于根据该当前手势确定机器人动作;
该输出接口用于发送该机器人动作至连接的机器人完成第一操控。
具体实施时,该输出接口连接至机器人,该手势识别箱包括左检测壁、右检测壁、上检测壁、下检测壁以及后检测壁,每一检测壁上装设若干超声传感器。
第三方面,本申请实施方式提供的技术方案是:提供一种机器人,包括机身、第一机械手,该机身设置主控制器,还包括手势识别箱以及设置在该手势识别箱上的第一控制模块,该第一控制模块连接至该主控制器,该手势识别箱包括多个检测壁以及一供手伸入的操作口,每一测试壁上设置若干连接至控制模块的超声传感器,该控制模块包括测量单元以及手势识别单元,
该测量单元用于以设定频率获取该超声传感器测量手掌的距离数据;
该手势识别单元用于根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势。
优选地,该机器人的控制模块还包括初始化单元、对接单元以及输出接口,
该初始化单元用于在手掌伸入手势识别箱时,初始化程序并记录手掌原始数据;
在确定该当前手势后,该对接单元用于根据该当前手势确定机器人动作;
该输出接口用于发送该机器人动作至机器人的主控制器完成对第一机械手的操控。
除了单手控制,本实施例还提供双手操控方案,该机器人还包括第二机械手,该主控制器连接第二手势识别箱以及设置在该手势识别箱上的第二控制模块,
该第一控制模块连接至第二控制模块,
在另一手掌伸入设置若干超声传感器的第二手势识别装置时,初始化程序并记录手掌原始数据;
第二控制模块以设定频率获取该超声传感器测量的该手掌的距离数据;
该第二控制模块根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定第二手势;
该第二控制模块根据该第二手势确定第二机器人动作,
发送该第二机器人动作至连接的机器人完成对第二机械手的操控。
具体实施时,该手势识别单元包括提取单元以及查询单元:
该提取单元用于提取该距离变化值集合的识别特征;
该查询单元用于根据识别特征与手势的映射关系库,查询确认当前识别特征对应的手势。
优选地,该第一控制模块与第二控制模块通过无线网络连接至该主控制器。
具体实施时,该机器人动作包括前进、后退、左移、右移、抓握、抓起以及翻转。
第四方面,本申请实施方式提供的技术方案是:提供一种计算机程序产品,该计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,该计算机程序包括程序指令,当该程序指令被计算机执行时,使该计算机执行前述的方法。
有益效果
本申请实施方式的有益效果是:本实施例的机器人操作控制方法以及机器人,通过超声波手势识别装置实时采集用户单手或者双手的多维度手势信息,通过该手势信息控制机器人完成各类动作,不仅手势识别灵敏准确,增强对机器人的操控性,而且增加用户操控机器人的趣味性。
本实施例的机器人操作控制方法以及机器人,通过设置至少一手势识别装置,利用超声波识别手掌多维度信息,并实时跟踪手掌的手势,识别的手势可用于连接的机器人操作和控制。使用时将手放入一个该手势识别装置的盒体中,盒体的五个面均布满了微型超声传感器。手掌的移动、握手、翻转、倾斜甚至弯曲手指和弹手指等微小动作都可以检测捕捉到,手势识别非常灵敏。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本申请实施例的机器人操作控制方法的主要处理流程图;
图2是本申请实施例的机器人操作控制方法的完整处理流程图;
图3是本申请实施例的手势识别装置中超声传感器的布设简图;
图4是本申请实施例的双手机器人的手势识别模块图;
图5是本申请实施例的机器人的操控示意图;以及
图6是本申请实施例的机器人的硬件架构简图。
本发明的实施方式
为使本申请实施例的目的、技术方案和优点更加清楚明白,下面结合附图对本申请实施例作进一步详细说明。在此,本申请的示意性实施例及其说明用于解释本申请,但并不作为对本申请的限定。
请一并参考图1与图5,本申请涉及机器人操作控制方法、手势识别装置以及机器人。
本实施例的机器人操作控制方法以及机器人,采用一个或者两个超声波手势识别装置实时采集用户单手或者双手的多维度超声手势信息。该机器人实时跟踪手掌的手势,建立识别的手势与机器人动作之间的映射关系库。该识别的手势可用于操作或者控制连接的机器人完成对应指示的动作。本申请的机器人操作控制方法以及机器人通过该手势信息控制机器人完成各类动作,不仅手势识别灵敏准确,增强对机器人的操控性,而且增加用户操控机器人的趣味性。使用时单手放入一个该手势识别装置的盒体或者双手分别放入对应的手势识别装置的盒体,改变手势即可完成对机器人的操作和控制。
本申请中,每一手势识别装置呈盒体构造,盒体的五个面均布满了微型超声传感器。手掌的移动、握手、翻转、倾斜甚至弯曲手指和弹手指等微小动作都可以检测捕捉到,手势识别非常灵敏。
实施例1
如图3以及图5所示,本实施例的手势识别装置,包括手势识别箱以及设置在该手势识别箱上的控制模块。该手势识别装置的数据根据具体应用场景的需要设定,比如,机器人连接两个手势识别装置,第一手势识别装置包括第一手势识别箱60以及设置在该手势识别箱60上的第一控制模块62;第二手势识别装置包括第二手势识别箱70以及设置在该第二手势识别箱70上的第二控制模块72。以下说明以第一手势识别装置为例加以说明。
该手势识别箱包括左检测壁、右检测壁、上检测壁、下检测壁以及后检测壁,每一检测壁上装设若干超声传感器。以左检测壁63为例,该左检测壁63上设置多列超声传感器65,超声传感器65设置的数量越多,则采集手掌的距离数据越多,反应手掌手势状态越精密,比如精细到手指的屈伸和移动。
该五个检测壁合围成箱体,该手势识别箱还包括一供手掌伸入的操作口,用户通过该操作口伸入该手势识别箱中。每一测试壁上设置的若干超声传感器均连接至该手势识别箱对应的控制模块。比如连接至第一控制模块62。
该第一控制模块62存储有多个软件模块,并收集的所有超声传感器反馈的距离数据。该第一控制模块62根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势。
该第一控制模块62包初始化单元620、测量单元622以及手势识别单元624、对接单元626以及输出接口628。
同样地,对于双手操控的机器人,还设置供另一手掌使用的第二手势识别装置。该第二手势识别装置包括第二手势识别箱70以及设置在该手势识别箱70上的第二控制模块72。该第二控制模块72包初始化单元720、测量单元722以及手势识别单元724、对接单元726以及输出接口728。
该初始化单元(620以及720)用于在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据;
该测量单元(622以及722)用于以设定频率获取该超声传感器测量手掌的距离数据。
该手势识别单元(624以及724)用于根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势。
在确定该当前手势后,该对接单元(626以及726)用于根据该当前手势确定机器人动作。
该输出接口(628以及728)连接至机器人,该输出接口用于发送该机器人动作至连接的机器人完成第一操控。
实施例2
请再次参考图5,本实施例的机器人包括机身10、第一臂部41、第二臂部42、第一足部31以及第二足部32。该机身10的腰部20设置可有线或者无线连接至第一手势识别装置与第二手势识别装置的主控制器22。该第一臂部41末端设置第一机械手51、该第二臂部42末端设置第二机械手52。
第一手势识别装置包括第一手势识别箱60以及设置在该手势识别箱60上的第一控制模块62;第二手势识别装置包括第二手势识别箱70以及设置在该第二手势识别箱70上的第二控制模块72。以下说明以第一手势识别装置为例加以说明。
该手势识别箱包括左检测壁、右检测壁、上检测壁、下检测壁以及后检测壁,每一检测壁上装设若干超声传感器。以左检测壁63为例,该左检测壁63上设置多列超声传感器65。
每一手势识别箱还设置一供手掌伸入的操作口。
如图4所示,该第一控制模块62存储有多个软件模块,并收集的所有超声传感器反馈的距离数据。该第一控制模块62根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势。
该第一控制模块62包初始化单元620、测量单元622以及手势识别单元624、对接单元626以及输出接口628。
同样地,对于双手操控的机器人,还设置供另一手掌使用的第二手势识别装置。该第二手势识别装置包括第二手势识别箱70以及设置在该手势识别箱70上的第二控制模块72。该第二控制模块72包初始化单元720、测量单元722以及手势识别单元724、对接单元726以及输出接口728。
该初始化单元(620以及720)用于在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据;
该测量单元(622以及722)用于以设定频率获取该超声传感器测量手掌的距离数据。
该手势识别单元(624以及724)用于根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势。
在确定该当前手势后,该对接单元(626以及726)用于根据该当前手势确定机器人动作。
该输出接口(628以及728)连接至机器人,该输出接口用于发送该机器人动作至机器人的主控制器22完成对第一机械手51的操控。
本实施例的机器人既可以通过一个手势识别装置实现单手控制机器人,也可以通过两个手势识别装置,对机器人实现更细腻的双手操控。
在本实施例的双手操控方案中,该机器人还包括第二机械手52,该主控制器22连接第二手势识别箱以及设置在该手势识别箱上的第二控制模块,
该第一控制模块62连接至第二控制模块72,
在另一手掌伸入设置若干超声传感器的第二手势识别装置时,初始化程序并记录手掌原始数据;
第二控制模块72以设定频率获取该超声传感器测量的该手掌的距离数据;
该第二控制模块72根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定第二手势;
该第二控制模块72根据该第二手势确定第二机器人动作,
发送该第二机器人动作至连接的机器人完成对第二机械手52的操控。
该手势识别单元(624以及724)还包括特征提取和查询的处理过程。在系统模块上,该手势识别单元(624以及724)包括提取单元以及查询单元。
该提取单元用于提取该距离变化值集合的识别特征。
该查询单元用于根据识别特征与手势的映射关系库,查询确认当前识别特征对应的手势。
本实施例中,该第一控制模块62与第二控制模块72通过有线或者无线网络连接至该主控制器22。
该机器人动作包括前进、后退、左移、右移、抓握、抓起以及翻转等动作,在性能更好的处理器支持下,更细腻的手掌指尖动作也可捕获和识别。
实施例3
请参考图1,本申请实施例的机器人操作控制方法,包括以下步骤:
步骤101:在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据,记录手掌原始数据后,手掌可以开始活动,以供超声波传感器采集距离数据以及计算距离数据变化率;
步骤102:采集完手掌原始数据以后,根据对机器人的操控需求改变手形,第一控制模块以设定频率获取该超声传感器测量的该手掌的距离数据;
步骤103:该第一控制模块根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势;
步骤104:该第一控制模块根据该当前手势确定机器人动作;
步骤105:发送该机器人动作至连接的机器人完成第一操控。
以上是单手使用手势识别装置,操控机器人的处理流程。
在单手操控机器人的基础上,给机器人的主控制器22叠加融合另一手势识别装置输入的第二手势数据,可操控该机器人完成更复杂的动作。
请一并参考图2,在该机器人双手操作控制方法中,该第一控制模块62连接至第二控制模块72,由该第一控制模块62或者该第二控制模块72单独完成动作的融合叠加。或者该第一控制模块62与该第二控制模块72均连接至主控制器22,由主控制器完成动作的融合叠加。
在机器人双手操控过程中:
步骤201:在另一手掌伸入设置若干超声传感器的第二手势识别装置时,初始化程序并记录手掌原始数据;
步骤202:第二控制模块以设定频率获取该超声传感器测量的该手掌的距离数据;
步骤203:该第二控制模块根据当前该手掌的距离数据以及至少一设定时间间隔后的该手掌的距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定第二手势;
步骤204:该第二控制模块根据该第二手势确定第二机器人动作,
步骤205:发送该第二机器人动作至连接的机器人完成第二操控。
对单手操作而言,在手势识别的步骤103具体实施时,还包括以下步骤:
步骤103-1:提取该距离变化值集合的识别特征;
步骤103-2:根据识别特征与手势的映射关系库,查询确认当前识别特征对应的手势。
同样,对双手操作而言,手势识别的步骤103与手势识别的步骤203同时进行,在手势识别的步骤203具体实施时,还包括以下步骤:
步骤203-1:提取该距离变化值集合的识别特征;
步骤203-2:根据识别特征与手势的映射关系库,查询确认当前识别特征对应的手势。
本实施例的机器人操作控制方法以及机器人。本实施例的机器人操作控制方法以及机器人,通过超声波手势识别装置实时采集用户单手或者双手的多维度手势信息,通过该手势信息控制机器人完成各类动作,不仅手势识别灵敏准确,增强对机器人的操控性,而且增加用户操控机器人的趣味性。
本实施例的机器人操作控制方法以及机器人,通过设置至少一手势识别装置,利用超声波识别手掌多维度信息,并实时跟踪手掌的手势,识别的手势可用于连接的机器人操作和控制。使用时将手放入一个该手势识别装置的盒体中,盒体的五个面均布满了微型超声传感器。手掌的移动、握手、翻转、倾斜甚至弯曲手指和弹手指等微小动作都可以检测捕捉到,手势识别非常灵敏。
本实施例中的若干超声传感器将检测到的手势的距离数据传送到主控制器22进行存储和计算。该主控制器22计算当前手掌的距离数据以及至少一设定时间间隔后的该手掌距离数据,确定所有超声传感器相对于该手掌的距离变化值集合,并根据该距离变化值集合确定当前手势,根据该当前手势确定机器人相应的操作的命令。本申请的手势识别装置,可以通过单手或者双手移动、翻转、屈伸手指、倾斜手掌等动作来控制机器人的运动,增强了用户在操作机器人时的趣味性。
图6是本申请实施例提供的机器人设备600的硬件结构示意图,如图6所示,该设备600包括:
一个或多个处理器610、存储器620以及通信组件650,图6中以一个处理器610为例。该存储器620存储有可被该至少一个处理器610执行的指令,亦即计算机程序640,该指令被该至少一个处理器执行时,通过通信组件650建立数据通道,以使该至少一个处理器能够执行该机器人操作控制方法。
处理器610、存储器620以及通信组件650可以通过总线或者其他方式连接,图6中以通过总线连接为例。
存储器620作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的机器人操作控制方法对应的程序指令/模块。处理器610通过运行存储在存储器620中的非易失性软件程序、指令以及模块,从而执行服务器的各种功能应用以及数据处理,即实现上述方法实施例中的机器人操作控制方法。
存储器620可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据机器人的使用所创建的数据等。此外,存储器620可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器620可选包括相对于处理器610远程设置的存储器,这些远程存储器可以通过网络连接至机器人。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述一个或者多个模块存储在所述存储器620中,当被所述一个或者多个处理器610执行时,执行上述机器人操作控制方法,例如,执行以上描述的图1中的方法步骤101至步骤105,或者执行图2中的方法步骤201至步骤205;实现附图4第一控制模块62以及第二控制模块72等的功能。
上述产品可执行本申请实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本申请实施例所提供的方法。
本申请实施例提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,该计算机可执行指令被一个或多个处理器执行,例如,执行以上描述的图1中的方法步骤101至步骤105,或者执行图2中的方法步骤201至步骤205;实现附图4第一控制模块62以及第二控制模块72等的功能。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (13)

  1. 一种机器人操作控制方法,其特征在于,包括以下步骤:
    在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据;
    改变手形,第一控制模块以设定频率获取所述超声传感器测量的所述手掌的距离数据;
    所述第一控制模块根据当前所述手掌的距离数据以及至少一设定时间间隔后的所述手掌的距离数据,确定所有超声传感器相对于所述手掌的距离变化值集合,并根据所述距离变化值集合确定当前手势;
    所述第一控制模块根据所述当前手势确定机器人动作;
    发送所述机器人动作至连接的机器人完成第一操控。
  2. 根据权利要求1所述的机器人操作控制方法,其特征在于,所述第一控制模块连接至第二控制模块,所述第二控制模块设置在第二手势识别装置上,
    在另一手掌伸入设置若干超声传感器的第二手势识别装置时,初始化程序并记录手掌原始数据;
    第二控制模块以设定频率获取所述超声传感器测量的所述手掌的距离数据;
    所述第二控制模块根据当前所述手掌的距离数据以及至少一设定时间间隔后的所述手掌的距离数据,确定所有超声传感器相对于所述手掌的距离变化值集合,并根据所述距离变化值集合确定第二手势;
    所述第二控制模块根据所述第二手势确定第二机器人动作,
    发送所述第二机器人动作至连接的机器人完成第二操控。
  3. 根据权利要求1所述的机器人操作控制方法,其特征在于,
    在根据所述距离变化值集合确定当前手势的步骤中,还包括:
    提取所述距离变化值集合的识别特征;
    根据识别特征与手势的映射关系库,查询确认当前识别特征对应的手势。
  4. 一种手势识别装置,其特征在于,包括手势识别箱以及设置在所述手势识别箱上的控制模块,所述手势识别箱包括多个合围成箱体的检测壁以及一供手伸入的操作口,每一测试壁上设置若干连接至控制模块的超声传感器,所述控制模块包括测量单元以及手势识别单元,
    所述测量单元用于以设定频率获取所述超声传感器测量手掌的距离数据;
    所述手势识别单元用于根据当前所述手掌的距离数据以及至少一设定时间间隔后的所述手掌的距离数据,确定所有超声传感器相对于所述手掌的距离变化值集合,并根据所述距离变化值集合确定当前手势。
  5. 根据权利要求4所述的手势识别装置,其特征在于,所述控制模块还包括初始化单元、对接单元以及输出接口,
    所述初始化单元用于在手掌伸入设置若干超声传感器的第一手势识别装置时,初始化程序并记录手掌原始数据;
    在确定所述当前手势后,所述对接单元用于根据所述当前手势确定机器人动作;
    所述输出接口用于发送所述机器人动作至连接的机器人完成第一操控。
  6. 根据权利要求5所述的手势识别装置,其特征在于,所述输出接口连接至机器人,所述手势识别箱包括左检测壁、右检测壁、上检测壁、下检测壁以及后检测壁,每一检测壁上装设若干超声传感器。
  7. 一种机器人,包括机身、第一机械手,所述机身设置主控制器,其特征在于,还包括手势识别箱以及设置在所述手势识别箱上的第一控制模块,所述第一控制模块连接至所述主控制器,所述手势识别箱包括多个检测壁以及一供手伸入的操作口,每一测试壁上设置若干连接至控制模块的超声传感器,所述控制模块包括测量单元以及手势识别单元,
    所述测量单元用于以设定频率获取所述超声传感器测量手掌的距离数据;
    所述手势识别单元用于根据当前所述手掌的距离数据以及至少一设定时间间隔后的所述手掌的距离数据,确定所有超声传感器相对于所述手掌的距离变化值集合,并根据所述距离变化值集合确定当前手势。
  8. 根据权利要求7所述的机器人,其特征在于,所述控制模块还包括初始化单元、对接单元以及输出接口,
    所述初始化单元用于在手掌伸入手势识别箱时,初始化程序并记录手掌原始数据;
    在确定所述当前手势后,所述对接单元用于根据所述当前手势确定机器人动作;
    所述输出接口用于发送所述机器人动作至机器人的主控制器完成对第一机械手的操控。
  9. 根据权利要求8所述的机器人,其特征在于,所述机器人还包括第二机械手,所述主控制器连接第二手势识别箱以及设置在所述手势识别箱上的第二控制模块,
    所述第一控制模块连接至第二控制模块,
    在另一手掌伸入设置若干超声传感器的第二手势识别装置时,初始化程序并记录手掌原始数据;
    第二控制模块以设定频率获取所述超声传感器测量的所述手掌的距离数据;
    所述第二控制模块根据当前所述手掌的距离数据以及至少一设定时间间隔后的所述手掌的距离数据,确定所有超声传感器相对于所述手掌的距离变化值集合,并根据所述距离变化值集合确定第二手势;
    所述第二控制模块根据所述第二手势确定第二机器人动作,
    发送所述第二机器人动作至连接的机器人完成对第二机械手的操控。
  10. 根据权利要求9所述的机器人,其特征在于,所述手势识别单元包括提取单元以及查询单元:
    所述提取单元用于提取所述距离变化值集合的识别特征;
    所述查询单元用于根据识别特征与手势的映射关系库,查询确认当前识别特征对应的手势。
  11. 根据权利要求7-10任意一项所述的机器人,其特征在于,所述第一控制模块与第二控制模块通过无线网络连接至所述主控制器。
  12. 根据权利要求10所述的机器人,其特征在于,所述机器人动作包括前进、后退、左移、右移、抓握、抓起以及翻转。
  13. 一种计算机程序产品,其特征在于,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行权利要求1-3任一项所述的方法。
PCT/CN2018/124407 2018-12-27 2018-12-27 一种机器人操作控制方法、手势识别装置以及机器人 WO2020133078A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/124407 WO2020133078A1 (zh) 2018-12-27 2018-12-27 一种机器人操作控制方法、手势识别装置以及机器人

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/124407 WO2020133078A1 (zh) 2018-12-27 2018-12-27 一种机器人操作控制方法、手势识别装置以及机器人

Publications (1)

Publication Number Publication Date
WO2020133078A1 true WO2020133078A1 (zh) 2020-07-02

Family

ID=71129416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/124407 WO2020133078A1 (zh) 2018-12-27 2018-12-27 一种机器人操作控制方法、手势识别装置以及机器人

Country Status (1)

Country Link
WO (1) WO2020133078A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201859393U (zh) * 2010-04-13 2011-06-08 任峰 三维手势识别盒
US20140033141A1 (en) * 2011-04-13 2014-01-30 Nokia Corporation Method, apparatus and computer program for user control of a state of an apparatus
CN104820491A (zh) * 2014-01-30 2015-08-05 霍尼韦尔国际公司 提供驾驶舱的人机工程三维手势多模态接口的系统和方法
CN105955489A (zh) * 2016-05-26 2016-09-21 苏州活力旺机器人科技有限公司 一种机器人手势识别示教装置及方法
CN206048251U (zh) * 2016-07-04 2017-03-29 浙江理工大学 基于多传感器融合的手势识别仿人机械手系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201859393U (zh) * 2010-04-13 2011-06-08 任峰 三维手势识别盒
US20140033141A1 (en) * 2011-04-13 2014-01-30 Nokia Corporation Method, apparatus and computer program for user control of a state of an apparatus
CN104820491A (zh) * 2014-01-30 2015-08-05 霍尼韦尔国际公司 提供驾驶舱的人机工程三维手势多模态接口的系统和方法
CN105955489A (zh) * 2016-05-26 2016-09-21 苏州活力旺机器人科技有限公司 一种机器人手势识别示教装置及方法
CN206048251U (zh) * 2016-07-04 2017-03-29 浙江理工大学 基于多传感器融合的手势识别仿人机械手系统

Similar Documents

Publication Publication Date Title
Yin et al. Modeling, learning, perception, and control methods for deformable object manipulation
Li et al. Survey on mapping human hand motion to robotic hands for teleoperation
Xue et al. Multimodal human hand motion sensing and analysis—A review
CN104942803B (zh) 机器人控制装置、机器人、机器人系统、示教方法及程序
TWI546725B (zh) 於手勢和使用者介面元件間之延續虛擬連結技術
CN107077169A (zh) 扩增现实中的空间交互
Gunawardane et al. Comparison of hand gesture inputs of leap motion controller & data glove in to a soft finger
WO2020186826A1 (zh) 基于智能手环的游戏控制方法、智能手环及存储介质
WO2018099258A1 (zh) 无人机的飞行控制方法和装置
Zubrycki et al. Using integrated vision systems: three gears and leap motion, to control a 3-finger dexterous gripper
Ding et al. Bunny-visionpro: Real-time bimanual dexterous teleoperation for imitation learning
Yang et al. Ace: A cross-platform visual-exoskeletons system for low-cost dexterous teleoperation
CN113508355A (zh) 虚拟现实控制器
CN105630134A (zh) 识别操作事件的方法和装置
Jiang et al. A Comprehensive User Study on Augmented Reality-Based Data Collection Interfaces for Robot Learning
CN109960404A (zh) 一种数据处理方法及装置
TW202405621A (zh) 通過虛擬滑鼠遠程控制擴展實境的系統及方法
WO2020133078A1 (zh) 一种机器人操作控制方法、手势识别装置以及机器人
CN111376246A (zh) 一种机器人操作控制方法、手势识别装置以及机器人
Liu et al. COMTIS: Customizable touchless interaction system for large screen visualization
CN114740997A (zh) 交互控制装置及交互控制方法
Ma et al. Design of manipulator control system based on leap motion
Rodriguez et al. Development and Implementation of an AI-Embedded and ROS-Compatible Smart Glove System in Human-Robot Interaction
US20240037848A1 (en) Systems, methods, and computer program products for implementing object permanence in a simulated environment
TWI554910B (zh) Medical image imaging interactive control method and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18944540

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18944540

Country of ref document: EP

Kind code of ref document: A1