CN108748139A - Robot control method based on human body temperature type and device - Google Patents
Robot control method based on human body temperature type and device Download PDFInfo
- Publication number
- CN108748139A CN108748139A CN201810347625.3A CN201810347625A CN108748139A CN 108748139 A CN108748139 A CN 108748139A CN 201810347625 A CN201810347625 A CN 201810347625A CN 108748139 A CN108748139 A CN 108748139A
- Authority
- CN
- China
- Prior art keywords
- depth image
- robot
- user
- hand
- terminal device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
Abstract
The present embodiments relate to human-computer interaction technique fields, more particularly to a kind of robot control method and device based on human body temperature type, the robot control method based on human body temperature type is applied to terminal device, which connect with Kinect sensor, which also connect with robot.The depth image for user's hand that the terminal device acquires real-time reception Kinect sensor, and the depth image is analyzed to obtain recognition result, and then the hand motion that robot executes user is controlled according to the recognition result.This programme is operated by acquiring the hand motion of user and controlling robot according to the hand motion of user, is avoided complicated button operation, is improved machine task efficiency.
Description
Technical field
The present invention relates to human-computer interaction technique fields, in particular to a kind of robot controlling party based on human body temperature type
Method and device.
Background technology
In robotic technology field, although some automatic production lines successfully realize spray with mechanical arm
The simple operation of the repeatability such as painting, welding, still, if it is desired that it is complex to execute explosive, deep-sea exploration etc. with mechanical arm
Operation, then need to build the technical difficulty of control system by technologies such as image identification, robot localizations by bigger, cost
It also will be more.Therefore, when executing these complex operations, the mode manually manipulated is mostly used, rather than certainly using mechanical arm
The mode of main motion.But it is stronger it is necessary to be familiar with plurality of button, function on controller to manual manipulation robot
Big robot button is more, this also increases the degree of difficulty in manipulation simultaneously, reduces operating efficiency.Therefore it provides a kind of
More easily the method for control robot is very necessary.
Invention content
The purpose of the present invention is to provide a kind of robot control methods based on human body temperature type, pass through human action to realize
Control robot is moved, and the operating efficiency of robot is improved.
Another object of the present invention is to provide a kind of robot control methods based on human body temperature type, pass through human body to realize
Action control robot is moved, and the operating efficiency of robot is improved.
To achieve the goals above, technical solution used in the embodiment of the present invention is as follows:
In a first aspect, an embodiment of the present invention provides a kind of robot control method based on human body temperature type, it is applied to terminal
Equipment, the terminal device are connect with Kinect sensor, and the terminal device is also connect with robot, the method includes:
The depth image for user's hand that Kinect sensor described in real-time reception acquires;The depth image of reception is identified to obtain
Recognition result;The hand motion that the robot executes user is controlled according to recognition result.
Second aspect, the embodiment of the present invention additionally provide a kind of robot controller based on human body temperature type, are applied to eventually
End equipment, the terminal device are connect with Kinect sensor, and the terminal device is also connect with robot, described device packet
It includes:Receiving module, the depth image for user's hand of Kinect sensor acquisition described in real-time reception;Identification module is used
It is identified to obtain recognition result in the depth image to reception;Control module, for controlling the machine according to recognition result
People executes the hand motion of user.
A kind of robot control method and device based on human body temperature type provided in an embodiment of the present invention, should be based on human body temperature type
Robot control method is applied to terminal device, which connect with Kinect sensor, the terminal device also with machine
People connects.The depth image for user's hand that the terminal device acquires real-time reception Kinect sensor, and to the depth map
As being analyzed to obtain recognition result, and then the hand motion that robot executes user is controlled according to the recognition result.This programme
It is operated according to the hand motion of user by acquiring the hand motion of user and controlling robot, avoids complicated button
Operation, improves machine task efficiency.
To enable the above objects, features and advantages of the present invention to be clearer and more comprehensible, preferred embodiment cited below particularly, and coordinate
Appended attached drawing, is described in detail below.
Description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below will be to needed in the embodiment attached
Figure is briefly described, it should be understood that the following drawings illustrates only certain embodiments of the present invention, therefore is not construed as pair
The restriction of range for those of ordinary skill in the art without creative efforts, can also be according to this
A little attached drawings obtain other relevant attached drawings.
Fig. 1 shows that a kind of application scenarios of robot control method based on human body temperature type provided in an embodiment of the present invention show
It is intended to.
Fig. 2 shows a kind of flow signals of the robot control method based on human body temperature type provided in an embodiment of the present invention
Figure.
Fig. 3 shows that a kind of function module of robot controller based on human body temperature type provided in an embodiment of the present invention is shown
It is intended to.
Diagram:100-Kinect sensors;200- terminal devices;300- wireless signal transmissions;400- robots;
Robot controllers of the 210- based on human body temperature type;211- receiving modules;212- identification modules;213- determining modules;214- is controlled
Molding block.
Specific implementation mode
Below in conjunction with attached drawing in the embodiment of the present invention, technical solution in the embodiment of the present invention carries out clear, complete
Ground describes, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.Usually exist
The component of the embodiment of the present invention described and illustrated in attached drawing can be arranged and be designed with a variety of different configurations herein.Cause
This, the detailed description of the embodiment of the present invention to providing in the accompanying drawings is not intended to limit claimed invention below
Range, but it is merely representative of the selected embodiment of the present invention.Based on the embodiment of the present invention, those skilled in the art are not doing
The every other embodiment obtained under the premise of going out creative work, shall fall within the protection scope of the present invention.
It should be noted that:Similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi
It is defined, then it further need not be defined and explained in subsequent attached drawing in a attached drawing.Meanwhile the present invention's
In description, term " first ", " second " etc. are only used for distinguishing description, are not understood to indicate or imply relative importance.
With the continuous development of artificial intelligence and virtual reality technology, contactless, natural style human-computer interaction (Human-
Computer Interaction) system design and realization, have become current research hotspot, be also to improve robot
Operating experience and control mode provide new possibility.Therefore, it is necessary to introduce with natural body-sensing (Natural
Interaction the control mode based on) is moved come control machinery arm by the mode of imitation human action, improves machine with this
The operating experience of people, and improve the operating efficiency and intelligent level of robot.
Fig. 1 is please referred to, is a kind of applied field of robot control method based on human body temperature type provided in an embodiment of the present invention
Scape schematic diagram.It includes Kinect sensor 100, terminal device 200, wireless signal transmission 300 and robot 400.
The Kinect sensor 100 is used to acquire the depth image of user's hand, and by the depth of user's hand of acquisition
Image is sent to terminal device 200, is further handled depth image by terminal device 200.It should be noted that the use
The gesture at family can there are many, realized by a variety of different gestures and the diversification of robot 400 controlled, the gesture of the user
Including holding, opening, move left and right, move up and down, which will acquire the depth of user's hand in real time
Spend image, i.e. depth image before user gesture variation and the image after variation.
The terminal device 200 may be, but not limited to, the intelligent electronic devices such as computer, tablet computer, the terminal device
Depth image processing driver is installed to be identified with the depth image of user's hand to reception on 200.Specific identification
Method is:User is provided with default center point in advance on the terminal device 200, and all depth images received are pre- with this
If central point is references object, that is, the central point of the depth image of reception is calculated, according to the center for the depth image being calculated
Point determines the position of depth image with the difference of default center point.And then the central point of the depth image received with each is ginseng
Coordinate points are examined, if the central point of next depth image is moved relative to default center point towards predetermined direction, characterize user's
Hand is moved towards predetermined direction, if the central point of next depth image occurs partially to the left relative to default center point
It moves, then illustrates that the hand of user is being moved to the left, and then terminal device 200 controls robot 400 and moved towards with user's hand
Direction it is consistent direction movement.It can be seen that the direction one of the moving direction of user's hand and the movement of 400 time of robot
It causes, thus terminal device 200 can control robot 400 according to the gesture of user.
In addition, if the area of next depth image is reduced to given threshold relative to the area of current depth image,
Characterization user's hand is in state of holding, and then terminal device 200 controls robot 400 and executes the action consistent with user action,
I.e. control robot 400 executes grasping movement.If the area of next depth image increases relative to the area of current depth image
Given threshold is arrived greatly, then characterizes the open-shaped state of user's hand, and then terminal device 200 controls the execution of robot 400 and user
The action kept strokes controls robot 400 and executes the action decontroled.
It can be seen that be acquired to user gesture by Kinect sensor 100, and by terminal device 200 to
Control robot 400 executes the action consistent with user gesture after family gesture is identified, so realization human body temperature type to machine
400 control mode of people, improves user experience.It should be noted that the embodiment of the present invention is only illustratively enumerated
Pass through the control that hand is mobile, hand is held, opened etc. to robot 400, it is readily appreciated that, user can also pass through other hands
Gesture controls robot 400, as user rotates wrist, and then control robot 400 rotation etc..
It should be noted that the terminal device 200 directly can be wirelessly attached with robot 400,
It is directly executed accordingly according to the output control robot 400 of the range image analysis acquired to Kinect sensor 100 with realizing
Operation.The terminal device 200 can also with 300 wired connection of wireless signal transmission, to pass through the wireless signal transmission
Device 300 transmits a signal to the robot 400 being wirelessly connected with wireless signal transmission 300, so that robot 400 executes
Corresponding action.Specifically, being identified after the range image analysis that the terminal device 200 acquires Kinect sensor 100
As a result, the recognition result is the variation pattern of gesture, and recognition result is sent to wireless signal in a manner of pulse signal and is passed
Defeated device 300, the wireless signal transmission 300 are filtered the pulse signal of reception, filter out the pulse signal of mistake,
Correct pulse signal is sent to robot 400 and executes corresponding operation to control robot 400.
Scheme provided in an embodiment of the present invention realizes the control to machine by apish action, on the one hand avoids
Using additional controller or button is clicked to the cumbersome of robot manipulation, and the another aspect program can be widely used in
The hazardous environments such as malicious, harmful and explosive improve safety.
Fig. 2 is please referred to, is a kind of robot control method based on human body temperature type provided in an embodiment of the present invention, this method is answered
For terminal device 200, this method includes:
Step S110, the depth image of user's hand of Kinect sensor acquisition described in real-time reception.
The gesture of the user can there are many, realized by a variety of different gestures and the diversification of robot 400 controlled,
The gesture of the user includes holding, opening, moving left and right, moving up and down, which will acquire in real time
Depth image before the variation of the depth image of user's hand, i.e. user gesture and the image after variation, and by the depth map of acquisition
As being sent to terminal device 200.
Step S120 is identified to obtain recognition result to multiple depth images of reception.
Step S130 controls the hand motion that the robot executes user according to recognition result.
Specifically, the central point of the depth image received using each is reference coordinate point, if next depth image
Central point is moved relative to default center point towards predetermined direction, then the hand for characterizing user is moved towards predetermined direction, such as
If the central point of next depth image shifts to the left relative to default center point, illustrate the hand of user to the left
It is mobile, and then terminal device 200 controls robot 400 and is moved towards the consistent direction in the direction moved with user's hand.If in addition,
The area of next depth image is reduced to the threshold value of setting relative to the area of current depth image, then characterizes user's hand and be in
State is held, and then terminal device 200 controls robot 400 and executes the action consistent with user action, that is, controls robot 400
Execute grasping movement.If the area of next depth image increases to given threshold relative to the area of current depth image,
The open-shaped state of user's hand is characterized, and then terminal device 200 controls robot 400 and executes the action consistent with user action,
I.e. control robot 400 executes the action decontroled.
Fig. 3 is please referred to, is a kind of function of robot controller 210 based on human body temperature type provided in an embodiment of the present invention
Module diagram, the device include receiving module 211, identification module 212, determining module 213, control module 214.
Receiving module 211, the depth image for user's hand of Kinect sensor acquisition described in real-time reception.
In embodiments of the present invention, step S110 can be executed by receiving module 211.
Identification module 212 is identified to obtain recognition result for multiple depth images to reception.
In embodiments of the present invention, step S120 can be executed by identification module 212.
Determining module 213, the central point for the selected each depth image received are reference coordinate point.
Control module 214, for controlling the hand motion that the robot executes user according to recognition result.
In embodiments of the present invention, step S130 can be executed by control module 214.
Due to being had been described in the robot control method part based on human body temperature type, details are not described herein.
In conclusion a kind of robot control method and device based on human body temperature type provided in an embodiment of the present invention, the base
It is applied to terminal device in the robot control method of human body temperature type, which connect with Kinect sensor, which sets
It is standby also to be connect with robot.The depth image for user's hand that the terminal device acquires real-time reception Kinect sensor, and
The depth image is analyzed to obtain recognition result, and then dynamic according to the hand of recognition result control robot execution user
Make.This programme is operated by acquiring the hand motion of user and controlling robot according to the hand motion of user, is avoided
Complicated button operation, improves machine task efficiency.
In several embodiments provided herein, it should be understood that disclosed device and method can also pass through
Other modes are realized.The apparatus embodiments described above are merely exemplary, for example, the flow chart in attached drawing and block diagram
Show the device of multiple embodiments according to the present invention, the architectural framework in the cards of method and computer program product,
Function and operation.In this regard, each box in flowchart or block diagram can represent the one of a module, section or code
Part, a part for the module, section or code, which includes that one or more is for implementing the specified logical function, to be held
Row instruction.It should also be noted that at some as in the realization method replaced, the function of being marked in box can also be to be different from
The sequence marked in attached drawing occurs.For example, two continuous boxes can essentially be basically executed in parallel, they are sometimes
It can execute in the opposite order, this is depended on the functions involved.It is also noted that every in block diagram and or flow chart
The combination of box in a box and block diagram and or flow chart can use function or the dedicated base of action as defined in executing
It realizes, or can be realized using a combination of dedicated hardware and computer instructions in the system of hardware.
In addition, each function module in each embodiment of the present invention can integrate to form an independent portion
Point, can also be modules individualism, can also two or more modules be integrated to form an independent part.
It, can be with if the function is realized and when sold or used as an independent product in the form of software function module
It is stored in a computer read/write memory medium.Based on this understanding, technical scheme of the present invention is substantially in other words
The part of the part that contributes to existing technology or the technical solution can be expressed in the form of software products, the meter
Calculation machine software product is stored in a storage medium, including some instructions are used so that a computer equipment (can be
People's computer, server or network equipment etc.) it performs all or part of the steps of the method described in the various embodiments of the present invention.
It should be noted that herein, relational terms such as first and second and the like are used merely to an entity or behaviour
Make with another entity or operate distinguish, without necessarily requiring or implying between these entities or operation there are it is any this
The actual relationship of kind or sequence.Moreover, the terms "include", "comprise" or its any other variant are intended to nonexcludability
Include so that including a series of elements process, method, article or equipment not only include those elements, but also
Including other elements that are not explicitly listed, or further include for this process, method, article or equipment intrinsic want
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that wanted including described
There is also other identical elements in the process, method, article or equipment of element.
The foregoing is only a preferred embodiment of the present invention, is not intended to restrict the invention, for the skill of this field
For art personnel, the invention may be variously modified and varied.All within the spirits and principles of the present invention, any made by repair
Change, equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.It should be noted that:Similar label and letter exist
Similar terms are indicated in following attached drawing, therefore, once being defined in a certain Xiang Yi attached drawing, are then not required in subsequent attached drawing
It is further defined and is explained.
The above description is merely a specific embodiment, but scope of protection of the present invention is not limited thereto, any
Those familiar with the art in the technical scope disclosed by the present invention, can easily think of the change or the replacement, and should all contain
Lid is within protection scope of the present invention.Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. a kind of robot control method based on human body temperature type, be applied to terminal device, which is characterized in that the terminal device with
Kinect sensor connects, and the terminal device is also connect with robot, the method includes:
The depth image for user's hand that Kinect sensor described in real-time reception acquires;
The depth image of reception is identified to obtain recognition result;
The hand motion that the robot executes user is controlled according to recognition result.
2. the method as described in claim 1, which is characterized in that the method further includes:
Calculate the central point of the depth image;
The position of the depth image is determined according to the difference of the central point of the depth image and default center point.
3. method as claimed in claim 2, which is characterized in that multiple depth images of described pair of reception, which are identified, to be known
The step of other result includes:
If the central point of the depth image shifts relative to default center point towards predetermined direction, the hand of user is characterized
It is moved towards predetermined direction;
It is described the step of robot executes the hand motion of user is controlled according to recognition result to include:
The moving direction for controlling the robot towards user's hand moves.
4. method as claimed in claim 2, which is characterized in that multiple depth images of described pair of reception, which are identified, to be known
The step of other result includes:
If the area of next depth image is reduced to the threshold values of setting relative to the area of current depth image, user is characterized
Hand is in the state held;
It is described the step of robot executes user's hand motion is controlled according to recognition result to include:
Control the action that the robot executes crawl.
5. method as claimed in claim 2, which is characterized in that multiple depth images of described pair of reception, which are identified, to be known
The step of other result includes:
If the area of next depth image increases to the threshold values of setting relative to the area of current depth image, user is characterized
The open-shaped state of hand;
It is described the step of robot executes user's hand motion is controlled according to recognition result to include:
It controls the robot and executes the operation decontroled.
6. the method as described in claim 1, which is characterized in that the terminal device is also connect with wireless signal transmission,
The wireless signal transmission is connect with robot,
The recognition result is sent to the wireless signal transmission by the terminal device;
The wireless signal transmission filters out correct recognition result and is sent to the robot, to control the robot
The hand motion of user is executed according to the recognition result.
7. a kind of robot controller based on human body temperature type, be applied to terminal device, which is characterized in that the terminal device with
Kinect sensor connects, and the terminal device is also connect with robot, and described device includes:
Receiving module, the depth image for user's hand of Kinect sensor acquisition described in real-time reception;
Identification module is identified to obtain recognition result for the depth image to reception;
Control module, for controlling the hand motion that the robot executes user according to recognition result.
8. device as claimed in claim 7, which is characterized in that described device includes:
Determining module, the central point for calculating the depth image, and according to the central point of the depth image and in presetting
The difference of heart point determines the position of the depth image.
9. device as claimed in claim 8, which is characterized in that the identification module is additionally operable to:
If the central point of the depth image shifts relative to default center point towards predetermined direction, the hand of user is characterized
It is moved towards predetermined direction.
10. device as claimed in claim 8, which is characterized in that the identification module is additionally operable to:
If the area of next depth image is reduced relative to the area of current depth image, it is in hold to characterize user's hand
State;
If the area of next depth image increases relative to the area of current depth image, it is open-shaped to characterize user's hand
State.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810347625.3A CN108748139A (en) | 2018-04-18 | 2018-04-18 | Robot control method based on human body temperature type and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810347625.3A CN108748139A (en) | 2018-04-18 | 2018-04-18 | Robot control method based on human body temperature type and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108748139A true CN108748139A (en) | 2018-11-06 |
Family
ID=64011183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810347625.3A Pending CN108748139A (en) | 2018-04-18 | 2018-04-18 | Robot control method based on human body temperature type and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108748139A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111694428A (en) * | 2020-05-25 | 2020-09-22 | 电子科技大学 | Gesture and track remote control robot system based on Kinect |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105825193A (en) * | 2016-03-25 | 2016-08-03 | 乐视控股(北京)有限公司 | Method and device for position location of center of palm, gesture recognition device and intelligent terminals |
KR101706864B1 (en) * | 2015-10-14 | 2017-02-17 | 세종대학교산학협력단 | Real-time finger and gesture recognition using motion sensing input devices |
CN106933340A (en) * | 2015-12-31 | 2017-07-07 | 北京体基科技有限公司 | Gesture motion recognition methods, control method and device and wrist equipment |
CN107030692A (en) * | 2017-03-28 | 2017-08-11 | 浙江大学 | One kind is based on the enhanced manipulator teleoperation method of perception and system |
CN107214679A (en) * | 2017-07-17 | 2017-09-29 | 武汉大学 | Mechanical arm man-machine interactive system based on body-sensing sensor |
CN107688779A (en) * | 2017-08-18 | 2018-02-13 | 北京航空航天大学 | A kind of robot gesture interaction method and apparatus based on RGBD camera depth images |
-
2018
- 2018-04-18 CN CN201810347625.3A patent/CN108748139A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101706864B1 (en) * | 2015-10-14 | 2017-02-17 | 세종대학교산학협력단 | Real-time finger and gesture recognition using motion sensing input devices |
CN106933340A (en) * | 2015-12-31 | 2017-07-07 | 北京体基科技有限公司 | Gesture motion recognition methods, control method and device and wrist equipment |
CN105825193A (en) * | 2016-03-25 | 2016-08-03 | 乐视控股(北京)有限公司 | Method and device for position location of center of palm, gesture recognition device and intelligent terminals |
CN107030692A (en) * | 2017-03-28 | 2017-08-11 | 浙江大学 | One kind is based on the enhanced manipulator teleoperation method of perception and system |
CN107214679A (en) * | 2017-07-17 | 2017-09-29 | 武汉大学 | Mechanical arm man-machine interactive system based on body-sensing sensor |
CN107688779A (en) * | 2017-08-18 | 2018-02-13 | 北京航空航天大学 | A kind of robot gesture interaction method and apparatus based on RGBD camera depth images |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111694428A (en) * | 2020-05-25 | 2020-09-22 | 电子科技大学 | Gesture and track remote control robot system based on Kinect |
CN111694428B (en) * | 2020-05-25 | 2021-09-24 | 电子科技大学 | Gesture and track remote control robot system based on Kinect |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10166673B2 (en) | Portable apparatus for controlling robot and method thereof | |
US10179407B2 (en) | Dynamic multi-sensor and multi-robot interface system | |
US9230329B2 (en) | Method, computer program and apparatus for determining a gripping location | |
US20160092504A1 (en) | Recognition of Free-form Gestures from Orientation Tracking of a Handheld or Wearable Device | |
CN103246290B (en) | A kind of cloud platform control method and system thereof | |
KR101860200B1 (en) | Selection of a device or an object by means of a camera | |
CN105930775B (en) | Facial orientation recognition methods based on sensitivity parameter | |
EP2422295A1 (en) | Object-learning robot and method | |
EP3007030A1 (en) | Portable device and control method via gestures | |
CN106514667A (en) | Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo | |
KR20200068075A (en) | Remote guidance apparatus and method capable of handling hyper-motion step based on augmented reality and machine learning | |
CN109839827B (en) | Gesture recognition intelligent household control system based on full-space position information | |
CN109746914B (en) | Method of constructing robot, robot control apparatus, system, and storage medium | |
CN108972593A (en) | Control method and system under a kind of industrial robot system | |
CN107894834B (en) | Control gesture recognition method and system in augmented reality environment | |
CN108776444A (en) | Augmented reality man-machine interactive system suitable for CPS automatic control systems | |
CN109558004A (en) | A kind of control method and device of human body auxiliary robot | |
CN105808129B (en) | Method and device for quickly starting software function by using gesture | |
KR20180017074A (en) | Detection of the robot axial angles and selection of a robot by means of a camera | |
CN108748139A (en) | Robot control method based on human body temperature type and device | |
CN113601510A (en) | Robot movement control method, device, system and equipment based on binocular vision | |
CN113043268A (en) | Robot eye calibration method, device, terminal, system and storage medium | |
CN106951109B (en) | Method and device for acquiring hand gesture | |
US20210247758A1 (en) | Teleoperation with a wearable sensor system | |
CN111702759A (en) | Teaching system and robot teaching method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181106 |