CN112706148A - Robot operating device and method - Google Patents
Robot operating device and method Download PDFInfo
- Publication number
- CN112706148A CN112706148A CN202011564232.1A CN202011564232A CN112706148A CN 112706148 A CN112706148 A CN 112706148A CN 202011564232 A CN202011564232 A CN 202011564232A CN 112706148 A CN112706148 A CN 112706148A
- Authority
- CN
- China
- Prior art keywords
- dragging
- drag
- instruction
- touch
- touch panel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/08—Programme-controlled manipulators characterised by modular constructions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The present invention provides a robot operating device, including: the instruction input module is used for receiving touch operation input from a user and at least comprises a first dragging direction and a second dragging direction which are intersected; a drive shaft comprising a first drive shaft of the first drag direction and a second drive shaft associated with a second drag direction; the touch operation detection module is used for detecting whether the touch operation is a dragging operation; the action instruction generating module is used for receiving the input of an operation instruction from a user and selecting at least one execution action instruction of the driving shaft; the robot is operated in the gesture dragging direction by inputting an instruction through manual operation.
Description
Technical Field
The invention relates to the field of manual operation of robots, in particular to a robot operating device and a method.
Background
For example, in an industrial robot system, a robot can be operated manually (manual operation). Such an operation is used, for example, for teaching tasks (teaching). At this time, the user manually operates the robot using a box (teaching box) or the like connected to a controller that controls the robot. For this purpose, the case is provided with various operation keys (keys including mechanical switches) dedicated for manual operation.
Disclosure of Invention
In order to solve the technical problem, the invention designs a robot operating device
The invention adopts the specific technical scheme that: a robotic manipulation device, the manipulation device comprising:
the instruction input module is used for receiving touch operation input from a user and at least comprises a first dragging direction and a second dragging direction which are intersected;
a drive shaft comprising a first drive shaft of the first drag direction and a second drive shaft associated with a second drag direction;
the touch operation detection module is used for detecting whether the touch operation is a dragging operation;
the action instruction generating module is used for receiving the input of an operation instruction from a user and selecting at least one execution action instruction of the driving shaft;
wherein the content of the first and second substances,
generating an action instruction for driving the first driving shaft and the second driving shaft according to the detection result of the touch operation detection module;
if the dragging operation is carried out, judging whether the dragging operation is in a first dragging direction or a second dragging direction, if the dragging operation is in the first dragging direction, driving a first driving shaft associated with the dragging operation in the first dragging direction according to the dragging operation, and generating an action instruction by a motion command generating module; if the drag operation is in the second drag direction, the motion command drives a second drive shaft associated with the drag operation in the second drag direction in accordance with the drag operation, and generates an action instruction.
In the robot operating device described above, the command input module is preferably a touch panel.
In the robot operating device according to the above aspect, it is preferable that the touch operation detection module is provided integrally with the touch panel or independently of the touch panel to detect an operation from a user.
The robot manipulation device as described above, preferably, the first dragging direction and the second dragging direction intersect to form a dragging coordinate system;
if the dragging operation is in any quadrant of a coordinate system formed by the first dragging direction and the second dragging direction, the motion command generation module simultaneously drives the first driving shaft and the second driving shaft according to the dragging direction of any quadrant in which the dragging operation is located, and generates an action command so that the first driving shaft and the second driving shaft are combined to drive along the dragging direction.
In the robot operating device according to the above aspect, preferably, the first dragging direction is a lateral direction on the touch panel, and the second dragging direction is a longitudinal direction on the touch panel.
Preferably, the robot operating device described above further includes at least: a touch screen for accepting a touch operation input from a user; a housing held by a user; a plurality of buttons arranged outside the range of the touch screen for button operation;
the touch panel has a plurality of virtual buttons for button operations.
As the robot operating device described above, preferably, the touch operation detecting means may further detect an operation performed on a virtual button displayed on the touch panel or a button other than the touch panel; the priority of the operation performed by the button on the touch panel or the button other than the touch panel is higher than that of the drag operation.
A method of robotic operation, the method comprising the steps of:
s1, an instruction input module accepts touch operation input from a user, wherein the instruction input module at least has a first dragging direction and a second dragging direction which are intersected;
s2, connecting the first driving axle of the first dragging direction and the second driving axle associated with the second dragging direction;
s3, detecting whether the touch operation is a drag operation;
s4, receiving the input of operation instruction from the user, and selecting at least one action execution instruction of the drive shaft;
wherein the content of the first and second substances,
generating an action command for driving the first driving shaft and the second driving shaft according to the detection result; if the dragging operation is carried out, judging whether the dragging operation is in a first dragging direction or a second dragging direction, if the dragging operation is in the first dragging direction, driving a first driving shaft associated with the dragging operation in the first dragging direction according to the dragging operation, and generating an action instruction by a motion command generating module; if the drag operation is in the second drag direction, the motion command drives a second drive shaft associated with the drag operation in the second drag direction in accordance with the drag operation, and generates an action instruction.
In the robot operating method according to the present invention, the command input module is preferably a touch panel.
In the robot operation method as described above, it is preferable that the touch operation detection module is provided integrally with the touch panel or independently of the touch panel to detect an operation from a user.
In the robot operating method according to the above aspect, preferably, the first dragging direction is a lateral direction on the touch panel, and the second dragging direction is a longitudinal direction on the touch panel.
The beneficial technical effects are as follows: the robot is operated in the gesture dragging direction by inputting an instruction through manual operation.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention.
Wherein:
fig. 1 is a wire frame diagram of an operating robotic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, the terms "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, which are for convenience of description of the present invention only and do not require that the present invention must be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. The terms "connected" and "connected" used herein should be interpreted broadly, and may include, for example, a fixed connection or a detachable connection; they may be directly connected or indirectly connected through intermediate members, and specific meanings of the above terms will be understood by those skilled in the art as appropriate.
A robotic manipulation device, the manipulation device comprising:
the instruction input module is used for receiving touch operation input from a user and at least comprises a first dragging direction and a second dragging direction which are intersected;
a drive shaft comprising a first drive shaft of the first drag direction and a second drive shaft associated with a second drag direction;
the touch operation detection module is used for detecting whether the touch operation is a dragging operation;
the action instruction generating module is used for receiving the input of an operation instruction from a user and selecting at least one execution action instruction of the driving shaft;
wherein the content of the first and second substances,
generating an action instruction for driving the first driving shaft and the second driving shaft according to the detection result of the touch operation detection module;
if the dragging operation is carried out, judging whether the dragging operation is in a first dragging direction or a second dragging direction, if the dragging operation is in the first dragging direction, driving a first driving shaft associated with the dragging operation in the first dragging direction according to the dragging operation, and generating an action instruction by a motion command generating module; if the drag operation is in the second drag direction, the motion command drives a second drive shaft associated with the drag operation in the second drag direction in accordance with the drag operation, and generates an action instruction.
The robot is operated in the gesture dragging direction by inputting an instruction through manual operation. The finger is drawn across the touch screen, whether the finger is dragged to form an action instruction or not is detected, the touch operation detection module is arranged, misoperation is avoided, operation accuracy is improved, and meanwhile the action instruction is generated based on the dragging direction, so that the robot is installed and dragged to operate.
In some embodiments, the instruction input module is a touch panel. The touch panel is arranged as an instruction input module, so that the use is convenient.
In some embodiments, the touch operation detection module is provided integrally with the touch panel or independently of the touch panel to detect an operation from a user.
The invention also has the following implementation mode, the first dragging direction and the second dragging direction are intersected to form a dragging coordinate system;
if the dragging operation is in any quadrant of a coordinate system formed by the first dragging direction and the second dragging direction, the motion command generation module simultaneously drives the first driving shaft and the second driving shaft according to the dragging direction of any quadrant in which the dragging operation is located, and generates an action command so that the first driving shaft and the second driving shaft are combined to drive along the dragging direction.
The first dragging direction and the second dragging direction form a coordinate system, and when the dragging direction is positioned in any quadrant, the first driving shaft and the second driving shaft are driven in a combined mode according to the position of the dragging path in the quadrant, so that the driving direction is consistent with the dragging direction.
In some embodiments, the first driving shaft and the second driving shaft are respectively connected with the driving wheels, and the driving in the dragging direction is realized by controlling the difference between the rotation speeds of the two driving wheels.
In some embodiments, the drag direction is preset, specifically, the first drag direction is a lateral direction on the touch panel, and the second drag direction is a longitudinal direction on the touch panel.
The present invention also provides an embodiment in which the instruction input module includes at least: a touch screen for accepting a touch operation input from a user; a housing held by a user; a plurality of buttons arranged outside the range of the touch screen for button operation;
the touch panel has a plurality of virtual buttons for button operations. And virtual buttons and push buttons are arranged, so that various instruction input modes can be realized.
The present invention also has an embodiment in which the touch operation detection module may further detect an operation performed on a virtual button displayed on the touch panel or a button other than the touch panel; the priority of the operation performed by the button on the touch panel or the button other than the touch panel is higher than that of the drag operation. Since the accuracy of the button with respect to the touch panel is higher, the operation accuracy is ensured with the button operation as the highest priority.
A method of robotic operation, the method comprising the steps of:
s1, an instruction input module accepts touch operation input from a user, wherein the instruction input module at least has a first dragging direction and a second dragging direction which are intersected;
s2, connecting the first driving axle of the first dragging direction and the second driving axle associated with the second dragging direction;
s3, detecting whether the touch operation is a drag operation;
s4, receiving the input of operation instruction from the user, and selecting at least one action execution instruction of the drive shaft;
wherein the content of the first and second substances,
generating an action command for driving the first driving shaft and the second driving shaft according to the detection result; if the dragging operation is carried out, judging whether the dragging operation is in a first dragging direction or a second dragging direction, if the dragging operation is in the first dragging direction, driving a first driving shaft associated with the dragging operation in the first dragging direction according to the dragging operation, and generating an action instruction by a motion command generating module; if the drag operation is in the second drag direction, the motion command drives a second drive shaft associated with the drag operation in the second drag direction in accordance with the drag operation, and generates an action instruction.
The robot is operated in the gesture dragging direction by inputting an instruction through manual operation. The finger is drawn across the touch screen, whether the finger is dragged to form an action instruction or not is detected, the touch operation detection module is arranged, misoperation is avoided, operation accuracy is improved, and meanwhile the action instruction is generated based on the dragging direction, so that the robot is installed and dragged to operate.
In some embodiments, the instruction input module is a touch panel. The touch operation detection module is provided integrally with the touch panel or independently of the touch panel to detect an operation from a user.
The first dragging direction is a lateral direction on the touch panel, and the second dragging direction is a longitudinal direction on the touch panel.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above embodiments are only used for illustrating the embodiments of the present application, and not for limiting the embodiments of the present application, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also belong to the scope of the embodiments of the present application, and the scope of the embodiments of the present application should be defined by the claims.
Claims (10)
1. A robotic manipulation device, the manipulation device comprising:
the instruction input module is used for receiving touch operation input from a user and at least comprises a first dragging direction and a second dragging direction which are intersected;
a drive shaft comprising a first drive shaft of the first drag direction and a second drive shaft associated with a second drag direction;
the touch operation detection module is used for detecting whether the touch operation is a dragging operation;
the action instruction generating module is used for receiving the input of an operation instruction from a user and selecting at least one execution action instruction of the driving shaft;
wherein the content of the first and second substances,
generating an action instruction for driving the first driving shaft and the second driving shaft according to the detection result of the touch operation detection module;
if the dragging operation is carried out, judging whether the dragging operation is in a first dragging direction or a second dragging direction, if the dragging operation is in the first dragging direction, driving a first driving shaft associated with the dragging operation in the first dragging direction according to the dragging operation, and generating an action instruction by a motion command generating module; if the drag operation is in the second drag direction, the motion command drives a second drive shaft associated with the drag operation in the second drag direction in accordance with the drag operation, and generates an action instruction.
2. The robotic manipulation device of claim 1 wherein said command input module is a touch panel.
3. The robot operation device according to claim 2, wherein the touch operation detection module is provided integrally with the touch panel or independently of the touch panel to detect an operation from a user.
4. The robotic manipulation device of claim 1 wherein the first drag direction and the second drag direction intersect to form a drag coordinate system;
if the dragging operation is in any quadrant of a coordinate system formed by the first dragging direction and the second dragging direction, the motion command generation module simultaneously drives the first driving shaft and the second driving shaft according to the dragging direction of any quadrant in which the dragging operation is located, and generates an action command so that the first driving shaft and the second driving shaft are combined to drive along the dragging direction.
5. The robotic manipulation device of claim 1,
the first drag direction is a lateral direction on the touch panel,
the second drag direction is a longitudinal direction on the touch panel.
6. The robotic manipulation device of claim 1, wherein said command input module has at least: a touch screen for accepting a touch operation input from a user;
a housing held by a user;
a plurality of buttons arranged outside the range of the touch screen for button operation;
the touch panel has a plurality of virtual buttons for button operations.
7. A method of robotic operation, the method comprising the steps of:
s1, an instruction input module accepts touch operation input from a user, wherein the instruction input module at least has a first dragging direction and a second dragging direction which are intersected;
s2, connecting the first driving axle of the first dragging direction and the second driving axle associated with the second dragging direction;
s3, detecting whether the touch operation is a drag operation;
s4, receiving the input of operation instruction from the user, and selecting at least one action execution instruction of the drive shaft; wherein the content of the first and second substances,
generating an action command for driving the first driving shaft and the second driving shaft according to the detection result; if the dragging operation is carried out, judging whether the dragging operation is in a first dragging direction or a second dragging direction, if the dragging operation is in the first dragging direction, driving a first driving shaft associated with the dragging operation in the first dragging direction according to the dragging operation, and generating an action instruction by a motion command generating module; if the drag operation is in the second drag direction, the motion command drives a second drive shaft associated with the drag operation in the second drag direction in accordance with the drag operation, and generates an action instruction.
8. The robot operating method according to claim 7, wherein the instruction input module is a touch panel.
9. A robot operation method according to claim 8, wherein the touch operation detection module is provided integrally with the touch panel or independently of the touch panel to detect an operation from a user.
10. A robot operating method according to claim 7, wherein the first drag direction is a lateral direction on the touch panel,
the second drag direction is a longitudinal direction on the touch panel.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011564232.1A CN112706148A (en) | 2020-12-25 | 2020-12-25 | Robot operating device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011564232.1A CN112706148A (en) | 2020-12-25 | 2020-12-25 | Robot operating device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112706148A true CN112706148A (en) | 2021-04-27 |
Family
ID=75546522
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011564232.1A Pending CN112706148A (en) | 2020-12-25 | 2020-12-25 | Robot operating device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112706148A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105190514A (en) * | 2013-03-11 | 2015-12-23 | 三星电子株式会社 | Apparatus and method for deleting an item on a touch screen display |
CN105479467A (en) * | 2014-10-01 | 2016-04-13 | 电装波动株式会社 | Robot operation apparatus, robot system, and robot operation program |
CN105722650A (en) * | 2013-09-20 | 2016-06-29 | 电装波动株式会社 | Robot maneuvering device, robot system, and robot maneuvering program |
JP2016175174A (en) * | 2015-03-19 | 2016-10-06 | 株式会社デンソーウェーブ | Robot operation device, and robot operation program |
KR20180063515A (en) * | 2016-12-02 | 2018-06-12 | 두산로보틱스 주식회사 | Teaching Device of a Robot and Teaching Method thereof |
-
2020
- 2020-12-25 CN CN202011564232.1A patent/CN112706148A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105190514A (en) * | 2013-03-11 | 2015-12-23 | 三星电子株式会社 | Apparatus and method for deleting an item on a touch screen display |
CN105722650A (en) * | 2013-09-20 | 2016-06-29 | 电装波动株式会社 | Robot maneuvering device, robot system, and robot maneuvering program |
CN105479467A (en) * | 2014-10-01 | 2016-04-13 | 电装波动株式会社 | Robot operation apparatus, robot system, and robot operation program |
JP2016175174A (en) * | 2015-03-19 | 2016-10-06 | 株式会社デンソーウェーブ | Robot operation device, and robot operation program |
KR20180063515A (en) * | 2016-12-02 | 2018-06-12 | 두산로보틱스 주식회사 | Teaching Device of a Robot and Teaching Method thereof |
Non-Patent Citations (1)
Title |
---|
[印度]朗坦•约瑟夫(LENTIN JOSEPH): "《机器人操作系统(ROS)入门必备 机器人编程一学就会》", 29 February 2020, 机械工业出版社 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105479467B (en) | Robotic manipulation device, robot system and robot manipulation's program | |
KR102042115B1 (en) | Method for generating robot operation program, and device for generating robot operation program | |
JP6642054B2 (en) | Robot operation device and robot operation program | |
JPH11262883A (en) | Manual operation device for robot | |
CN105722650A (en) | Robot maneuvering device, robot system, and robot maneuvering program | |
WO2004085120A1 (en) | Robot simulation device, and robot simulation program | |
WO1997010931A1 (en) | Teach pendant | |
WO2012062374A1 (en) | A control system and an operating device for controlling an industrial robot comprising a touch -screen | |
JP4264029B2 (en) | Haptic input device | |
JPH068171A (en) | Operating device for micro-manipulator | |
JP2016175132A (en) | Robot operation device and robot operation method | |
JPH11262884A (en) | Manual operation device for robot | |
CN112706148A (en) | Robot operating device and method | |
JP6379902B2 (en) | Robot operation device, robot system, and robot operation program | |
KR20120027934A (en) | Robot system with robot controller combined with teach pedant | |
JP2017052031A (en) | Robot operation device and robot operation method | |
CN114952904A (en) | Control device and control system of robot | |
JP6435940B2 (en) | Robot operation device and robot operation program | |
JP6894659B2 (en) | Movement data setting device for industrial robots and movement axis confirmation method | |
JP2013058219A (en) | Touch panel type input device and screen display method for touch panel type input device | |
CN206021234U (en) | A kind of robot demonstrator with back button | |
CN217372379U (en) | Control device and control system of robot | |
JP2016175174A (en) | Robot operation device, and robot operation program | |
US11577381B2 (en) | Teaching apparatus, robot system, and teaching program | |
CN103909524A (en) | Directional position control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210427 |