CN112847337A - Method for autonomous operation of application program by industrial robot - Google Patents
Method for autonomous operation of application program by industrial robot Download PDFInfo
- Publication number
- CN112847337A CN112847337A CN202011558035.9A CN202011558035A CN112847337A CN 112847337 A CN112847337 A CN 112847337A CN 202011558035 A CN202011558035 A CN 202011558035A CN 112847337 A CN112847337 A CN 112847337A
- Authority
- CN
- China
- Prior art keywords
- autonomous operation
- application program
- behavior
- database
- industrial robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 230000006399 behavior Effects 0.000 claims abstract description 74
- 230000000007 visual effect Effects 0.000 claims abstract description 46
- 238000004088 simulation Methods 0.000 claims abstract description 12
- 230000003993 interaction Effects 0.000 claims description 3
- 239000000126 substance Substances 0.000 claims description 3
- 230000010485 coping Effects 0.000 claims 1
- 238000012937 correction Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000009429 distress Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009776 industrial production Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
Abstract
The invention provides a method for an industrial robot to autonomously operate an application program, which comprises the following steps of S1, collecting the operation behaviors of the application program under emergency conditions in different visual environments; s2, constructing an autonomous operation behavior database, wherein the database comprises: a visual environment information set, an operation step set, an emergency situation solving set and an application program command executing set; s3, performing application program operation learning and autonomous operation behavior simulation under different visual environments; s4, comparing the execution result of the application program command under the simulated autonomous operation behavior with the database to obtain and store the correct autonomous operation behavior; and S5, triggering an autonomous operation command and executing an autonomous operation behavior. Collecting the operation behaviors of the application program, establishing a database, performing operation learning and autonomous operation behavior simulation of the application program under different visual environments, and performing simulation comparison before executing the autonomous operation behavior so as to improve the accuracy of operation.
Description
Technical Field
The application relates to the technical field of communication, in particular to a method for an industrial robot to autonomously operate an application program.
Background
The existing industrial robot becomes necessary equipment for production automation and is applied to various industries of industrial production. The robot system needs to learn about a workpiece to be grabbed before being equipped with an automatic production line, especially, the existing visual robot arm system needs to be trained by technical personnel before being put into use, picture samples of the workpiece to be grabbed are shot, parameter conditions are set for learning, the number of picture samples is more, the learning times is more, the visual robot arm is more accurate to grab, in the process, a generated machine learning program is stored in an industrial personal computer system of the robot arm, and two problems exist in the learning process:
(1) each robot system put into production needs to be trained independently, and a machine learning program stored in an industrial personal computer system of the robot is poor in universality and cannot be transplanted for use;
(2) the mode of repeatedly shooting and sampling the picture has more repetitive work and lower efficiency.
Disclosure of Invention
In order to solve the above problems, the present invention provides a robot capable of autonomous learning and operation, and the present invention designs a method for an industrial robot to autonomously operate an application.
The invention adopts the specific technical scheme that: a method for an industrial robot to autonomously operate an application, comprising the steps of: s1, collecting the operation behaviors of the application program in different visual environments and under the emergency condition;
s2, constructing an autonomous operation behavior database, wherein the database comprises: a visual environment information set, an operation step set, an emergency situation solving set and an application program command executing set;
s3, performing application program operation learning and autonomous operation behavior simulation under different visual environments;
s4, comparing the execution result of the application program command under the simulated autonomous operation behavior with the database to obtain and store the correct autonomous operation behavior;
and S5, triggering an autonomous operation command and executing an autonomous operation behavior.
The method for autonomously operating an application program based on different visual environments as described above preferably includes the step of solving the emergency situation as decision data in the course of behavior for different emergency situations.
The method for autonomously operating an application program based on different visual environments as described above preferably includes: (ii) a burst condition handling behavior derived based on the decision data;
based on normal operating behavior in different visual environments;
wherein the content of the first and second substances,
the emergency handling behavior is prioritized over the normal operating behavior.
The method for autonomously operating an application based on different visual environments as described above is preferably implemented by acquiring the visual environments through sensors.
The method for autonomously operating an application program based on different visual environments as described above, preferably, the sensor includes: the image sensor is used for acquiring article information under a certain visual environment;
the timer is used for acquiring specific time data at a certain moment;
the temperature sensor is used for acquiring temperature data at a certain moment;
and the light sensor is used for acquiring light data at a certain moment.
In the method for autonomously operating an application program based on different visual environments as described above, it is preferable that the visual environment be confirmed before the autonomous operation is performed in step S5.
In the method for autonomously operating an application program based on different visual environments, it is preferable that, in step S4, if the result of executing an application program command in the simulated autonomous operation behavior is not consistent with the database, step S3 is repeated to correct the simulated autonomous operation behavior.
And setting a help-seeking interaction unit, and when the execution result of the application program command under the multi-time simulated autonomous operation behavior is inconsistent with the comparison of the database, controlling the terminal to send a help-seeking signal.
The control terminal is a PC terminal.
The beneficial technical effects are as follows: collecting the operation behaviors of the application program in different visual environments and under an emergency condition, establishing a database, performing operation learning and simulation of the autonomous operation behaviors of the application program in different visual environments, and performing simulation comparison before executing the autonomous operation behaviors so as to improve the accuracy of operation.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention.
Wherein:
fig. 1 is a wire-frame diagram of a method of operating an application autonomously based on different visual environments provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, the terms "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, which are for convenience of description of the present invention only and do not require that the present invention must be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. The terms "connected" and "connected" used herein should be interpreted broadly, and may include, for example, a fixed connection or a detachable connection; they may be directly connected or indirectly connected through intermediate members, and specific meanings of the above terms will be understood by those skilled in the art as appropriate.
A method for an industrial robot to autonomously operate an application, comprising the steps of: s1, collecting the operation behaviors of the application program in different visual environments and under the emergency condition;
s2, constructing an autonomous operation behavior database, wherein the database comprises: a visual environment information set, an operation step set, an emergency situation solving set and an application program command executing set;
s3, performing application program operation learning and autonomous operation behavior simulation under different visual environments;
s4, comparing the execution result of the application program command under the simulated autonomous operation behavior with the database to obtain and store the correct autonomous operation behavior;
and S5, triggering an autonomous operation command and executing an autonomous operation behavior.
Collecting the operation behaviors of the application program in different visual environments and under an emergency condition, establishing a database, performing operation learning and simulation of the autonomous operation behaviors of the application program in different visual environments, and performing simulation comparison before executing the autonomous operation behaviors so as to improve the accuracy of operation.
The invention also has the following implementation mode, and the emergency solving set is decision data in the behavior process corresponding to different emergency.
Setting a burst condition solving set to deal with the burst condition, and storing decision data of an operation line for solving the burst condition.
The invention also has embodiments wherein the autonomous operational behavior comprises: (ii) a burst condition handling behavior derived based on the decision data;
based on normal operating behavior in different visual environments;
wherein the content of the first and second substances,
the emergency handling behavior is prioritized over the normal operating behavior.
Between normal operating behavior and addressing of the bursty condition, the bursty condition handling behavior is prioritized over the normal operating behavior because the bursty condition is unpredictable destructive, and further penalties are avoided by addressing the bursty condition first.
The invention also has an embodiment in which the visual environment is acquired by a sensor.
In this embodiment, the sensor includes: the image sensor is used for acquiring article information under a certain visual environment;
the timer is used for acquiring specific time data at a certain moment;
the temperature sensor is used for acquiring temperature data at a certain moment;
and the light sensor is used for acquiring light data at a certain moment.
The sensor is used for taking at least one of time, temperature, humidity and light as a visual environment, autonomous operation behaviors are more detailed through subdivision, and operation accuracy is guaranteed.
Moreover, the database can be associated with a plurality of robots, each robot does not need to be trained, and only the newly added robot needs to be in communication connection with the database.
In step S5, the present invention also has an embodiment in which the visual environment needs to be checked before the autonomous operation behavior is executed. And after the simulation comparison, reconfirming the visual environment to ensure the operation precision, and if the reconfirming result is negative, determining that the visual environment is changed, and generating the autonomous operation instruction and simulation again.
In step S4, when the result of executing the application program command under the simulated autonomous operation behavior is not consistent with the database, the step S3 is repeated to correct the simulated autonomous operation behavior. The database stores a large number of visual environment information sets, operation step sets, emergency situation solving sets and application program command executing sets, and the accuracy rate of each autonomous operation behavior is guaranteed.
And setting a help-seeking interaction unit, and when the execution result of the application program command under the simulated autonomous operation behavior after multiple corrections is inconsistent with the comparison of the database, controlling the terminal to send a help-seeking signal.
The robot is prevented from having an unlearned autonomous operation behavior, the correction times are preset, when the robot cannot normally operate after being corrected for many times, the robot is judged to be unable to solve, at the moment, help is sought for the control terminal, the control terminal is operated by a user, and at the moment, the autonomous operation behavior under the visual environment can be input. The control terminal is a PC terminal.
And a clock unit can be further arranged, and when the execution result of the application program command under the simulated autonomous operation behavior after multiple corrections is inconsistent with the database in preset time, the control terminal sends a help-seeking signal.
In some embodiments, the present application further has a robotic arm by which to perform the autonomous operation behavior;
in the solution set of the emergency, an early warning parameter data set can be further set, namely, parameters of all items which can be obtained in a visual environment are set, after autonomous learning is carried out, when a certain monitored data exceeds a set value, the situation can be regarded as that the situation can not be solved autonomously, the autonomous operation behavior produced at the moment is not required to be corrected autonomously, a distress signal is directly given out, the visual parameters and the generated autonomous operation behavior under the situation are sent to a control terminal, the control terminal can judge the feasibility of the autonomous operation behavior according to the actual visual parameters, and the autonomous operation behavior is selected to be executed or a proper operation behavior is given out and input.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above embodiments are only used for illustrating the embodiments of the present application, and not for limiting the embodiments of the present application, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also belong to the scope of the embodiments of the present application, and the scope of the embodiments of the present application should be defined by the claims.
Claims (9)
1. A method for an industrial robot to autonomously operate an application, characterized by the steps of:
s1, collecting the operation behaviors of the application program in different visual environments and under the emergency condition;
s2, constructing an autonomous operation behavior database, wherein the database comprises: a visual environment information set, an operation step set, an emergency situation solving set and an application program command executing set;
s3, performing application program operation learning and autonomous operation behavior simulation under different visual environments;
s4, comparing the execution result of the application program command under the simulated autonomous operation behavior with the database to obtain and store the correct autonomous operation behavior;
and S5, triggering an autonomous operation command and executing an autonomous operation behavior.
2. A method for an industrial robot to autonomously operate an application according to claim 1, characterized in that the burst resolution set is decision data in the course of behaviour coping with different bursts.
3. A method for an industrial robot to autonomously operate an application according to claim 2, characterized in that the autonomous operation behavior comprises: (ii) a burst condition handling behavior derived based on the decision data;
based on normal operating behavior in different visual environments;
wherein the content of the first and second substances,
the emergency handling behavior is prioritized over the normal operating behavior.
4. A method for autonomous operation of an application by an industrial robot according to claim 1, characterized in that the visual environment is acquired by means of sensors.
5. A method for autonomous operation of an application by an industrial robot according to claim 4, characterized in that the sensor comprises: the image sensor is used for acquiring article information under a certain visual environment;
the timer is used for acquiring specific time data at a certain moment;
the temperature sensor is used for acquiring temperature data at a certain moment;
and the light sensor is used for acquiring light data at a certain moment.
6. The method for autonomous operation of an application by an industrial robot according to claim 1, characterized in that in step S5 the visual environment needs to be confirmed before performing the autonomous operation.
7. The method for the autonomous operation of an industrial robot according to any of claims 1-6, wherein in step S4, if the result of the command execution of the application program under the simulated autonomous operation behavior is inconsistent with the database, the step S3 is repeated to modify the simulated autonomous operation behavior.
8. The method according to claim 7, wherein a help interaction unit is provided, and the control terminal sends a help signal when the execution result of the application program command under the multiple simulated autonomous operation behaviors is inconsistent with the database.
9. Method for autonomous operation of an application by an industrial robot according to claim 8, characterized in that the control terminal is a PC-terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011558035.9A CN112847337A (en) | 2020-12-24 | 2020-12-24 | Method for autonomous operation of application program by industrial robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011558035.9A CN112847337A (en) | 2020-12-24 | 2020-12-24 | Method for autonomous operation of application program by industrial robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112847337A true CN112847337A (en) | 2021-05-28 |
Family
ID=75996907
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011558035.9A Pending CN112847337A (en) | 2020-12-24 | 2020-12-24 | Method for autonomous operation of application program by industrial robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112847337A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07210234A (en) * | 1994-01-20 | 1995-08-11 | Fujitsu Ltd | Operation controller and method therefor |
CN107943098A (en) * | 2018-01-01 | 2018-04-20 | 余绍祥 | A kind of intelligent O&M robot system based on machine learning |
CN109760050A (en) * | 2019-01-12 | 2019-05-17 | 鲁班嫡系机器人(深圳)有限公司 | Robot behavior training method, device, system, storage medium and equipment |
CN110058592A (en) * | 2019-04-25 | 2019-07-26 | 重庆大学 | A kind of mobile robot control method |
CN111618862A (en) * | 2020-06-12 | 2020-09-04 | 山东大学 | Robot operation skill learning system and method under guidance of priori knowledge |
CN111880897A (en) * | 2020-07-24 | 2020-11-03 | 哈尔滨工业大学(威海) | Windows window application program behavior simulation robot and working method thereof |
-
2020
- 2020-12-24 CN CN202011558035.9A patent/CN112847337A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07210234A (en) * | 1994-01-20 | 1995-08-11 | Fujitsu Ltd | Operation controller and method therefor |
CN107943098A (en) * | 2018-01-01 | 2018-04-20 | 余绍祥 | A kind of intelligent O&M robot system based on machine learning |
CN109760050A (en) * | 2019-01-12 | 2019-05-17 | 鲁班嫡系机器人(深圳)有限公司 | Robot behavior training method, device, system, storage medium and equipment |
CN110058592A (en) * | 2019-04-25 | 2019-07-26 | 重庆大学 | A kind of mobile robot control method |
CN111618862A (en) * | 2020-06-12 | 2020-09-04 | 山东大学 | Robot operation skill learning system and method under guidance of priori knowledge |
CN111880897A (en) * | 2020-07-24 | 2020-11-03 | 哈尔滨工业大学(威海) | Windows window application program behavior simulation robot and working method thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10722313B2 (en) | Robot system and method of operating the same | |
Steinmetz et al. | Razer—a hri for visual task-level programming and intuitive skill parameterization | |
US5555179A (en) | Control method and control apparatus of factory automation system | |
CN105094049B (en) | Learning path control | |
CN107414837B (en) | Method and system for safely and automatically returning to original position after abnormal shutdown of industrial robot | |
US20220176563A1 (en) | Systems and methods for distributed training and management of ai-powered robots using teleoperation via virtual spaces | |
EP0949552B1 (en) | Robot controller with teaching device with key for performing a series of instructions | |
CN102317044A (en) | Industrial robot system | |
US5475797A (en) | Menu driven system for controlling automated assembly of palletized elements | |
EP3828654B1 (en) | Control system, controller and control method | |
US11252010B2 (en) | System for controlling and monitoring adaptive cyberphysical systems | |
CN112847337A (en) | Method for autonomous operation of application program by industrial robot | |
KR101398215B1 (en) | Dual arm robot control apparatus and method with error recovery function | |
Filipescu et al. | Simulated Hybrid Model of an Autonomous Robotic System Integrated into Assembly/Disassembly Mechatronics Line | |
CN114789453B (en) | Mechanical arm dynamic PID control method and device, electronic equipment and storage medium | |
US11321102B2 (en) | Programmable display, display control method, and display control program | |
Malik | Robots and COVID-19: Challenges in integrating robots for collaborative automation | |
KR102093775B1 (en) | Automatic assembly apparatus and method based on process recipe | |
JP2021035697A (en) | Work machine operation control method | |
EP0477430A1 (en) | Off-line teaching method for industrial robot | |
EP0083502A1 (en) | Robot control method and arrangement | |
Brecher et al. | Design and implementation of a comprehensible cognitive assembly system | |
CN113043268A (en) | Robot eye calibration method, device, terminal, system and storage medium | |
CN111899629A (en) | Flexible robot teaching system and method | |
WO2022080007A1 (en) | Server, control method therefor, program, and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210528 |