CN116810804A - Full-automatic business flow method integrating software robot and humanoid robot - Google Patents
Full-automatic business flow method integrating software robot and humanoid robot Download PDFInfo
- Publication number
- CN116810804A CN116810804A CN202311108964.3A CN202311108964A CN116810804A CN 116810804 A CN116810804 A CN 116810804A CN 202311108964 A CN202311108964 A CN 202311108964A CN 116810804 A CN116810804 A CN 116810804A
- Authority
- CN
- China
- Prior art keywords
- layer
- robot
- component
- humanoid robot
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 230000008569 process Effects 0.000 claims abstract description 43
- 230000003993 interaction Effects 0.000 claims abstract description 21
- 239000011664 nicotinic acid Substances 0.000 claims description 30
- 230000000007 visual effect Effects 0.000 claims description 29
- 230000006870 function Effects 0.000 claims description 16
- 230000009471 action Effects 0.000 claims description 15
- 230000005856 abnormality Effects 0.000 claims description 13
- 238000004458 analytical method Methods 0.000 claims description 13
- 230000007246 mechanism Effects 0.000 claims description 13
- 238000004891 communication Methods 0.000 claims description 10
- 238000007726 management method Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 10
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 238000009826 distribution Methods 0.000 claims description 9
- 238000013500 data storage Methods 0.000 claims description 8
- 238000011156 evaluation Methods 0.000 claims description 8
- 238000013439 planning Methods 0.000 claims description 8
- 238000003860 storage Methods 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 7
- 230000004044 response Effects 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000007405 data analysis Methods 0.000 claims description 4
- 230000002085 persistent effect Effects 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000005111 flow chemistry technique Methods 0.000 claims description 2
- 230000002452 interceptive effect Effects 0.000 claims 2
- 230000004927 fusion Effects 0.000 abstract description 6
- 238000005516 engineering process Methods 0.000 description 11
- 241000282414 Homo sapiens Species 0.000 description 7
- 238000013461 design Methods 0.000 description 6
- 238000004801 process automation Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 230000006399 behavior Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000003592 biomimetic effect Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 235000001968 nicotinic acid Nutrition 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 238000011158 quantitative evaluation Methods 0.000 description 2
- 238000012384 transportation and delivery Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000011089 mechanical engineering Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 238000012015 optical character recognition Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/25—Integrating or interfacing systems involving database management systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Databases & Information Systems (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Data Mining & Analysis (AREA)
- Automation & Control Theory (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Fuzzy Systems (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Algebra (AREA)
- Manipulator (AREA)
Abstract
The invention relates to a full-automatic business process method for integrating a software robot and a humanoid robot, which belongs to the technical field of software and comprises the following steps: acquiring an input task demand instruction based on a software robot, and converting the task demand instruction into an execution instruction of the humanoid robot operation; the software robot includes: a presentation layer, a service layer, a data layer, and an application layer; the representation layer provides an external interaction window and displays the running state of the humanoid robot; the service layer is connected with the presentation layer and is used for responding to the service request sent by the presentation layer and distributing the service to the downstream; the data layer is connected with the service layer and is used for accessing and storing data; the application layer is connected with the data layer and is used for outputting an execution instruction to the humanoid robot; the humanoid robot completes the operation based on the execution instruction and feeds back the running state to the software robot. The problem of low degree of automation of the traditional robot is solved by adopting the fusion of the software robot and the humanoid robot, and the degree of automation of the operation flow is effectively improved.
Description
Technical Field
The invention relates to the technical field of software, in particular to a full-automatic business flow method for integrating a software robot and a humanoid robot.
Background
In the process of operating and using multiple software, people find that the software has the defects of simple flow rule and high repetition rate, and consumes a great amount of time and labor, and in order to simplify the manual production flow and realize the automatic execution of the flow, the robot flow automation RPA (Robotic Process Automation) technology is generated. For example, reference [1] designs an RPA software flow automation robot, which completes analysis and compilation of system data through a unified interface, and realizes the robot to automatically execute steps according to a given flow. Reference [2] designs a robot flow automatic cloud service system and an implementation method, which execute a large number of user flows through dynamic allocation of robots, solve the problem of robot resource limitation and simplify the use cost of first-line business users. Reference [3] combines the theory of bionic design from the connotation of design morphology, and combines multidisciplinary knowledge to assist the development of novel military bionic robots.
At present, the traditional process automation task management mode has higher learning cost for business personnel, needs to consume a large amount of manpower and material resources, consumes longer time, utilizes RPA to replace manpower to complete software automation work, and can better solve the problem. Meanwhile, the risk of injury of robots to workers in the traditional manufacturing industry is a key factor affecting safety production, in a dynamic and uncertain man-machine interaction environment, the human-shaped robots are automatically guided to execute business processes by utilizing RPA (remote procedure alliance) through artificial intelligence means, the fusion development of advanced artificial intelligence technology and man-machine interaction technology is promoted, and the method has wide application prospect in the aspects of man-machine fusion and safety cooperation, and is disclosed in reference [4].
Because the software robot faces the information system, the related flow of manpower is born, and an automatic service is established in the digital world. The humanoid robot has the advantage of universality, adopts the technical characteristics of bionics, and replaces human beings in the physical world to complete work service. Furthermore, if the fusion of the business processes of the software robot and the humanoid robot can be realized, the combination of the advantages of two different robots can be realized, and the full-automatic operation requirement of future enterprises on the business processes can be met.
Reference is made to:
[1] zhang Jiemin the RPA software process automation robot [ D ]. University of North China university of technology, 2021.
[2] Chinese patent application publication number CN114816730a, name: a robot process automation cloud service system and an implementation method thereof, apply for the publication date: 2022-07-29.
[3] Ma Jin, hujie, zhu Guoniu, etc. the present and progress of the design morphology based military biomimetic robot (J.) packaging engineering art version, 2022, 43 (4): 1-11.
[4] Zheng Pai, li Chengxi, yan Yue, etc. augmented reality assisted mutual cognitive human-computer security interaction system [ J ]. Mechanical engineering journal 2023, 06.
Disclosure of Invention
The invention aims to provide a full-automatic business flow method for integrating a software robot and a humanoid robot.
In order to achieve the above object, the present invention provides a fully automatic business process method for integrating a software robot and a humanoid robot, comprising:
s1, acquiring an input task demand instruction based on a software robot, and converting the task demand instruction into an execution instruction for controlling the operation of the humanoid robot; wherein, the software robot includes: a presentation layer, a service layer, a data layer, and an application layer;
the representation layer provides an external interaction window for the software robot and performs visual display on the running state of the humanoid robot;
the service layer is connected with the representation layer and is used for responding to the service request sent by the representation layer and distributing the service to the downstream;
the data layer is connected with the service layer and is used for accessing and storing data;
the application layer is connected with the data layer and is used for outputting the execution instruction to the humanoid robot;
s2, the humanoid robot completes the operation based on the execution instruction, and the running state of the humanoid robot is fed back to the software robot.
According to one aspect of the present invention, in step S1, the representation layer includes: a front end UI frame and a visual display module;
the front-end UI framework is respectively in communication connection with the visual display module and the business layer, wherein the front-end UI framework comprises: the communication mechanism, the external window and the operation script are used for establishing data transmission;
the visual display module comprises: the visual input window is used for inputting task demand instructions, the visual operation display window is used for displaying the operation state of the humanoid robot, and the visual business planning window is used for displaying the business processes of the software robot and the humanoid robot.
According to one aspect of the invention, the service layer comprises: the system comprises a logic conversion module, a service distribution module and a basic service module;
the logic conversion module comprises: the system comprises a read-write instruction component, a picture capturing component and an external equipment screen throwing component;
the service distribution module comprises: a logic flow presetting component for determining instruction execution sequence and flow;
the basic service module comprises: a plurality of general-purpose functional components for performing general-purpose functions of the software robot.
According to one aspect of the invention, the data layer comprises: the system comprises a data access component, a data storage component and a picture management component;
the data access component is used for performing operations of adding, deleting, modifying and searching on database data;
the data storage component is used for performing persistent storage operation on data;
the picture management component is used for uniformly storing the acquired pictures into folders of the designated paths.
According to one aspect of the invention, the application layer comprises: the system comprises a service control component, a response processing component and a bionic form selecting component;
the service control component is used for carrying out parameter analysis on the distributed execution instruction and converting the parameter analysis into a data format readable by the humanoid robot;
the response processing component is used for receiving the parameters analyzed by the business control component in batches according to the flow and converting the parameters into a first action instruction of the humanoid robot;
the bionic form selecting component is used for acquiring bionic form parameters of the humanoid robot and storing the bionic form parameters into the database, receiving the first action instruction, comparing the first action instruction with the bionic form parameters in the database, matching the bionic form for the humanoid robot, and outputting a second action instruction corresponding to the bionic form.
According to one aspect of the invention, the software robot further comprises: a feedback layer;
the feedback layer is respectively connected with the application layer and the representation layer and is used for collecting the running state of the humanoid robot and feeding back the running state to the representation layer for visual display.
According to one aspect of the invention, the feedback layer comprises: an error mechanism module and a workflow generation module;
the error mechanism module comprises: the task recording component is used for recording the task completion probability and the completion state of the humanoid robot, the abnormality detection component is used for carrying out abnormality analysis on the task record in the task recording component, and the big data analysis component is used for carrying out analysis on the abnormality detected by the abnormality detection component;
the workflow generating module is used for carrying out flow processing on the task executing process of the humanoid robot so as to generate a flow data format.
According to one aspect of the present invention, further comprising:
s3, evaluating interaction degree between the software robot and the humanoid robot and the user, and evaluating application efficiency of the whole business process of the software robot, wherein at least one of the representation layer, the business layer, the data layer, the application layer and the feedback layer is optimized based on the interaction degree and the application efficiency.
According to one aspect of the present invention, in step S6, in the step of evaluating the interaction engagement degree between the software robot and the humanoid robot and the user, a robot quotient formula is adopted to obtain the interaction engagement degree, wherein the robot quotient formula is expressed as:
;
wherein ,belongs to the determinable criterion parameter set->;
In step S6, in the step of evaluating the application efficiency of the software robot full-service flow, an application efficiency formula is adopted for evaluation, where the application efficiency formula is expressed as:
;
wherein ,is a set of weights, denoted +.>,/>Is a fuzzy relation matrix.
According to the scheme of the invention, through the flexible webpage capturing and system screenshot functions of the software robot and through front-end UI calling, operation visualization is realized quickly and conveniently, the display level of automatic processing of the process is improved, and meanwhile, the complexity of designing an automatic process is reduced.
According to the scheme of the invention, the screen capturing technology records the change of the data address by using the handle, and Windows also provides related APIs to acquire window handle functions, so that the operations of reading, deleting, analyzing and the like of the screenshot data by the corresponding functional modules of the application layer are realized, meanwhile, the bottom database or interface is not required to be accessed, the screen capturing technology is suitable for a legacy system which cannot be opened or accessed, and the universality is stronger.
According to the scheme of the invention, the software system is combined with the physical system, so that software operation automation is realized through the RPA, the operation of the humanoid robot platform can be controlled, the condition that the humanoid robot completes the service is monitored, controlled and checked at any time and remotely at the mobile terminal through the screen throwing technology of the mobile terminal, and the automation degree of multi-software linkage and operation flow is improved.
According to the scheme of the invention, the full-automatic business flow method taking the intelligent technology as the core is designed, the function of automatically operating the bionic robot to complete business in the information system is realized, the humanoid robot replaces manpower to complete preset tasks under complex and dangerous conditions, the human-computer interaction, the bionics and the computer technology are combined, and the innovation of the automatic business is improved.
According to the scheme of the invention, the feedback mechanism realizes visual display at the front end, so that an operator can conveniently carry out emergency response and processing according to the recorded workflow in time, adjust the business flow, improve the accuracy of full-automatic business execution, and provide a redundant standby scheme so as to adjust parameters to be analyzed in time.
According to the scheme of the invention, the software robot is permeated in the whole process of the business process, and the communication connection of the humanoid robot and the software robot is adopted, so that the different systems can cooperatively complete corresponding processes, activities or tasks, the error reporting can be automatically recorded, the delivery result after manual management is fed back, and the automation level and the business process efficiency of the invention are effectively improved.
Drawings
FIG. 1 is a block diagram of steps of a fully automated business process method according to one embodiment of the present invention;
fig. 2 is a flow chart of a fully automated business process method according to one embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the technical solution in the embodiments of the present invention, a further detailed description will be given below with reference to the accompanying drawings.
Referring to fig. 1 and 2, according to an embodiment of the present invention, a fully automatic business process method for integrating a software robot and a humanoid robot includes:
s1, acquiring an input task demand instruction based on a software robot, and converting the task demand instruction into an execution instruction for controlling the operation of the humanoid robot; wherein, the software robot includes: a presentation layer, a service layer, a data layer, and an application layer;
the representation layer provides an external interaction window for the software robot and performs visual display on the running state of the humanoid robot;
the service layer is connected with the presentation layer and is used for responding to the service request sent by the presentation layer and distributing the service to the downstream;
the data layer is connected with the service layer and is used for accessing and storing data;
the application layer is connected with the data layer and is used for outputting an execution instruction to the humanoid robot;
s2, the humanoid robot completes the operation based on the execution instruction, and the running state of the humanoid robot is fed back to the software robot.
As shown in conjunction with fig. 1 and 2, in step S1, the presentation layer includes: a front end UI frame and a visual display module; in this embodiment, the presentation layer is mainly used for inputting instructions by the information system, and the front end UI frame is used as a bottom module of the communication mechanism between the modules, which is equivalent to an external window of the software robot.
In this embodiment, the front-end UI framework is communicatively connected to the visual display module and the business layer, respectively, where the front-end UI framework includes: the communication mechanism, the external window and the operation script are used for establishing data transmission; the visual display module comprises: the visual input window is used for inputting task demand instructions, the visual operation display window is used for displaying the operation state of the humanoid robot, and the visual business planning window is used for displaying the business processes of the software robot and the humanoid robot. The visual display module is convenient for overall planning and visual interpretation of the business instructions.
As shown in fig. 1 and fig. 2 in combination, according to an embodiment of the present invention, the service layer is a bridge where the front-end presentation layer interacts with the service application, and is responsible for functions such as service request and distribution. In this embodiment, the service layer includes: the system comprises a logic conversion module, a service distribution module and a basic service module; the logic conversion module is used for converting a task demand instruction manually input in the presentation layer into an instruction which can be understood by a computer, and the software robot is used for connecting a business planning front-end system with a database, and storing the converted instruction, so that the logic conversion module is a basis for subsequent business calling, processing and responding. In this embodiment, the logic conversion module includes: the system comprises a read-write instruction component, a picture capturing component and an external equipment screen throwing component; the read-write instruction component is used for executing the task demand instruction which is input manually and converting the task demand instruction into the computer instruction, so that the picture capturing component can execute corresponding operation based on the instruction. In this embodiment, the screen-throwing component of the external device may be set to be used for throwing a screen of the android device, and specifically, the screen-throwing component of the external device is in communication connection with the android device through a communication component such as WiFi and bluetooth, where the screen-throwing component of the external device supports configuration of multiple attributes of a screen-throwing window, including configuration of multiple attributes of the screen-throwing window, such as size, rotation angle, top setting of the window, computer control, full screen display, and the like. Through the screen throwing function, the picture information of the mobile phone app can be captured, and the automatic flow control of the mobile terminal is realized. Of course, the screen throwing component of the external device is also adaptively arranged to be connected with other types of mobile terminal devices, so that access of different devices is facilitated. In addition, in this embodiment, the screen-projecting component of the external device may be further configured to be connected to multiple/multiple mobile terminal devices at the same time, so as to be suitable for access of multiple mobile terminal devices.
In this embodiment, the image capturing component is configured to implement a web page capturing function and a system screenshot function, and specifically, may implement a one-key capturing effect by setting a shortcut key. In this embodiment, the shortcut key setting is stored in the front-end UI item, and the Windows screenshot tool is invoked by a function to realize the screenshot operation. In the embodiment, in the task planning stage, if the issued task is in an image format, the picture capturing component can automatically capture text information and convert the text information into a computer instruction, so that the humanoid robot is convenient to control, and great convenience is provided for service planning and distribution.
In this embodiment, the service distribution module includes: a logic flow presetting component for determining instruction execution sequence and flow; the logic flow presetting component determines the execution sequence and flow of the instructions based on the logic flow chart.
In this embodiment, the basic service module includes: a plurality of general-purpose functional components for performing general-purpose functions of the software robot. Wherein a plurality of general-purpose functional components are integrated in the basic service module by means of pre-encapsulation, for example, the general-purpose functional components comprise uploading and downloading functions of files. Through the arrangement, the base service module is integrated in the base service module in a pre-packaging mode, so that the coding efficiency is improved, and the portability and reusability of codes are improved.
Referring to fig. 1 and 2, according to an embodiment of the present invention, the data layer mainly accesses and stores data, and the core function is to enter the OS table file data uploaded by the user into the database, define and compile the SQL statement, add, delete and search the data, and download the table. In this embodiment, the data layer includes: the system comprises a data access component, a data storage component and a picture management component; the data access component is used for performing operations of adding, deleting, modifying and searching on database data; the data storage component is used for performing a persistent storage operation on the data; in this embodiment, the data storage component may perform fixed storage of the common instructions, and may also perform temporary storage of the new instructions entered in the presentation layer. In this embodiment, the data storage component uses MySQL to implement a persistent storage operation on data, and the data sources include MySQL and an OS table file, where the OS table file includes xls, xlsx, csv file forms. The picture management component is used for uniformly storing the acquired pictures in a folder of a designated path, and is convenient to read and operate when the service call is processed.
Referring to fig. 1 and fig. 2, according to an embodiment of the present invention, an application layer is configured to output service analysis generated by the foregoing process to a humanoid robot, and perform functions such as disassembling, reading, and executing an action flow, so as to serve as an interaction bridge between a software robot and the humanoid robot. In this embodiment, the application layer includes: the system comprises a service control component, a response processing component and a bionic form selecting component; the service control component is used for carrying out parameter analysis on the distributed service logic and converting the service logic into a data format readable by the humanoid robot; the response processing component is used for receiving the parameters analyzed by the business control component in batches according to the flow and converting the parameters into a first action instruction of the humanoid robot; the bionic form selecting component is used for acquiring bionic form parameters of the humanoid robot and storing the bionic form parameters into the database, receiving the first action instruction, comparing the first action instruction with the bionic form parameters in the database, matching the optimal bionic form for the humanoid robot, and outputting a second action instruction corresponding to the bionic form. In this embodiment, the bionic form selecting module is configured to obtain the bionic form parameters of the humanoid robot and store the bionic form parameters in the database, and adopts the design morphology in the bio-excitation mode of the fusion bionic method, and analyzes and mines the contained actions or motion mechanisms by extracting "state" elements such as behaviors, functions and principles in the excitation source of the bio-field and using methods such as mathematical modeling, so as to obtain data (e.g., bending angles of wrists when grabbing objects) such as series formulas, parameters, statistical probabilities, and the like, thereby realizing the generation of the bionic form parameters and storing the bionic form parameters in the database.
As shown in connection with fig. 1 and 2, according to one embodiment of the present invention, the software robot further includes: a feedback layer; the feedback layer is respectively connected with the application layer and the representation layer and is used for collecting the running state of the humanoid robot and feeding back the running state to the representation layer for visual display. In this embodiment, the feedback layer includes: an error mechanism module and a workflow generation module. In this embodiment, the error mechanism module includes: the task recording component is used for recording the task completion probability and the completion state of the humanoid robot, the abnormality detection component is used for carrying out abnormality analysis on the task record in the task recording component, and the big data analysis component is used for analyzing the abnormality detected by the abnormality detection component; the anomaly detection component is used for analyzing burst and uncontrollable factors which cause errors of the full-automatic business process and feeding the burst and uncontrollable factors back to the visual display module of the front-end representation layer for visual analysis. The big data analysis component is used for carrying out emergency prompt and processing on the abnormality which leads to the error mechanism, and improving the accuracy of the full-automatic task execution.
In this embodiment, the workflow generating module is configured to perform a process of performing a process on a task execution process of the humanoid robot to generate a process data format. The workflow technology is a set of technical solution based on business process management theory, is used for controlling and managing automatic transfer of documents among a plurality of computers, and supports script development of robots in a 'stream' mode and a 'list' mode.
This is illustrated for further explanation of the present scheme.
In this embodiment, description will be made taking automatic sorting and grabbing of express delivery in a logistics warehouse as an example. The control mode of the traditional sorting machine is mainly a pulse signal tracking method, which is not described herein, but when the full-automatic business flow method of the fusion software robot and the humanoid robot is adopted, the method comprises the following steps:
(1) Displaying a representation layer screen of the software robot on a mobile terminal app through an external equipment screen projection component in the service layer, and displaying corresponding express information through visualization of the representation layer;
(2) The constructed business layer can capture the express information displayed by the presentation layer based on the operation of a user at the mobile terminal app, and automatically captures the express information through a picture capturing function (such as an optical character recognition technology in natural language processing);
(3) The business layer automatically classifies a great amount of captured data information in a table form, and writes the captured data information in an xls form into a database of the data layer for storage; reading related data of the express information at a business layer, judging information such as express type, weight, size, where to send and the like, generating corresponding execution instructions based on a generation logic conversion module, and combing the execution sequence of the execution instructions based on a business distribution module;
(4) The software robot receives the execution instruction issued by the service layer through the application layer, and automatically selects an action form according to the received execution instruction and the requirement to complete the sorting and grabbing process of the humanoid robot;
(5) The feedback layer of the software robot records header information, window class names, storage paths and the like in a 'stream' mode when the humanoid robot finishes express grabbing and sorting once, and is convenient for later-stage manual management, checking sorting conditions and checking missing and repairing.
Referring to fig. 1 and fig. 2, according to an embodiment of the present invention, a fully automatic business process method for integrating a software robot and a humanoid robot further includes:
s3, evaluating interaction degree between the software robot and the humanoid robot and the user, and evaluating application efficiency of the whole business process of the software robot, and optimizing at least one of a representation layer, a business layer, a data layer, an application layer and a feedback layer based on the interaction degree and the application efficiency. In this embodiment, the evaluation mechanism (module) of the interaction degree may be integrated in the feedback layer, which is used to evaluate, after the application layer outputs the biomimetic shape instruction, the instruction accuracy, the task completion degree, etc. of the humanoid robot through the defined robot business.
As shown in connection with fig. 1 and 2, according to one embodiment of the present invention, the expressive power of a human being in cooperation with a robot can be defined as the ability to effectively communicate, coordinate and cooperate between the two. Wherein, "human" refers to manual task planning and instruction input, and "robot" refers to an execution instruction finally output to the humanoid robot, which includes the following aspects:
(1) Ac capability (communication ability, CA): humans and robots need to be able to efficiently communicate information in a variety of forms including speech, images, sound, etc.;
(2) Task allocation capability (Task assignment ability, TAA): human beings and robots need to be able to reasonably distribute tasks according to their respective capacity and task requirements;
(3) Adaptive capability (AA): the human and the robot need to be able to adapt to different environments and task demands and flexibly adjust their own behaviors;
(4) Mutual understanding capabilities (Mutual understanding ability, MUA): humans and robots need to be able to understand the intent and behavior of each other in order to better cooperate;
(5) Efficiency and effect (Efficiency and effectiveness, EE): humans and robots need to be able to perform tasks efficiently and to achieve the desired results.
The five types of capabilities are main characterization aspects when people and machines cooperate with each other, the five types of capabilities are taken as basic elements, and factors which are difficult to determine boundaries and difficult to quantitatively analyze in a multi-constraint and multi-target system are quantified by combining the concepts of fuzzy mathematics so as to comprehensively judge the system.
Further, in step S6, in the step of evaluating the interaction degree between the software robot and the humanoid robot and the user, a formula of a robot quotient is adopted to reflect the capability expressed when the human interacts with the robot, and the degree of interaction degree (i.e., interaction degree) when the human communicates with the robot is evaluated. Wherein, the formula of the robot quotient is expressed as:
;
wherein ,belongs to the determinable criterion parameter set->The method comprises the steps of carrying out a first treatment on the surface of the { good, typically, bad } belongs to the evaluation parameter set (evaluation set)/(I)>ThenUTo->Is a fuzzy relation matrix of (a)MCan be expressed as:,/>indicative factor->Relative to the evaluation parameter->Is a membership of (1).
In step S6, in the step of evaluating the application efficiency of the software robot full-service flow, an application efficiency formula is adopted for evaluation, wherein the selection is performed in combination with the characteristics of the software robotWeighted average operator, weight set isThe fuzzy relation matrix isRThe application efficiency of the full-business process automation system is judged aiming at the full-business process automation system, and the full-business process automation system can be obtained through calculationW: the application effectiveness formula is expressed as:
;
;
wherein ,Ais a set of weights, expressed as,RIs a fuzzy relation matrix.
Through the arrangement, the weighted average operator multiplies the weight vector by the fuzzy relation matrix, and the mutual influence of all factors is reflected in the evaluation result, so that the effect of the weight coefficient can be obviously reflected. And the information of R is fully utilized, so that the synthesis is embodied in a true sense.
Through the arrangement, the quantifiable robot quotient is adopted, so that the quantitative evaluation of the completion condition of the whole business process in the five aspects is realized, and the quantitative evaluation of the business process is facilitated.
The foregoing is merely exemplary of embodiments of the invention and, as regards devices and arrangements not explicitly described in this disclosure, it should be understood that this can be done by general purpose devices and methods known in the art.
The above description is only one embodiment of the present invention, and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (9)
1. A full-automatic business process method integrating a software robot and a humanoid robot is characterized by comprising the following steps:
s1, acquiring an input task demand instruction based on a software robot, and converting the task demand instruction into an execution instruction for controlling the operation of the humanoid robot; wherein, the software robot includes: a presentation layer, a service layer, a data layer, and an application layer;
the representation layer provides an external interaction window for the software robot and performs visual display on the running state of the humanoid robot;
the service layer is connected with the representation layer and is used for responding to the service request sent by the representation layer and distributing the service to the downstream;
the data layer is connected with the service layer and is used for accessing and storing data;
the application layer is connected with the data layer and is used for outputting the execution instruction to the humanoid robot;
s2, the humanoid robot completes the operation based on the execution instruction, and the running state of the humanoid robot is fed back to the software robot.
2. The fully automatic business process method according to claim 1, wherein in step S1, the presentation layer comprises: a front end UI frame and a visual display module;
the front-end UI framework is respectively in communication connection with the visual display module and the business layer, wherein the front-end UI framework comprises: the communication mechanism, the external window and the operation script are used for establishing data transmission;
the visual display module comprises: the visual input window is used for inputting task demand instructions, the visual operation display window is used for displaying the operation state of the humanoid robot, and the visual business planning window is used for displaying the business processes of the software robot and the humanoid robot.
3. The fully automatic business process method of claim 2, wherein the business layer comprises: the system comprises a logic conversion module, a service distribution module and a basic service module;
the logic conversion module comprises: the system comprises a read-write instruction component, a picture capturing component and an external equipment screen throwing component;
the service distribution module comprises: a logic flow presetting component for determining instruction execution sequence and flow;
the basic service module comprises: a plurality of general-purpose functional components for performing general-purpose functions of the software robot.
4. The fully automatic business process method of claim 3, wherein the data layer comprises: the system comprises a data access component, a data storage component and a picture management component;
the data access component is used for performing operations of adding, deleting, modifying and searching on database data;
the data storage component is used for performing persistent storage operation on data;
the picture management component is used for uniformly storing the acquired pictures into folders of the designated paths.
5. The fully automatic business process method of claim 4, wherein the application layer comprises: the system comprises a service control component, a response processing component and a bionic form selecting component;
the service control component is used for carrying out parameter analysis on the distributed execution instruction and converting the parameter analysis into a data format readable by the humanoid robot;
the response processing component is used for receiving the parameters analyzed by the business control component in batches according to the flow and converting the parameters into a first action instruction of the humanoid robot;
the bionic form selecting component is used for acquiring bionic form parameters of the humanoid robot and storing the bionic form parameters into the database, receiving the first action instruction, comparing the first action instruction with the bionic form parameters in the database, matching the bionic form for the humanoid robot, and outputting a second action instruction corresponding to the bionic form.
6. The fully automated business process method of claim 5, wherein the software robot further comprises: a feedback layer;
the feedback layer is respectively connected with the application layer and the representation layer and is used for collecting the running state of the humanoid robot and feeding back the running state to the representation layer for visual display.
7. The fully automatic business process method of claim 6, wherein the feedback layer comprises: an error mechanism module and a workflow generation module;
the error mechanism module comprises: the task recording component is used for recording the task completion probability and the completion state of the humanoid robot, the abnormality detection component is used for carrying out abnormality analysis on the task record in the task recording component, and the big data analysis component is used for carrying out analysis on the abnormality detected by the abnormality detection component;
the workflow generating module is used for carrying out flow processing on the task executing process of the humanoid robot so as to generate a flow data format.
8. The fully automatic business process method of claim 7, further comprising:
s3, evaluating interaction degree between the software robot and the humanoid robot and the user, and evaluating application efficiency of the whole business process of the software robot, wherein at least one of the representation layer, the business layer, the data layer, the application layer and the feedback layer is optimized based on the interaction degree and the application efficiency.
9. The fully automatic business process method according to claim 8, wherein in step S6, in the step of evaluating the interactive engagement degree between the software robot and the humanoid robot and the user, a robot quotient formula is adopted to obtain the interactive engagement degree, wherein the robot quotient formula is expressed as:
;
wherein ,belongs to the determinable criterion parameter set->;
In step S6, in the step of evaluating the application efficiency of the software robot full-service flow, an application efficiency formula is adopted for evaluation, where the application efficiency formula is expressed as:
;
wherein ,is a set of weights, denoted +.>,/>Is a fuzzy relation matrix.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311108964.3A CN116810804B (en) | 2023-08-31 | 2023-08-31 | Full-automatic business flow method integrating software robot and humanoid robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311108964.3A CN116810804B (en) | 2023-08-31 | 2023-08-31 | Full-automatic business flow method integrating software robot and humanoid robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116810804A true CN116810804A (en) | 2023-09-29 |
CN116810804B CN116810804B (en) | 2023-12-12 |
Family
ID=88117061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311108964.3A Active CN116810804B (en) | 2023-08-31 | 2023-08-31 | Full-automatic business flow method integrating software robot and humanoid robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116810804B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007081519A2 (en) * | 2005-12-30 | 2007-07-19 | Steven Kays | Genius adaptive design |
CN106157130A (en) * | 2016-08-05 | 2016-11-23 | 青岛海尔软件有限公司 | Business administration application system and software for budget to people |
US20170214701A1 (en) * | 2016-01-24 | 2017-07-27 | Syed Kamran Hasan | Computer security based on artificial intelligence |
CN109807900A (en) * | 2019-03-19 | 2019-05-28 | 西北工业大学 | A kind of software architecture of industrial robot component networked control systems |
CN110694256A (en) * | 2019-09-18 | 2020-01-17 | 徐磊 | Novel emergency computer war game deduction system and method |
CN112099844A (en) * | 2020-09-16 | 2020-12-18 | 国网甘肃省电力公司信息通信公司 | Multi-kernel compatible intelligent browsing system for state network service system |
CN113824764A (en) * | 2021-08-11 | 2021-12-21 | 朱建定 | Robot process automation system based on cloud native architecture and operation method |
CN114816730A (en) * | 2022-03-07 | 2022-07-29 | 达而观信息科技(上海)有限公司 | Robot process automation cloud service system and implementation method |
CN116051319A (en) * | 2022-09-27 | 2023-05-02 | 重庆平戎科技有限公司 | Tactical collaborative combat simulation training system and tactical collaborative combat simulation training method |
-
2023
- 2023-08-31 CN CN202311108964.3A patent/CN116810804B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007081519A2 (en) * | 2005-12-30 | 2007-07-19 | Steven Kays | Genius adaptive design |
US20170214701A1 (en) * | 2016-01-24 | 2017-07-27 | Syed Kamran Hasan | Computer security based on artificial intelligence |
CN106157130A (en) * | 2016-08-05 | 2016-11-23 | 青岛海尔软件有限公司 | Business administration application system and software for budget to people |
CN109807900A (en) * | 2019-03-19 | 2019-05-28 | 西北工业大学 | A kind of software architecture of industrial robot component networked control systems |
CN110694256A (en) * | 2019-09-18 | 2020-01-17 | 徐磊 | Novel emergency computer war game deduction system and method |
CN112099844A (en) * | 2020-09-16 | 2020-12-18 | 国网甘肃省电力公司信息通信公司 | Multi-kernel compatible intelligent browsing system for state network service system |
CN113824764A (en) * | 2021-08-11 | 2021-12-21 | 朱建定 | Robot process automation system based on cloud native architecture and operation method |
CN114816730A (en) * | 2022-03-07 | 2022-07-29 | 达而观信息科技(上海)有限公司 | Robot process automation cloud service system and implementation method |
CN116051319A (en) * | 2022-09-27 | 2023-05-02 | 重庆平戎科技有限公司 | Tactical collaborative combat simulation training system and tactical collaborative combat simulation training method |
Also Published As
Publication number | Publication date |
---|---|
CN116810804B (en) | 2023-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10572360B2 (en) | Functional behaviour test system and method | |
US11693650B2 (en) | Intelligent software agent to facilitate software development and operations | |
CN112651126B (en) | BIM-based construction engineering construction optimization system, method, terminal and storage medium | |
EP3112965A1 (en) | Robotic process automation | |
US20090210860A1 (en) | Tagging and logical grouping of items in source code change lists | |
CN109871256B (en) | Method and equipment for integrating audit system and heterogeneous system based on RPA | |
CN111309609B (en) | software processing system | |
KR102543064B1 (en) | System for providing manufacturing environment monitoring service based on robotic process automation | |
US11106569B1 (en) | Requirements to test system and method | |
EP4124948A1 (en) | Process assembly line with robotic process automation | |
Larek et al. | Industry 4.0 in manual assembly processes–a concept for real time production steering and decision making | |
JPWO2013190714A1 (en) | Automatic operation system and operation automation method | |
CN115310420A (en) | Simulation analysis report generation method, device, equipment and storage medium | |
CN111538608A (en) | Method for preventing terminal equipment from being down, terminal equipment and storage medium | |
CN116810804B (en) | Full-automatic business flow method integrating software robot and humanoid robot | |
US20240255925A1 (en) | Multi-sensor system for operation status monitoring | |
CN117149266A (en) | Task processing method and device, storage medium and electronic equipment | |
CN117252726A (en) | Intelligent workshop safety risk identification system and method based on Yun Bian cooperation | |
Kim et al. | Smart connected worker edge platform for smart manufacturing: Part 1—Architecture and platform design | |
Roth et al. | The Augmented Workforce: A Systematic Review of Operator Assistance Systems | |
CN107346472A (en) | A kind of method and device that operation/maintenance data is handled by line modeling | |
CN115237425A (en) | Code viewing method and device, computer equipment and storage medium | |
US11100449B1 (en) | Systems and methods for efficiency management | |
US9965131B1 (en) | System and processes to capture, edit, and publish problem solving techniques | |
Quaum et al. | Method of Robotic Process Automation in Invoice Processing and Mailing: Implementation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |