EP3112965A1 - Automatisation de processus robotique - Google Patents

Automatisation de processus robotique Download PDF

Info

Publication number
EP3112965A1
EP3112965A1 EP15290172.4A EP15290172A EP3112965A1 EP 3112965 A1 EP3112965 A1 EP 3112965A1 EP 15290172 A EP15290172 A EP 15290172A EP 3112965 A1 EP3112965 A1 EP 3112965A1
Authority
EP
European Patent Office
Prior art keywords
images
display
computer
output data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15290172.4A
Other languages
German (de)
English (en)
Inventor
Cyrille Bataller
Adrien Jacquot
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Accenture Global Services Ltd
Original Assignee
Accenture Global Services Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accenture Global Services Ltd filed Critical Accenture Global Services Ltd
Priority to EP15290172.4A priority Critical patent/EP3112965A1/fr
Priority to US15/094,063 priority patent/US9555544B2/en
Priority to PCT/EP2016/065305 priority patent/WO2017001560A1/fr
Priority to JP2017567664A priority patent/JP7089879B2/ja
Priority to AU2016286308A priority patent/AU2016286308B2/en
Priority to EP16734340.9A priority patent/EP3215900B1/fr
Priority to CN201680031217.6A priority patent/CN107666987B/zh
Publication of EP3112965A1 publication Critical patent/EP3112965A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36184Record actions of human expert, teach by showing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40116Learn by operator observation, symbiosis, show, watch

Definitions

  • RPA robotic process automation
  • Robotic process automation is a technology that enables to automate the execution of repetitive and manually intensive activities.
  • a computer system or robot may mimic the actions of a human being in order to perform a computer-based task.
  • RPA can be used to interact with application software (or application, for short) through its user interface, as a human being would do. Therefore it is not necessary to integrate RPA with the existing applications at a programming level, thereby eliminating the difficulties inherent to integration, namely bringing together diverse components.
  • Blue Prism provides a plurality of connectors, i.e. libraries for communicating with a particular kind of application user interface.
  • Blue Prism provides an HTML connector, which allows working with HTML, JavaScript and other common components of a browser interface.
  • Blue Prism may retrieve the source code of a web page and use it e.g. to identify and recognize different components of the web page such as the search button.
  • Blue Prism comprises several modules, each one being specific to an application user interface.
  • RPA universal RPA
  • a robot is capable of interacting with a variety of user interfaces without the need for application-specific modules.
  • RPA being environment independent, capable of an automation of the execution of operations and/or commands independently of the technical operation of a program and/or program module, for instance when using a remote desktop connection technology that sends an image stream without knowledge of the underlying applications.
  • a computer system may comprise a memory and a processor configured to:
  • This aspect of the invention is directed to recording a procedure as executed by a user, in the context of RPA.
  • the user may exemplarily perform a task by interacting with the user interface of an application.
  • the user interface may be shown on a display of a computer screen and the processor may acquire information relating what is shown on the display. What is shown on the display may be monitored in the form of images which are distinct, single entities, or in the form of a stream of images, as in a sequence of combined images or a video stream.
  • images may be used to refer both to the single images and to one or more of the images in the stream of the images.
  • the display of the computer screen may, in one example, be comprised in or form part of the computer system configured to carry out the information acquisition.
  • the display may be part of a different computer system which is accessed e.g. via a remote desktop connection technology and/or by means of a virtual machine (which may be running on the same computer system or a different computer system).
  • the images may depict exemplarily the user interface and additional elements representing the interaction of the user with the user interface.
  • the acquisition of the information relating to the images may be achieved by directly capturing the images as they appear on the display (e.g. take screenshots) and/or by collecting information about the images on the display from sources other than the display.
  • information may be particularly acquired by computer vision, e.g. tracking mouse movements, mouse pointer turning to character prompt blinking in text field, and/or characters appearing on the screen, or buttons showing visual alteration when pressed.
  • computer vision e.g. tracking mouse movements, mouse pointer turning to character prompt blinking in text field, and/or characters appearing on the screen, or buttons showing visual alteration when pressed.
  • the capturing of the images may be performed in a (at least partly) continuous manner or only at particular moments during the execution of the task. For example, the acquisition may take place at regular intervals of time and/or subsequent to a trigger event. In particular, images may be acquired every time an activity on the display is detected, e.g. the user performing an action, such as clicking on a button of the user interface, or an application doing something, for instance a browser loading a web page or a dialog box popping up.
  • the capturing of the images is performed internally, i.e. by a stream of images to the display (e.g. accessing the feed of output images from the graphics card to the display).
  • the capturing may be conducted by an external image acquisition device, such as a camera focused on the display.
  • the processor may be configured to gather information about the images on the display from other sources.
  • the additional elements shown in the images may comprise a pointer of a pointing device (e.g. a mouse).
  • the processor may acquire information on an additional element such as the pointer from e.g. the drivers controlling the peripheral input devices, including the pointing device, or from the underlying operating system.
  • the acquired information may comprise a location of the pointer and/or an action performed by the pointer, such as selecting an area of the image shown on the display.
  • the different steps of the procedure may correspond to different activities displayed on the computer screen.
  • the beginning of the procedure may coincide with the launching of an application, which implies a new running process.
  • Other applications may run at the same time and/or following the first one, each constituting a process. Therefore a process change may occur when launching a new application or switching from one application to another.
  • the procedure may comprise a user input to the user interface e.g. by means of a peripheral input device such as a keyboard or a mouse.
  • the user input may, thus, comprise typing, i.e. inputting text by pressing keys on the keyboard, or using a pointer of a pointing device (e.g. a mouse) for performing actions such as clicking, dragging and/or scrolling.
  • interacting with the user interface may provoke a scene change, i.e. a change in the images shown on the display, e.g. when clicking on a link in a web page to access another web page.
  • a scene change may also be referred to as a screen change.
  • the acquired information relating to the images may be analyzed to detect activities on the display.
  • the information captured in the form of screenshots may by analyzed to determine whether a scene change has occurred, i.e. whether there is a change between two subsequent screenshots.
  • This analysis may be performed by means of computer vision techniques.
  • Computer vision refers to a field of technology concerned with the analysis and interpretation of visual information by a computer.
  • computer vision may duplicate the abilities of human vision by electronically perceiving and understanding an image. The use of computer vision may, thus, allow to analyze the information relating to the images to detect activities on the display.
  • the information about the images on the display obtained by sources other than the display may be analyzed to detect activities on the display. If, for example, this information comprises indication of an action performed e.g. by a mouse pointer, the occurrence of an activity of the "user input" type may be detected.
  • the collected information may be recorded by the processor.
  • the information relative to the activities, or activity information, and the information relating to the images may be stored in a log file in the memory.
  • the processor may be configured to generate output data comprising a combination of the information relating to the images and the activity information.
  • a screenshot may be superimposed with or enclose the activity information in a human-readable format (e.g. natural text language).
  • a screenshot showing the mouse pointer on a button may be modified to contain also a written instruction describing the step performed at the moment in which the screenshot was taken, such as "Left Click" to indicate a click with the left key of the mouse.
  • a set of computer instructions to reproduce the activity in an automated fashion may be superimposed on or enclosed with a screenshot.
  • the processor may be further configured to:
  • a second aspect of the invention is directed to automatically executing a procedure for performing a task.
  • the computer system may reproduce them.
  • the output data comprise a combination of the information relating to the images and the activity information and may be read by the processor as a set of instructions. Again, the computer vision may be employed to interpret the output data.
  • the processor is further configured to use the output data for image matching.
  • An image may characterize a specific area of the display.
  • the processor may take a screenshot of the display and search through the screenshot for a match with the image.
  • the image may be of a button that should be clicked on according to the output data.
  • the image matching may allow to identify the position of the button on the display and, subsequently, the processor may reproduce e.g. the activity of clicking.
  • the processor is further configured to use optical character recognition (OCR) to retrieve information from the acquired images in the output data.
  • OCR optical character recognition
  • the recorded information may be superimposed as written text on the acquired images and the OCR may convert the image of the typewritten text into machine-encoded text.
  • the processor may interpret a recorded piece of information such as "Left Click” and reproduce the activity of clicking with the left key of the mouse.
  • a computer-implemented method may comprise:
  • a computer program product may comprise computer-readable instructions which, when loaded and executed on a suitable system, perform the steps of a method according to the previous aspect.
  • RPA in the present description particularly comprises two aspects: the training of the computer system or robot that has to perform a given task and the subsequent automatized execution of the task by the robot.
  • the robot needs to be shown how a human being or user would perform the task.
  • the procedure followed by the human being when executing the task must be recorded and rendered in a form that allows the robot to replicate the actions made by the user.
  • computer vision is employed both when recording a procedure and when executing it.
  • existing RPA solutions which rely on specific technologies or protocol (e.g. HTML) to interpret the user's actions
  • the present description leverages computer vision to construe the operations performed by the user.
  • RPA may be defined cognitive RPA, meaning that a computer system is capable of sensing, comprehending and acting. Cognitive RPA is also flexible and adaptive to variations in the procedure, as explained in more detail below.
  • FIG. 1 shows an example of an architecture for RPA according to the present invention.
  • the RPA computer system may comprise a monitoring agent 100, an executing agent 110, a display 200, a memory and a processor (not shown).
  • the monitoring agent 100 and the executing agent 110 may be two different entities. In another example, the monitoring agent 100 and the executing agent 110 may coincide.
  • the monitoring agent 100 may be configured to acquire information relating to images and/or a stream of images shown on the display 200, while a user 130 is performing operations in the computer system, wherein the computing operations have a direct counterpart on the display 200. It should be understood that the information acquired from the display may be based on images which are distinct, single entities, or on a (continuous or discrete) stream of images, such as in a sequence of combined images or a video stream. In the following, the term "images" may be used to refer both to the single images and to one or more of the images in the stream of the images.
  • the monitoring agent 100 may analyze the information relating to the images to detect activities on the display 200, as explained with reference to Figures 2 and 3 below. The monitoring agent may then record activity information about the detected activities on the display 200 and generate output data 260 comprising a combination of the information relating to the images and the activity information.
  • the output data 260 may be used by the executing agent 110 when the executing agent 110 interfaces with the display 200 to autonomously and automatically repeat the operations previously performed by the user 130.
  • the executing agent 110 may read the output data 260 and, on the basis of the output data 260, reproduce the activities on the display 200 that correspond to the steps necessary to complete a task. Two examples of how the executing agent 110 reproduces the activities are given below with reference to Figures 4 and 5 .
  • the monitoring agent 100 may monitor a display 200 of a computer on which a user is performing a task, for example sending an email. Particularly, the monitoring agent 100 may acquire information about the images by capturing the images being sent to the display 200 e.g. from a stream of images sent thereto and/or in the form of screen shots. Alternatively or additionally, the monitoring agent 100 may capture the images shown on the display 200 by a camera (such as a CCD camera) recording the images displayed on the display 200. Specifically, the display 200 may show the steps performed e.g. by a user sitting in front of the display 200 to carry out the task e.g.
  • a camera such as a CCD camera
  • a pointing device such as a mouse
  • an input device such as a keyboard, a gesture recognition device, a touchpad, a scanner, an eye tracker or the like.
  • the camera may be part of or included in smart glasses (such as Google glasses). For instance, when an email is sent, one or more of the following activities may be triggered or input e.g. by a user and, thus, displayed or shown on the display 200:
  • the above activities maybe identified by the monitoring agent 100 in virtue of the captured images and/or of information obtained by other sources, as previously illustrated.
  • the above activities may be classified by the monitoring agent 100 into three main groups: process change 210, scene change 220 (also referred to as screen change) and user input 230.
  • a process change 210 may refer to a new instance of an application being executed. Another at least one process may have been running prior to the process change or the computer may have been idle. Ba a chance of the image displayed on the monitor 200, the monitoring agent 100 can determine that a process chance 210 has been triggered e.g. by a user. Particularly, a new process may be launched by clicking on an icon shown on the display 200 e.g. relating to an email messaging application (e.g. Outlook).
  • an email messaging application e.g. Outlook
  • a scene change 220 may refer to the modification of the scene shown on the display 200, wherein the scene is given by the ensemble of the images shown on the display (e.g. windows, desktop background).
  • An exemplary scene change may be e.g. the opening or generation on the display 200 of a new window relating to an application being newly started.
  • a user input 230 may be given via a peripheral input device such as a keyboard, touchpad or a mouse.
  • the user input 230 may, thus, comprise typing, i.e. inputting text by pressing keys on a keyboard, or using a pointer of a pointing device (e.g. a mouse) for performing actions such as clicking, dragging and/or scrolling.
  • Activity b) above may be considered a process change 210.
  • Activity d) may be considered a scene change 220.
  • Activities a), c) and e) to k) may be considered user inputs 230.
  • the activity of selection may correspond to a left click of the user on a mouse and the appearance of text may correspond to typing of the user on a keyboard.
  • the operations 300 carried out by the monitoring agent 100 may, thus, comprise checking for the occurrence of any activity on the display 200, in order to extract information relating to performing a task and document the procedure necessary for performing a task.
  • the monitoring agent 100 may monitor the display and look for instances of scene change 301, process change 302 and/or user input 303.
  • the monitoring agent 100 When an activity takes place (i.e. instances of scene change 301, process change 302 and/or user input 303 are detected by the monitoring agent 100), the monitoring agent 100 particularly causes information about the activity to be recorded or stored in an activities database 240.
  • the monitoring agent 100 may log 304 the new image and attempt to identify it.
  • the occurrence of the scene change 220 may be detected by the monitoring agent 100 for small modifications (e.g. a pop-up window and/or a message/icon appears) or more consistent modification (e.g. a complete change of the whole image shown on the display 200).
  • the initial image shown on the display 200 may be a dark screen (e.g. a desktop background), and the scene change may be due to the launching of the email application, which has a light GUI, e.g. a window where a bright color (e.g. white) is a predominant color.
  • the average intensity of the pixels in the two images may be computed and the difference between the intensities may be an indicator of the scene change. If the difference is higher than a specified (predetermined or predeterminable) threshold, the monitoring agent 100 may identify a scene change and log the new image and/or cause information about the detected activity to be recorded or stored in an activities database 240.
  • the monitoring agent 100 may determine and log 305 the name of the process.
  • the name of the process may be determined by the monitoring agent 100 by querying the processes operated on the computer.
  • the monitoring agent 100 may retrieve the name of the process by accessing e.g. a task manager particularly being a system monitor program providing information about the process(es) and/or program(s) running on a computer and/or the general status of the computer.
  • a task manager particularly being a system monitor program providing information about the process(es) and/or program(s) running on a computer and/or the general status of the computer.
  • the monitoring agent 100 particularly may retrieve the name of the process via computer vision, for example by using OCR on an acquired image showing the GUI of the application running the process, wherein the GUI bears the name of the process e.g. in the title bar.
  • the name of the process may be inferred by recognizing the general appearance of the GUI, e.g. using image matching as explained in more detail below.
  • Computer vision particularly is a field that includes methods for acquiring, processing, analyzing, and/or understanding or interpreting images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information thereof, e.g., in the form of one or more decisions.
  • computer vision may duplicate the abilities of human vision by electronically perceiving and understanding an image, wherein for achieving the image understanding a disentangling of symbolic information from image data using models constructed may be done by means of geometry, physics, statistics, and/or learning theory.
  • the monitoring agent 100 may go back 307 to the start of the operations 300 and check (or continue monitoring) for other or further activities.
  • the monitoring agent 100 may detect it on the display 200 and log 306 the type of action occurred (e.g. executed by the user) and the area of the display 200 where the action has occurred or been performed.
  • This information particularly may be obtained via computer vision and/or from external sources.
  • the location on the display 200 where the action is performed may be identified (particularly via computer vision), by recognizing e.g. the pointer hovering on the button "Send".
  • the information about the type of action i.e. the left click, may be obtained directly from the mouse driver. In another example, also the location of the pointer may be retrieved directly from the mouse driver.
  • the monitoring agent 100 may take 308 a screenshot of (at least a part of) the display 200.
  • the screenshot(s) may be stored in the activities database 240 along with the recorded activity information.
  • An activity handler 250 of the monitoring agent 100 may access the activities database 240 and combine the recorded activity information with screenshots concerning that specific activity.
  • text particularly a natural language text
  • the output data 260 given by the combination of the information relating to the images (e.g. in the form of screenshots) and the recorded activity information may be presented in the form of an output document, such as a Word document particularly including the screen shot(s) and the recorded information (e.g. the natural language text) in the same document. Accordingly, any activities detected by the monitoring agent 100 on the display 200 are recorded and collated into the output data for subsequent usage by a computer and/or user.
  • the monitoring agent 100 therefore, enables to automatically and seamlessly collect information on different processes.
  • the monitoring agent 100 may monitor processes on a computer without interfering with the normal functioning.
  • the documentation operations carried out by the monitoring agent 100 can also enable to detect process anomalies or discrepancies between users, and make sure the final documentation created illustrates the optimal way of executing a process.
  • the monitoring agent 100 is independent of the actual environment operating on the computer and an interference therewith, which potentially may lead to a disruption (e.g. crash) of a computer operation is avoided.
  • the output data 260 can be generated by the monitoring agent 100 independently of specific protocols (e.g. HTML) and or technical processes (e.g. computer applications) running on the computer.
  • the output data provided by the monitoring agent 100 may be used by the executing agent 110 to at least partly automatically carry out a task.
  • a task may comprise a series of operations.
  • Figures 4 and 5 show two examples of operations performed by the executing agent 110.
  • the operations to be performed are moving a pointer and selecting a button to mimic the actions of a user moving a mouse and clicking on a button.
  • Figure 4 shows how the first operation, i.e. moving a pointer, may be carried out.
  • the output data 420 may comprise the location of the pointer, the type of action performed (e.g. left click) and a screenshot of the display, showing the pattern of the button actioned (i.e. its visual characteristics) and the background.
  • the executing agent 110 may, thus, use the coordinates (x od , y od ) provided by the output data 420 in order to correctly move the pointer.
  • the coordinates (x od , y od ) may be relative coordinates, depending e.g. on the display resolution and/or whether the window constituting the GUI of the currently running application is maximized or not. For example, depending on a specific computer setting and/or display setting, a window may have a different size and/or location on the display 200.
  • the executing agent 110 may perform a double check and use part of the output data 420, e.g. the pattern of the button actioned (also called template), for image matching 400.
  • the executing agent 110 may take a screenshot of what is currently displayed and try to find where the template is most likely located in the screenshot.
  • One possible technique is to slide (or gradually displace) the template through the screenshot, wherein sliding means moving the template a specified pixel amount (e.g. one pixel) at a time (left to right, up to down).
  • a metric is calculated so it represents how "good” or "bad” the match at that location is (or how similar the template is to that particular area of the source image).
  • the image matching 400 may yield a second set of coordinates (x im , y im ) indicating a likely location of the button to be clicked.
  • the executing agent 110 may perform some algebraic operations 440 to combine the information from the output data 420 and from the image matching 400 in order to obtain the most reliable indication (x,y) on where to move the pointer. For example, the executing agent 110 may choose (x,y) as the middle of the segment joining the two points (x od , y od ) and (x im , y im ). Once the location (x,y) to where the pointer has to be moved is determined, the executing agent 110 may finally proceed to move the pointer 460.
  • Figure 5 shows how an operation entailing a scene change 220, such as the second operation (i.e. selecting a button), may be carried out.
  • the second operation i.e. selecting a button
  • a first screenshot 520 may be taken before the action 500 is made.
  • the action 500 may be performed by the executing agent 110 according to the information in the output data (e.g. left click).
  • a second screenshot 510 may be taken after the action has been performed. If a scene change 220 was recorded in the output data, a first check 540 may be to compare the first pre-action screenshot 520 and the second post-action screenshot 510 and see if a scene change has actually occurred.
  • a second check 550 may consist in comparing the second post-action screenshot 510 with a screenshot 530 present in the output data, depicting how the display should look after performing the action 500.
  • a logical operator 560 may be used to combine the results of the two checks. For example, if both checks are required, an AND operator may be used. If only one check is considered sufficient, an OR operator may be used.
  • the use of screenshots and/or particularly of computer vision by the executing agent 110 makes the automatization of a procedure adaptive to possible variations, such as a change in the window size as mentioned above. Furthermore, the executing agent 110 is capable to perform a task also when remote desktop operations need to be performed, since it relies on the images on the display.
  • Figure 6 shows another example of an architecture for RPA.
  • the monitoring agent 100 may provide output data 260 in the form of description for macros.
  • a macro may be a set of rules that specifies how a certain input sequence should be mapped to a replacement output sequence according to a defined procedure.
  • the macros may contain instructions on how the sequence of operations performed by a user may be mapped to a sequence of operations performed by a robot.
  • the description for the macros may be inspected by a developer 600, which may be a human being or a computer, e.g. in order to choose an optimized description(s) for the macro among a plurality of descriptions that are the results of the monitoring agent 100 monitoring different machines and/or users.
  • different descriptions may be inspected for similarities using technologies such as machine learning to recognize and group similar procedures into a categorization system.
  • the description may be transformed, via a macro writer user interface 610, into actual macro definitions, e.g. given in a scripting language.
  • the macro definitions may be stored in a macro definitions database 620.
  • the executing agent 110 may then receive the macro definitions from the macro definitions database 620 in order to implement the macros on display 200.
  • the executing agent 110 may interact with a macro orders database 640 storing macro orders.
  • Macro orders may be additional set of rules built from the macro definitions deriving from the monitoring agent 100.
  • the executing agent 110 may interact with the macro orders database 640 directly or through a macro mediator 630, which may be an interface between the two.
  • the macro orders database 640 may also be accessed by a support team 650, which may provide debugging and/or updating functionalities.
  • FIG. 7 shows an exemplary system for implementing the invention including a general purpose computing device in the form of a conventional computing environment 920 (e.g. a personal computer).
  • the conventional computing environment includes a processing unit 922, a system memory 924, and a system bus 926.
  • the system bus couples various system components including the system memory 924 to the processing unit 922.
  • the processing unit 922 may perform arithmetic, logic and/or control operations by accessing the system memory 924.
  • the system memory 924 may store information and/or instructions for use in combination with the processing unit 922.
  • the system memory 924 may include volatile and non-volatile memory, such as a random access memory (RAM) 928 and a read only memory (ROM) 930.
  • RAM random access memory
  • ROM read only memory
  • the system bus 926 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • the personal computer 920 may further include a hard disk drive 932 for reading from and writing to a hard disk (not shown), and an external disk drive 934 for reading from or writing to a removable disk 936.
  • the removable disk may be a magnetic disk for a magnetic disk driver or an optical disk such as a CD ROM for an optical disk drive.
  • the hard disk drive 932 and the external disk drive 934 are connected to the system bus 926 by a hard disk drive interface 938 and an external disk drive interface 940, respectively.
  • the drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 920.
  • the data structures may include relevant data for the implementation of the method as described above.
  • the relevant data may be organized in a database, for example a relational database management system or a object-oriented database management system.
  • a number of program modules may be stored on the hard disk, external disk 936, ROM 930 or RAM 928, including an operating system (not shown), one or more application programs 944, other program modules (not shown), and program data 946.
  • the application programs may include at least a part of the functionality as depicted in figures 2 to 6 .
  • a user may enter commands and information, as discussed below, into the personal computer 920 through input devices such as keyboard 948 and mouse 950.
  • Other input devices may include a microphone (or other sensors), joystick, game pad, scanner, or the like.
  • These and other input devices may be connected to the processing unit 922 through a serial port interface 952 that is coupled to the system bus 926, or may be collected by other interfaces, such as a parallel port interface 954, game port or a universal serial bus (USB). Further, information may be printed using printer 956.
  • the printer 956, and other parallel input/output devices may be connected to the processing unit 922 through parallel port interface 954.
  • a monitor 958 or other type of display device is also connected to the system bus 926 via an interface, such as a video input/output 960.
  • computing environment 920 may include other peripheral output devices (not shown), such as speakers or other audible output.
  • the computing environment 920 may communicate with other electronic devices such as a computer, telephone (wired or wireless), personal digital assistant, television, or the like. To communicate, the computer environment 920 may operate in a networked environment using connections to one or more electronic devices.
  • Figure 9 depicts the computer environment networked with remote computer 962.
  • the remote computer 962 may be another computing environment such as a server, a router, a network PC, a peer device or other common network node, and may include many or all of the elements described above relative to the computing environment 920.
  • the logical connections depicted in figure 9 include a local area network (LAN) 964 and a wide area network (WAN) 966.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet and may particularly be encrypted.
  • the computing environment 920 When used in a LAN networking environment, the computing environment 920 may be connected to the LAN 964 through a network I/O 968. When used in a WAN networking environment, the computing environment 920 may include a modem 970 or other means for establishing communications over the WAN 966. The modem 970, which may be internal or external to computing environment 920, is connected to the system bus 926 via the serial port interface 952. In a networked environment, program modules depicted relative to the computing environment 920, or portions thereof, may be stored in a remote memory storage device resident on or accessible to remote computer 962. Furthermore other data relevant to the method for optimization of evaluation of a policy (described above) may be resident on or accessible via the remote computer 962. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the electronic devices may be used.
  • the above-described computing system is only one example of the type of computing system that may be used to implement the above method for RPA.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Manipulator (AREA)
  • Programmable Controllers (AREA)
EP15290172.4A 2015-07-02 2015-07-02 Automatisation de processus robotique Withdrawn EP3112965A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP15290172.4A EP3112965A1 (fr) 2015-07-02 2015-07-02 Automatisation de processus robotique
US15/094,063 US9555544B2 (en) 2015-07-02 2016-04-08 Robotic process automation
PCT/EP2016/065305 WO2017001560A1 (fr) 2015-07-02 2016-06-30 Automatisation de processus robotique
JP2017567664A JP7089879B2 (ja) 2015-07-02 2016-06-30 ロボットによるプロセス自動化
AU2016286308A AU2016286308B2 (en) 2015-07-02 2016-06-30 Robotic process automation
EP16734340.9A EP3215900B1 (fr) 2015-07-02 2016-06-30 Automatisation de processus robotique
CN201680031217.6A CN107666987B (zh) 2015-07-02 2016-06-30 机器人过程自动化

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP15290172.4A EP3112965A1 (fr) 2015-07-02 2015-07-02 Automatisation de processus robotique

Publications (1)

Publication Number Publication Date
EP3112965A1 true EP3112965A1 (fr) 2017-01-04

Family

ID=53794164

Family Applications (2)

Application Number Title Priority Date Filing Date
EP15290172.4A Withdrawn EP3112965A1 (fr) 2015-07-02 2015-07-02 Automatisation de processus robotique
EP16734340.9A Active EP3215900B1 (fr) 2015-07-02 2016-06-30 Automatisation de processus robotique

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP16734340.9A Active EP3215900B1 (fr) 2015-07-02 2016-06-30 Automatisation de processus robotique

Country Status (5)

Country Link
US (1) US9555544B2 (fr)
EP (2) EP3112965A1 (fr)
JP (1) JP7089879B2 (fr)
CN (1) CN107666987B (fr)
AU (1) AU2016286308B2 (fr)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248164A (zh) * 2019-06-27 2019-09-17 四川中电启明星信息技术有限公司 Pc自动化作业监控方法及系统
US10482232B2 (en) 2017-08-16 2019-11-19 Bank Of America Corporation Robotic process automation using controller execution model
EP3675008A1 (fr) 2018-12-31 2020-07-01 Kofax, Inc. Systèmes et procédés d'identification de procédés d'automatisation robotique et modèles de construction associés
US10817314B1 (en) 2019-10-01 2020-10-27 NTT DATA Services, LLC Augmented shareable video files for robotic process automation
US10970109B1 (en) * 2017-11-09 2021-04-06 Amdocs Development Limited System, method, and computer program for managing a plurality of heterogeneous software robots to automate business processes
US10970064B1 (en) 2020-07-28 2021-04-06 Bank Of America Corporation Dynamically updating a software program to resolve errors
WO2021083480A1 (fr) 2019-10-28 2021-05-06 Siemens Aktiengesellschaft Procédé et dispositif de support d'une automatisation de processus robotique
WO2021219234A1 (fr) * 2020-05-01 2021-11-04 Blue Prism Limited Système et procédés d'automatisation de processus robotique
EP3909722A1 (fr) * 2020-05-11 2021-11-17 UiPath, Inc. Sélection de technique de recherche d'éléments graphiques, sélection en logique floue d'ancres et de cibles, et/ou identification hiérarchique d'éléments graphiques pour l'automatisation de processus robotiques
WO2022010516A1 (fr) * 2020-07-07 2022-01-13 UiPath, Inc. Descripteurs d'interface utilisateur (ui), bibliothèques d'objets d'ui, référentiels d'objets d'ui et navigateurs d'objets d'ui pour l'automatisation robotisée des processus
TWI767590B (zh) * 2021-03-02 2022-06-11 伊斯酷軟體科技股份有限公司 用於多部電子計算裝置的機器人流程自動化裝置及機器人流程自動化方法
EP3948722A4 (fr) * 2019-09-19 2023-01-04 UiPath, Inc. Compréhension de processus concernant l'automatisation robotisée des processus (arp) faisant appel à l'extraction de séquence
US20230008220A1 (en) * 2021-07-09 2023-01-12 Bank Of America Corporation Intelligent robotic process automation bot development using convolutional neural networks
EP4242848A1 (fr) 2022-03-09 2023-09-13 Universitatea "Lucian Blaga" Procédé et système informatique pour la capture et l'analyse des actions répétitives générées par l'interaction employé-ordinateur

Families Citing this family (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI528218B (zh) * 2013-11-29 2016-04-01 財團法人資訊工業策進會 機敏資料鑑識方法與使用所述方法的資料外洩防範系統
WO2018017214A1 (fr) * 2016-07-20 2018-01-25 Hewlett-Packard Development Company, L.P. Création de travailleurs numériques dans des organisations
AU2018200877A1 (en) * 2017-03-30 2018-10-18 Accenture Global Solutions Limited Closed loop nodal analysis
US10802453B2 (en) 2017-06-02 2020-10-13 Bank Of America Corporation Robotics process automation macro bot
US10449670B2 (en) 2017-07-17 2019-10-22 Bank Of America Corporation Event processing using robotic entities
US10740726B2 (en) * 2017-10-05 2020-08-11 Servicenow, Inc. Systems and methods for providing message templates in an enterprise system
JP2019074889A (ja) * 2017-10-13 2019-05-16 BizteX株式会社 ウェブブラウザの操作を伴う業務プロセスを自動化するためのシステム、方法及びプログラム
US10616280B2 (en) 2017-10-25 2020-04-07 Bank Of America Corporation Network security system with cognitive engine for dynamic automation
US10659482B2 (en) 2017-10-25 2020-05-19 Bank Of America Corporation Robotic process automation resource insulation system
US10437984B2 (en) 2017-10-26 2019-10-08 Bank Of America Corporation Authentication protocol elevation triggering system
US10705948B2 (en) 2017-10-30 2020-07-07 Bank Of America Corporation Robotic process automation simulation of environment access for application migration
US10503627B2 (en) 2017-10-30 2019-12-10 Bank Of America Corporation Robotic process automation enabled file dissection for error diagnosis and correction
US10686684B2 (en) 2017-11-02 2020-06-16 Bank Of America Corporation Individual application flow isotope tagging within a network infrastructure
US10575231B2 (en) * 2017-11-03 2020-02-25 Bank Of America Corporation System for connection channel adaption using robotic automation
US10474755B2 (en) 2017-11-03 2019-11-12 Bank Of America Corporation Robotics assisted production support utility
US10120656B1 (en) 2017-11-07 2018-11-06 Bank Of America Corporation Robotic process automation system for functional evaluation and improvement of back end instructional constructs
US10606687B2 (en) 2017-12-04 2020-03-31 Bank Of America Corporation Process automation action repository and assembler
US10452674B2 (en) * 2017-12-07 2019-10-22 Accenture Global Solutions Limited Artificial intelligence and robotic process automation for automated data management
US11075935B2 (en) 2017-12-22 2021-07-27 Kpmg Llp System and method for identifying cybersecurity threats
US11693923B1 (en) * 2018-05-13 2023-07-04 Automation Anywhere, Inc. Robotic process automation system with hybrid workflows
CN108732971B (zh) * 2018-05-29 2021-08-13 广州亿程交通信息集团有限公司 基于车联网的环境数据采集系统
JP6763914B2 (ja) * 2018-06-08 2020-09-30 ファナック株式会社 ロボットシステムおよびロボットシステムの制御方法
US10802889B1 (en) 2018-07-18 2020-10-13 NTT DATA Services, LLC Systems and methods of virtual resource monitoring for robotic processes
JP6452882B1 (ja) * 2018-07-28 2019-01-16 BizteX株式会社 ウェブブラウザの操作を伴う業務プロセスを自動化するためのシステム、方法及びプログラム
US10878531B2 (en) * 2018-08-17 2020-12-29 Accenture Global Solutions Limited Robotic process automation
CN113227964A (zh) * 2018-09-28 2021-08-06 艾利文Ai有限公司 用于机器人过程自动化设计的基于上下文的建议
US11613008B2 (en) 2019-01-14 2023-03-28 International Business Machines Corporation Automating a process using robotic process automation code
US11693757B2 (en) * 2019-02-01 2023-07-04 Virtusa Corporation Requirement gathering in process automation
AU2020267490A1 (en) * 2019-05-06 2021-12-23 Strong Force Iot Portfolio 2016, Llc Platform for facilitating development of intelligence in an industrial internet of things system
US10970097B2 (en) 2019-06-19 2021-04-06 Sap Se Adaptive web-based robotic process automation
JP2021056794A (ja) * 2019-09-30 2021-04-08 富士通株式会社 制御プログラム、制御装置及び制御方法
US11507772B2 (en) * 2019-10-02 2022-11-22 UiPath, Inc. Sequence extraction using screenshot images
US11150882B2 (en) * 2019-10-14 2021-10-19 UiPath Inc. Naming robotic process automation activities according to automatically detected target labels
US11488015B2 (en) 2019-10-15 2022-11-01 UiPath, Inc. Artificial intelligence layer-based process extraction for robotic process automation
US20210109503A1 (en) 2019-10-15 2021-04-15 UiPath, Inc. Human-in-the-loop robot training for robotic process automation
US11440201B2 (en) 2019-10-15 2022-09-13 UiPath, Inc. Artificial intelligence-based process identification, extraction, and automation for robotic process automation
CN115699050A (zh) * 2019-11-05 2023-02-03 强力价值链网络投资组合2019有限公司 价值链网络控制塔和企业管理平台
US11642783B2 (en) * 2019-12-02 2023-05-09 International Business Machines Corporation Automated generation of robotic computer program code
US11829795B2 (en) * 2019-12-30 2023-11-28 UiPath, Inc. Trigger service management for robotic process automation (RPA)
US11453131B2 (en) * 2019-12-30 2022-09-27 UiPath, Inc. Method and apparatus for remote native automation decoupling
US11233861B2 (en) 2020-02-18 2022-01-25 UiPath, Inc. Inter-session automation for robotic process automation (RPA) robots
US10654166B1 (en) 2020-02-18 2020-05-19 UiPath, Inc. Automation windows for robotic process automation
WO2021176523A1 (fr) 2020-03-02 2021-09-10 日本電信電話株式会社 Dispositif et procédé de reconnaissance d'écran, et programme
US11436830B2 (en) 2020-03-11 2022-09-06 Bank Of America Corporation Cognitive robotic process automation architecture
US20210294303A1 (en) * 2020-03-17 2021-09-23 UiPath, Inc. In-process trigger management for robotic process automation (rpa)
US11443241B2 (en) 2020-03-26 2022-09-13 Wipro Limited Method and system for automating repetitive task on user interface
US20210342736A1 (en) * 2020-04-30 2021-11-04 UiPath, Inc. Machine learning model retraining pipeline for robotic process automation
US11461164B2 (en) 2020-05-01 2022-10-04 UiPath, Inc. Screen response validation of robot execution for robotic process automation
US11080548B1 (en) 2020-05-01 2021-08-03 UiPath, Inc. Text detection, caret tracking, and active element detection
US11200441B2 (en) 2020-05-01 2021-12-14 UiPath, Inc. Text detection, caret tracking, and active element detection
US11367008B2 (en) 2020-05-01 2022-06-21 Cognitive Ops Inc. Artificial intelligence techniques for improving efficiency
KR102297355B1 (ko) * 2020-05-01 2021-09-01 유아이패스, 인크. 텍스트 검출, 캐럿 추적, 및 활성 엘리먼트 검출
US20210349430A1 (en) * 2020-05-11 2021-11-11 UiPath, Inc. Graphical element search technique selection, fuzzy logic selection of anchors and targets, and/or hierarchical graphical element identification for robotic process automation
US11494203B2 (en) 2020-05-13 2022-11-08 UiPath, Inc. Application integration for robotic process automation
CN111638879B (zh) * 2020-05-15 2023-10-31 民生科技有限责任公司 克服像素点定位限制的系统、方法、装置及可读存储介质
JP7174014B2 (ja) * 2020-07-06 2022-11-17 株式会社東芝 操作システム、処理システム、操作方法、及びプログラム
US11157339B1 (en) 2020-07-09 2021-10-26 UiPath, Inc. Automation of a process running in a first session via a robotic process automation robot running in a second session
US11392477B2 (en) 2020-07-09 2022-07-19 UiPath, Inc. Automation of a process running in a first session via a robotic process automation robot running in a second session
US20220084306A1 (en) * 2020-07-14 2022-03-17 Kalpit Jain Method and system of guiding a user on a graphical interface with computer vision
US11775321B2 (en) * 2020-08-03 2023-10-03 Automation Anywhere, Inc. Robotic process automation with resilient playback capabilities
US11759950B2 (en) * 2020-09-08 2023-09-19 UiPath, Inc. Localized configurations of distributed-packaged robotic processes
US11507259B2 (en) * 2020-09-08 2022-11-22 UiPath, Inc. Graphical element detection using a combined series and delayed parallel execution unified target technique, a default graphical element detection technique, or both
US11232170B1 (en) 2020-09-08 2022-01-25 UiPath, Inc. Application-specific graphical element detection
US11385777B2 (en) * 2020-09-14 2022-07-12 UiPath, Inc. User interface (UI) mapper for robotic process automation
US20220092607A1 (en) * 2020-09-24 2022-03-24 The Toronto-Dominion Bank Management of programmatic and compliance workflows using robotic process automation
US11592804B2 (en) * 2020-10-14 2023-02-28 UiPath, Inc. Task automation by support robots for robotic process automation (RPA)
US11301269B1 (en) 2020-10-14 2022-04-12 UiPath, Inc. Determining sequences of interactions, process extraction, and robot generation using artificial intelligence / machine learning models
US11921608B2 (en) 2020-10-30 2024-03-05 Accenture Global Solutions Limited Identifying a process and generating a process diagram
US11833661B2 (en) * 2020-10-31 2023-12-05 Google Llc Utilizing past contact physics in robotic manipulation (e.g., pushing) of an object
US20220188697A1 (en) * 2020-12-11 2022-06-16 UiPath, Inc. Supplementing artificial intelligence (ai) / machine learning (ml) models via action center, ai/ml model retraining hardware control, and ai/ml model settings management
TW202228953A (zh) * 2021-01-29 2022-08-01 日商發那科股份有限公司 機器人控制裝置及數值控制系統
US11618160B2 (en) 2021-03-26 2023-04-04 UiPath, Inc. Integrating robotic process automations into operating and software systems
US11934416B2 (en) 2021-04-13 2024-03-19 UiPath, Inc. Task and process mining by robotic process automations across a computing environment
US11522364B2 (en) 2021-04-21 2022-12-06 Peak Power, Inc. Building load modification responsive to utility grid events using robotic process automation
CN113221866B (zh) * 2021-04-30 2022-11-15 东方蓝天钛金科技有限公司 一种基于图像识别的设备数据采集系统及方法
US11829284B2 (en) 2021-06-07 2023-11-28 International Business Machines Corporation Autonomous testing of software robots
WO2023276875A1 (fr) * 2021-06-28 2023-01-05 株式会社 東芝 Système d'exploitation, système de traitement, procédé de construction de système de traitement, ordinateur, procédé pour opération, programme et support de stockage
US11794348B2 (en) * 2021-07-28 2023-10-24 Sap Se Process assembly line with robotic process automation
US20230037297A1 (en) * 2021-08-06 2023-02-09 Bank Of America Corporation Robotics Process Automation Automatic Enhancement System

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030052859A1 (en) * 2001-07-05 2003-03-20 Finley Michael Cain Laser and digital camera computer pointer device system
US20140118239A1 (en) * 2012-10-25 2014-05-01 PixiOnCloud, Inc. Visual-symbolic control of remote devices having display-based user interfaces
US20150103131A1 (en) * 2013-10-11 2015-04-16 Fuji Xerox Co., Ltd. Systems and methods for real-time efficient navigation of video streams

Family Cites Families (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4825394A (en) * 1985-05-07 1989-04-25 General Dynamics Corporation Vision metrology system
US8330812B2 (en) * 1995-05-30 2012-12-11 Simulated Percepts, Llc Method and apparatus for producing and storing, on a resultant non-transitory storage medium, computer generated (CG) video in correspondence with images acquired by an image acquisition device tracked in motion with respect to a 3D reference frame
US6181371B1 (en) * 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US7453451B1 (en) * 1999-03-16 2008-11-18 Maguire Francis J Jr Moveable headrest for viewing images from different directions
JP2001159903A (ja) * 1999-12-01 2001-06-12 Yamaha Motor Co Ltd 組合せ完成品用単位装置の最適化装置
JP2001277163A (ja) * 2000-04-03 2001-10-09 Sony Corp ロボットの制御装置及び制御方法
JP2005515910A (ja) * 2002-01-31 2005-06-02 ブレインテック カナダ インコーポレイテッド シングルカメラ3dビジョンガイドロボティクスの方法および装置
US7505604B2 (en) * 2002-05-20 2009-03-17 Simmonds Precision Prodcuts, Inc. Method for detection and recognition of fog presence within an aircraft compartment using video images
SE0203908D0 (sv) * 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
US7773799B2 (en) * 2004-04-02 2010-08-10 The Boeing Company Method for automatic stereo measurement of a point of interest in a scene
WO2006016866A2 (fr) * 2004-07-08 2006-02-16 Microsoft Corporation Saisie d'image automatique pour la production de contenu
JP4137862B2 (ja) * 2004-10-05 2008-08-20 ファナック株式会社 計測装置及びロボット制御装置
US7626569B2 (en) * 2004-10-25 2009-12-01 Graphics Properties Holdings, Inc. Movable audio/video communication interface system
CA2615659A1 (fr) * 2005-07-22 2007-05-10 Yogesh Chunilal Rathod Systeme universel de gestion des connaissances et de recherche bureau
JP2007061983A (ja) * 2005-09-01 2007-03-15 Fanuc Ltd ロボット監視システム
JP2007279991A (ja) 2006-04-05 2007-10-25 It System Corp ログ管理プログラム及び記録媒体
JP2007280174A (ja) 2006-04-10 2007-10-25 Hitachi Electronics Service Co Ltd 操作履歴記録装置
EP2140316B1 (fr) * 2007-03-29 2011-12-28 iRobot Corporation Système de configuration d'unité de commande d'opérateur de robot et procédé
JP5228716B2 (ja) * 2007-10-04 2013-07-03 日産自動車株式会社 情報提示システム
WO2009092164A1 (fr) * 2008-01-25 2009-07-30 Mcmaster University Guidage chirurgical utilisant la rétroaction tissulaire
US8559699B2 (en) * 2008-10-10 2013-10-15 Roboticvisiontech Llc Methods and apparatus to facilitate operations in image based systems
FR2954518B1 (fr) * 2009-12-18 2012-03-23 Aripa Service Innovation Ind " systeme anticollision pour le deplacement d'un objet dans un environnement encombre."
US8825183B2 (en) * 2010-03-22 2014-09-02 Fisher-Rosemount Systems, Inc. Methods for a data driven interface based on relationships between process control tags
US9104202B2 (en) * 2010-05-11 2015-08-11 Irobot Corporation Remote vehicle missions and systems for supporting remote vehicle missions
KR102068216B1 (ko) * 2011-01-28 2020-01-20 인터치 테크놀로지스 인코퍼레이티드 이동형 원격현전 로봇과의 인터페이싱
US9400778B2 (en) * 2011-02-01 2016-07-26 Accenture Global Services Limited System for identifying textual relationships
JP5622647B2 (ja) * 2011-04-11 2014-11-12 株式会社東芝 シナリオ生成装置およびシナリオ生成プログラム
WO2012150602A1 (fr) * 2011-05-03 2012-11-08 Yogesh Chunilal Rathod Système et procédé permettant de surveiller, d'enregistrer, de traiter, de fixer de façon dynamique des liaisons actives, dynamiques, contextuelles et accessibles et de présenter des activités, des actions, des emplacements, des journaux, des flux de vie, un comportement et un statut, de type physique ou numérique
US8793578B2 (en) * 2011-07-11 2014-07-29 International Business Machines Corporation Automating execution of arbitrary graphical interface applications
CN102270139B (zh) * 2011-08-16 2013-09-11 潘天华 一种屏幕截图的方法
US10176725B2 (en) * 2011-08-29 2019-01-08 Worcester Polytechnic Institute System and method of pervasive developmental disorder interventions
US9679215B2 (en) * 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
GB201202344D0 (en) * 2012-02-10 2012-03-28 Isis Innovation Method of locating a sensor and related apparatus
US20130335405A1 (en) * 2012-06-18 2013-12-19 Michael J. Scavezze Virtual object generation within a virtual environment
US9092698B2 (en) * 2012-06-21 2015-07-28 Rethink Robotics, Inc. Vision-guided robots and methods of training them
US9607787B2 (en) * 2012-09-21 2017-03-28 Google Inc. Tactile feedback button for a hazard detector and fabrication method thereof
WO2014121262A2 (fr) * 2013-02-04 2014-08-07 Children's National Medical Center Système chirurgical robotisé à commande hybride
US8868241B2 (en) * 2013-03-14 2014-10-21 GM Global Technology Operations LLC Robot task commander with extensible programming environment
JP2014235699A (ja) 2013-06-05 2014-12-15 株式会社日立システムズ 情報処理装置、機器設定システム、機器設定装置、機器設定方法、およびプログラム
JP5931806B2 (ja) * 2013-06-24 2016-06-08 日本電信電話株式会社 画像認識による自動操作装置、その方法及びプログラム
JP2015028765A (ja) 2013-06-25 2015-02-12 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America 入力制御方法及び入力制御装置
US9672728B2 (en) * 2014-04-07 2017-06-06 Google Inc. Smart hazard detector drills
CN104238418A (zh) * 2014-07-02 2014-12-24 北京理工大学 一种交互现实系统和方法
US9704043B2 (en) * 2014-12-16 2017-07-11 Irobot Corporation Systems and methods for capturing images and annotating the captured images with information

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030052859A1 (en) * 2001-07-05 2003-03-20 Finley Michael Cain Laser and digital camera computer pointer device system
US20140118239A1 (en) * 2012-10-25 2014-05-01 PixiOnCloud, Inc. Visual-symbolic control of remote devices having display-based user interfaces
US20150103131A1 (en) * 2013-10-11 2015-04-16 Fuji Xerox Co., Ltd. Systems and methods for real-time efficient navigation of video streams

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482232B2 (en) 2017-08-16 2019-11-19 Bank Of America Corporation Robotic process automation using controller execution model
US10783229B2 (en) 2017-08-16 2020-09-22 Bank Of America Corporation Robotic process automation using controller execution model
US10970109B1 (en) * 2017-11-09 2021-04-06 Amdocs Development Limited System, method, and computer program for managing a plurality of heterogeneous software robots to automate business processes
US11281936B2 (en) 2018-12-31 2022-03-22 Kofax, Inc. Systems and methods for identifying processes for robotic automation and building models therefor
EP3675008A1 (fr) 2018-12-31 2020-07-01 Kofax, Inc. Systèmes et procédés d'identification de procédés d'automatisation robotique et modèles de construction associés
US11836662B2 (en) 2018-12-31 2023-12-05 Kofax, Inc. Systems and methods for identifying processes for robotic automation and building models therefor
DE202019005843U1 (de) 2018-12-31 2022-07-01 Kofax, Inc. Systeme und Computerprogrammprodukte zur Identifizierung von Prozessen für die robotergestützte Automatisierung und zur Erstellung von Modellen dafür
CN110248164A (zh) * 2019-06-27 2019-09-17 四川中电启明星信息技术有限公司 Pc自动化作业监控方法及系统
EP3948722A4 (fr) * 2019-09-19 2023-01-04 UiPath, Inc. Compréhension de processus concernant l'automatisation robotisée des processus (arp) faisant appel à l'extraction de séquence
US10817314B1 (en) 2019-10-01 2020-10-27 NTT DATA Services, LLC Augmented shareable video files for robotic process automation
WO2021083480A1 (fr) 2019-10-28 2021-05-06 Siemens Aktiengesellschaft Procédé et dispositif de support d'une automatisation de processus robotique
WO2021219234A1 (fr) * 2020-05-01 2021-11-04 Blue Prism Limited Système et procédés d'automatisation de processus robotique
EP3909722A1 (fr) * 2020-05-11 2021-11-17 UiPath, Inc. Sélection de technique de recherche d'éléments graphiques, sélection en logique floue d'ancres et de cibles, et/ou identification hiérarchique d'éléments graphiques pour l'automatisation de processus robotiques
US11748069B2 (en) 2020-07-07 2023-09-05 UiPath, Inc. User interface (UI) descriptors, UI object libraries, UI object repositories, and UI object browsers for robotic process automation
CN116057504A (zh) * 2020-07-07 2023-05-02 尤帕斯公司 机器人流程自动化的用户界面(ui)描述符、ui对象库、ui对象储存库和ui对象浏览器
WO2022010517A1 (fr) * 2020-07-07 2022-01-13 UiPath, Inc. Descripteurs d'interface utilisateur (ui), bibliothèques d'objets ui, référentiels d'objets ui, et navigateurs d'objets ui pour automatisation de processus robotique
US11809846B2 (en) 2020-07-07 2023-11-07 UiPath, Inc. User interface (UI) descriptors, UI object libraries, UI object repositories, and UI object browsers for robotic process automation
WO2022010516A1 (fr) * 2020-07-07 2022-01-13 UiPath, Inc. Descripteurs d'interface utilisateur (ui), bibliothèques d'objets d'ui, référentiels d'objets d'ui et navigateurs d'objets d'ui pour l'automatisation robotisée des processus
US10970064B1 (en) 2020-07-28 2021-04-06 Bank Of America Corporation Dynamically updating a software program to resolve errors
TWI767590B (zh) * 2021-03-02 2022-06-11 伊斯酷軟體科技股份有限公司 用於多部電子計算裝置的機器人流程自動化裝置及機器人流程自動化方法
US11748053B2 (en) 2021-03-02 2023-09-05 Iscoollab Co., Ltd. Device and method for robotic process automation of multiple electronic computing devices
US20230008220A1 (en) * 2021-07-09 2023-01-12 Bank Of America Corporation Intelligent robotic process automation bot development using convolutional neural networks
EP4242848A1 (fr) 2022-03-09 2023-09-13 Universitatea "Lucian Blaga" Procédé et système informatique pour la capture et l'analyse des actions répétitives générées par l'interaction employé-ordinateur

Also Published As

Publication number Publication date
US9555544B2 (en) 2017-01-31
CN107666987B (zh) 2020-10-16
EP3215900B1 (fr) 2019-11-27
AU2016286308B2 (en) 2018-11-29
JP7089879B2 (ja) 2022-06-23
US20170001308A1 (en) 2017-01-05
EP3215900A1 (fr) 2017-09-13
CN107666987A (zh) 2018-02-06
JP2018535459A (ja) 2018-11-29
AU2016286308A1 (en) 2018-01-04

Similar Documents

Publication Publication Date Title
EP3112965A1 (fr) Automatisation de processus robotique
Brown et al. Finding waldo: Learning about users from their interactions
CN106844217B (zh) 对应用的控件进行埋点的方法及装置、可读存储介质
JP6893606B2 (ja) 画像のタグ付け方法、装置及び電子機器
Zhao et al. ActionNet: Vision-based workflow action recognition from programming screencasts
CN106779088A (zh) 执行机器学习流程的方法及系统
CN102844795A (zh) 图像处理装置、图像处理方法和程序
Tehranchi et al. Modeling visual search in interactive graphic interfaces: Adding visual pattern matching algorithms to ACT-R
EP4018399A1 (fr) Modélisation du comportement humain dans des environnements de travail à l'aide de réseaux neuronaux
CN115525563A (zh) 一种测试方法、装置、计算机设备和存储介质
US11625608B2 (en) Methods and systems for operating applications through user interfaces
Bernal-Cárdenas et al. Translating video recordings of complex mobile app ui gestures into replayable scenarios
Jaganeshwari et al. an Automated Testing Tool Based on Graphical User Interface With Exploratory Behavioural Analysis
US11216656B1 (en) System and method for management and evaluation of one or more human activities
CN113703637A (zh) 巡检任务代码化方法、装置、电子设备和计算机存储介质
Simko et al. Screen recording segmentation to scenes for eye-tracking analysis
Yu et al. Universally Adaptive Cross-Platform Reinforcement Learning Testing via GUI Image Understanding
Suay et al. A comparison of two algorithms for robot learning from demonstration
US20230169399A1 (en) System and methods for robotic process automation
JP7255619B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム及び情報処理システム
US20230289535A1 (en) Visual language processing modeling framework via an attention-on-attention mechanism
US20240184692A1 (en) Software testing
Forsgren et al. REGTEST-an Automatic & Adaptive GUI Regression Testing Tool.
Heinerud et al. Automatic testing of graphical user interfaces
CN117853841A (zh) 模型推理结果的输出方法及装置、电子设备及存储介质

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17P Request for examination filed

Effective date: 20170608

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17Q First examination report despatched

Effective date: 20190606

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20191017