WO2021230876A1 - Skill logic controller (slc) - Google Patents

Skill logic controller (slc) Download PDF

Info

Publication number
WO2021230876A1
WO2021230876A1 PCT/US2020/033052 US2020033052W WO2021230876A1 WO 2021230876 A1 WO2021230876 A1 WO 2021230876A1 US 2020033052 W US2020033052 W US 2020033052W WO 2021230876 A1 WO2021230876 A1 WO 2021230876A1
Authority
WO
WIPO (PCT)
Prior art keywords
skill
slc
skills
programmed
smart
Prior art date
Application number
PCT/US2020/033052
Other languages
French (fr)
Inventor
Lingyun Wang
Arquimedes Martinez Canedo
Original Assignee
Siemens Aktiengesellschaft
Siemens Industry, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft, Siemens Industry, Inc. filed Critical Siemens Aktiengesellschaft
Priority to PCT/US2020/033052 priority Critical patent/WO2021230876A1/en
Publication of WO2021230876A1 publication Critical patent/WO2021230876A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/05Programmable logic controllers, e.g. simulating logic interconnections of signals according to ladder diagrams or function charts
    • G05B19/056Programming the PLC

Definitions

  • aspects of the present invention generally relate to a Skill Logic Controller (SLC) having an automation main component as a skill for controlling an automation process by integrating the SLC with other automation devices.
  • SLC Skill Logic Controller
  • PLC Programmable Logic Controller
  • Wikipedia defines a Programmable Logic Controller (PLC) as “an industrial digital computer which has been ruggedized and adapted for the control of manufacturing processes, such as assembly lines, or robotic devices, or any activity that requires high reliability control and ease of programming and process fault diagnosis.”
  • the PLC hardware is a ruggedized and dedicated box, supplied by various controller vendors such as Siemens and Rockwell.
  • the programming languages of PLC are Ladder Logic, Structure Text, among others, based on IEC61131-3 standard, which has not changed in over half a century, since PLC was invented in the 1950s and 1960s during the second industrial revolution. The world has entered the fourth industrial revolution.
  • aspects of the present invention relate to a revolutionary Skill Logic Controller (SLC) to replace today’s Programmable Logic Controller (PLC).
  • a Skill Logic Controller (SLC) has an automation main component as a skill for controlling an automation process by integrating the SLC with other automation devices. Integration of smart sensors and tools such as intelligent robots and AI-based camera in the factory floor with a SLC is not a dauting task.
  • the automation main component in a SLC is a Skill.
  • a skill is a self-contained algorithm that takes inputs from the environment (e.g. sensors) and give commands in sequence to machines to accomplish a task. The skill can also interact with its environment (sensors, machines) or other skills by asking questions and seek clarifications.
  • Advantages include more flexibility to create reusable automation programs compared to the traditional PLC, better interoperability with state-of-the-art automation equipment and a new skills editor to compose skill programs faster.
  • Technical features include combination of programmed and learned skills, discovery of skills from automation devices, on-the-fly composition of skill programs using microservices and skill interface as opposed to Function Block (FB) interface.
  • FB Function Block
  • a Skill Logic Controller for controlling an automation process.
  • the SLC comprises a Central Processor Unit (CPU) module including a processor and an accessible memory storing an automation main component as a programmed skill.
  • the programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to smart machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and the smart machines or other programmed skills by asking questions and seeking clarifications.
  • the accessible memory stores a SLC program comprising software instructions that when executed by the processor are configured to provide combinatorial logic and sequential control for the automation process.
  • the combinatorial logic performs logical operations on parameters that have binary states and operate devices with binary states.
  • the sequential control determines an order in which actions follow one another.
  • a method for controlling an automation process.
  • the method comprises a step of providing a Skill Logic Controller (SLC) comprising a Central Processor Unit (CPU) module including a processor and an accessible memory storing an automation main component as a programmed skill.
  • the programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to smart machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and the smart machines or other programmed skills by asking questions and seeking clarifications.
  • the accessible memory stores a SLC program comprising software instructions that when executed by the processor are configured to provide combinatorial logic and sequential control for the automation process.
  • the combinatorial logic performs logical operations on parameters that have binary states and operate devices with binary states.
  • the sequential control determines an order in which actions follow one another.
  • FIG. 1 illustrates a block diagram of a Skill Logic Controller (SLC) in accordance with an exemplary embodiment of the present invention.
  • FIG. 2 illustrates a block diagram of a program cycle of the SLC in accordance with an exemplary embodiment of the present invention.
  • FIG. 3 illustrates a block diagram of a skill in accordance with an exemplary embodiment of the present invention.
  • FIG. 4 illustrates a block diagram of a skill-to-skill interaction in accordance with an exemplary embodiment of the present invention.
  • FIG. 5 illustrates a block diagram of an automation system in accordance with an exemplary embodiment of the present invention.
  • FIG. 6 illustrates an architecture and interactions among a Skill Logic Controller, Skill Engineering Environment, a smart sensor and a smart machine in accordance with an exemplary embodiment of the present invention.
  • FIG. 7 illustrates an example in which a SLC works with skills of a camera/inspection station, a robot station and a conveyor system in accordance with an exemplary embodiment of the present invention.
  • FIG. 8 illustrates a system architecture for SLC in accordance with an exemplary embodiment of the present invention.
  • FIGs. 9-11 illustrate schematic views of wiring skills on-the-fly in accordance with an exemplary embodiment of the present invention.
  • FIG. 12 illustrates a schematic view of a flow chart of a method for providing a Skill Logic Controller (SLC) to control an automation process in accordance with an exemplary embodiment of the present invention.
  • SLC Skill Logic Controller
  • FIG. 13 shows an example of a computing environment within which embodiments of the disclosure may be implemented.
  • FIG. 1 represents a block diagram of a Skill Logic Controller (SLC) 105 for controlling an automation process 106 in accordance with an exemplary embodiment of the present invention.
  • the SLC 105 comprises a Central Processor Unit (CPU) module 107 including a processor 110.
  • the SLC 105 further comprises an accessible memory 112 for storing an automation main component (AMC) 115 as one or more programmed skills 115(l-m).
  • AMC automation main component
  • the programmed skill 115(1) is a self-contained algorithm that takes inputs 117 from an environment such as smart sensors 118(1) and give commands in sequence to smart machines 118(2) to accomplish a task such that the programmed skill 115(1) is configured to interact with its environment such as the smart sensors 118(1) and the smart machines 118(2) or other programmed skills by asking questions and seeking clarifications.
  • the accessible memory 112 further stores a SLC program 120 comprising software instructions 122 that when executed by the processor 110 are configured to provide combinatorial logic 125(1) and sequential control 125(2) for the automation process 106.
  • the combinatorial logic 125(1) performs logical operations on parameters that have binary states and operate devices with binary states.
  • the sequential control 125(2) determines an order in which actions follow one another.
  • the SLC 105 further comprises a plurality of input modules 127(l-n) to receive input signals 130(1-2) by the SLC 105 such that the input signals 130(1-2) are conditioned and converted to digital input data values 132(1) compatible with the Central Processor (CPU) module 107.
  • the SLC 105 further comprises a first buffer 135(1) to store an input image 137(1) formed by the digital input data values 132(1).
  • the SLC program 120 determines a logical operation performed on a state of the digital input data values 132(1).
  • the SLC 105 further comprises an input skill interface 140(1) to provide an interface between an engineering system 142(1), a skill library 142(2), a skill composer 142(3) and the first buffer 135(1).
  • the engineering system 142(1) or an engineering environment is configured to populate the skill library 142(2) of skills by querying available programmed and learned skills from the smart sensors 118(1), the smart machines 118(2) and other SLCs.
  • the skill composer 142(3) provides a graphical editor for an automation engineer to program the SLC 105 with skills.
  • the SLC 105 further comprises learned skills 145(l-m) learned using artificial intelligence which enables the SLC 105 to learn during runtime a new and more efficient way to perform a task and change the wiring of skills on-the-fly.
  • the SLC 105 further comprises a second buffer 135(2) to store an output image 137(2) formed by digital output data values 132(2).
  • the digital input data values 132(1) and the SLC program 120 determine how the processor 110 is to set the digital output data values 132(2).
  • the SLC 105 further comprises an output skill interface 140(2) to provide an interface between the engineering system 142(1), the skill library 142(2), the skill composer 142(3) and the second buffer 135(2).
  • the SLC 105 further comprises a plurality of output modules 150(l-n) to convert the output image 137(2) into electrical control signals 152(1-2) that operate the smart sensors 118(1) and the smart machines 118(2) under control.
  • the smart sensors 118(1) and the smart machines 118(2) host one or more programmed and learned skills that specify how to operate itself.
  • FIG. 2 it illustrates a block diagram of a program cycle 205 of the SLC 105 in accordance with an exemplary embodiment of the present invention.
  • the SLC 105 might control a machine or a process in real-time by continuously looping a five-step sequence shown by an arrow 207.
  • FIG. 2 describes this process as a "read- execute-write" cycle.
  • skills are input via a skill interface in step 210.
  • inputs are provided via input modules in step 212.
  • output modules provide outputs in step 215.
  • Skills are written in step 217 and provided further by a skill interface in step 220.
  • the "read-execute-write" cycle of the program cycle 205 includes a step 222 of reading input image. Next step is a step 225 of reading skill interface. At step 227, a control or user program is executed. Then, at step 230, skills are executed. Output image is written next is step 232.
  • the control or user program is executed in its entirety from start to finish using the data obtained from the input image and the skill interface.
  • the results obtained by the executed program are saved as an 'output image', having a one-to-one mapping to the external outputs.
  • the output image is written to the SLC’s physical output interfaces including the skill interface.
  • the whole process is then repeated in a continuous cycle known as the "scan cycle”.
  • the cyclic program conforms to definition of "classical sequential programming" where "actions are strictly ordered as a time sequence”.
  • the SLC 105 processes the control or user program in a strict unaltered sequence and all actions are carried out: the SLC 105 reads the inputs and applies the entire fixed length program, then sets the outputs accordingly.
  • FIG. 3 it illustrates a block diagram of a skill 305 in accordance with an exemplary embodiment of the present invention.
  • An automation main component in the SLC 105 is the skill 305.
  • the skill 305 is a self-contained algorithm that takes inputs from the environment (e.g. sensors) and give commands in sequence to smart machines to accomplish a task.
  • the skill 305 can also interact with its environment (sensors, machines) or other skills by asking questions and seeking clarifications.
  • FIG. 4 illustrates a block diagram of a skill-to-skill interaction in accordance with an exemplary embodiment of the present invention.
  • One skill 405(1) can also be linked to another skill 405(2).
  • the outputs of one skill is the inputs of another skill.
  • FIG. 5 it illustrates a block diagram of an automation system 505 in accordance with an exemplary embodiment of the present invention.
  • the automation system 505 includes a programming environment 507, a SLC 510, smart machines 512 and smart sensors 515.
  • the programming environment 507 includes a skill composer 517, a skill library 520 and a skill deployment 522.
  • the SLC 510 includes a plurality of programmed skills 525(1 -n) and a plurality of learned skills 527(1 -n).
  • the SLC 510 hosts multiple programmed kills 525(1 - n), which are programmed with the skill composer 517.
  • the SLC 510 interacts with the smart sensors 515 and the smart machines 512 to accomplish a task specified by the skills 525(l-n), 527(l-n). Not all the skills 525(l-n), 527(1 -n) are available during programming. Some skills can be learned as the learned skills 527(1 -n) using artificial intelligence. Regardless of whether it is a learned or a programmed skill, an interface between a skill 525(1), or 527(1) and the smart machines 512 and the smart sensors 515 remains the same.
  • Skills 525(l-n), 527(l-n) are microservices that can compose themselves at engineering time and at runtime as long as their inputs and outputs have compatible types. This enables the ability of the SLC 510 to “learn” during runtime a new and more efficient way to perform a task and change the wiring of the skills 525(l-n), 527(l-n) on-the-fly.
  • FIG. 6 it illustrates an architecture and interactions among a SLC 605, a Skill Engineering Environment 607, a smart sensor 610 and a smart machine 612 in accordance with an exemplary embodiment of the present invention.
  • the Skill Engineering Environment 607 includes a skill composer 617, a skill library 620 and a skill deployment component 622.
  • the skill composer 617 links a sensor skill (S-Skill) 625 with a skill 627 and a learned skill 630 from another SLC and triggers a machine skill (M-Skill) 632 to complete an operation.
  • S-Skill sensor skill
  • M-Skill machine skill
  • the skill library 620 includes programmed skills 627, 635(1), learned skills 637(1-3), 630 the sensor skill (S-Skill) 625 and the machine skill (M- Skill) 632.
  • the skill deployment component 622 is configured to deploy skill programs to the SLC 605 once engineering is completed.
  • the Skill Engineering Environment 607 is the interface with automation engineers.
  • the environment will populate a library of skills by querying the available skills from sensors, machines and other SLCs.
  • the skill composer 617 provides a graphical editor for the automation engineer to program the SLC 605 with skills. In the above example, it links a sensor skill (S-Skill) with a skill and a learned skill from another SLC, and eventually triggers a machine skill (M-Skill) to complete the operation.
  • S-Skill sensor skill
  • M-Skill machine skill
  • the SLC 605 further comprises an onboard native skill library 640 including native skills 642(1-2) since manufacturing and learned skills 645(1-2) over a period of SLC operation.
  • the SLC 605 further comprises a runtime environment 650 that holds skill programs engineered and deployed from the Skill Engineering Environment 607 such that during a runtime execution the SLC 605 interacts with the smart sensor 610 and the smart machine 612 to complete an operational task.
  • the smart sensor 610 includes learned skills 652(1), 637(1), a sensor skill (S-Skill) 655(1) and the sensor skill (S-Skill) 625.
  • the smart machine 612 includes learned skills 657(1), 637(2), a machine skill (M- Skill) 660(1) and the machine skill (M-Skill) 632.
  • the smart sensor 610 and the smart machine 612 host the skills that specify how to operate itself. For instance, a smart camera will have a skill to take a photo with high resolution. A smart milling machine can expose a skill for milling operation with input parameters.
  • FIG. 7 it illustrates an example in which a SLC 705 works with skills 707 (1-5) of a camera/inspection station 710, a robot station 712 and a conveyor system 715 in accordance with an exemplary embodiment of the present invention.
  • FIG. 7 shows a concrete example using the SLC 705 and smart sensors/machines.
  • the camera/inspection station 710 can offer two skills: (i) a skill 707(1) for taking a photo at a specific frame per second (fps) e.g. 30, 60 or 120 fps; (ii) a skill 707(2) to detect a defective product.
  • the robot station 712 can also offer two skills: (i) a skill 707(3) to pick the object from a tray on the conveyor; (ii) a skill 707(4) to drop the object at the table of the robot station 712.
  • the conveyor system 715 can offer a move skill 707(5), which moves the tray on the conveyor from one station (e.g. inspection station) to another station (e.g. robot station).
  • a skill program 725 can be composed and deployed to the SLC 705. As shown in the diagram in the SLC 705, an engineer can simply wire among different skills 707 (1-5) to compose the skill program 725 that: (1) takes a photo, (2) detects the a defective product, (3) moves a tray to another station, (4) picks the tray, and (5) places the tray in a buffer.
  • FIG. 8 it illustrates a system architecture 805 for a SLC in accordance with an exemplary embodiment of the present invention.
  • the system architecture 805 comprises a bottom layer 807 for communication, a top layer 810 of interaction and a middle layer 822 including a scheduler 812, tasks 815, a process image 817 and services 820.
  • FIGs. 9-11 illustrate schematic views of wiring skills on-the-fly in accordance with an exemplary embodiment of the present invention.
  • a high level skill such as "move object” is mapped to lower level skills such as “conveyor move” and "pick_and_place”. Skill matching is performed via type checking.
  • the "conveyor move” and “pick_and_place” map to the "move object” skill because of the input/output type matching.
  • the "pick_and_place” skill can be decomposed into compositions of lower level skills provided by the machines to the engineering system.
  • Machine 1 may realize the "pick_and_place” skill via ⁇ grasp, lift, move, lift, lift> composition of skills.
  • Machine 2 may realize the "pick_and_place” via a ⁇ pick, lift, move, lift, lift> composition of skills that applies a concentric force on the object with the arm's gripper. Notice that Machine 2 requires a robotic "arm” to perform the action whereas Machine 1 does not require any additional equipment.
  • the Machine 3 requires two robotic "arms” to “push” the object from opposite sides and is realized via a ⁇ (push
  • the machine-specific skills require a "box” as an input and the higher-level skill “pick_and_place” specifies an "object” as an input.
  • the system supports the specialization of inputs and outputs. Meaning that "box” is a subtype of "object”.
  • the on-the-fly wiring of skills comes into play when a task must be performed, but the environmental conditions change. For example, if during normal operation Machine 2 goes offline, then the rewiring of the skill program consists of: (1) finding an execution alternative, and (2) switching the execution to the alternative. In this case, Machine 3 or Machine 1 can take over that task assuming all the inputs and conditions are satisfied and the hardware is available. The system can re-wire the skills on-the-fly while a machine is in malfunction/maintenance, or the plant owner wants to balance the load of machines, or simply to save energy.
  • the learned skills come into play.
  • the system can learn to use a robotic arm in combination with a conveyor and compose ⁇ pick_and_place, start conveyor, stop conveyor, pick_and_place >.
  • Another alternative is to use two robotic arms to throw and catch the object and thus composing a ⁇ throw, catch> program during operation further decomposed into a ⁇ pick_for_throw, catch_from_throw> skill program.
  • the whole skill-software may be downloaded from the devices into the engineering system (and e.g., later on transfer the complete code to the SLC 105). Downloading the entire skill software from the device to the engineering system has the advantage of allowing changes to the skill. This can be used to adapt skills, or to change their specification. Simply downloading the skill-access-information may be inflexible. In the present invention, having the ability to change a skill code gives flexibility to fix bugs, improve performance, and make changes.
  • a skill is a software component that does not belong to a fixed asset or equipment (e.g. a machine or a controller). With this concept, the skill can be programmed, improved, patched, and upgraded or downgraded anywhere, at anytime.
  • a smart machine or a smart sensor brings in its initial skills from OEM. But later on, the skills can be improved during engineering, or during operation via learning. The improved skills can be redeployed back to the engineering environment or the smart machine if needed. If a new security vulnerability is discovered later, the security patch can be applied even during runtime.
  • FIG. 12 illustrates a schematic view of a flow chart of a method 1200 for providing the SLC 105 to control an automation process in accordance with an exemplary embodiment of the present invention.
  • the method 1200 performed by an automation system comprises a step 1205 of providing the SLC 105 comprising a Central Processor Unit (CPU) module including a processor and an accessible memory.
  • the accessible memory storing an automation main component as a programmed skill.
  • CPU Central Processor Unit
  • the programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and machines or other programmed skills by asking questions and seeking clarifications.
  • the accessible memory further storing a SLC program comprising software instructions that when executed by the processor are configured to provide combinatorial logic and sequential control for the automation process.
  • the combinatorial logic performs logical operations on parameters that have binary states and operate devices with binary states.
  • the sequential control determines an order in which actions follow one another.
  • Skill Logic Controller based on “skills” is described here a range of one or more other Industrial Control Systems, or other forms of edge devices are also contemplated by the present invention.
  • SLC Skill Logic Controller
  • other types of Industrial Control Systems may be implemented based on one or more features presented above without deviating from the spirit of the present invention.
  • FIG. 13 shows an example of a computing environment within which embodiments of the disclosure may be implemented.
  • this computing environment 1300 may be configured to execute the automation system discussed above with reference to FIG. 1 or to execute portions of the method 1200 described above with respect to FIG. 12.
  • Computers and computing environments, such as computer system 1310 and computing environment 1300, are known to those of skill in the art and thus are described briefly here.
  • the computer system 1310 may include a communication mechanism such as a bus 1321 or other communication mechanism for communicating information within the computer system 1310.
  • the computer system 1310 further includes one or more processors 1320 coupled with the bus 1321 for processing the information.
  • the processors 1320 may include one or more central processing units (CPUs), graphical processing units (GPUs), or any other processor known in the art.
  • the computer system 1310 also includes a system memory 1330 coupled to the bus 1321 for storing information and instructions to be executed by processors 1320.
  • the system memory 1330 may include computer readable storage media in the form of volatile and/or nonvolatile memory, such as read only memory (ROM) 1331 and/or random access memory (RAM) 1332.
  • the system memory RAM 1332 may include other dynamic storage device(s) (e.g., dynamic RAM, static RAM, and synchronous DRAM).
  • the system memory ROM 1331 may include other static storage device(s) (e.g., programmable ROM, erasable PROM, and electrically erasable PROM).
  • system memory 1330 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processors 1320.
  • a basic input/output system (BIOS) 1333 containing the basic routines that helps to transfer information between elements within computer system 1310, such as during start-up, may be stored in ROM 1331.
  • RAM 1332 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by the processors 1320.
  • System memory 1330 may additionally include, for example, operating system 1334, application programs 1335, other program modules 1336 and program data 1337.
  • the computer system 1310 also includes a disk controller 1340 coupled to the bus 1321 to control one or more storage devices for storing information and instructions, such as a hard disk 1341 and a removable media drive 1342 (e.g., floppy disk drive, compact disc drive, tape drive, and/or solid state drive).
  • the storage devices may be added to the computer system 1310 using an appropriate device interface (e.g., a small computer system interface (SCSI), integrated device electronics (IDE), Universal Serial Bus (USB), or FireWire).
  • SCSI small computer system interface
  • IDE integrated device electronics
  • USB Universal Serial Bus
  • FireWire FireWire
  • the computer system 1310 may also include a display controller 1365 coupled to the bus 1321 to control a display 1366, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information to a computer user.
  • the computer system includes an input interface 1360 and one or more input devices, such as a keyboard 1362 and a pointing device 1361, for interacting with a computer user and providing information to the processor 1320.
  • the pointing device 1361 for example, may be a mouse, a trackball, or a pointing stick for communicating direction information and command selections to the processor 1320 and for controlling cursor movement on the display 1366.
  • the display 1366 may provide a touch screen interface which allows input to supplement or replace the communication of direction information and command selections by the pointing device 1361.
  • the computer system 1310 may perform a portion or all of the processing steps of embodiments of the invention in response to the processors 1320 executing one or more sequences of one or more instructions contained in a memory, such as the system memory 1330.
  • a memory such as the system memory 1330.
  • Such instructions may be read into the system memory 1330 from another computer readable medium, such as a hard disk 1341 or a removable media drive 1342.
  • the hard disk 1341 may contain one or more datastores and data files used by embodiments of the present invention. Datastore contents and data files may be encrypted to improve security.
  • the processors 1320 may also be employed in a multi processing arrangement to execute the one or more sequences of instructions contained in system memory 1330.
  • hard- wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
  • the computer system 1310 may include at least one computer readable medium or memory for holding instructions programmed according to embodiments of the invention and for containing data structures, tables, records, or other data described herein.
  • the term “computer readable medium” as used herein refers to any medium that participates in providing instructions to the processor 1320 for execution.
  • a computer readable medium may take many forms including, but not limited to, non-volatile media, volatile media, and transmission media.
  • Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto optical disks, such as hard disk 1341 or removable media drive 1342.
  • Non-limiting examples of volatile media include dynamic memory, such as system memory 1330.
  • Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up the bus 1321.
  • Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • the computing environment 1300 may further include the computer system 1010 operating in a networked environment using logical connections to one or more remote computers, such as remote computer 1380.
  • Remote computer 1380 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 1310.
  • computer system 1310 may include modem 1372 for establishing communications over a network 1371, such as the Internet. Modem 1372 may be connected to bus 1321 via user network interface 1370, or via another appropriate mechanism.
  • Network 1371 may be any network or system generally known in the art, including the Internet, an intranet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communication between computer system 1310 and other computers (e.g., remote computer 1380).
  • the network 1371 may be wired, wireless or a combination thereof. Wired connections may be implemented using Ethernet, Universal Serial Bus (USB), RJ-11 or any other wired connection generally known in the art.
  • Wireless connections may be implemented using Wi-Fi, WiMAX, and Bluetooth, infrared, cellular networks, satellite or any other wireless connection methodology generally known in the art.
  • the computer system 1310 may be utilized in conjunction with a parallel processing platform comprising a plurality of processing units. This platform may allow parallel execution of one or more of the tasks associated with optimal design generation, as described above. For the example, in some embodiments, execution of multiple product lifecycle simulations may be performed in parallel, thereby allowing reduced overall processing times for optimal design selection.
  • the embodiments of the present disclosure may be implemented with any combination of hardware and software.
  • the embodiments of the present disclosure may be included in an article of manufacture (e.g., one or more computer program products) having, for example, computer-readable, non-transitory media.
  • the media has embodied therein, for instance, computer readable program code for providing and facilitating the mechanisms of the embodiments of the present disclosure.
  • the article of manufacture can be included as part of a computer system or sold separately.
  • An executable application comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input.
  • An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters.
  • a graphical user interface comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
  • the GUI also includes an executable procedure or executable application.
  • the executable procedure or executable application conditions the display processor to generate signals representing the GUI display images. These signals are supplied to a display device which displays the image for viewing by the user.
  • the processor under control of an executable procedure or executable application, manipulates the GUI display images in response to signals received from the input devices. In this way, the user may interact with the display image using the input devices, enabling user interaction with the processor or other device.
  • the functions and process steps herein may be performed automatically or wholly or partially in response to user command.
  • An activity (including a step) performed automatically is performed in response to one or more executable instructions or device operation without user direct initiation of the activity.
  • Computer readable medium instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field- programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • program modules, applications, computer- executable instructions, code, or the like depicted in FIG. 13 as being stored in the system memory are merely illustrative and not exhaustive and that processing described as being supported by any particular module may alternatively be distributed across multiple modules or performed by a different module.
  • various program module(s), script(s), plug-in(s), Application Programming Interface(s) (API(s)), or any other suitable computer-executable code hosted locally on the computer system 1310, the remote device, and/or hosted on other computing device(s) accessible via one or more of the network(s) may be provided to support functionality provided by the program modules, applications, or computer-executable code depicted in FIG. 13 and/or additional or alternate functionality.
  • functionality may be modularized differently such that processing described as being supported collectively by the collection of program modules depicted in FIG. 13 may be performed by a fewer or greater number of modules, or functionality described as being supported by any particular module may be supported, at least in part, by another module.
  • program modules that support the functionality described herein may form part of one or more applications executable across any number of systems or devices in accordance with any suitable computing model such as, for example, a client-server model, a peer-to-peer model, and so forth.
  • any of the functionality described as being supported by any of the program modules depicted in FIG. 13 may be implemented, at least partially, in hardware and/or firmware across any number of devices.
  • the computer system 1310 may include alternate and/or additional hardware, software, or firmware components beyond those described or depicted without departing from the scope of the disclosure. More particularly, it should be appreciated that software, firmware, or hardware components depicted as forming part of the computer system 1310 are merely illustrative and that some components may not be present or additional components may be provided in various embodiments. While various illustrative program modules have been depicted and described as software modules stored in system memory, it should be appreciated that functionality described as being supported by the program modules may be enabled by any combination of hardware, software, and/or firmware. It should further be appreciated that each of the above-mentioned modules may, in various embodiments, represent a logical partitioning of supported functionality.
  • This logical partitioning is depicted for ease of explanation of the functionality and may not be representative of the structure of software, hardware, and/or firmware for implementing the functionality. Accordingly, it should be appreciated that functionality described as being provided by a particular module may, in various embodiments, be provided at least in part by one or more other modules. Further, one or more depicted modules may not be present in certain embodiments, while in other embodiments, additional modules not depicted may be present and may support at least a portion of the described functionality and/or additional functionality. Moreover, while certain modules may be depicted and described as sub- modules of another module, in certain embodiments, such modules may be provided as independent modules or as sub-modules of other modules.
  • any operation, element, component, data, or the like described herein as being based on another operation, element, component, data, or the like can be additionally based on one or more other operations, elements, components, data, or the like. Accordingly, the phrase “based on,” or variants thereof, should be interpreted as “based at least in part on.”
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the Figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
  • any examples or illustrations given herein are not to be regarded in any way as restrictions on, limits to, or express definitions of, any term or terms with which they are utilized. Instead, these examples or illustrations are to be regarded as being described with respect to one particular embodiment and as illustrative only. Those of ordinary skill in the art will appreciate that any term or terms with which these examples or illustrations are utilized will encompass other embodiments which may or may not be given therewith or elsewhere in the specification and all such embodiments are intended to be included within the scope of that term or terms.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Programmable Controllers (AREA)

Abstract

A Skill Logic Controller (SLC) is provided for controlling an automation process of an industrial control system involving edge devices. The SLC comprises a Central Processor Unit (CPU) module including a processor and an accessible memory storing an automation main component as a programmed skill. The programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to smart machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and the smart machines or other programmed skills by asking questions and seeking clarifications. The accessible memory further stores a SLC program comprising software instructions that when executed by the processor are configured to provide combinatorial logic and sequential control for the automation process.

Description

SKILL LOGIC CONTROLLER (SLC)
BACKGROUND
1. Field
[0001] Aspects of the present invention generally relate to a Skill Logic Controller (SLC) having an automation main component as a skill for controlling an automation process by integrating the SLC with other automation devices.
2. Description of the Related Art
[0002] Today’s automation is centered around Programmable Logic Controllers (PLCs). Wikipedia defines a Programmable Logic Controller (PLC) as “an industrial digital computer which has been ruggedized and adapted for the control of manufacturing processes, such as assembly lines, or robotic devices, or any activity that requires high reliability control and ease of programming and process fault diagnosis.” The PLC hardware is a ruggedized and dedicated box, supplied by various controller vendors such as Siemens and Rockwell. The programming languages of PLC are Ladder Logic, Structure Text, among others, based on IEC61131-3 standard, which has not changed in over half a century, since PLC was invented in the 1950s and 1960s during the second industrial revolution. The world has entered the fourth industrial revolution. Technological advances in digitalization, robotics, autonomy and artificial intelligence have shown its impacts in automation and manufacturing processes. However, since the traditional automation is still dependent on the PLC, an old automation platform that has not improved much since its inception, there is a bottleneck that is preventing the full exploitation of novel manufacturing technologies. Lor instance, it is not uncommon to have intelligent robots and AI-based camera in the factory floor. However, the integration of these smart sensors and tools with a PLC is a dauting task. The programming paradigm, the communication protocols, and runtime execution models of a PLC and today’s intelligent devices are vastly different. A lot of programming and integration effort is necessary to make the outdated PLC to understand the new environment and work effectively.
[0003] As long as today’s automation relies on the traditional PLC, the integration of advanced machines and sensors with the PLC will be slow, difficult, and will not exploit the full potential that new technologies have to offer. Typical difficulties in integrating PLC with modern automation equipment include:
[0004] - Tedious programming of PLC at Function-level (e.g. Function Block (FB),
Organization Block (OB)) while machines/sensors understand “goals” [0005] - Translate sensor inputs and command outputs as “Tags” (the process image/data in PLC)
[0006] - “Micromanaging” smart machines and sensors as dumb devices
[0007] - Functionality needs to be manually abstracted by the automation engineer.
These abstracted FBs provide a reusable function but it is typically reusable only for that environment and for that factory.
[0008] Therefore, the traditional PLC being the brain of the factory needs an upgrade.
SUMMARY
[0009] Briefly described, aspects of the present invention relate to a revolutionary Skill Logic Controller (SLC) to replace today’s Programmable Logic Controller (PLC). A Skill Logic Controller (SLC) has an automation main component as a skill for controlling an automation process by integrating the SLC with other automation devices. Integration of smart sensors and tools such as intelligent robots and AI-based camera in the factory floor with a SLC is not a dauting task. The automation main component in a SLC is a Skill. A skill is a self-contained algorithm that takes inputs from the environment (e.g. sensors) and give commands in sequence to machines to accomplish a task. The skill can also interact with its environment (sensors, machines) or other skills by asking questions and seek clarifications. Advantages include more flexibility to create reusable automation programs compared to the traditional PLC, better interoperability with state-of-the-art automation equipment and a new skills editor to compose skill programs faster. Technical features include combination of programmed and learned skills, discovery of skills from automation devices, on-the-fly composition of skill programs using microservices and skill interface as opposed to Function Block (FB) interface. A new engineering tool is provided which makes it easy-to-integrate software and hardware.
[0010] In accordance with one illustrative embodiment of the present invention, a Skill Logic Controller (SLC) is provided for controlling an automation process. The SLC comprises a Central Processor Unit (CPU) module including a processor and an accessible memory storing an automation main component as a programmed skill. The programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to smart machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and the smart machines or other programmed skills by asking questions and seeking clarifications. The accessible memory stores a SLC program comprising software instructions that when executed by the processor are configured to provide combinatorial logic and sequential control for the automation process. The combinatorial logic performs logical operations on parameters that have binary states and operate devices with binary states. The sequential control determines an order in which actions follow one another.
[0011] In accordance with another illustrative embodiment of the present invention, a method is provided for controlling an automation process. The method comprises a step of providing a Skill Logic Controller (SLC) comprising a Central Processor Unit (CPU) module including a processor and an accessible memory storing an automation main component as a programmed skill. The programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to smart machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and the smart machines or other programmed skills by asking questions and seeking clarifications. The accessible memory stores a SLC program comprising software instructions that when executed by the processor are configured to provide combinatorial logic and sequential control for the automation process. The combinatorial logic performs logical operations on parameters that have binary states and operate devices with binary states. The sequential control determines an order in which actions follow one another.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIG. 1 illustrates a block diagram of a Skill Logic Controller (SLC) in accordance with an exemplary embodiment of the present invention. [0013] FIG. 2 illustrates a block diagram of a program cycle of the SLC in accordance with an exemplary embodiment of the present invention.
[0014] FIG. 3 illustrates a block diagram of a skill in accordance with an exemplary embodiment of the present invention.
[0015] FIG. 4 illustrates a block diagram of a skill-to-skill interaction in accordance with an exemplary embodiment of the present invention.
[0016] FIG. 5 illustrates a block diagram of an automation system in accordance with an exemplary embodiment of the present invention.
[0017] FIG. 6 illustrates an architecture and interactions among a Skill Logic Controller, Skill Engineering Environment, a smart sensor and a smart machine in accordance with an exemplary embodiment of the present invention. [0018] FIG. 7 illustrates an example in which a SLC works with skills of a camera/inspection station, a robot station and a conveyor system in accordance with an exemplary embodiment of the present invention.
[0019] FIG. 8 illustrates a system architecture for SLC in accordance with an exemplary embodiment of the present invention.
[0020] FIGs. 9-11 illustrate schematic views of wiring skills on-the-fly in accordance with an exemplary embodiment of the present invention.
[0021] FIG. 12 illustrates a schematic view of a flow chart of a method for providing a Skill Logic Controller (SLC) to control an automation process in accordance with an exemplary embodiment of the present invention.
[0022] FIG. 13 shows an example of a computing environment within which embodiments of the disclosure may be implemented.
DETAILED DESCRIPTION [0023] To facilitate an understanding of embodiments, principles, and features of the present invention, they are explained hereinafter with reference to implementation in illustrative embodiments. In particular, they are described in the context of a Skill Logic Controller (SLC) to control an automation process. Embodiments of the present invention, however, are not limited to use in the described devices or methods. [0024] The components and materials described hereinafter as making up the various embodiments are intended to be illustrative and not restrictive. Many suitable components and materials that would perform the same or a similar function as the materials described herein are intended to be embraced within the scope of embodiments of the present invention. [0025] These and other embodiments of an automation system according to the present disclosure are described below with reference to FIGs. 1-10 herein. Like reference numerals used in the drawings identify similar or identical elements throughout the several views. The drawings are not necessarily drawn to scale. [0026] Consistent with one embodiment of the present invention, FIG. 1 represents a block diagram of a Skill Logic Controller (SLC) 105 for controlling an automation process 106 in accordance with an exemplary embodiment of the present invention. The SLC 105 comprises a Central Processor Unit (CPU) module 107 including a processor 110. The SLC 105 further comprises an accessible memory 112 for storing an automation main component (AMC) 115 as one or more programmed skills 115(l-m). The programmed skill 115(1) is a self-contained algorithm that takes inputs 117 from an environment such as smart sensors 118(1) and give commands in sequence to smart machines 118(2) to accomplish a task such that the programmed skill 115(1) is configured to interact with its environment such as the smart sensors 118(1) and the smart machines 118(2) or other programmed skills by asking questions and seeking clarifications.
[0027] The accessible memory 112 further stores a SLC program 120 comprising software instructions 122 that when executed by the processor 110 are configured to provide combinatorial logic 125(1) and sequential control 125(2) for the automation process 106. The combinatorial logic 125(1) performs logical operations on parameters that have binary states and operate devices with binary states. The sequential control 125(2) determines an order in which actions follow one another.
[0028] The SLC 105 further comprises a plurality of input modules 127(l-n) to receive input signals 130(1-2) by the SLC 105 such that the input signals 130(1-2) are conditioned and converted to digital input data values 132(1) compatible with the Central Processor (CPU) module 107. The SLC 105 further comprises a first buffer 135(1) to store an input image 137(1) formed by the digital input data values 132(1). The SLC program 120 determines a logical operation performed on a state of the digital input data values 132(1).
[0029] The SLC 105 further comprises an input skill interface 140(1) to provide an interface between an engineering system 142(1), a skill library 142(2), a skill composer 142(3) and the first buffer 135(1). The engineering system 142(1) or an engineering environment is configured to populate the skill library 142(2) of skills by querying available programmed and learned skills from the smart sensors 118(1), the smart machines 118(2) and other SLCs. The skill composer 142(3) provides a graphical editor for an automation engineer to program the SLC 105 with skills.
[0030] The SLC 105 further comprises learned skills 145(l-m) learned using artificial intelligence which enables the SLC 105 to learn during runtime a new and more efficient way to perform a task and change the wiring of skills on-the-fly.
[0031] The SLC 105 further comprises a second buffer 135(2) to store an output image 137(2) formed by digital output data values 132(2). The digital input data values 132(1) and the SLC program 120 determine how the processor 110 is to set the digital output data values 132(2). The SLC 105 further comprises an output skill interface 140(2) to provide an interface between the engineering system 142(1), the skill library 142(2), the skill composer 142(3) and the second buffer 135(2). The SLC 105 further comprises a plurality of output modules 150(l-n) to convert the output image 137(2) into electrical control signals 152(1-2) that operate the smart sensors 118(1) and the smart machines 118(2) under control. The smart sensors 118(1) and the smart machines 118(2) host one or more programmed and learned skills that specify how to operate itself.
[0032] Referring to FIG. 2, it illustrates a block diagram of a program cycle 205 of the SLC 105 in accordance with an exemplary embodiment of the present invention. The SLC 105 might control a machine or a process in real-time by continuously looping a five-step sequence shown by an arrow 207. FIG. 2 describes this process as a "read- execute-write" cycle. Before the "read-execute- write" cycle in the program cycle 205 skills are input via a skill interface in step 210. Likewise, inputs are provided via input modules in step 212. After the "read-execute- write" cycle in the program cycle 205 output modules provide outputs in step 215. Skills are written in step 217 and provided further by a skill interface in step 220.
[0033] The "read-execute-write" cycle of the program cycle 205 includes a step 222 of reading input image. Next step is a step 225 of reading skill interface. At step 227, a control or user program is executed. Then, at step 230, skills are executed. Output image is written next is step 232.
[0034] All external inputs and outputs are directly connected to the SLC 105 at the same time (in parallel). At the start of the program cycle 205 a 'snapshot' is taken of the 'input image' which has a one-to-one mapping to the external inputs at the sampling instant. Taking a snapshot of the inputs solves the problem of false states arising from changing inputs. If the same input is read at two different points in the program, it is possible that the input could change state between processing the two points and affect the output. The snapshot ensures that all inputs are consistent throughout the course of the program cycle 205.
[0035] The control or user program is executed in its entirety from start to finish using the data obtained from the input image and the skill interface. The results obtained by the executed program are saved as an 'output image', having a one-to-one mapping to the external outputs. Finally, the output image is written to the SLC’s physical output interfaces including the skill interface. The whole process is then repeated in a continuous cycle known as the "scan cycle". The cyclic program conforms to definition of "classical sequential programming" where "actions are strictly ordered as a time sequence". The SLC 105 processes the control or user program in a strict unaltered sequence and all actions are carried out: the SLC 105 reads the inputs and applies the entire fixed length program, then sets the outputs accordingly.
[0036] All of these actions are fixed in terms of duration and determine the 'scan time' of the SLC 105. Although the scan time is a function of the program length, and assuming that no conditional jumps are used in the logic, its duration remains fixed and therefore so does the response time, making the SLC 105 a 'real-time' control device.
[0037] Turning now to FIG. 3, it illustrates a block diagram of a skill 305 in accordance with an exemplary embodiment of the present invention. An automation main component in the SLC 105 is the skill 305. The skill 305 is a self-contained algorithm that takes inputs from the environment (e.g. sensors) and give commands in sequence to smart machines to accomplish a task. The skill 305 can also interact with its environment (sensors, machines) or other skills by asking questions and seeking clarifications. [0038] FIG. 4 illustrates a block diagram of a skill-to-skill interaction in accordance with an exemplary embodiment of the present invention. One skill 405(1) can also be linked to another skill 405(2). The outputs of one skill is the inputs of another skill.
[0039] As seen in FIG. 5, it illustrates a block diagram of an automation system 505 in accordance with an exemplary embodiment of the present invention. The automation system 505 includes a programming environment 507, a SLC 510, smart machines 512 and smart sensors 515. The programming environment 507 includes a skill composer 517, a skill library 520 and a skill deployment 522. The SLC 510 includes a plurality of programmed skills 525(1 -n) and a plurality of learned skills 527(1 -n).
[0040] As shown in the FIG. 5, the SLC 510 hosts multiple programmed kills 525(1 - n), which are programmed with the skill composer 517. When the SLC 510 is in operation, it interacts with the smart sensors 515 and the smart machines 512 to accomplish a task specified by the skills 525(l-n), 527(l-n). Not all the skills 525(l-n), 527(1 -n) are available during programming. Some skills can be learned as the learned skills 527(1 -n) using artificial intelligence. Regardless of whether it is a learned or a programmed skill, an interface between a skill 525(1), or 527(1) and the smart machines 512 and the smart sensors 515 remains the same. Skills 525(l-n), 527(l-n) are microservices that can compose themselves at engineering time and at runtime as long as their inputs and outputs have compatible types. This enables the ability of the SLC 510 to “learn” during runtime a new and more efficient way to perform a task and change the wiring of the skills 525(l-n), 527(l-n) on-the-fly.
[0041] As shown in FIG. 6, it illustrates an architecture and interactions among a SLC 605, a Skill Engineering Environment 607, a smart sensor 610 and a smart machine 612 in accordance with an exemplary embodiment of the present invention. The Skill Engineering Environment 607 includes a skill composer 617, a skill library 620 and a skill deployment component 622. The skill composer 617 links a sensor skill (S-Skill) 625 with a skill 627 and a learned skill 630 from another SLC and triggers a machine skill (M-Skill) 632 to complete an operation. The skill library 620 includes programmed skills 627, 635(1), learned skills 637(1-3), 630 the sensor skill (S-Skill) 625 and the machine skill (M- Skill) 632. The skill deployment component 622 is configured to deploy skill programs to the SLC 605 once engineering is completed.
[0042] The Skill Engineering Environment 607 is the interface with automation engineers. The environment will populate a library of skills by querying the available skills from sensors, machines and other SLCs. The skill composer 617 provides a graphical editor for the automation engineer to program the SLC 605 with skills. In the above example, it links a sensor skill (S-Skill) with a skill and a learned skill from another SLC, and eventually triggers a machine skill (M-Skill) to complete the operation. Once the engineering is completed, the skill programs are deployed to the SLC 605 through the skill deployment component 622.
[0043] The SLC 605 further comprises an onboard native skill library 640 including native skills 642(1-2) since manufacturing and learned skills 645(1-2) over a period of SLC operation. The SLC 605 further comprises a runtime environment 650 that holds skill programs engineered and deployed from the Skill Engineering Environment 607 such that during a runtime execution the SLC 605 interacts with the smart sensor 610 and the smart machine 612 to complete an operational task. The smart sensor 610 includes learned skills 652(1), 637(1), a sensor skill (S-Skill) 655(1) and the sensor skill (S-Skill) 625. The smart machine 612 includes learned skills 657(1), 637(2), a machine skill (M- Skill) 660(1) and the machine skill (M-Skill) 632.
[0044] The smart sensor 610 and the smart machine 612 host the skills that specify how to operate itself. For instance, a smart camera will have a skill to take a photo with high resolution. A smart milling machine can expose a skill for milling operation with input parameters.
[0045] In FIG. 7, it illustrates an example in which a SLC 705 works with skills 707 (1-5) of a camera/inspection station 710, a robot station 712 and a conveyor system 715 in accordance with an exemplary embodiment of the present invention. FIG. 7 shows a concrete example using the SLC 705 and smart sensors/machines. In this example, the camera/inspection station 710 can offer two skills: (i) a skill 707(1) for taking a photo at a specific frame per second (fps) e.g. 30, 60 or 120 fps; (ii) a skill 707(2) to detect a defective product. The robot station 712 can also offer two skills: (i) a skill 707(3) to pick the object from a tray on the conveyor; (ii) a skill 707(4) to drop the object at the table of the robot station 712. The conveyor system 715 can offer a move skill 707(5), which moves the tray on the conveyor from one station (e.g. inspection station) to another station (e.g. robot station).
[0046] Using these skills 707 (1-5) from the devices, a skill program 725 can be composed and deployed to the SLC 705. As shown in the diagram in the SLC 705, an engineer can simply wire among different skills 707 (1-5) to compose the skill program 725 that: (1) takes a photo, (2) detects the a defective product, (3) moves a tray to another station, (4) picks the tray, and (5) places the tray in a buffer.
[0047] With regard to FIG. 8, it illustrates a system architecture 805 for a SLC in accordance with an exemplary embodiment of the present invention. The system architecture 805 comprises a bottom layer 807 for communication, a top layer 810 of interaction and a middle layer 822 including a scheduler 812, tasks 815, a process image 817 and services 820. [0048] FIGs. 9-11 illustrate schematic views of wiring skills on-the-fly in accordance with an exemplary embodiment of the present invention. In FIG. 9, a high level skill such as "move object" is mapped to lower level skills such as "conveyor move" and "pick_and_place". Skill matching is performed via type checking. In this example, the "conveyor move" and "pick_and_place" map to the "move object" skill because of the input/output type matching.
[0049] Similarly, in FIG. 10, the "pick_and_place" skill can be decomposed into compositions of lower level skills provided by the machines to the engineering system. For example, Machine 1 may realize the "pick_and_place" skill via <grasp, lift, move, lift, lift> composition of skills. Machine 2 may realize the "pick_and_place" via a <pick, lift, move, lift, lift> composition of skills that applies a concentric force on the object with the arm's gripper. Notice that Machine 2 requires a robotic "arm" to perform the action whereas Machine 1 does not require any additional equipment. Similarly, the Machine 3 requires two robotic "arms" to "push" the object from opposite sides and is realized via a <(push | push), lift, move, lift, lift> composition of skills. In addition, note that the machine-specific skills require a "box" as an input and the higher-level skill "pick_and_place" specifies an "object" as an input. In this case, the system supports the specialization of inputs and outputs. Meaning that "box" is a subtype of "object".
[0050] The on-the-fly wiring of skills comes into play when a task must be performed, but the environmental conditions change. For example, if during normal operation Machine 2 goes offline, then the rewiring of the skill program consists of: (1) finding an execution alternative, and (2) switching the execution to the alternative. In this case, Machine 3 or Machine 1 can take over that task assuming all the inputs and conditions are satisfied and the hardware is available. The system can re-wire the skills on-the-fly while a machine is in malfunction/maintenance, or the plant owner wants to balance the load of machines, or simply to save energy.
[0051] So far, the compositions of skills presented are composed by experts. That is, they are engineered apriori the operation. However, the on-the-fly skills wiring is useful to compose skills during operation. To illustrate this, consider in FIG. 11 the case when the "move object" cannot be fulfilled by any of the machines because the distance d between the object's initial position (specified in the input) and the object's final position (specified by the output) is greater than the reach of the robot (Machine 1). This is:
[0052] Reach(robot, A, B) < d
[0053] In this case, the learned skills come into play. Using, for example, reinforcement learning, the system can learn to use a robotic arm in combination with a conveyor and compose <pick_and_place, start conveyor, stop conveyor, pick_and_place >. Another alternative is to use two robotic arms to throw and catch the object and thus composing a <throw, catch> program during operation further decomposed into a <pick_for_throw, catch_from_throw> skill program.
[0054] The whole skill-software may be downloaded from the devices into the engineering system (and e.g., later on transfer the complete code to the SLC 105). Downloading the entire skill software from the device to the engineering system has the advantage of allowing changes to the skill. This can be used to adapt skills, or to change their specification. Simply downloading the skill-access-information may be inflexible. In the present invention, having the ability to change a skill code gives flexibility to fix bugs, improve performance, and make changes.
[0055] In the present invention, a skill is a software component that does not belong to a fixed asset or equipment (e.g. a machine or a controller). With this concept, the skill can be programmed, improved, patched, and upgraded or downgraded anywhere, at anytime. A smart machine or a smart sensor brings in its initial skills from OEM. But later on, the skills can be improved during engineering, or during operation via learning. The improved skills can be redeployed back to the engineering environment or the smart machine if needed. If a new security vulnerability is discovered later, the security patch can be applied even during runtime.
[0056] FIG. 12 illustrates a schematic view of a flow chart of a method 1200 for providing the SLC 105 to control an automation process in accordance with an exemplary embodiment of the present invention. Reference is made to the elements and features described in FIGs. 1-11. It should be appreciated that some steps are not required to be performed in any particular order, and that some steps are optional. [0057] The method 1200 performed by an automation system comprises a step 1205 of providing the SLC 105 comprising a Central Processor Unit (CPU) module including a processor and an accessible memory. The accessible memory storing an automation main component as a programmed skill. The programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and machines or other programmed skills by asking questions and seeking clarifications.
[0058] The accessible memory further storing a SLC program comprising software instructions that when executed by the processor are configured to provide combinatorial logic and sequential control for the automation process. The combinatorial logic performs logical operations on parameters that have binary states and operate devices with binary states. The sequential control determines an order in which actions follow one another.
[0059] While “a Skill Logic Controller (SLC)” based on “skills” is described here a range of one or more other Industrial Control Systems, or other forms of edge devices are also contemplated by the present invention. For example, other types of Industrial Control Systems may be implemented based on one or more features presented above without deviating from the spirit of the present invention.
[0060] The techniques described herein can be particularly useful for automation systems. While particular embodiments are described in terms of the automation system, the techniques described herein are not limited to automation system but can also be used with other systems. [0061] FIG. 13 shows an example of a computing environment within which embodiments of the disclosure may be implemented. For example, this computing environment 1300 may be configured to execute the automation system discussed above with reference to FIG. 1 or to execute portions of the method 1200 described above with respect to FIG. 12. Computers and computing environments, such as computer system 1310 and computing environment 1300, are known to those of skill in the art and thus are described briefly here.
[0062] As shown in FIG. 13, the computer system 1310 may include a communication mechanism such as a bus 1321 or other communication mechanism for communicating information within the computer system 1310. The computer system 1310 further includes one or more processors 1320 coupled with the bus 1321 for processing the information. The processors 1320 may include one or more central processing units (CPUs), graphical processing units (GPUs), or any other processor known in the art.
[0063] The computer system 1310 also includes a system memory 1330 coupled to the bus 1321 for storing information and instructions to be executed by processors 1320. The system memory 1330 may include computer readable storage media in the form of volatile and/or nonvolatile memory, such as read only memory (ROM) 1331 and/or random access memory (RAM) 1332. The system memory RAM 1332 may include other dynamic storage device(s) (e.g., dynamic RAM, static RAM, and synchronous DRAM). The system memory ROM 1331 may include other static storage device(s) (e.g., programmable ROM, erasable PROM, and electrically erasable PROM). In addition, the system memory 1330 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processors 1320. A basic input/output system (BIOS) 1333 containing the basic routines that helps to transfer information between elements within computer system 1310, such as during start-up, may be stored in ROM 1331. RAM 1332 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by the processors 1320. System memory 1330 may additionally include, for example, operating system 1334, application programs 1335, other program modules 1336 and program data 1337. [0064] The computer system 1310 also includes a disk controller 1340 coupled to the bus 1321 to control one or more storage devices for storing information and instructions, such as a hard disk 1341 and a removable media drive 1342 (e.g., floppy disk drive, compact disc drive, tape drive, and/or solid state drive). The storage devices may be added to the computer system 1310 using an appropriate device interface (e.g., a small computer system interface (SCSI), integrated device electronics (IDE), Universal Serial Bus (USB), or FireWire).
[0065] The computer system 1310 may also include a display controller 1365 coupled to the bus 1321 to control a display 1366, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information to a computer user. The computer system includes an input interface 1360 and one or more input devices, such as a keyboard 1362 and a pointing device 1361, for interacting with a computer user and providing information to the processor 1320. The pointing device 1361, for example, may be a mouse, a trackball, or a pointing stick for communicating direction information and command selections to the processor 1320 and for controlling cursor movement on the display 1366. The display 1366 may provide a touch screen interface which allows input to supplement or replace the communication of direction information and command selections by the pointing device 1361.
[0066] The computer system 1310 may perform a portion or all of the processing steps of embodiments of the invention in response to the processors 1320 executing one or more sequences of one or more instructions contained in a memory, such as the system memory 1330. Such instructions may be read into the system memory 1330 from another computer readable medium, such as a hard disk 1341 or a removable media drive 1342. The hard disk 1341 may contain one or more datastores and data files used by embodiments of the present invention. Datastore contents and data files may be encrypted to improve security. The processors 1320 may also be employed in a multi processing arrangement to execute the one or more sequences of instructions contained in system memory 1330. In alternative embodiments, hard- wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
[0067] As stated above, the computer system 1310 may include at least one computer readable medium or memory for holding instructions programmed according to embodiments of the invention and for containing data structures, tables, records, or other data described herein. The term “computer readable medium” as used herein refers to any medium that participates in providing instructions to the processor 1320 for execution. A computer readable medium may take many forms including, but not limited to, non-volatile media, volatile media, and transmission media. Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto optical disks, such as hard disk 1341 or removable media drive 1342. Non-limiting examples of volatile media include dynamic memory, such as system memory 1330. Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up the bus 1321. Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
[0068] The computing environment 1300 may further include the computer system 1010 operating in a networked environment using logical connections to one or more remote computers, such as remote computer 1380. Remote computer 1380 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 1310. When used in a networking environment, computer system 1310 may include modem 1372 for establishing communications over a network 1371, such as the Internet. Modem 1372 may be connected to bus 1321 via user network interface 1370, or via another appropriate mechanism.
[0069] Network 1371 may be any network or system generally known in the art, including the Internet, an intranet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communication between computer system 1310 and other computers (e.g., remote computer 1380). The network 1371 may be wired, wireless or a combination thereof. Wired connections may be implemented using Ethernet, Universal Serial Bus (USB), RJ-11 or any other wired connection generally known in the art. Wireless connections may be implemented using Wi-Fi, WiMAX, and Bluetooth, infrared, cellular networks, satellite or any other wireless connection methodology generally known in the art. Additionally, several networks may work alone or in communication with each other to facilitate communication in the network 1371. [0070] In some embodiments, the computer system 1310 may be utilized in conjunction with a parallel processing platform comprising a plurality of processing units. This platform may allow parallel execution of one or more of the tasks associated with optimal design generation, as described above. For the example, in some embodiments, execution of multiple product lifecycle simulations may be performed in parallel, thereby allowing reduced overall processing times for optimal design selection.
[0071] The embodiments of the present disclosure may be implemented with any combination of hardware and software. In addition, the embodiments of the present disclosure may be included in an article of manufacture (e.g., one or more computer program products) having, for example, computer-readable, non-transitory media. The media has embodied therein, for instance, computer readable program code for providing and facilitating the mechanisms of the embodiments of the present disclosure. The article of manufacture can be included as part of a computer system or sold separately.
[0072] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
[0073] An executable application, as used herein, comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters.
[0074] A graphical user interface (GUI), as used herein, comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions. The GUI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the GUI display images. These signals are supplied to a display device which displays the image for viewing by the user. The processor, under control of an executable procedure or executable application, manipulates the GUI display images in response to signals received from the input devices. In this way, the user may interact with the display image using the input devices, enabling user interaction with the processor or other device.
[0075] The functions and process steps herein may be performed automatically or wholly or partially in response to user command. An activity (including a step) performed automatically is performed in response to one or more executable instructions or device operation without user direct initiation of the activity.
[0076] The system and processes of the figures are not exclusive. Other systems, processes and menus may be derived in accordance with the principles of the invention to accomplish the same objectives. Although this invention has been described with reference to particular embodiments, it is to be understood that the embodiments and variations shown and described herein are for illustration purposes only. Modifications to the current design may be implemented by those skilled in the art, without departing from the scope of the invention. As described herein, the various systems, subsystems, agents, managers and processes can be implemented using hardware components, software components, and/or combinations thereof.
[0077] Computer readable medium instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field- programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure. [0078] Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, may be implemented by computer readable medium instructions.
[0079] It should be appreciated that the program modules, applications, computer- executable instructions, code, or the like depicted in FIG. 13 as being stored in the system memory are merely illustrative and not exhaustive and that processing described as being supported by any particular module may alternatively be distributed across multiple modules or performed by a different module. In addition, various program module(s), script(s), plug-in(s), Application Programming Interface(s) (API(s)), or any other suitable computer-executable code hosted locally on the computer system 1310, the remote device, and/or hosted on other computing device(s) accessible via one or more of the network(s), may be provided to support functionality provided by the program modules, applications, or computer-executable code depicted in FIG. 13 and/or additional or alternate functionality. Further, functionality may be modularized differently such that processing described as being supported collectively by the collection of program modules depicted in FIG. 13 may be performed by a fewer or greater number of modules, or functionality described as being supported by any particular module may be supported, at least in part, by another module. In addition, program modules that support the functionality described herein may form part of one or more applications executable across any number of systems or devices in accordance with any suitable computing model such as, for example, a client-server model, a peer-to-peer model, and so forth. In addition, any of the functionality described as being supported by any of the program modules depicted in FIG. 13 may be implemented, at least partially, in hardware and/or firmware across any number of devices.
[0080] It should further be appreciated that the computer system 1310 may include alternate and/or additional hardware, software, or firmware components beyond those described or depicted without departing from the scope of the disclosure. More particularly, it should be appreciated that software, firmware, or hardware components depicted as forming part of the computer system 1310 are merely illustrative and that some components may not be present or additional components may be provided in various embodiments. While various illustrative program modules have been depicted and described as software modules stored in system memory, it should be appreciated that functionality described as being supported by the program modules may be enabled by any combination of hardware, software, and/or firmware. It should further be appreciated that each of the above-mentioned modules may, in various embodiments, represent a logical partitioning of supported functionality. This logical partitioning is depicted for ease of explanation of the functionality and may not be representative of the structure of software, hardware, and/or firmware for implementing the functionality. Accordingly, it should be appreciated that functionality described as being provided by a particular module may, in various embodiments, be provided at least in part by one or more other modules. Further, one or more depicted modules may not be present in certain embodiments, while in other embodiments, additional modules not depicted may be present and may support at least a portion of the described functionality and/or additional functionality. Moreover, while certain modules may be depicted and described as sub- modules of another module, in certain embodiments, such modules may be provided as independent modules or as sub-modules of other modules.
[0081] Although specific embodiments of the disclosure have been described, one of ordinary skill in the art will recognize that numerous other modifications and alternative embodiments are within the scope of the disclosure. For example, any of the functionality and/or processing capabilities described with respect to a particular device or component may be performed by any other device or component. Further, while various illustrative implementations and architectures have been described in accordance with embodiments of the disclosure, one of ordinary skill in the art will appreciate that numerous other modifications to the illustrative implementations and architectures described herein are also within the scope of this disclosure. In addition, it should be appreciated that any operation, element, component, data, or the like described herein as being based on another operation, element, component, data, or the like can be additionally based on one or more other operations, elements, components, data, or the like. Accordingly, the phrase “based on,” or variants thereof, should be interpreted as “based at least in part on.”
[0082] Although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments could include, while other embodiments do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements, and/or steps are included or are to be performed in any particular embodiment.
[0083] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure.
In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
[0084] While embodiments of the present invention have been disclosed in exemplary forms, it will be apparent to those skilled in the art that many modifications, additions, and deletions can be made therein without departing from the spirit and scope of the invention and its equivalents, as set forth in the following claims.
[0085] Embodiments and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known starting materials, processing techniques, components and equipment are omitted so as not to unnecessarily obscure embodiments in detail. It should be understood, however, that the detailed description and the specific examples, while indicating preferred embodiments, are given by way of illustration only and not by way of limitation. Various substitutions, modifications, additions and/or rearrangements within the spirit and/or scope of the underlying inventive concept will become apparent to those skilled in the art from this disclosure.
[0086] As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, article, or apparatus.
[0087] Additionally, any examples or illustrations given herein are not to be regarded in any way as restrictions on, limits to, or express definitions of, any term or terms with which they are utilized. Instead, these examples or illustrations are to be regarded as being described with respect to one particular embodiment and as illustrative only. Those of ordinary skill in the art will appreciate that any term or terms with which these examples or illustrations are utilized will encompass other embodiments which may or may not be given therewith or elsewhere in the specification and all such embodiments are intended to be included within the scope of that term or terms.
[0088] In the foregoing specification, the invention has been described with reference to specific embodiments. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of invention.
[0089] Although the invention has been described with respect to specific embodiments thereof, these embodiments are merely illustrative, and not restrictive of the invention. The description herein of illustrated embodiments of the invention is not intended to be exhaustive or to limit the invention to the precise forms disclosed herein (and in particular, the inclusion of any particular embodiment, feature or function is not intended to limit the scope of the invention to such embodiment, feature or function). Rather, the description is intended to describe illustrative embodiments, features and functions in order to provide a person of ordinary skill in the art context to understand the invention without limiting the invention to any particularly described embodiment, feature or function. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes only, various equivalent modifications are possible within the spirit and scope of the invention, as those skilled in the relevant art will recognize and appreciate. As indicated, these modifications may be made to the invention in light of the foregoing description of illustrated embodiments of the invention and are to be included within the spirit and scope of the invention. Thus, while the invention has been described herein with reference to particular embodiments thereof, a latitude of modification, various changes and substitutions are intended in the foregoing disclosures, and it will be appreciated that in some instances some features of embodiments of the invention will be employed without a corresponding use of other features without departing from the scope and spirit of the invention as set forth. Therefore, many modifications may be made to adapt a particular situation or material to the essential scope and spirit of the invention.
[0090] Respective appearances of the phrases "in one embodiment," "in an embodiment," or "in a specific embodiment" or similar terminology in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any particular embodiment may be combined in any suitable manner with one or more other embodiments. It is to be understood that other variations and modifications of the embodiments described and illustrated herein are possible in light of the teachings herein and are to be considered as part of the spirit and scope of the invention.
[0091] In the description herein, numerous specific details are provided, such as examples of components and/or methods, to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that an embodiment may be able to be practiced without one or more of the specific details, or with other apparatus, systems, assemblies, methods, components, materials, parts, and/or the like. In other instances, well-known structures, components, systems, materials, or operations are not specifically shown or described in detail to avoid obscuring aspects of embodiments of the invention. While the invention may be illustrated by using a particular embodiment, this is not and does not limit the invention to any particular embodiment and a person of ordinary skill in the art will recognize that additional embodiments are readily understandable and are a part of this invention.
[0092] It will also be appreciated that one or more of the elements depicted in the drawings/figures can also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application.
[0093] Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component.

Claims

What is claimed is:
1. A Skill Logic Controller (SLC) for controlling an automation process, the SLC comprising: a Central Processor Unit (CPU) module including a processor; and an accessible memory storing: an automation main component as a programmed skill, wherein the programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to smart machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and the smart machines or other programmed skills by asking questions and seeking clarifications, and a SLC program comprising software instructions that when executed by the processor are configured to: provide combinatorial logic and sequential control for the automation process, wherein the combinatorial logic performs logical operations on parameters that have binary states and operates devices with binary states, and wherein the sequential control determines an order in which actions follow one another.
2. The SLC of claim 1, further comprising: a plurality of input modules to receive input signals by the SLC such that the input signals are conditioned and converted to digital input data values compatible with the Central Processor (CPU) module.
3. The SLC of claim 1, further comprising: a first buffer to store an input image formed by the digital input data values, wherein the SLC program determines a logical operation performed on a state of the digital input data values.
4. The SLC of claim 3, further comprising: an input skill interface to provide an interface between an engineering system, a skill library, a skill composer and the first buffer.
5. The SLC of claim 1, further comprising: a learned skill learned using artificial intelligence which enables the SLC to learn during runtime a new and more efficient way to perform a task and change the wiring of skills on-the-fly.
6. The SLC of claim 1, further comprising: a second buffer to store an output image formed by digital output data values, wherein the digital input data values and the SLC program determine how the processor is to set the digital output data values.
7. The SLC of claim 6, further comprising: an output skill interface to provide an interface between an engineering system, a skill library, a skill composer and the second buffer.
8. The SLC of claim 7, further comprising: a plurality of output modules to convert the output image into electrical control signals that operate the smart sensors and the smart machines under control.
9. The SLC of claim 8, wherein the smart sensors and the smart machines host one or more programmed and learned skills that specify how to operate themselves.
10. The SLC of claim 7, wherein the engineering system or an engineering environment is configured to populate the skill library of skills by querying available programmed and learned skills from the smart sensors, the smart machines and other SLCs.
11. The SLC of claim 7, wherein the skill composer provides a graphical editor for an automation engineer to program the SLC with skills.
12. The SLC of claim 11, wherein the skill composer links a sensor skill (S-Skill) with a skill and a learned skill from another SLC and triggers a machine skill (M-Skill) to complete an operation.
13. The SLC of claim 1, wherein a skill deployment component to deploy skill programs to the SLC once engineering is completed.
14. The SLC of claim 1, further comprising: an onboard native skill library including native skills since manufacturing and learned skills over a period of SLC operation.
15. The SLC of claim 1, further comprising: a runtime environment that holds skill programs engineered and deployed from a skill engineering environment such that during a runtime execution the SLC interacts with the smart sensors and the smart machines to complete an operational task.
16. The SLC of claim 1, wherein the SLC can re-wire skills on-the-fly while a machine is in malfunction or maintenance, or a plant owner wants to balance load of machines, or simply to save energy.
17. The SLC of claim 16, wherein on-the-fly wiring of skills comes into play when a task must be performed but environmental conditions change.
18. The SLC of claim 16, wherein if during normal operation a machine goes offline, then rewiring of a skill program consists of: (1) finding an execution alternative, and (2) switching the execution to the execution alternative.
19. A method for controlling an automation process, the method comprising: providing a Skill Logic Controller (SLC) comprising: a Central Processor Unit (CPU) module including a processor; and an accessible memory storing: an automation main component as a programmed skill, wherein the programmed skill is a self-contained algorithm that takes inputs from an environment such as smart sensors and give commands in sequence to smart machines to accomplish a task such that the programmed skill is configured to interact with its environment such as the smart sensors and the smart machines or other programmed skills by asking questions and seeking clarifications, and a SLC program comprising software instructions that when executed by the processor are configured to: provide combinatorial logic and sequential control for the automation process, wherein the combinatorial logic performs logical operations on parameters that have binary states and operates devices with binary states, and wherein the sequential control determines an order in which actions follow one another.
20. The method of claim 19, further comprising: providing an input skill interface to provide an interface between an engineering system, a skill library, a skill composer and a first buffer; and providing an output skill interface to provide an interface between the engineering system, the skill library, the skill composer and a second buffer.
21. The method of claim 19, wherein the smart sensors and the smart machines host one or more programmed and learned skills that specify how to operate themselves.
22. The method of claim 21, wherein the engineering system or an engineering environment is configured to populate the skill library of skills by querying available programmed and learned skills from the smart sensors, the smart machines and other SLCs.
23. The method of claim 19, further comprising: providing an onboard native skill library including native skills since manufacturing and learned skills over a period of SLC operation; and providing a runtime environment that holds skill programs engineered and deployed from a skill engineering environment such that during a runtime execution the SLC interacts with the smart sensors and the smart machines to complete an operational task.
PCT/US2020/033052 2020-05-15 2020-05-15 Skill logic controller (slc) WO2021230876A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/033052 WO2021230876A1 (en) 2020-05-15 2020-05-15 Skill logic controller (slc)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/033052 WO2021230876A1 (en) 2020-05-15 2020-05-15 Skill logic controller (slc)

Publications (1)

Publication Number Publication Date
WO2021230876A1 true WO2021230876A1 (en) 2021-11-18

Family

ID=71078585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/033052 WO2021230876A1 (en) 2020-05-15 2020-05-15 Skill logic controller (slc)

Country Status (1)

Country Link
WO (1) WO2021230876A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018176025A1 (en) * 2017-03-24 2018-09-27 Siemens Aktiengesellschaft System and method for engineering autonomous systems
US20200030979A1 (en) * 2017-04-17 2020-01-30 Siemens Aktiengesellschaft Mixed Reality Assisted Spatial Programming of Robotic Systems

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018176025A1 (en) * 2017-03-24 2018-09-27 Siemens Aktiengesellschaft System and method for engineering autonomous systems
US20200030979A1 (en) * 2017-04-17 2020-01-30 Siemens Aktiengesellschaft Mixed Reality Assisted Spatial Programming of Robotic Systems

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BOADA M J L ET AL: "Visual approach skill for a mobile robot using learning and fusion of simple skills", ROBOTICS AND AUTONOMOUS SYSTEMS, ELSEVIER BV, AMSTERDAM, NL, vol. 38, no. 3-4, 31 March 2002 (2002-03-31), pages 157 - 170, XP004344527, ISSN: 0921-8890, DOI: 10.1016/S0921-8890(02)00165-3 *
JOSE ROGER_FOLCH ET AL: "Graphical Development of Software for Programmable Logic Controllers", POWER ELECTRONICS AND MOTION CONTROL CONFERENCE, 2006. EPE-PEMC 2006. 12TH INTERNATIONAL, IEEE, PISCATAWAY, NJ, USA, 30 August 2006 (2006-08-30), pages 444 - 449, XP031421658, ISBN: 978-1-4244-0121-5 *

Similar Documents

Publication Publication Date Title
US10782668B2 (en) Development of control applications in augmented reality environment
EP3285127B1 (en) Remote industrial automation site operation in a cloud platform
CN107407918B (en) Extending programmable logic controllers with app
Veneri et al. Hands-on industrial Internet of Things: create a powerful industrial IoT infrastructure using industry 4.0
US20150019191A1 (en) Industrial simulation using redirected i/o module configurations
US20100083223A1 (en) Compilation model
US20070078525A1 (en) Business process execution engine
US20220128980A1 (en) Automation code generator for interoperability across industrial ecosystems
US7869887B2 (en) Discoverable services
US20160349967A1 (en) Offline investigation in an industrial automation environment
Rossini et al. REPLICA: A Solution for Next Generation IoT and Digital Twin Based Fault Diagnosis and Predictive Maintenance.
CN112579051A (en) Preferred automated view management
Ribeiro da Silva et al. Digital twins: Making it feasible for SMEs
US20210200167A1 (en) Control program code conversion
WO2021230876A1 (en) Skill logic controller (slc)
US10878690B2 (en) Unified status and alarm management for operations, monitoring, and maintenance of legacy and modern control systems from common user interface
Brecher et al. Model-based control of a handling system with SysML
CN116719533A (en) Environment automatic deployment method, deployment device, electronic equipment and storage medium
Eichelberger et al. Asset Administration Shells, Configuration, Code Generation: A power trio for Industry 4.0 Platforms
US20230120197A1 (en) An automation system and a method that provide a global view for managing any industrial controller in a network
CN115113851A (en) System model smart object configuration
Díaz et al. Robotic operation of the Observatorio Astrofísico de Javalambre (OAJ)
US20240201972A1 (en) Scheduling a software update on an industrial machine
EP4332850A2 (en) Cloud computing system, method and computer program
US20240094709A1 (en) Discover match use (dmu) automation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20732009

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20732009

Country of ref document: EP

Kind code of ref document: A1