WO2022009802A1 - Operating system, processing system, operating method, program, and storage medium - Google Patents

Operating system, processing system, operating method, program, and storage medium Download PDF

Info

Publication number
WO2022009802A1
WO2022009802A1 PCT/JP2021/025172 JP2021025172W WO2022009802A1 WO 2022009802 A1 WO2022009802 A1 WO 2022009802A1 JP 2021025172 W JP2021025172 W JP 2021025172W WO 2022009802 A1 WO2022009802 A1 WO 2022009802A1
Authority
WO
WIPO (PCT)
Prior art keywords
file
command
coordinates
signal
output
Prior art date
Application number
PCT/JP2021/025172
Other languages
French (fr)
Japanese (ja)
Inventor
康徳 渕上
Original Assignee
株式会社 東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝 filed Critical 株式会社 東芝
Publication of WO2022009802A1 publication Critical patent/WO2022009802A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • Embodiments of the present invention relate to an operation system, a processing system, an operation method, a program, and a storage medium.
  • An object to be solved by the present invention is to provide an operation system, a processing system, an operation method, a program, and a storage medium that can reduce the burden on the user.
  • the operation system includes an acquisition unit, a calculation unit, and a generation unit.
  • the acquisition unit acquires the output from the first device to be operated.
  • the arithmetic unit generates an operation instruction based on the acquisition result of the output and the operation file created in advance.
  • the generation unit generates an operation signal corresponding to the first device based on the operation command and transmits it to the first device.
  • the operation file includes a first file in which a generation operation for generating the operation instruction is described, and information referred to when the generation operation is executed and used for identification of the output and generation of the operation signal. Includes a second file.
  • FIG. 1 is a schematic diagram showing an operation system according to an embodiment.
  • the operation system 10 according to the embodiment includes an acquisition unit 11, a calculation unit 12, a generation unit 13, and a storage device 14.
  • the operation system 10 is used to automatically operate the processing device 21 (first device).
  • the processing device 21 processes the work.
  • the treatment is, for example, at least one selected from transport, processing, washing, heating, cooling, and drying.
  • Processing is, for example, at least one selected from film formation, etching, polishing, lithography, and joining.
  • the processing device 21 includes a control unit 21a. Each component of the processing device 21 operates based on the instruction transmitted from the control unit 21a to process the work.
  • the control unit 21a includes, for example, a central processing unit (CPU), a read-only memory (ROM), a random access memory (RAM), a storage device, an input interface, an output interface, a communication interface, and a bus connecting them.
  • the control unit 21a may be a dedicated computer for the processing device 21 or a general-purpose computer.
  • the function of the control unit 21a may be realized by the collaboration of a plurality of computers.
  • the input device 22 is used by the user to input information to the processing device 21.
  • the user is, for example, an operator in charge of processing executed by the processing device 21.
  • the input device 22 includes, for example, at least one selected from a keyboard, mouse, touchpad, and microphone (voice input).
  • the display device 23 displays the signal output from the processing device 21 so that the user can visually recognize it.
  • the display device 23 includes a monitor.
  • the acquisition unit 11 acquires the output from the processing device 21 which is the operation target to the display device 23.
  • the output includes a video signal transmitted from the processing device 21 to the display device 23.
  • the acquisition unit 11 receives the video signal.
  • the acquisition unit 11 acquires the video signal and transmits the video signal to the display device 23.
  • the acquisition unit 11 includes, for example, a video signal distributor and a capture unit.
  • the acquisition unit 11 acquires the video signal as a moving image or a still image and transmits it to the calculation unit 12.
  • the output may be light emitted from the processing device 21.
  • the acquisition unit 11 includes an image pickup device.
  • the acquisition unit 11 takes a picture of the light emitting device that emits light.
  • the acquisition unit 11 acquires a moving image or a still image and transmits it to the calculation unit 12.
  • the output may be a sound emitted from the processing device 21.
  • the acquisition unit 11 includes a microphone. The acquisition unit 11 acquires the sound emitted from the processing device 21 as an electric signal and transmits it to the calculation unit 12.
  • the calculation unit 12 recognizes the acquisition result by the acquisition unit 11. For example, the calculation unit 12 recognizes the image acquired by the acquisition unit 11. In image recognition, processing such as pattern matching, character recognition, and color matching is executed. Alternatively, the calculation unit 12 recognizes the sound acquired by the acquisition unit 11. In sound recognition, processing such as pattern matching is executed.
  • the calculation unit 12 refers to the operation file created in advance.
  • the operation file includes the first file and the second file.
  • the first file describes the generation operation for generating the operation instruction.
  • the first file is a program file in which an operation sequence for generating an operation instruction and transmitting it to the generation unit 13 is described.
  • the second file is referenced when the generation operation is executed.
  • the second file contains the information used in the identification of the output signal and the generation of the operation signal described below.
  • the second file contains coordinates selected by the input device 22, data information referred to during image recognition, and the like.
  • the calculation unit 12 executes the generation operation of the first file based on the acquisition result by the acquisition unit 11 while referring to the information included in the second file. As a result, an operation command for operating the processing device 21 is generated.
  • the arithmetic unit 12 transmits an operation instruction to the generation unit 13.
  • the arithmetic unit 12 includes, for example, a CPU, a ROM, a RAM, a storage device, an input interface, an output interface, a communication interface, and a bus connecting these.
  • the arithmetic unit 12 is a dedicated or general-purpose computer.
  • the function of the arithmetic unit 12 may be realized by the collaboration of a plurality of computers.
  • the generation unit 13 generates an operation signal corresponding to the processing device 21 based on the operation command.
  • the generation unit 13 inputs the generated operation signal to the processing device 21 (control unit 21a).
  • the processing device 21 controls the processing device 21a.
  • the processing device 21 receives the operation signal, the processing device 21 moves the pointer, inputs characters, and the like according to the operation signal.
  • the generation unit 13 includes, for example, a microcomputer programmed to generate an operation signal corresponding to a signal transmitted from the input device 22.
  • the storage device 14 stores various data necessary for operating the processing device 21.
  • the storage device 14 stores an operation file.
  • the storage device 14 stores reference data necessary for image recognition.
  • the storage device 14 includes a hard disk drive (HDD), a solid state drive (SSD), a network-attached hard disk (NAS), and the like.
  • the operation file and the reference data necessary for recognizing the acquisition result are prepared in advance by the user using the input device 15 and the display device 16.
  • the input device 15 is used by the user to input information to the calculation unit 12.
  • the input device 15 includes, for example, at least one selected from a keyboard, mouse, touchpad, and microphone.
  • the display device 16 displays the signal output from the calculation unit 12 so that the user can visually recognize it.
  • the display device 16 includes a monitor.
  • the acquisition unit 11 mainly acquires the video signal output from the processing device 21.
  • image recognition at least one selected from template matching, character recognition, and color recognition is executed.
  • the user prepares reference data necessary for these processes in advance and stores them in the storage device 14.
  • the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when performing image recognition.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user uses the input device 15 to select a portion of the displayed image.
  • the selected image is stored in the storage device 14 as a template image.
  • the user describes the file name of the template image to be referred to in the second file described later.
  • the template image is prepared for all the screens on which the automatic operation by the operation system 10 is executed.
  • the screen changes.
  • the displayed screen transitions to another screen, or a window is displayed.
  • a partial image that is not included in the screen before the change but is included in the screen after the change is used as the template image.
  • the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when it is necessary to recognize the character.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user confirms the range in which character recognition is executed in the displayed image by using the input device 15.
  • the user describes the range in the second file described later.
  • the acquisition unit 11 acquires a video signal from the processing device 21 to the display device 23 when it is necessary to recognize the color.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user uses the input device 15 to select a range in which color recognition is performed in the displayed image.
  • the images in the selected range are stored in the storage device 14.
  • the user describes the range in the second file described later.
  • the calculation unit 12 determines whether an image similar to the template image is included in the image based on the video signal.
  • the arithmetic unit 12 cuts out an image in a range stored in advance.
  • the arithmetic unit 12 executes character recognition processing such as Optical Character Recognition (OCR) on the image.
  • OCR Optical Character Recognition
  • the calculation unit 12 cuts out an image in a range stored in advance.
  • the calculation unit 12 determines whether or not the colors of the respective units match between the image stored in advance and the image cut out. By recognizing the image by these processes, the calculation unit 12 determines whether or not the operation on the processing device 21 by the operation signal is completed.
  • the first file and the second file are created.
  • the first file is described in a general-purpose programming language such as C language.
  • the selected coordinates of the pointing device, the template image to be referenced, the characters to be input, and the like are described in a text format.
  • the user describes the coordinates to be selected (clicked) in the second file and stores them in the storage device 14.
  • the acquisition unit 11 acquires a video signal from the processing device 21 to the display device 23 when operating the pointing device.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user points the pointer at the selected part of the displayed image and confirms the coordinates.
  • the user describes the coordinates in the second file.
  • the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when inputting characters with the keyboard.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal. In the displayed image, the user points the pointer at the text box for inputting characters and confirms the coordinates. The user describes the coordinates and one or more characters to be entered in the text box in the second file.
  • the calculation unit 12 determines whether the processing device 21 has been operated according to the operation signal by performing image recognition. If it is determined that the operation is not performed, the calculation unit 12 may send a notification to the display device 16 or another terminal device that the operation cannot be performed.
  • the file name of the template image used for image recognition, the range used for character recognition, the image and range used for color recognition, and the like are described in the second file as described above.
  • FIG. 4A is a schematic diagram illustrating a part of the first file.
  • FIG. 4B is a schematic diagram illustrating a part of the second file.
  • the screens shown in FIGS. 2 and 3 are displayed on the display device 23 based on the video signal output from the processing device 21.
  • icons B1 to B7 and information C1 regarding the work processed by the processing device 21 are displayed.
  • icons B1 to B4, icons B11 to B14, and a schematic diagram D of the processing device 21 are displayed.
  • icons B1 to B4, icons B21 to B24, and schematic diagram D are displayed.
  • icons B1 to B4, icons B21 to B24, and information C2 regarding processing conditions are displayed by the processing device 21.
  • the user moves the pointer to the icon B3 on the screen A1 shown in FIG. 2A and selects the icon B3.
  • the screen A2 shown in FIG. 2B is displayed on the display device 23.
  • the user moves the pointer to the icon B14 on the screen A2 and selects the icon B14.
  • the screen A3 shown in FIG. 3A is displayed on the display device 23.
  • the user moves the pointer to the icon B21 on the screen A3 and selects the icon B21.
  • the screen A4 shown in FIG. 3B is displayed on the display device 23.
  • the calculation unit 12 When the processing device 21 is operated by the operation system 10, the calculation unit 12 causes the screen A1 to be displayed on the display device 16. The user confirms the coordinates of the icon B3 on the screen A1 and describes them in the second file. Next, the calculation unit 12 causes the display device 16 to display the screen A2. The user cuts out a template image for determining the transition from the screen A1 to the screen A2 from the screen A2. For example, any of the schematic diagram D and the icons B11 to B14 is used as a template image. The user stores the cut out template image as image data in the storage device 14. The user describes the file name of the template image referred to at the time of determining the transition from the screen A1 to the screen A2 in the second file.
  • the storage of the template image and the description of the file name in the second file are repeated in the same manner.
  • the user cuts out a template image for determining the transition of the screen from the screen A3 and stores it in the storage device 14 as image data.
  • the user describes the file name of the template image in the second file.
  • the first file and the second file are prepared for the operation by the operation system 10.
  • the operation executed by the calculation unit 12 such as the generation of the operation instruction for selecting a specific icon and the reference of the template image described above, is described.
  • the second file the selected coordinates, the file name of the template image to be referenced, and the like are described.
  • the trigger for starting the operation by the operation system 10 is arbitrary.
  • the operation may be started based on a signal input to the processing device 21, a signal sent from another sensor, a measuring instrument, or the like.
  • the operation may be initiated in response to a signal transmitted from another operating system 10 or a signal transmitted from another processing device 21 operated by another operating system 10.
  • the signal may be transmitted at the time of operation by another operation system 10, or may be transmitted at a time other than the operation by another operation system 10.
  • the operation may be started when a specific screen is displayed on the display device 23.
  • the operation may be started when the user inputs a specific instruction to the arithmetic unit 12.
  • FIG. 5 is a flowchart showing a procedure for operation by the operation system according to the embodiment.
  • the user creates the first file (step S1).
  • the user creates reference data (step S2) and creates a second file (step S3).
  • the operation system 10 automatically operates the processing device 21 using the created first file, reference data, and second file (step S4).
  • the operation system 10 acquires an output signal from the processing device 21 and inputs an operation signal corresponding to the processing device 21. Therefore, the operation system 10 can be applied to the existing processing device 21. By applying the operation system 10 to the processing device 21 whose operation is not automated, the operation of the processing device 21 can be automated. For example, the operation of the processing device 21 can be automated without modifying the processing device 21 or rewriting the program.
  • the operation file when operating the processing device 21, the operation file is referred to.
  • the operation file includes the first file and the second file.
  • the first file describes the generation operation for generating the operation instruction.
  • the second file is created separately from the first file and is referred to when the generation operation based on the description of the first file is executed.
  • a plurality of processing devices 21 of the same type may be used.
  • the plurality of processing devices 21 can be automatically operated by the plurality of operation systems 10.
  • a common operation file and a common reference data may be used.
  • the operation file and reference data are duplicated and used in the operation by each operation system 10.
  • the displayed screens may differ slightly even between processing devices 21 of the same type.
  • the positions of the same icons on the same screen may be slightly different between the processing devices 21.
  • the desired icon may not be selected.
  • the appearance of the template image may differ due to the influence of noise, deterioration of the video signal output unit of the processing device 21 over time, and the like. As a result, the accuracy of image recognition may decrease.
  • the information used when generating the operation instruction is described in a second file separate from the first file.
  • the description of the coordinates may be changed so as to correct the deviation.
  • the template image having a different appearance may be prepared, and the description of the second file may be changed so as to refer to the image data.
  • the machine difference can be easily corrected, and the burden on the user required for preparing the operation system 10 can be reduced.
  • a processing system 1 including an operation system 10 and a processing device 21 that can reduce the burden on the user is provided.
  • the second file is, for example, a text file. Therefore, even a user who has no knowledge about programming can describe the second file in a text format and can easily modify the second file.
  • the user may try the operation in order to confirm whether there is a problem in the operation by the operation system 10 before the operation of the production line. For example, the user confirms whether the icon is properly selected, whether the image is recognized normally, and the like in the operation attempt. When a malfunction in the operation is confirmed, the user appropriately corrects the description in the second file.
  • the calculation unit 12 may automatically correct the coordinates described in the second file. For example, in an operation trial, the arithmetic unit 12 generates a selection operation command for selecting the coordinates described in the second file and transmits it to the generation unit 13. The generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21. The calculation unit 12 recognizes the image acquired by the acquisition unit 11 after the operation signal is transmitted, and determines whether or not the template image prepared in advance is displayed. When the template image is not displayed, the arithmetic unit 12 generates another selection operation command for selecting another coordinate different from the coordinates described in the second file. For example, the other coordinates are randomly determined from the vicinity of the coordinates described in the second file.
  • the arithmetic unit 12 transmits another selection operation command to the generation unit 13.
  • the generation unit 13 generates another operation signal based on another selection operation command and transmits it to the processing device 21.
  • the generation of another selection operation instruction is repeated, for example, until a prepared template image is displayed or a preset number of times is reached.
  • the arithmetic unit 12 generates another selection operation command so as to select another coordinate different from the previous coordinates.
  • FIG. 6 is a flowchart showing a part of an operation trial by the operation system according to the embodiment.
  • an operation using the first file, the second file, and the reference data is tried after steps S1 to S3 and before step S4.
  • the arithmetic unit 12 generates a selection operation instruction for selecting a part of the screen (step S11).
  • the generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21.
  • the calculation unit 12 determines whether the operation to the processing device 21 by the operation signal is completed (step S12).
  • the calculation unit 12 determines whether the number of trials of the selection operation is less than the specified number (step S13). When the number of trials is less than the specified number of times, the arithmetic unit 12 generates another selection operation instruction for selecting another coordinate (step S14). The generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21. When the number of trials of the selection operation reaches the specified number of times, the calculation unit 12 notifies the user (step S15). For example, the display device 16 displays information informing that the operation cannot be properly executed.
  • step S12 When it is determined in step S12 that the operation corresponding to the selection operation command is completed, the arithmetic unit 12 changes the coordinates described in the second file to the coordinates used for the selection operation command capable of executing the selection operation. Modify (step S16). If the selection operation can be executed with the original coordinates described in the second file, the modification of the second file is not executed.
  • the response by the processing device 21 may be delayed.
  • the screen does not change due to the delay, and the screen may change normally after the determination. Therefore, for the processing device 21 where the display of the screen on the display device 23 may be delayed, the interval for transmitting the operation signal is set longer, or the operation signal based on the selection operation command is transmitted a plurality of times. For example, in the flowchart shown in FIG. 6, when the operation is not completed after a certain operation signal is input to the processing device 21, the same operation signal is input to the processing device 21 again from the generation unit 13.
  • FIG. 7 is a schematic diagram showing an operation system according to the first modification.
  • An input display device having a function as an input device and a display device may be connected to the processing device 21.
  • a touch panel 24 is provided as an input display device.
  • the operation system 10a according to the first modification further includes a switching unit 17.
  • the processing system 1a includes an operation system 10a and a processing device 21.
  • the generation unit 13 generates an operation signal corresponding to the signal transmitted from the touch panel 24.
  • the switching unit 17 switches the signal input to the processing device 21. Specifically, a signal transmitted from the touch panel 24 and an operation signal transmitted from the generation unit 13 are input to the switching unit 17.
  • the switching unit 17 switches between a state in which the signal transmitted from the touch panel 24 is input to the processing device 21 and a state in which the operation signal transmitted from the generation unit 13 is input to the processing device 21. During the operation by the operation system 10a, the switching unit 17 inputs the operation signal to the processing device 21.
  • the switching unit 17 can prevent the user from operating the processing device 21 by the touch panel 24 during the operation by the operation system 10a. For example, it is possible to avoid the transition to another screen by the user's operation and the interruption of the operation by the operation system 10a.
  • the switching unit 17 includes, for example, a relay circuit or a switching circuit. By providing the switching unit 17, the processing device 21 in which the touch panel 24 is used can be operated by the operation system 10a.
  • the coordinate deviation of the generated operation signal is large. This is because the coordinates of the sensor that detects touch and the positional relationship of the display are individually adjusted. Therefore, even when a plurality of touch panels 24 of the same type are used for a plurality of processing devices 21 of the same type, the coordinates selected or referred to at the time of automatic operation must be corrected for each processing device 21. be.
  • the invention according to the embodiment is particularly suitable for the processing device 21 in which the touch panel 24 is used. According to the first modification, the burden on the user on the processing system 1a including the touch panel 24 can be reduced.
  • FIG. 8 is a schematic diagram showing an operation system according to the second modification.
  • the operation system 10b according to the second modification further includes a switching unit 17 as compared with the operation system 10.
  • the signal transmitted from the generation unit 13 or the input device 22 to the processing device 21 is input to the processing device 21 via the switching unit 17.
  • the switching unit 17 switches between a state in which the signal transmitted from the input device 22 is input to the processing device 21 and a state in which the operation signal transmitted from the generation unit 13 is input to the processing device 21.
  • the switching unit 17 inputs an operation signal to the processing device 21. This makes it possible to prevent the user from operating the processing device 21 using the input device 22 during the operation by the operation system 10b. According to the second modification, the convenience of the processing system 1b including the operation system 10b and the processing device 21 can be improved.
  • FIG. 9 is a schematic diagram showing an operation system according to the third modification.
  • the operation system 10c according to the third modification further includes reading devices 18a and 18b as compared with the operation system 10.
  • the reading devices 18a and 18b read the identification information.
  • As the identification information a one-dimensional bar code, a two-dimensional bar code, or a Radio Frequency IDentifier (RFID) can be used.
  • the readers 18a and 18b are barcode readers or RFID readers.
  • the reading device 18a reads the identification information of the operator in charge of the processing device 21.
  • the reading device 18b reads the identification information of the work processed by the processing device 21.
  • the user when executing the process by the processing device 21, the user (operator) places the work in a place where the reading device 18b can read the identification information of the work. After that, the user causes the reading device 18a to read his / her identification information.
  • the calculation unit 12 starts the operation of the processing device 21 in response to the reading of the identification information by the reading device 18a. For example, the arithmetic unit 12 causes the reading device 18b to read the identification information of the work, and then starts generating an operation command.
  • the operation system 10c may generate an operation signal according to the read identification information and input it to the processing device 21.
  • the identification information may be directly input from the arithmetic unit 12 to the processing device 21 via a communication port or the like by serial communication or the like.
  • the operation system 10c may include other devices related to the processing executed by the processing device 21.
  • the operation of the processing device 21 may be started according to the operation of the other device. This eliminates the need for the user to operate the input device 15 or 22 to initiate the operation.
  • the convenience of the operation system 10c can be improved.
  • the convenience of the processing system 1c including the operation system 10c and the processing device 21 can be improved.
  • FIG. 10 is a schematic diagram showing an operation system according to the fourth modification.
  • the operation system 10d according to the fourth modification is connected to the management device 30 (second device).
  • the arithmetic unit 12 is connected to the management device 30 by network, wired communication, or wireless communication.
  • the management device 30 is a higher-level device that transmits a command to the processing device 21.
  • the operation system 10d receives the command.
  • the operation system 10d executes an operation in response to an instruction.
  • the arithmetic unit 12 generates an operation instruction according to the instruction and transmits it to the generation unit 13.
  • a plurality of operation files are stored in the storage device 14.
  • the arithmetic unit 12 selects one of the plurality of operation files according to the instruction transmitted from the management device 30.
  • the arithmetic unit 12 executes the program included in the selected operation file and generates an operation instruction.
  • the instruction transmitted from the management device 30 may be used as a trigger for starting the operation.
  • the management device 30 may be connected to a plurality of operation systems 10d.
  • the operation by the operation system 10d may be started in response to the signal transmitted from another operation system 10c or the signal transmitted from another processing device 21 operated by another operation system 10d.
  • the signal may be transmitted at the time of operation by another operation system 10d, or may be transmitted at a time other than the operation by another operation system 10d.
  • the operation system 10d may input parameters related to the processing executed by the processing device 21 to the processing device 21.
  • the parameters are the number of workpieces to be processed, conditions at the time of processing execution, and the like.
  • the conditions are, for example, temperature, pressure, gas or liquid flow rate, current, voltage, processing time and the like.
  • the operation system 10d generates an operation signal for inputting a numerical value of a parameter and transmits it to the processing device 21. As a result, a numerical value is input to the text box or the like displayed on the display device 23.
  • the management device 30 may transmit parameters input from the operation system 10d to the processing device 21 to the operation system 10d.
  • the operation system 10d receives a numerical value input as a parameter and generates an operation signal according to the numerical value. As a result, the numerical value transmitted from the management device 30 is input to the processing device 21.
  • the management device 30 may transmit information about the workpiece to be processed to the operation system 10d.
  • the management device 30 transmits the work identification information (ID) to the operation system 10d.
  • ID work identification information
  • the operation system 10d inputs the identification information to the processing device 21 via the communication port.
  • the operation system 10d may generate an operation signal according to the identification information and input it to the processing device 21.
  • higher-level equipment may be installed to manage multiple processing equipment.
  • a command cannot be directly transmitted from the management device 30 to the processing device 21 due to compatibility of communication IFs between the processing device 21 and the management device 30, differences in communication standards, and the like.
  • the processing device 21 may not accept the command transmitted from the management device 30.
  • the operation system 10d the processing device 21 can be managed by the management device 30.
  • the management device 30 it is not necessary to modify the processing device 21, and the production efficiency can be improved more easily.
  • the management device 30 can easily acquire information on the operating rate or the operating status of the single or a plurality of processing systems 1d.
  • FIG. 11 is a schematic diagram showing the hardware configuration of the operation system according to the embodiment.
  • the arithmetic unit 12 is a computer and includes a ROM 12a, a RAM 12b, a CPU 12c, and an HDD 12d.
  • the ROM 12a stores a program that controls the operation of the computer.
  • the ROM 12a stores programs necessary for realizing various processes in a computer.
  • the RAM 12b functions as a storage area in which the program stored in the ROM 12a is expanded.
  • the CPU 12c includes a processing circuit.
  • the CPU 12c reads the control program stored in the ROM 12a and controls the operation of the computer according to the control program.
  • the CPU 12c expands various data obtained by the operation of the computer into the RAM 12b.
  • the HDD 12d stores data necessary for reading and data obtained in the process of reading.
  • the HDD 12d may function as the storage device 14 shown in FIG.
  • Convenience can be improved by using the operation system, processing system, or operation method described above.
  • the convenience of the operation system can be improved by using a program for causing the computer to execute the above-mentioned processing or a storage medium in which the program is stored.
  • the above-mentioned various data processing can be performed by a computer as a program that can be executed by a magnetic disk (flexible disk, hard disk, etc.), an optical disk (CD-ROM, CD-R, CD-RW, DVD-ROM, DVD ⁇ R). , DVD ⁇ RW, etc.), and may be recorded on a non-transitory computer-readable storage medium such as a semiconductor memory.
  • the data recorded on the recording medium can be read by a computer (or embedded system).
  • the recording format storage format
  • the computer reads a program from the recording medium and causes the CPU to execute the instructions described in the program based on the program.
  • the acquisition (or reading) of the program may be performed through the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An operating system according to an embodiment of the present invention is provided with an acquiring unit, a computing unit, and a generating unit. The acquiring unit acquires an output from a first device which is to be operated. The computing unit generates an operation instruction on the basis of the output acquisition result and an action file created in advance. The generating unit generates an operation signal corresponding to the first device on the basis of the operation instruction, and transmits the same to the first device. The action file includes a first file in which a generation action for generating the operation instruction is described, and a second file which is referred to when the generation action is executed, and which includes information used to identify the output and to generate the operation signal.

Description

操作システム、処理システム、操作方法、プログラム、及び記憶媒体Operation system, processing system, operation method, program, and storage medium
 本発明の実施形態は、操作システム、処理システム、操作方法、プログラム、及び記憶媒体に関する。 Embodiments of the present invention relate to an operation system, a processing system, an operation method, a program, and a storage medium.
 装置を自動的に操作するシステムがある。このシステムについて、ユーザへの負担の低減が求められている。 There is a system that automatically operates the device. This system is required to reduce the burden on the user.
特開2004-110489号公報Japanese Unexamined Patent Publication No. 2004-110489
 本発明が解決しようとする課題は、ユーザへの負担を低減可能な、操作システム、処理システム、操作方法、プログラム、及び記憶媒体を提供することである。 An object to be solved by the present invention is to provide an operation system, a processing system, an operation method, a program, and a storage medium that can reduce the burden on the user.
 実施形態に係る操作システムは、取得部、演算部、及び生成部を備える。前記取得部は、操作対象である第1装置からの出力を取得する。前記演算部は、前記出力の取得結果及び予め作成された動作ファイルに基づいて、操作命令を生成する。前記生成部は、前記操作命令に基づいて前記第1装置に対応した操作信号を生成し、前記第1装置へ送信する。前記動作ファイルは、前記操作命令を生成するための生成動作が記述された第1ファイルと、前記生成動作の実行時に参照され、前記出力の識別及び前記操作信号の生成に使用される情報を含む第2ファイルと、を含む。 The operation system according to the embodiment includes an acquisition unit, a calculation unit, and a generation unit. The acquisition unit acquires the output from the first device to be operated. The arithmetic unit generates an operation instruction based on the acquisition result of the output and the operation file created in advance. The generation unit generates an operation signal corresponding to the first device based on the operation command and transmits it to the first device. The operation file includes a first file in which a generation operation for generating the operation instruction is described, and information referred to when the generation operation is executed and used for identification of the output and generation of the operation signal. Includes a second file.
実施形態に係る操作システムを表す模式図である。It is a schematic diagram which shows the operation system which concerns on embodiment. 表示装置の画面を例示する模式図である。It is a schematic diagram which illustrates the screen of a display device. 表示装置の画面を例示する模式図である。It is a schematic diagram which illustrates the screen of a display device. 第1ファイルの一部及び第2ファイルの一部を例示する模式図である。It is a schematic diagram which illustrates a part of the 1st file and a part of a 2nd file. 実施形態に係る操作システムによる操作のための手順を表すフローチャートである。It is a flowchart which shows the procedure for operation by the operation system which concerns on embodiment. 実施形態に係る操作システムによる操作の試行の一部を表すフローチャートである。It is a flowchart which shows a part of the trial of the operation by the operation system which concerns on embodiment. 第1変形例に係る操作システムを表す模式図である。It is a schematic diagram which shows the operation system which concerns on the 1st modification. 第2変形例に係る操作システムを表す模式図である。It is a schematic diagram which shows the operation system which concerns on the 2nd modification. 第3変形例に係る操作システムを表す模式図である。It is a schematic diagram which shows the operation system which concerns on the 3rd modification. 第4変形例に係る操作システムを表す模式図である。It is a schematic diagram which shows the operation system which concerns on 4th modification. 実施形態に係る操作システムのハードウェア構成を表す模式図である。It is a schematic diagram which shows the hardware configuration of the operation system which concerns on embodiment.
 以下に、本発明の各実施形態について図面を参照しつつ説明する。
 本願明細書と各図において、既に説明したものと同様の要素には同一の符号を付して詳細な説明は適宜省略する。
Hereinafter, each embodiment of the present invention will be described with reference to the drawings.
In the present specification and each figure, the same elements as those already described are designated by the same reference numerals, and detailed description thereof will be omitted as appropriate.
 図1は、実施形態に係る操作システムを表す模式図である。
 図1に表したように、実施形態に係る操作システム10は、取得部11、演算部12、生成部13、及び記憶装置14を含む。操作システム10は、処理装置21(第1装置)を自動的に操作するために用いられる。
FIG. 1 is a schematic diagram showing an operation system according to an embodiment.
As shown in FIG. 1, the operation system 10 according to the embodiment includes an acquisition unit 11, a calculation unit 12, a generation unit 13, and a storage device 14. The operation system 10 is used to automatically operate the processing device 21 (first device).
 処理装置21は、ワークを処理する。処理は、例えば、搬送、加工、洗浄、加熱、冷却、及び乾燥から選択される少なくとも1つである。加工は、例えば、成膜、エッチング、研磨、リソグラフィ、及び接合から選択される少なくとも1つである。 The processing device 21 processes the work. The treatment is, for example, at least one selected from transport, processing, washing, heating, cooling, and drying. Processing is, for example, at least one selected from film formation, etching, polishing, lithography, and joining.
 処理装置21は、制御部21aを含む。処理装置21の各構成要素は、制御部21aから送信された命令に基づいて動作し、ワークを処理する。制御部21aは、例えば、中央演算処理装置(CPU)、リードオンリーメモリ(ROM)、ランダムアクセスメモリ(RAM)、記憶装置、入力インタフェース、出力インタフェース、通信インタフェース、及びこれらを接続するバスを含む。制御部21aは、処理装置21のための専用コンピュータであっても良いし、汎用コンピュータであっても良い。複数のコンピュータの協働により、制御部21aの機能が実現されても良い。 The processing device 21 includes a control unit 21a. Each component of the processing device 21 operates based on the instruction transmitted from the control unit 21a to process the work. The control unit 21a includes, for example, a central processing unit (CPU), a read-only memory (ROM), a random access memory (RAM), a storage device, an input interface, an output interface, a communication interface, and a bus connecting them. The control unit 21a may be a dedicated computer for the processing device 21 or a general-purpose computer. The function of the control unit 21a may be realized by the collaboration of a plurality of computers.
 入力装置22は、ユーザが処理装置21に情報を入力するために用いられる。ユーザは、例えば、処理装置21により実行される処理を担当するオペレータである。入力装置22は、例えば、キーボード、マウス、タッチパッド、及びマイク(音声入力)から選択される少なくとも1つを含む。 The input device 22 is used by the user to input information to the processing device 21. The user is, for example, an operator in charge of processing executed by the processing device 21. The input device 22 includes, for example, at least one selected from a keyboard, mouse, touchpad, and microphone (voice input).
 表示装置23は、処理装置21から出力された信号を、ユーザが視認できるように表示する。表示装置23は、モニタを含む。 The display device 23 displays the signal output from the processing device 21 so that the user can visually recognize it. The display device 23 includes a monitor.
 取得部11は、操作対象である処理装置21から表示装置23への出力を取得する。 The acquisition unit 11 acquires the output from the processing device 21 which is the operation target to the display device 23.
 例えば、出力は、処理装置21から表示装置23へ送信される映像信号を含む。取得部11は、映像信号を受信する。取得部11は、映像信号を取得するとともに、その映像信号を表示装置23へ送信する。取得部11は、例えば、映像信号の分配器及びキャプチャユニットを含む。取得部11は、映像信号を動画又は静止画として取得し、演算部12へ送信する。 For example, the output includes a video signal transmitted from the processing device 21 to the display device 23. The acquisition unit 11 receives the video signal. The acquisition unit 11 acquires the video signal and transmits the video signal to the display device 23. The acquisition unit 11 includes, for example, a video signal distributor and a capture unit. The acquisition unit 11 acquires the video signal as a moving image or a still image and transmits it to the calculation unit 12.
 又は、出力は、処理装置21から発せられる光であっても良い。例えば、取得部11は、撮像装置を含む。取得部11は、光を発する発光装置の写真を撮影する。取得部11は、動画又は静止画を取得し、演算部12へ送信する。 Alternatively, the output may be light emitted from the processing device 21. For example, the acquisition unit 11 includes an image pickup device. The acquisition unit 11 takes a picture of the light emitting device that emits light. The acquisition unit 11 acquires a moving image or a still image and transmits it to the calculation unit 12.
 又は、出力は、処理装置21から発せられる音であっても良い。例えば、取得部11は、マイクを含む。取得部11は、処理装置21から発せられた音を電気信号として取得し、演算部12へ送信する。 Alternatively, the output may be a sound emitted from the processing device 21. For example, the acquisition unit 11 includes a microphone. The acquisition unit 11 acquires the sound emitted from the processing device 21 as an electric signal and transmits it to the calculation unit 12.
 演算部12は、取得部11による取得結果を認識する。例えば、演算部12は、取得部11が取得した画像を認識する。画像認識では、パターンマッチング、文字認識、カラーマッチング等の処理が実行される。又は、演算部12は、取得部11で取得された音を認識する。音の認識では、パターンマッチング等の処理が実行される。 The calculation unit 12 recognizes the acquisition result by the acquisition unit 11. For example, the calculation unit 12 recognizes the image acquired by the acquisition unit 11. In image recognition, processing such as pattern matching, character recognition, and color matching is executed. Alternatively, the calculation unit 12 recognizes the sound acquired by the acquisition unit 11. In sound recognition, processing such as pattern matching is executed.
 演算部12は、予め作成された動作ファイルを参照する。動作ファイルは、第1ファイル及び第2ファイルを含む。第1ファイルは、操作命令を生成するための生成動作が記述されている。例えば、第1ファイルは、操作命令を生成して生成部13へ送信するための動作シーケンスが記述されたプログラムファイルである。第2ファイルは、生成動作の実行時に参照される。第2ファイルは、出力信号の識別と、後述する操作信号の生成と、において使用される情報を含む。例えば、第2ファイルは、入力装置22により選択される座標、画像認識の際に参照されるデータの情報などを含む。 The calculation unit 12 refers to the operation file created in advance. The operation file includes the first file and the second file. The first file describes the generation operation for generating the operation instruction. For example, the first file is a program file in which an operation sequence for generating an operation instruction and transmitting it to the generation unit 13 is described. The second file is referenced when the generation operation is executed. The second file contains the information used in the identification of the output signal and the generation of the operation signal described below. For example, the second file contains coordinates selected by the input device 22, data information referred to during image recognition, and the like.
 演算部12は、取得部11による取得結果に基づいて、第2ファイルに含まれる情報を参照しながら、第1ファイルの生成動作を実行する。これにより、処理装置21を操作するための操作命令が生成される。演算部12は、操作命令を生成部13へ送信する。 The calculation unit 12 executes the generation operation of the first file based on the acquisition result by the acquisition unit 11 while referring to the information included in the second file. As a result, an operation command for operating the processing device 21 is generated. The arithmetic unit 12 transmits an operation instruction to the generation unit 13.
 演算部12は、例えば、CPU、ROM、RAM、記憶装置、入力インタフェース、出力インタフェース、通信インタフェース、及びこれらを接続するバスを含む。演算部12は、専用又は汎用のコンピュータである。複数のコンピュータの協働により、演算部12の機能が実現されても良い。 The arithmetic unit 12 includes, for example, a CPU, a ROM, a RAM, a storage device, an input interface, an output interface, a communication interface, and a bus connecting these. The arithmetic unit 12 is a dedicated or general-purpose computer. The function of the arithmetic unit 12 may be realized by the collaboration of a plurality of computers.
 生成部13は、操作命令に基づき、処理装置21に対応した操作信号を生成する。生成部13は、生成した操作信号を処理装置21(制御部21a)へ入力する。例えば、入力装置22がマウス又はキーボードを含む場合、操作信号は、マウス又はキーボードから送信される信号に相当する。処理装置21は、操作信号を受け付けると、その操作信号に従って、ポインタの移動、文字の入力などを実行する。生成部13は、例えば、入力装置22から送信される信号に相当する操作信号を生成するようにプログラミングされたマイコンを含む。 The generation unit 13 generates an operation signal corresponding to the processing device 21 based on the operation command. The generation unit 13 inputs the generated operation signal to the processing device 21 (control unit 21a). For example, when the input device 22 includes a mouse or keyboard, the operation signal corresponds to a signal transmitted from the mouse or keyboard. When the processing device 21 receives the operation signal, the processing device 21 moves the pointer, inputs characters, and the like according to the operation signal. The generation unit 13 includes, for example, a microcomputer programmed to generate an operation signal corresponding to a signal transmitted from the input device 22.
 記憶装置14は、処理装置21の操作に必要な各種データを記憶する。例えば、記憶装置14は、動作ファイルを記憶する。記憶装置14は、画像認識に必要な参照データを記憶する。記憶装置14は、ハードディスクドライブ(HDD)、ソリッドステートドライブ(SSD)、ネットワーク接続ハードディスク(NAS)などを含む。 The storage device 14 stores various data necessary for operating the processing device 21. For example, the storage device 14 stores an operation file. The storage device 14 stores reference data necessary for image recognition. The storage device 14 includes a hard disk drive (HDD), a solid state drive (SSD), a network-attached hard disk (NAS), and the like.
 動作ファイル及び取得結果の認識に必要な参照データは、入力装置15及び表示装置16を用いてユーザにより予め準備される。入力装置15は、ユーザが演算部12に情報を入力するために用いられる。入力装置15は、例えば、キーボード、マウス、タッチパッド、及びマイクから選択される少なくとも1つを含む。表示装置16は、演算部12から出力された信号を、ユーザが視認できるように表示する。表示装置16は、モニタを含む。 The operation file and the reference data necessary for recognizing the acquisition result are prepared in advance by the user using the input device 15 and the display device 16. The input device 15 is used by the user to input information to the calculation unit 12. The input device 15 includes, for example, at least one selected from a keyboard, mouse, touchpad, and microphone. The display device 16 displays the signal output from the calculation unit 12 so that the user can visually recognize it. The display device 16 includes a monitor.
 ここでは、主に、取得部11が、処理装置21から出力された映像信号を取得する場合について説明する。 Here, a case where the acquisition unit 11 mainly acquires the video signal output from the processing device 21 will be described.
 画像認識では、テンプレートマッチング、文字認識、及び色認識から選択される少なくとも1つが実行される。ユーザは、予めこれらの処理に必要な参照データを準備し、記憶装置14に記憶する。 In image recognition, at least one selected from template matching, character recognition, and color recognition is executed. The user prepares reference data necessary for these processes in advance and stores them in the storage device 14.
 テンプレート画像の準備において、取得部11は、画像認識を実行するときの処理装置21から表示装置23への映像信号を取得する。演算部12は、映像信号に基づく画像を、表示装置16に表示させる。ユーザは、入力装置15を用いて、表示された画像の一部を選択する。選択された画像は、テンプレート画像として記憶装置14に記憶される。ユーザは、参照するテンプレート画像のファイル名を、後述する第2ファイルに記述する。例えば、操作システム10による自動的な操作が実行される全ての画面に対して、テンプレート画像の準備が行われる。 In the preparation of the template image, the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when performing image recognition. The calculation unit 12 causes the display device 16 to display an image based on the video signal. The user uses the input device 15 to select a portion of the displayed image. The selected image is stored in the storage device 14 as a template image. The user describes the file name of the template image to be referred to in the second file described later. For example, the template image is prepared for all the screens on which the automatic operation by the operation system 10 is executed.
 例えば、処理装置21が操作されると、画面に変化が生じる。例えば、表示されている画面が別の画面に遷移したり、ウインドウが表示されたりする。変化前の画面に含まれず、変化後の画面に含まれる部分画像が、テンプレート画像として使用される。 For example, when the processing device 21 is operated, the screen changes. For example, the displayed screen transitions to another screen, or a window is displayed. A partial image that is not included in the screen before the change but is included in the screen after the change is used as the template image.
 文字認識範囲の準備において、取得部11は、文字を認識する必要があるときの処理装置21から表示装置23への映像信号を取得する。演算部12は、映像信号に基づく画像を、表示装置16に表示させる。ユーザは、入力装置15を用いて、表示された画像において、文字認識が実行される範囲を確認する。ユーザは、その範囲を後述する第2ファイルに記述する。 In the preparation of the character recognition range, the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when it is necessary to recognize the character. The calculation unit 12 causes the display device 16 to display an image based on the video signal. The user confirms the range in which character recognition is executed in the displayed image by using the input device 15. The user describes the range in the second file described later.
 色認識の準備において、取得部11は、色を認識する必要があるときの処理装置21から表示装置23への映像信号を取得する。演算部12は、映像信号に基づく画像を、表示装置16に表示させる。ユーザは、入力装置15を用いて、表示された画像において、色認識が実行される範囲を選択する。選択された範囲の画像は、記憶装置14に記憶される。ユーザは、その範囲を後述する第2ファイルに記述する。 In preparation for color recognition, the acquisition unit 11 acquires a video signal from the processing device 21 to the display device 23 when it is necessary to recognize the color. The calculation unit 12 causes the display device 16 to display an image based on the video signal. The user uses the input device 15 to select a range in which color recognition is performed in the displayed image. The images in the selected range are stored in the storage device 14. The user describes the range in the second file described later.
 テンプレート画像を用いた画像認識では、演算部12は、テンプレート画像と類似する画像が、映像信号に基づく画像に含まれているか判定する。文字認識では、演算部12は、予め記憶された範囲の画像を切り出す。演算部12は、その画像に対して、Optical Character Recognition(OCR)などの文字認識の処理を実行する。色認識では、演算部12は、予め記憶された範囲の画像を切り出す。演算部12は、予め記憶された画像と切り出した画像との間で、各部の色が一致するか判定する。演算部12は、これらの処理により画像を認識することで、操作信号による処理装置21への操作が完了したか判定する。 In image recognition using a template image, the calculation unit 12 determines whether an image similar to the template image is included in the image based on the video signal. In character recognition, the arithmetic unit 12 cuts out an image in a range stored in advance. The arithmetic unit 12 executes character recognition processing such as Optical Character Recognition (OCR) on the image. In color recognition, the calculation unit 12 cuts out an image in a range stored in advance. The calculation unit 12 determines whether or not the colors of the respective units match between the image stored in advance and the image cut out. By recognizing the image by these processes, the calculation unit 12 determines whether or not the operation on the processing device 21 by the operation signal is completed.
 動作ファイルの準備では、第1ファイル及び第2ファイルの作成が行われる。第1ファイルは、例えば、C言語などの汎用のプログラミング言語で記述される。第2ファイルは、ポインティングデバイスの選択座標、参照されるテンプレート画像、入力される文字などがテキスト形式で記述される。 In the preparation of the operation file, the first file and the second file are created. The first file is described in a general-purpose programming language such as C language. In the second file, the selected coordinates of the pointing device, the template image to be referenced, the characters to be input, and the like are described in a text format.
 マウスなどのポインティングデバイスに関して、ユーザは、選択(クリック)する座標を第2ファイルに記述し、記憶装置14に記憶する。例えば、取得部11は、ポインティングデバイスを操作するときの処理装置21から表示装置23への映像信号を取得する。演算部12は、映像信号に基づく画像を、表示装置16に表示させる。ユーザは、表示された画像において選択する部分にポインタを合わせ、その座標を確認する。ユーザは、座標を第2ファイルに記述する。 Regarding a pointing device such as a mouse, the user describes the coordinates to be selected (clicked) in the second file and stores them in the storage device 14. For example, the acquisition unit 11 acquires a video signal from the processing device 21 to the display device 23 when operating the pointing device. The calculation unit 12 causes the display device 16 to display an image based on the video signal. The user points the pointer at the selected part of the displayed image and confirms the coordinates. The user describes the coordinates in the second file.
 キーボードに関して、取得部11は、キーボードにより文字を入力するときの処理装置21から表示装置23への映像信号を取得する。演算部12は、映像信号に基づく画像を、表示装置16に表示させる。ユーザは、表示された画像において、文字を入力するテキストボックスにポインタを合わせ、その座標を確認する。ユーザは、座標と、そのテキストボックスに入力する1つ以上の文字と、を第2ファイルに記述する。 Regarding the keyboard, the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when inputting characters with the keyboard. The calculation unit 12 causes the display device 16 to display an image based on the video signal. In the displayed image, the user points the pointer at the text box for inputting characters and confirms the coordinates. The user describes the coordinates and one or more characters to be entered in the text box in the second file.
 処理装置21に操作信号が送られた後、演算部12は、画像認識を行うことにより、操作信号に応じて処理装置21が操作されたか判定する。操作されていないと判定される場合、演算部12は、表示装置16又は他の端末装置に、操作できない旨の通知を送信しても良い。画像認識に使用されるテンプレート画像のファイル名、文字認識に使用される範囲、色認識に使用される画像及び範囲などは、上述したように第2ファイルに記述される。 After the operation signal is sent to the processing device 21, the calculation unit 12 determines whether the processing device 21 has been operated according to the operation signal by performing image recognition. If it is determined that the operation is not performed, the calculation unit 12 may send a notification to the display device 16 or another terminal device that the operation cannot be performed. The file name of the template image used for image recognition, the range used for character recognition, the image and range used for color recognition, and the like are described in the second file as described above.
 具体例を参照しながら、操作システム10について説明する。
 図2及び図3は、表示装置の画面を例示する模式図である。
 図4(a)は、第1ファイルの一部を例示する模式図である。図4(b)は、第2ファイルの一部を例示する模式図である。
The operation system 10 will be described with reference to a specific example.
2 and 3 are schematic views illustrating the screen of the display device.
FIG. 4A is a schematic diagram illustrating a part of the first file. FIG. 4B is a schematic diagram illustrating a part of the second file.
 処理装置21による処理過程の一部において、処理装置21から出力された映像信号に基づき、図2及び図3に表した画面が、表示装置23に表示される。図2(a)に表した画面A1には、アイコンB1~B7と、処理装置21により処理されているワークに関する情報C1と、が表示されている。図2(b)に表した画面A2には、アイコンB1~B4、アイコンB11~B14、処理装置21の模式図Dと、が表示されている。図3(a)に表した画面A3には、アイコンB1~B4、アイコンB21~B24、模式図Dと、が表示されている。図3(b)に表した画面A4には、アイコンB1~B4、アイコンB21~B24、処理装置21により処理条件に関する情報C2、が表示されている。 In a part of the processing process by the processing device 21, the screens shown in FIGS. 2 and 3 are displayed on the display device 23 based on the video signal output from the processing device 21. On the screen A1 shown in FIG. 2A, icons B1 to B7 and information C1 regarding the work processed by the processing device 21 are displayed. On the screen A2 shown in FIG. 2B, icons B1 to B4, icons B11 to B14, and a schematic diagram D of the processing device 21 are displayed. On the screen A3 shown in FIG. 3A, icons B1 to B4, icons B21 to B24, and schematic diagram D are displayed. On the screen A4 shown in FIG. 3B, icons B1 to B4, icons B21 to B24, and information C2 regarding processing conditions are displayed by the processing device 21.
 例えば、操作システム10を用いない場合、ユーザは、図2(a)に表した画面A1において、アイコンB3にポインタを移動させ、アイコンB3を選択する。表示装置23に、図2(b)に表した画面A2が表示される。ユーザは、画面A2において、アイコンB14にポインタを移動させ、アイコンB14を選択する。表示装置23に、図3(a)に表した画面A3が表示される。ユーザは、画面A3において、アイコンB21にポインタを移動させ、アイコンB21を選択する。表示装置23に、図3(b)に表した画面A4が表示される。 For example, when the operation system 10 is not used, the user moves the pointer to the icon B3 on the screen A1 shown in FIG. 2A and selects the icon B3. The screen A2 shown in FIG. 2B is displayed on the display device 23. The user moves the pointer to the icon B14 on the screen A2 and selects the icon B14. The screen A3 shown in FIG. 3A is displayed on the display device 23. The user moves the pointer to the icon B21 on the screen A3 and selects the icon B21. The screen A4 shown in FIG. 3B is displayed on the display device 23.
 操作システム10により処理装置21を操作する場合、演算部12が、画面A1を表示装置16に表示させる。ユーザは、画面A1において、アイコンB3の座標を確認し、第2ファイルに記述する。次に、演算部12が、画面A2を表示装置16に表示させる。ユーザは、画面A1から画面A2への遷移を判定するためのテンプレート画像を、画面A2から切り出す。例えば、模式図D及びアイコンB11~B14のいずれかが、テンプレート画像として使用される。ユーザは、切り出したテンプレート画像を、画像データとして記憶装置14に記憶する。ユーザは、画面A1から画面A2への遷移の判定時に参照されるテンプレート画像のファイル名を、第2ファイルに記述する。 When the processing device 21 is operated by the operation system 10, the calculation unit 12 causes the screen A1 to be displayed on the display device 16. The user confirms the coordinates of the icon B3 on the screen A1 and describes them in the second file. Next, the calculation unit 12 causes the display device 16 to display the screen A2. The user cuts out a template image for determining the transition from the screen A1 to the screen A2 from the screen A2. For example, any of the schematic diagram D and the icons B11 to B14 is used as a template image. The user stores the cut out template image as image data in the storage device 14. The user describes the file name of the template image referred to at the time of determining the transition from the screen A1 to the screen A2 in the second file.
 以降は、同様に、テンプレート画像の記憶、第2ファイルへのファイル名の記述が繰り返される。例えば、画面A3が表示装置16に表示されると、ユーザは、画面の遷移を判定するためのテンプレート画像を画面A3から切り出し、画像データとして記憶装置14に記憶する。ユーザは、そのテンプレート画像のファイル名を、第2ファイルに記述する。 After that, the storage of the template image and the description of the file name in the second file are repeated in the same manner. For example, when the screen A3 is displayed on the display device 16, the user cuts out a template image for determining the transition of the screen from the screen A3 and stores it in the storage device 14 as image data. The user describes the file name of the template image in the second file.
 例えば図4(a)及び図4(b)に表したように、操作システム10による操作のために、第1ファイル及び第2ファイルが準備される。第1ファイルには、上述した、特定のアイコンを選択するための操作命令の生成、テンプレート画像の参照など、演算部12により実行される動作が記述される。第2ファイルには、選択される座標、参照されるテンプレート画像のファイル名などが記述される。 For example, as shown in FIGS. 4 (a) and 4 (b), the first file and the second file are prepared for the operation by the operation system 10. In the first file, the operation executed by the calculation unit 12, such as the generation of the operation instruction for selecting a specific icon and the reference of the template image described above, is described. In the second file, the selected coordinates, the file name of the template image to be referenced, and the like are described.
 操作システム10による操作の開始のトリガは、任意である。例えば、処理装置21に入力された信号や、別のセンサや計測器等から送られた信号などに基づいて操作が開始されても良い。別の操作システム10から送信された信号、又は別の操作システム10によって操作される別の処理装置21から送信された信号に応じて、操作が開始されても良い。信号は、別の操作システム10による操作時に送信されても良いし、別の操作システム10による操作時以外で送信されても良い。表示装置23に特定の画面が表示されたときに、操作が開始されても良い。ユーザが演算部12に特定の命令を入力したときに、操作が開始されても良い。 The trigger for starting the operation by the operation system 10 is arbitrary. For example, the operation may be started based on a signal input to the processing device 21, a signal sent from another sensor, a measuring instrument, or the like. The operation may be initiated in response to a signal transmitted from another operating system 10 or a signal transmitted from another processing device 21 operated by another operating system 10. The signal may be transmitted at the time of operation by another operation system 10, or may be transmitted at a time other than the operation by another operation system 10. The operation may be started when a specific screen is displayed on the display device 23. The operation may be started when the user inputs a specific instruction to the arithmetic unit 12.
 図5は、実施形態に係る操作システムによる操作のための手順を表すフローチャートである。
 ユーザは、第1ファイルを作成する(ステップS1)。次に、ユーザは、参照データを作成し(ステップS2)、第2ファイルを作成する(ステップS3)。操作システム10は、作成された、第1ファイル、参照データ、及び第2ファイルを用いて、処理装置21を自動的に操作する(ステップS4)。
FIG. 5 is a flowchart showing a procedure for operation by the operation system according to the embodiment.
The user creates the first file (step S1). Next, the user creates reference data (step S2) and creates a second file (step S3). The operation system 10 automatically operates the processing device 21 using the created first file, reference data, and second file (step S4).
 実施形態の効果を説明する。
 操作システム10は、処理装置21からの出力信号を取得するとともに、処理装置21に対応した操作信号を入力する。このため、操作システム10は、既存の処理装置21に適用できる。操作が自動化されていない処理装置21に操作システム10を適用することで、処理装置21の操作を自動化できる。例えば、処理装置21の改造やプログラムの書き換え等を行わずに、処理装置21の操作を自動化できる。
The effect of the embodiment will be described.
The operation system 10 acquires an output signal from the processing device 21 and inputs an operation signal corresponding to the processing device 21. Therefore, the operation system 10 can be applied to the existing processing device 21. By applying the operation system 10 to the processing device 21 whose operation is not automated, the operation of the processing device 21 can be automated. For example, the operation of the processing device 21 can be automated without modifying the processing device 21 or rewriting the program.
 操作システム10では、処理装置21を操作する際に、動作ファイルが参照される。動作ファイルは、第1ファイル及び第2ファイルを含む。第1ファイルは、操作命令を生成するための生成動作が記述されている。第2ファイルは、第1ファイルとは別に作成され、第1ファイルの記述に基づく生成動作の実行時に参照される。 In the operation system 10, when operating the processing device 21, the operation file is referred to. The operation file includes the first file and the second file. The first file describes the generation operation for generating the operation instruction. The second file is created separately from the first file and is referred to when the generation operation based on the description of the first file is executed.
 製造現場では、同じ種類の複数の処理装置21が使用されることがある。複数の処理装置21は、複数の操作システム10により、それぞれ自動で操作できる。この場合、共通の動作ファイル及び共通の参照データが使用されても良い。例えば、1つの処理装置21を操作するための動作ファイル及び参照データが用意されると、その動作ファイル及び参照データが複製され、それぞれの操作システム10による操作において使用される。 At the manufacturing site, a plurality of processing devices 21 of the same type may be used. The plurality of processing devices 21 can be automatically operated by the plurality of operation systems 10. In this case, a common operation file and a common reference data may be used. For example, when an operation file and reference data for operating one processing device 21 are prepared, the operation file and reference data are duplicated and used in the operation by each operation system 10.
 同じ種類の処理装置21同士の間でも、表示される画面が僅かに異なることがある。例えば、同じ画面における同じアイコンの位置が、処理装置21同士の間で僅かに異なることがある。この結果、第2ファイルに記述された座標をポインティングデバイスで選択したときに、所望のアイコンを選択できないことがある。同じ種類の処理装置21同士の間でも、ノイズの影響、処理装置21における映像信号の出力部の経年劣化などにより、テンプレート画像の見え方が異なることもある。この結果、画像認識の精度が低下することがある。 The displayed screens may differ slightly even between processing devices 21 of the same type. For example, the positions of the same icons on the same screen may be slightly different between the processing devices 21. As a result, when the coordinates described in the second file are selected by the pointing device, the desired icon may not be selected. Even among processing devices 21 of the same type, the appearance of the template image may differ due to the influence of noise, deterioration of the video signal output unit of the processing device 21 over time, and the like. As a result, the accuracy of image recognition may decrease.
 従来の操作システムでは、このような機差が存在する場合、その機差を補正した新たなプログラムファイルを作成する必要があった。すなわち、演算部12により実行される基本的な動作や、使用される参照データ、入力される文字などが、処理装置21同士の間でほぼ同じにも拘わらず、操作命令を生成するためのプログラムファイルを作り直す必要があった。 In the conventional operation system, when such a machine difference exists, it was necessary to create a new program file that corrected the machine difference. That is, a program for generating an operation instruction even though the basic operations executed by the arithmetic unit 12, the reference data used, the characters to be input, and the like are almost the same between the processing devices 21. I had to recreate the file.
 実施形態に係る操作システム10では、操作命令の生成時に使用される情報が、第1ファイルとは別の第2ファイルに記述される。例えば、アイコンの座標のずれが存在する場合には、そのずれを補正するように、座標の記述を変更すれば良い。テンプレート画像の見え方が異なる場合には、見え方の異なるテンプレート画像を用意し、その画像データを参照するように第2ファイルの記述を変更すれば良い。実施形態によれば、機差を容易に補正でき、操作システム10の準備に必要なユーザの負担を低減できる。 In the operation system 10 according to the embodiment, the information used when generating the operation instruction is described in a second file separate from the first file. For example, if there is a deviation in the coordinates of the icon, the description of the coordinates may be changed so as to correct the deviation. When the appearance of the template image is different, the template image having a different appearance may be prepared, and the description of the second file may be changed so as to refer to the image data. According to the embodiment, the machine difference can be easily corrected, and the burden on the user required for preparing the operation system 10 can be reduced.
 実施形態によれば、ユーザによる負担を低減可能な、操作システム10及び処理装置21を含む処理システム1が提供される。 According to the embodiment, a processing system 1 including an operation system 10 and a processing device 21 that can reduce the burden on the user is provided.
 第2ファイルは、例えばテキストファイルである。このため、プログラミングに関する知識の無いユーザでも、テキスト形式で第2ファイルを記述でき、第2ファイルを容易に修正できる。 The second file is, for example, a text file. Therefore, even a user who has no knowledge about programming can describe the second file in a text format and can easily modify the second file.
 ユーザは、製造ラインの稼働前に、操作システム10による操作に不具合が存在しないか確かめるため、操作を試行しても良い。例えば、ユーザは、操作の試行において、アイコンが適切に選択されるか、画像が正常に認識されているか、などを確認する。操作の不具合が確認された場合には、ユーザは、第2ファイルの記述を適宜修正する。 The user may try the operation in order to confirm whether there is a problem in the operation by the operation system 10 before the operation of the production line. For example, the user confirms whether the icon is properly selected, whether the image is recognized normally, and the like in the operation attempt. When a malfunction in the operation is confirmed, the user appropriately corrects the description in the second file.
 演算部12は、第2ファイルに記述された座標を自動的に修正しても良い。例えば、操作の試行において、演算部12は、第2ファイルに記述された座標を選択するための選択操作命令を生成し、生成部13に送信する。生成部13は、選択操作命令に基づいて操作信号を生成し、処理装置21に送信する。演算部12は、操作信号が送信された後に取得部11によって取得された画像を認識し、予め用意されたテンプレート画像が表示されているか判定する。テンプレート画像が表示されない場合、演算部12は、第2ファイルに記述されていた座標とは異なる別の座標を選択するための別の選択操作命令を生成する。例えば、当該別の座標は、第2ファイルに記述されていた座標近傍からランダムに決定される。演算部12は、生成部13に別の選択操作命令を送信する。生成部13は、別の選択操作命令に基づいて別の操作信号を生成し、処理装置21に送信する。別の選択操作命令の生成は、例えば、予め用意されたテンプレート画像が表示されるか、又は予め設定された回数に達するまで繰り返される。演算部12は、それまでの座標とは異なる別の座標を選択するように、別の選択操作命令を生成する。 The calculation unit 12 may automatically correct the coordinates described in the second file. For example, in an operation trial, the arithmetic unit 12 generates a selection operation command for selecting the coordinates described in the second file and transmits it to the generation unit 13. The generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21. The calculation unit 12 recognizes the image acquired by the acquisition unit 11 after the operation signal is transmitted, and determines whether or not the template image prepared in advance is displayed. When the template image is not displayed, the arithmetic unit 12 generates another selection operation command for selecting another coordinate different from the coordinates described in the second file. For example, the other coordinates are randomly determined from the vicinity of the coordinates described in the second file. The arithmetic unit 12 transmits another selection operation command to the generation unit 13. The generation unit 13 generates another operation signal based on another selection operation command and transmits it to the processing device 21. The generation of another selection operation instruction is repeated, for example, until a prepared template image is displayed or a preset number of times is reached. The arithmetic unit 12 generates another selection operation command so as to select another coordinate different from the previous coordinates.
 図6は、実施形態に係る操作システムによる操作の試行の一部を表すフローチャートである。
 例えば、図5に表したフローチャートにおいて、ステップS1~S3の後でありステップS4の前に、第1ファイル、第2ファイル、及び参照データを用いた操作が試行される。試行の一部において、演算部12は、画面の一部を選択するための選択操作命令を生成する(ステップS11)。生成部13は、その選択操作命令に基づく操作信号を生成し、処理装置21に送信する。演算部12は、画面の認識結果に基づき、操作信号による処理装置21への操作が完了したか判定する(ステップS12)。
FIG. 6 is a flowchart showing a part of an operation trial by the operation system according to the embodiment.
For example, in the flowchart shown in FIG. 5, an operation using the first file, the second file, and the reference data is tried after steps S1 to S3 and before step S4. In a part of the trial, the arithmetic unit 12 generates a selection operation instruction for selecting a part of the screen (step S11). The generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21. Based on the recognition result of the screen, the calculation unit 12 determines whether the operation to the processing device 21 by the operation signal is completed (step S12).
 操作が完了しないとき、演算部12は、選択操作の試行回数が規定回数未満か判定する(ステップS13)。試行回数が規定回数未満のとき、演算部12は、別の座標を選択するための別の選択操作命令を生成する(ステップS14)。生成部13は、その選択操作命令に基づく操作信号を生成し、処理装置21に送信する。選択操作の試行回数が規定回数に達すると、演算部12は、ユーザに向けて報知する(ステップS15)。例えば、操作を適切に実行できないことを知らせる情報を、表示装置16に表示させる。 When the operation is not completed, the calculation unit 12 determines whether the number of trials of the selection operation is less than the specified number (step S13). When the number of trials is less than the specified number of times, the arithmetic unit 12 generates another selection operation instruction for selecting another coordinate (step S14). The generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21. When the number of trials of the selection operation reaches the specified number of times, the calculation unit 12 notifies the user (step S15). For example, the display device 16 displays information informing that the operation cannot be properly executed.
 ステップS12において、選択操作命令に応じた操作が完了したと判定されると、演算部12は、第2ファイルに記述された座標を、選択操作を実行できた選択操作命令に使用された座標に修正する(ステップS16)。第2ファイルに記述されていた元の座標で選択操作を実行できた場合には、第2ファイルの修正は実行されない。 When it is determined in step S12 that the operation corresponding to the selection operation command is completed, the arithmetic unit 12 changes the coordinates described in the second file to the coordinates used for the selection operation command capable of executing the selection operation. Modify (step S16). If the selection operation can be executed with the original coordinates described in the second file, the modification of the second file is not executed.
 図6に表したフローチャートによる処理により、第2ファイルに記述された座標を修正する必要がある場合には、その座標が自動的に修正される。このため、操作システム10の利便性をさらに向上できる。 If it is necessary to correct the coordinates described in the second file by the processing according to the flowchart shown in FIG. 6, the coordinates are automatically corrected. Therefore, the convenience of the operation system 10 can be further improved.
 なお、処理装置21に操作信号が入力されたときに、処理装置21による応答が遅延することがある。選択操作命令に応じた操作が完了したか判定された際には、遅延により画面が変化しておらず、判定後に正常に画面が変化する可能性がある。このため、表示装置23への画面の表示に遅延が生じうる処理装置21に対しては、操作信号を送信する間隔をより長く設定するか、選択操作命令に基づく操作信号を複数回送信する。例えば、図6に表したフローチャートにおいて、ある操作信号が処理装置21に入力された後に操作が完了しない場合、生成部13から同じ操作信号が処理装置21に再度入力される。 Note that when an operation signal is input to the processing device 21, the response by the processing device 21 may be delayed. When it is determined whether the operation corresponding to the selection operation command is completed, the screen does not change due to the delay, and the screen may change normally after the determination. Therefore, for the processing device 21 where the display of the screen on the display device 23 may be delayed, the interval for transmitting the operation signal is set longer, or the operation signal based on the selection operation command is transmitted a plurality of times. For example, in the flowchart shown in FIG. 6, when the operation is not completed after a certain operation signal is input to the processing device 21, the same operation signal is input to the processing device 21 again from the generation unit 13.
(第1変形例)
 図7は、第1変形例に係る操作システムを表す模式図である。
 処理装置21には、入力装置と表示装置としての機能を備える入力表示装置が接続されても良い。例えば、図7に表したように、入力表示装置として、タッチパネル24が設けられる。第1変形例に係る操作システム10aは、切替部17をさらに含む。処理システム1aは、操作システム10a及び処理装置21を含む。
(First modification)
FIG. 7 is a schematic diagram showing an operation system according to the first modification.
An input display device having a function as an input device and a display device may be connected to the processing device 21. For example, as shown in FIG. 7, a touch panel 24 is provided as an input display device. The operation system 10a according to the first modification further includes a switching unit 17. The processing system 1a includes an operation system 10a and a processing device 21.
 生成部13は、タッチパネル24から送信される信号に相当する操作信号を生成する。切替部17は、処理装置21に入力される信号を切り替える。具体的には、切替部17には、タッチパネル24から送信された信号と、生成部13から送信された操作信号と、が入力される。切替部17は、タッチパネル24から送信された信号が処理装置21に入力される状態と、生成部13から送信された操作信号が処理装置21に入力される状態と、を切り替える。操作システム10aによる操作中、切替部17は、操作信号を処理装置21に入力する。 The generation unit 13 generates an operation signal corresponding to the signal transmitted from the touch panel 24. The switching unit 17 switches the signal input to the processing device 21. Specifically, a signal transmitted from the touch panel 24 and an operation signal transmitted from the generation unit 13 are input to the switching unit 17. The switching unit 17 switches between a state in which the signal transmitted from the touch panel 24 is input to the processing device 21 and a state in which the operation signal transmitted from the generation unit 13 is input to the processing device 21. During the operation by the operation system 10a, the switching unit 17 inputs the operation signal to the processing device 21.
 切替部17により、操作システム10aによる操作中に、ユーザがタッチパネル24により処理装置21を操作することを回避できる。例えば、ユーザの操作により、別の画面に遷移し、操作システム10aによる操作が中断されることを回避できる。 The switching unit 17 can prevent the user from operating the processing device 21 by the touch panel 24 during the operation by the operation system 10a. For example, it is possible to avoid the transition to another screen by the user's operation and the interruption of the operation by the operation system 10a.
 切替部17は、例えばリレー回路又はスイッチング回路を含む。切替部17を設けることで、タッチパネル24が使用される処理装置21を、操作システム10aにより操作することが可能となる。 The switching unit 17 includes, for example, a relay circuit or a switching circuit. By providing the switching unit 17, the processing device 21 in which the touch panel 24 is used can be operated by the operation system 10a.
 タッチパネル24については、同じ種類の装置であったとしても、生成される操作信号の座標のずれが大きい。これは、タッチを検出するセンサの座標と表示器の位置関係が個別に調整されているためである。従って、同じ種類の複数の処理装置21に、それぞれ、同じ種類の複数のタッチパネル24を用いた場合でも、自動操作時に選択又は参照される座標を、処理装置21ごとに修正しなければならないことがある。 Regarding the touch panel 24, even if the devices are of the same type, the coordinate deviation of the generated operation signal is large. This is because the coordinates of the sensor that detects touch and the positional relationship of the display are individually adjusted. Therefore, even when a plurality of touch panels 24 of the same type are used for a plurality of processing devices 21 of the same type, the coordinates selected or referred to at the time of automatic operation must be corrected for each processing device 21. be.
 実施形態によれば、第2ファイルに記述された座標を修正することで、座標のずれに関する機差を容易に修正できる。このため、実施形態に係る発明は、特にタッチパネル24が用いられる処理装置21に好適である。第1変形例によれば、タッチパネル24を含む処理システム1aに対するユーザの負担を低減できる。 According to the embodiment, by correcting the coordinates described in the second file, it is possible to easily correct the machine difference regarding the deviation of the coordinates. Therefore, the invention according to the embodiment is particularly suitable for the processing device 21 in which the touch panel 24 is used. According to the first modification, the burden on the user on the processing system 1a including the touch panel 24 can be reduced.
(第2変形例)
 図8は、第2変形例に係る操作システムを表す模式図である。
 第2変形例に係る操作システム10bは、操作システム10と比較して、切替部17をさらに含む。生成部13又は入力装置22から処理装置21に向けて送信された信号は、切替部17を介して、処理装置21に入力される。切替部17は、入力装置22から送信された信号が処理装置21に入力される状態と、生成部13から送信された操作信号が処理装置21に入力される状態と、を切り替える。
(Second modification)
FIG. 8 is a schematic diagram showing an operation system according to the second modification.
The operation system 10b according to the second modification further includes a switching unit 17 as compared with the operation system 10. The signal transmitted from the generation unit 13 or the input device 22 to the processing device 21 is input to the processing device 21 via the switching unit 17. The switching unit 17 switches between a state in which the signal transmitted from the input device 22 is input to the processing device 21 and a state in which the operation signal transmitted from the generation unit 13 is input to the processing device 21.
 操作システム10bによる操作中、切替部17は、操作信号を処理装置21に入力する。これにより、操作システム10bによる操作中に、ユーザが入力装置22を用いて処理装置21を操作することを回避できる。第2変形例によれば、操作システム10b及び処理装置21を含む処理システム1bの利便性を向上できる。 During operation by the operation system 10b, the switching unit 17 inputs an operation signal to the processing device 21. This makes it possible to prevent the user from operating the processing device 21 using the input device 22 during the operation by the operation system 10b. According to the second modification, the convenience of the processing system 1b including the operation system 10b and the processing device 21 can be improved.
(第3変形例)
 図9は、第3変形例に係る操作システムを表す模式図である。
 図9に表したように、第3変形例に係る操作システム10cは、操作システム10に比べて、読取装置18a及び18bをさらに含む。読取装置18a及び18bは、識別情報を読み取る。識別情報として、一次元バーコード、二次元バーコード、又はRadio Frequency IDentifier(RFID)を用いることができる。読取装置18a及び18bは、バーコードリーダ又はRFIDリーダである。例えば、読取装置18aは、処理装置21を担当するオペレータの識別情報を読み取る。読取装置18bは、処理装置21で処理されるワークの識別情報を読み取る。
(Third modification example)
FIG. 9 is a schematic diagram showing an operation system according to the third modification.
As shown in FIG. 9, the operation system 10c according to the third modification further includes reading devices 18a and 18b as compared with the operation system 10. The reading devices 18a and 18b read the identification information. As the identification information, a one-dimensional bar code, a two-dimensional bar code, or a Radio Frequency IDentifier (RFID) can be used. The readers 18a and 18b are barcode readers or RFID readers. For example, the reading device 18a reads the identification information of the operator in charge of the processing device 21. The reading device 18b reads the identification information of the work processed by the processing device 21.
 例えば、処理装置21による処理を実行する際、ユーザ(オペレータ)は、読取装置18bがワークの識別情報を読み取り可能な場所に、ワークを置く。その後、ユーザは、読取装置18aに自身の識別情報を読み取らせる。演算部12は、読取装置18aによる識別情報の読み取りに応じて、処理装置21の操作を開始する。例えば、演算部12は、ワークの識別情報を読取装置18bに読み取らせ、その後に操作命令の生成を開始する。 For example, when executing the process by the processing device 21, the user (operator) places the work in a place where the reading device 18b can read the identification information of the work. After that, the user causes the reading device 18a to read his / her identification information. The calculation unit 12 starts the operation of the processing device 21 in response to the reading of the identification information by the reading device 18a. For example, the arithmetic unit 12 causes the reading device 18b to read the identification information of the work, and then starts generating an operation command.
 操作システム10cは、読み取られた識別情報に応じた操作信号を生成し、処理装置21へ入力しても良い。又は、演算部12から処理装置21へ、通信ポートを介してシリアル通信等により識別情報が直接的に入力されても良い。 The operation system 10c may generate an operation signal according to the read identification information and input it to the processing device 21. Alternatively, the identification information may be directly input from the arithmetic unit 12 to the processing device 21 via a communication port or the like by serial communication or the like.
 第3変形例のように、操作システム10cは、処理装置21により実行される処理に関係する他の装置を含んでいても良い。他の装置の動作に応じて、処理装置21の操作が開始されても良い。これにより、ユーザは、操作の開始のために入力装置15又は22を操作する必要が無い。第3変形例によれば、操作システム10cの利便性を向上させることができる。操作システム10c及び処理装置21を含む処理システム1cの利便性を向上させることができる。 As in the third modification, the operation system 10c may include other devices related to the processing executed by the processing device 21. The operation of the processing device 21 may be started according to the operation of the other device. This eliminates the need for the user to operate the input device 15 or 22 to initiate the operation. According to the third modification, the convenience of the operation system 10c can be improved. The convenience of the processing system 1c including the operation system 10c and the processing device 21 can be improved.
(第4変形例)
 図10は、第4変形例に係る操作システムを表す模式図である。
 第4変形例に係る操作システム10dは、図10に表したように、管理装置30(第2装置)と接続される。例えば、演算部12が、ネットワーク、有線通信、又は無線通信により管理装置30と接続される。管理装置30は、処理装置21に向けて命令を送信する上位の装置である。
(Fourth modification)
FIG. 10 is a schematic diagram showing an operation system according to the fourth modification.
As shown in FIG. 10, the operation system 10d according to the fourth modification is connected to the management device 30 (second device). For example, the arithmetic unit 12 is connected to the management device 30 by network, wired communication, or wireless communication. The management device 30 is a higher-level device that transmits a command to the processing device 21.
 管理装置30が処理装置21に向けた命令を送信すると、操作システム10dは、その命令を受信する。操作システム10dは、命令に応じて操作を実行する。例えば、演算部12は、命令に応じた操作命令を生成し、生成部13へ送信する。 When the management device 30 transmits a command to the processing device 21, the operation system 10d receives the command. The operation system 10d executes an operation in response to an instruction. For example, the arithmetic unit 12 generates an operation instruction according to the instruction and transmits it to the generation unit 13.
 例えば、記憶装置14には、複数の動作ファイルが記憶される。演算部12は、管理装置30から送信された命令に応じて、複数の動作ファイルの1つを選択する。演算部12は、選択した動作ファイルに含まれるプログラムを実行し、操作命令を生成する。又は、管理装置30から送信される命令は、操作の開始のトリガとして用いられても良い。 For example, a plurality of operation files are stored in the storage device 14. The arithmetic unit 12 selects one of the plurality of operation files according to the instruction transmitted from the management device 30. The arithmetic unit 12 executes the program included in the selected operation file and generates an operation instruction. Alternatively, the instruction transmitted from the management device 30 may be used as a trigger for starting the operation.
 管理装置30は、複数の操作システム10dと接続されても良い。別の操作システム10cから送信された信号、又は別の操作システム10dによって操作される別の処理装置21から送信された信号に応じて、操作システム10dによる操作が開始されても良い。信号は、別の操作システム10dによる操作時に送信されても良いし、別の操作システム10dによる操作時以外で送信されても良い。複数の操作システム10d、複数の処理装置21、及び管理装置30の3つ以上の間で、操作の開始のための信号が送受信されることで、例えば、複数の処理装置21を効率的に稼動できる。 The management device 30 may be connected to a plurality of operation systems 10d. The operation by the operation system 10d may be started in response to the signal transmitted from another operation system 10c or the signal transmitted from another processing device 21 operated by another operation system 10d. The signal may be transmitted at the time of operation by another operation system 10d, or may be transmitted at a time other than the operation by another operation system 10d. By transmitting and receiving signals for starting operations between three or more of the plurality of operation systems 10d, the plurality of processing devices 21, and the management device 30, for example, the plurality of processing devices 21 can be efficiently operated. can.
 操作システム10dは、処理装置21により実行される処理に関するパラメータを、処理装置21に入力しても良い。パラメータは、処理されるワークの数、処理実行時の条件などである。条件は、例えば、温度、圧力、気体又は液体の流量、電流、電圧、処理時間などである。操作システム10dは、パラメータの数値を入力するための操作信号を生成し、処理装置21へ送信する。これにより、表示装置23に表示されたテキストボックス等に、数値が入力される。 The operation system 10d may input parameters related to the processing executed by the processing device 21 to the processing device 21. The parameters are the number of workpieces to be processed, conditions at the time of processing execution, and the like. The conditions are, for example, temperature, pressure, gas or liquid flow rate, current, voltage, processing time and the like. The operation system 10d generates an operation signal for inputting a numerical value of a parameter and transmits it to the processing device 21. As a result, a numerical value is input to the text box or the like displayed on the display device 23.
 管理装置30は、操作システム10dから処理装置21に入力されるパラメータを、操作システム10dへ送信しても良い。操作システム10dは、パラメータとして入力される数値を受信し、その数値に応じた操作信号を生成する。これにより、管理装置30から送信された数値が処理装置21へ入力される。 The management device 30 may transmit parameters input from the operation system 10d to the processing device 21 to the operation system 10d. The operation system 10d receives a numerical value input as a parameter and generates an operation signal according to the numerical value. As a result, the numerical value transmitted from the management device 30 is input to the processing device 21.
 管理装置30は、処理されるワークに関する情報を操作システム10dへ送信しても良い。例えば、管理装置30は、ワークの識別情報(ID)を、操作システム10dへ送信する。操作システム10dは、通信ポートを介して識別情報を処理装置21へ入力する。又は、操作システム10dは、識別情報に応じた操作信号を生成し、処理装置21へ入力しても良い。 The management device 30 may transmit information about the workpiece to be processed to the operation system 10d. For example, the management device 30 transmits the work identification information (ID) to the operation system 10d. The operation system 10d inputs the identification information to the processing device 21 via the communication port. Alternatively, the operation system 10d may generate an operation signal according to the identification information and input it to the processing device 21.
 製造現場では、複数の処理装置を管理するために、より上位の装置が設けられることがある。一方、処理装置21と管理装置30との間の通信IFの互換性、通信規格の相違などにより、管理装置30から処理装置21へ、直接命令を送信できない場合がある。例えば、処理装置21が古いと、管理装置30から送信される命令を、処理装置21が受け付けられないことがある。この場合、操作システム10dを用いることで、管理装置30により処理装置21を管理することが可能となる。例えば、処理装置21の改造等を行う必要が無く、より容易に生産効率を向上できる。操作システム10dを用いることで、単体又は複数の処理装置21の稼働状況を管理装置30へ送信できる。このため、管理装置30が、処理システム1dの単体又は複数の稼動率又は稼働状況に関する情報を容易に取得できる。 At the manufacturing site, higher-level equipment may be installed to manage multiple processing equipment. On the other hand, there are cases where a command cannot be directly transmitted from the management device 30 to the processing device 21 due to compatibility of communication IFs between the processing device 21 and the management device 30, differences in communication standards, and the like. For example, if the processing device 21 is old, the processing device 21 may not accept the command transmitted from the management device 30. In this case, by using the operation system 10d, the processing device 21 can be managed by the management device 30. For example, it is not necessary to modify the processing device 21, and the production efficiency can be improved more easily. By using the operation system 10d, the operating status of a single or a plurality of processing devices 21 can be transmitted to the management device 30. Therefore, the management device 30 can easily acquire information on the operating rate or the operating status of the single or a plurality of processing systems 1d.
 図11は、実施形態に係る操作システムのハードウェア構成を表す模式図である。
 例えば、演算部12は、コンピュータであり、ROM12a、RAM12b、CPU12c、およびHDD12dを含む。ROM12aは、コンピュータの動作を制御するプログラムを記憶している。ROM12aには、各種処理をコンピュータに実現させるために必要なプログラムが記憶されている。
FIG. 11 is a schematic diagram showing the hardware configuration of the operation system according to the embodiment.
For example, the arithmetic unit 12 is a computer and includes a ROM 12a, a RAM 12b, a CPU 12c, and an HDD 12d. The ROM 12a stores a program that controls the operation of the computer. The ROM 12a stores programs necessary for realizing various processes in a computer.
 RAM12bは、ROM12aに記憶されたプログラムが展開される記憶領域として機能する。CPU12cは、処理回路を含む。CPU12cは、ROM12aに記憶された制御プログラムを読み込み、当該制御プログラムに従ってコンピュータの動作を制御する。CPU12cは、コンピュータの動作によって得られた様々なデータをRAM12bに展開する。HDD12dは、読み取りに必要なデータや、読み取りの過程で得られたデータを記憶する。HDD12dは、図1に表した記憶装置14として機能しても良い。 The RAM 12b functions as a storage area in which the program stored in the ROM 12a is expanded. The CPU 12c includes a processing circuit. The CPU 12c reads the control program stored in the ROM 12a and controls the operation of the computer according to the control program. The CPU 12c expands various data obtained by the operation of the computer into the RAM 12b. The HDD 12d stores data necessary for reading and data obtained in the process of reading. The HDD 12d may function as the storage device 14 shown in FIG.
 以上で説明した操作システム、処理システム、又は操作方法を用いることで、利便性を向上できる。同様に、コンピュータに上述した処理を実行させるためのプログラム又はそのプログラムを記憶した記憶媒体を用いることで、操作システムの利便性を向上できる。 Convenience can be improved by using the operation system, processing system, or operation method described above. Similarly, the convenience of the operation system can be improved by using a program for causing the computer to execute the above-mentioned processing or a storage medium in which the program is stored.
 上記の種々のデータの処理は、コンピュータに実行させることのできるプログラムとして、磁気ディスク(フレキシブルディスク及びハードディスクなど)、光ディスク(CD-ROM、CD-R、CD-RW、DVD-ROM、DVD±R、DVD±RWなど)、半導体メモリなどの、コンピュータで読取可能な非一時的な記憶媒体(non-transitory computer-readable storage medium)に記録されても良い。 The above-mentioned various data processing can be performed by a computer as a program that can be executed by a magnetic disk (flexible disk, hard disk, etc.), an optical disk (CD-ROM, CD-R, CD-RW, DVD-ROM, DVD ± R). , DVD ± RW, etc.), and may be recorded on a non-transitory computer-readable storage medium such as a semiconductor memory.
 例えば、記録媒体に記録されたデータは、コンピュータ(または組み込みシステム)により読み出されることが可能である。記録媒体において、記録形式(記憶形式)は任意である。例えば、コンピュータは、記録媒体からプログラムを読み出し、このプログラムに基づいてプログラムに記述されている指示をCPUで実行させる。コンピュータにおいて、プログラムの取得(または読み出し)は、ネットワークを通じて行われても良い。 For example, the data recorded on the recording medium can be read by a computer (or embedded system). In the recording medium, the recording format (storage format) is arbitrary. For example, the computer reads a program from the recording medium and causes the CPU to execute the instructions described in the program based on the program. In the computer, the acquisition (or reading) of the program may be performed through the network.
 以上、本発明のいくつかの実施形態を例示したが、これらの実施形態は、例として提示したものであり、発明の範囲を限定することは意図していない。これら新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更などを行うことができる。これら実施形態やその変形例は、発明の範囲や要旨に含まれるとともに、請求の範囲に記載された発明とその均等の範囲に含まれる。前述の各実施形態は、相互に組み合わせて実施することができる。 Although some embodiments of the present invention have been exemplified above, these embodiments are presented as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other embodiments, and various omissions, replacements, changes, and the like can be made without departing from the gist of the invention. These embodiments and variations thereof are included in the scope and gist of the invention, and are also included in the scope of the invention described in the claims and the equivalent scope thereof. Each of the above embodiments can be implemented in combination with each other.

Claims (19)

  1.  操作対象である第1装置からの出力を取得する取得部と、
     前記出力の取得結果及び予め作成された動作ファイルに基づいて、操作命令を生成する演算部と、
     前記操作命令に基づいて前記第1装置に対応した操作信号を生成し、前記第1装置へ送信する生成部と、
     を備え、
     前記動作ファイルは、
      前記操作命令を生成するための生成動作が記述された第1ファイルと、
      前記生成動作の実行時に参照され、前記出力の識別及び前記操作信号の生成に使用される情報を含む第2ファイルと、
     を含む、操作システム。
    An acquisition unit that acquires the output from the first device that is the operation target, and
    An arithmetic unit that generates operation instructions based on the output acquisition result and the operation file created in advance.
    A generation unit that generates an operation signal corresponding to the first device based on the operation command and transmits it to the first device.
    Equipped with
    The operation file is
    The first file in which the generation operation for generating the operation instruction is described, and
    A second file that is referenced during the execution of the generation operation and contains information used to identify the output and generate the operation signal.
    Including the operation system.
  2.  前記出力は、前記第1装置から表示装置に向けて送信される映像信号を含み、
     前記第2ファイルは、前記表示装置の画面上において、入力装置による前記第1装置への選択操作が実行される座標を含む、請求項1記載の操作システム。
    The output includes a video signal transmitted from the first device to the display device.
    The operation system according to claim 1, wherein the second file includes coordinates on which a selection operation to the first device by an input device is executed on the screen of the display device.
  3.  前記演算部は、前記映像信号が示す画像を認識し、前記画像の認識結果に基づいて、前記操作信号による前記第1装置への操作が完了したか判定する、請求項2記載の操作システム。 The operation system according to claim 2, wherein the calculation unit recognizes an image indicated by the video signal, and determines whether or not the operation of the operation signal to the first device is completed based on the recognition result of the image.
  4.  前記操作命令が前記座標を選択する選択操作命令を含み、且つ前記選択操作命令に基づく前記操作信号による前記第1装置への操作が完了しない場合に、前記演算部は、前記座標とは別の座標を選択する別の選択操作命令を生成する、請求項3記載の操作システム。 When the operation command includes a selection operation command for selecting the coordinates and the operation to the first device by the operation signal based on the selection operation command is not completed, the calculation unit is different from the coordinates. The operation system according to claim 3, wherein another selection operation instruction for selecting coordinates is generated.
  5.  前記演算部は、前記画像の認識において、パターンマッチング、文字認識、及び色認識から選択される少なくとも1つを実行する請求項3又は4に記載の操作システム。 The operation system according to claim 3 or 4, wherein the calculation unit executes at least one selected from pattern matching, character recognition, and color recognition in the recognition of the image.
  6.  入力装置及び前記生成部の一方からの信号を前記第1装置へ送信する切替部をさらに備えた請求項1~5のいずれか1つに記載の操作システム。 The operation system according to any one of claims 1 to 5, further comprising a switching unit for transmitting a signal from one of the input device and the generation unit to the first device.
  7.  前記第2ファイルは、テキスト形式で記述される、請求項1~6のいずれか1つに記載の操作システム。 The second file is the operation system according to any one of claims 1 to 6, which is described in a text format.
  8.  前記演算部は、
      前記第1装置に向けた命令を送信する第2装置から、前記命令を受信し、
      前記命令に応じて前記操作命令の生成を開始する、
     請求項1~7のいずれか1つに記載の操作システム。
    The arithmetic unit
    The command is received from the second device that transmits the command to the first device, and the command is received.
    In response to the instruction, the generation of the operation instruction is started.
    The operation system according to any one of claims 1 to 7.
  9.  識別情報を読み取る読取装置をさらに備え、
     前記演算部は、前記読取装置による識別情報の読み取りに応じて、前記操作命令の生成を開始する、請求項1~8のいずれか1つに記載の操作システム。
    Further equipped with a reader to read the identification information,
    The operation system according to any one of claims 1 to 8, wherein the calculation unit starts generating the operation instruction in response to reading the identification information by the reading device.
  10.  請求項1~8のいずれか1つに記載の操作システムと、
     ワークの処理を実行する前記第1装置と、
     を備えた処理システム。
    The operation system according to any one of claims 1 to 8.
    The first device that executes the processing of the work and
    Processing system equipped with.
  11.  操作対象である第1装置からの出力を取得し、
     前記出力の取得結果及び予め作成された動作ファイルに基づいて、操作命令を生成し、
     前記操作命令に基づいて前記第1装置に対応した操作信号を生成し、前記第1装置へ送信する操作方法であって、
     前記動作ファイルは、
      前記操作命令を生成するための生成動作が記述された第1ファイルと、
      前記生成動作の実行時に参照され、前記出力の識別及び前記操作信号の生成に使用される情報を含む第2ファイルと、
     を含む、操作方法。
    Acquires the output from the first device that is the operation target,
    An operation instruction is generated based on the acquisition result of the output and the operation file created in advance.
    An operation method for generating an operation signal corresponding to the first device based on the operation command and transmitting the operation signal to the first device.
    The operation file is
    The first file in which the generation operation for generating the operation instruction is described, and
    A second file that is referenced during the execution of the generation operation and contains information used to identify the output and generate the operation signal.
    Operation method including.
  12.  前記出力は、前記第1装置から表示装置に向けて送信される映像信号を含み、
     前記第2ファイルは、前記表示装置の画面上において、入力装置による前記第1装置への選択操作が実行される座標を含む、請求項11記載の操作方法。
    The output includes a video signal transmitted from the first device to the display device.
    The operation method according to claim 11, wherein the second file includes coordinates on which a selection operation to the first device by an input device is executed on the screen of the display device.
  13.  前記映像信号が示す画像を識別し、前記画像の認識結果に基づいて、前記操作信号による前記第1装置への操作が完了したか判定する、請求項12記載の操作方法。 The operation method according to claim 12, wherein the image indicated by the video signal is identified, and based on the recognition result of the image, it is determined whether or not the operation to the first device by the operation signal is completed.
  14.  前記操作命令が前記座標を選択する選択操作命令を含み、且つ前記選択操作命令に基づく前記操作信号による前記第1装置への操作が完了しない場合に、前記座標とは別の座標を選択する別の選択操作命令を生成する、請求項13記載の操作方法。 When the operation command includes a selection operation command for selecting the coordinates and the operation to the first device by the operation signal based on the selection operation command is not completed, another coordinate different from the coordinates is selected. 13. The operation method according to claim 13, wherein the selection operation command of the above is generated.
  15.  操作対象である第1装置からの出力の取得結果の受信に応じてコンピュータに操作命令を生成させるプログラムであって、
     前記操作命令を生成するための生成動作の記述を含み、
     前記生成動作の実行時に、前記出力の識別及び操作信号の生成に使用される情報を含むファイルを前記コンピュータに参照させる、プログラム。
    A program that causes a computer to generate an operation command in response to the reception of the output acquisition result from the first device that is the operation target.
    Includes a description of the generation operation for generating the operation instruction.
    A program that causes the computer to refer to a file containing information used to identify the output and generate an operation signal when performing the generation operation.
  16.  前記出力は、前記第1装置から表示装置に向けて送信される映像信号を含み、
     前記ファイルは、前記表示装置の画面上において、入力装置による前記第1装置への選択操作が実行される座標を含む、請求項15記載のプログラム。
    The output includes a video signal transmitted from the first device to the display device.
    The program according to claim 15, wherein the file includes coordinates on which a selection operation to the first device by an input device is executed on the screen of the display device.
  17.  前記コンピュータに、前記映像信号が示す画像を認識させ、前記画像の認識結果に基づいて、前記操作信号による前記第1装置への操作が完了したか判定させる、請求項16記載のプログラム。 The program according to claim 16, wherein the computer recognizes an image indicated by the video signal, and based on the recognition result of the image, determines whether or not the operation of the operation signal to the first device is completed.
  18.  前記操作命令が前記座標を選択する選択操作命令を含み、且つ前記選択操作命令に基づく前記操作信号による前記第1装置への操作が完了しない場合に、前記コンピュータに、前記座標とは別の座標を選択する別の選択操作命令を生成させる、請求項17記載のプログラム。 When the operation command includes a selection operation command for selecting the coordinates and the operation to the first device by the operation signal based on the selection operation command is not completed, the computer has coordinates different from the coordinates. 17. The program of claim 17, which causes another selection operation instruction to select.
  19.  請求項15~18のいずれか1つに記載のプログラムを記憶した記憶媒体。 A storage medium that stores the program according to any one of claims 15 to 18.
PCT/JP2021/025172 2020-07-06 2021-07-02 Operating system, processing system, operating method, program, and storage medium WO2022009802A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020116434A JP7174014B2 (en) 2020-07-06 2020-07-06 Operating system, processing system, operating method, and program
JP2020-116434 2020-07-06

Publications (1)

Publication Number Publication Date
WO2022009802A1 true WO2022009802A1 (en) 2022-01-13

Family

ID=79552628

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/025172 WO2022009802A1 (en) 2020-07-06 2021-07-02 Operating system, processing system, operating method, program, and storage medium

Country Status (2)

Country Link
JP (2) JP7174014B2 (en)
WO (1) WO2022009802A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024067151A (en) * 2022-11-04 2024-05-17 株式会社東芝 Facility control system, facility control device, and facility control method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011115249A1 (en) * 2010-03-19 2011-09-22 株式会社日立国際電気 Substrate treatment apparatus
WO2013031735A1 (en) * 2011-08-27 2013-03-07 株式会社エネサイバー Monitoring control method and monitoring control device
WO2013190714A1 (en) * 2012-06-19 2013-12-27 日本電能株式会社 Automatic operation system and method for automating operation
JP2018535459A (en) * 2015-07-02 2018-11-29 アクセンチュア グローバル サービスィズ リミテッド Robotic process automation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011115249A1 (en) * 2010-03-19 2011-09-22 株式会社日立国際電気 Substrate treatment apparatus
WO2013031735A1 (en) * 2011-08-27 2013-03-07 株式会社エネサイバー Monitoring control method and monitoring control device
WO2013190714A1 (en) * 2012-06-19 2013-12-27 日本電能株式会社 Automatic operation system and method for automating operation
JP2018535459A (en) * 2015-07-02 2018-11-29 アクセンチュア グローバル サービスィズ リミテッド Robotic process automation

Also Published As

Publication number Publication date
JP2023014104A (en) 2023-01-26
JP7174014B2 (en) 2022-11-17
JP2022014215A (en) 2022-01-19

Similar Documents

Publication Publication Date Title
US20160069007A1 (en) Sewing machine system, terminal device, method of synchronizing embroidery data for sewing machine system, and recording non-transitory medium storing program for terminal device
JP5077826B2 (en) Plant information display device and plant information display method
JP2010152429A (en) Gui application test support device and test support method
WO2022009802A1 (en) Operating system, processing system, operating method, program, and storage medium
KR102291974B1 (en) Semiconductor system and method for supporting data editing
WO2022009764A1 (en) Setting code generating device, industrial machine, setting code generating method, and setting code generating program
JP2003345506A (en) Operation inputting device and image forming device
JP2011186607A (en) Field device, setting method for the same, field device management method and program
US11703830B2 (en) Production system, recovery system, production method, and information storage medium
JP2007034837A (en) Electronic component mounting system
JP4558675B2 (en) Numerical controller
US11036440B2 (en) Image formation system having a plurality of image formation apparatuses and method for controlling them
WO2023276875A1 (en) Operation system, processing system, method for constructing processing system, computer, operation method, program, and storage medium
JP4346387B2 (en) Display control system
EP3247097B1 (en) Printing apparatus, method of controlling the same, and storage medium
US10412251B2 (en) Image processing apparatus for debugging a hardware emulation process
JP6643966B2 (en) Safety system development support equipment
JP2007335963A (en) Device and method for setting equipment, and program
JP6390468B2 (en) Production line work instruction system
WO2023189280A1 (en) Information processing device, and program
JP7283965B2 (en) Wireless tag reader and program
TWI802163B (en) Control system and control method based on script
JP2019008635A (en) Screen display device, engineering device, and screen display method
KR102026274B1 (en) Easy setup menu providing method for equipment operation guide
JP2020160519A (en) Display control device and display control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837541

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837541

Country of ref document: EP

Kind code of ref document: A1