WO2022009802A1 - Système d'exploitation, système de traitement, procédé d'exploitation, programme et support de stockage - Google Patents

Système d'exploitation, système de traitement, procédé d'exploitation, programme et support de stockage Download PDF

Info

Publication number
WO2022009802A1
WO2022009802A1 PCT/JP2021/025172 JP2021025172W WO2022009802A1 WO 2022009802 A1 WO2022009802 A1 WO 2022009802A1 JP 2021025172 W JP2021025172 W JP 2021025172W WO 2022009802 A1 WO2022009802 A1 WO 2022009802A1
Authority
WO
WIPO (PCT)
Prior art keywords
file
command
coordinates
signal
output
Prior art date
Application number
PCT/JP2021/025172
Other languages
English (en)
Japanese (ja)
Inventor
康徳 渕上
Original Assignee
株式会社 東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝 filed Critical 株式会社 東芝
Publication of WO2022009802A1 publication Critical patent/WO2022009802A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • Embodiments of the present invention relate to an operation system, a processing system, an operation method, a program, and a storage medium.
  • An object to be solved by the present invention is to provide an operation system, a processing system, an operation method, a program, and a storage medium that can reduce the burden on the user.
  • the operation system includes an acquisition unit, a calculation unit, and a generation unit.
  • the acquisition unit acquires the output from the first device to be operated.
  • the arithmetic unit generates an operation instruction based on the acquisition result of the output and the operation file created in advance.
  • the generation unit generates an operation signal corresponding to the first device based on the operation command and transmits it to the first device.
  • the operation file includes a first file in which a generation operation for generating the operation instruction is described, and information referred to when the generation operation is executed and used for identification of the output and generation of the operation signal. Includes a second file.
  • FIG. 1 is a schematic diagram showing an operation system according to an embodiment.
  • the operation system 10 according to the embodiment includes an acquisition unit 11, a calculation unit 12, a generation unit 13, and a storage device 14.
  • the operation system 10 is used to automatically operate the processing device 21 (first device).
  • the processing device 21 processes the work.
  • the treatment is, for example, at least one selected from transport, processing, washing, heating, cooling, and drying.
  • Processing is, for example, at least one selected from film formation, etching, polishing, lithography, and joining.
  • the processing device 21 includes a control unit 21a. Each component of the processing device 21 operates based on the instruction transmitted from the control unit 21a to process the work.
  • the control unit 21a includes, for example, a central processing unit (CPU), a read-only memory (ROM), a random access memory (RAM), a storage device, an input interface, an output interface, a communication interface, and a bus connecting them.
  • the control unit 21a may be a dedicated computer for the processing device 21 or a general-purpose computer.
  • the function of the control unit 21a may be realized by the collaboration of a plurality of computers.
  • the input device 22 is used by the user to input information to the processing device 21.
  • the user is, for example, an operator in charge of processing executed by the processing device 21.
  • the input device 22 includes, for example, at least one selected from a keyboard, mouse, touchpad, and microphone (voice input).
  • the display device 23 displays the signal output from the processing device 21 so that the user can visually recognize it.
  • the display device 23 includes a monitor.
  • the acquisition unit 11 acquires the output from the processing device 21 which is the operation target to the display device 23.
  • the output includes a video signal transmitted from the processing device 21 to the display device 23.
  • the acquisition unit 11 receives the video signal.
  • the acquisition unit 11 acquires the video signal and transmits the video signal to the display device 23.
  • the acquisition unit 11 includes, for example, a video signal distributor and a capture unit.
  • the acquisition unit 11 acquires the video signal as a moving image or a still image and transmits it to the calculation unit 12.
  • the output may be light emitted from the processing device 21.
  • the acquisition unit 11 includes an image pickup device.
  • the acquisition unit 11 takes a picture of the light emitting device that emits light.
  • the acquisition unit 11 acquires a moving image or a still image and transmits it to the calculation unit 12.
  • the output may be a sound emitted from the processing device 21.
  • the acquisition unit 11 includes a microphone. The acquisition unit 11 acquires the sound emitted from the processing device 21 as an electric signal and transmits it to the calculation unit 12.
  • the calculation unit 12 recognizes the acquisition result by the acquisition unit 11. For example, the calculation unit 12 recognizes the image acquired by the acquisition unit 11. In image recognition, processing such as pattern matching, character recognition, and color matching is executed. Alternatively, the calculation unit 12 recognizes the sound acquired by the acquisition unit 11. In sound recognition, processing such as pattern matching is executed.
  • the calculation unit 12 refers to the operation file created in advance.
  • the operation file includes the first file and the second file.
  • the first file describes the generation operation for generating the operation instruction.
  • the first file is a program file in which an operation sequence for generating an operation instruction and transmitting it to the generation unit 13 is described.
  • the second file is referenced when the generation operation is executed.
  • the second file contains the information used in the identification of the output signal and the generation of the operation signal described below.
  • the second file contains coordinates selected by the input device 22, data information referred to during image recognition, and the like.
  • the calculation unit 12 executes the generation operation of the first file based on the acquisition result by the acquisition unit 11 while referring to the information included in the second file. As a result, an operation command for operating the processing device 21 is generated.
  • the arithmetic unit 12 transmits an operation instruction to the generation unit 13.
  • the arithmetic unit 12 includes, for example, a CPU, a ROM, a RAM, a storage device, an input interface, an output interface, a communication interface, and a bus connecting these.
  • the arithmetic unit 12 is a dedicated or general-purpose computer.
  • the function of the arithmetic unit 12 may be realized by the collaboration of a plurality of computers.
  • the generation unit 13 generates an operation signal corresponding to the processing device 21 based on the operation command.
  • the generation unit 13 inputs the generated operation signal to the processing device 21 (control unit 21a).
  • the processing device 21 controls the processing device 21a.
  • the processing device 21 receives the operation signal, the processing device 21 moves the pointer, inputs characters, and the like according to the operation signal.
  • the generation unit 13 includes, for example, a microcomputer programmed to generate an operation signal corresponding to a signal transmitted from the input device 22.
  • the storage device 14 stores various data necessary for operating the processing device 21.
  • the storage device 14 stores an operation file.
  • the storage device 14 stores reference data necessary for image recognition.
  • the storage device 14 includes a hard disk drive (HDD), a solid state drive (SSD), a network-attached hard disk (NAS), and the like.
  • the operation file and the reference data necessary for recognizing the acquisition result are prepared in advance by the user using the input device 15 and the display device 16.
  • the input device 15 is used by the user to input information to the calculation unit 12.
  • the input device 15 includes, for example, at least one selected from a keyboard, mouse, touchpad, and microphone.
  • the display device 16 displays the signal output from the calculation unit 12 so that the user can visually recognize it.
  • the display device 16 includes a monitor.
  • the acquisition unit 11 mainly acquires the video signal output from the processing device 21.
  • image recognition at least one selected from template matching, character recognition, and color recognition is executed.
  • the user prepares reference data necessary for these processes in advance and stores them in the storage device 14.
  • the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when performing image recognition.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user uses the input device 15 to select a portion of the displayed image.
  • the selected image is stored in the storage device 14 as a template image.
  • the user describes the file name of the template image to be referred to in the second file described later.
  • the template image is prepared for all the screens on which the automatic operation by the operation system 10 is executed.
  • the screen changes.
  • the displayed screen transitions to another screen, or a window is displayed.
  • a partial image that is not included in the screen before the change but is included in the screen after the change is used as the template image.
  • the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when it is necessary to recognize the character.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user confirms the range in which character recognition is executed in the displayed image by using the input device 15.
  • the user describes the range in the second file described later.
  • the acquisition unit 11 acquires a video signal from the processing device 21 to the display device 23 when it is necessary to recognize the color.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user uses the input device 15 to select a range in which color recognition is performed in the displayed image.
  • the images in the selected range are stored in the storage device 14.
  • the user describes the range in the second file described later.
  • the calculation unit 12 determines whether an image similar to the template image is included in the image based on the video signal.
  • the arithmetic unit 12 cuts out an image in a range stored in advance.
  • the arithmetic unit 12 executes character recognition processing such as Optical Character Recognition (OCR) on the image.
  • OCR Optical Character Recognition
  • the calculation unit 12 cuts out an image in a range stored in advance.
  • the calculation unit 12 determines whether or not the colors of the respective units match between the image stored in advance and the image cut out. By recognizing the image by these processes, the calculation unit 12 determines whether or not the operation on the processing device 21 by the operation signal is completed.
  • the first file and the second file are created.
  • the first file is described in a general-purpose programming language such as C language.
  • the selected coordinates of the pointing device, the template image to be referenced, the characters to be input, and the like are described in a text format.
  • the user describes the coordinates to be selected (clicked) in the second file and stores them in the storage device 14.
  • the acquisition unit 11 acquires a video signal from the processing device 21 to the display device 23 when operating the pointing device.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal.
  • the user points the pointer at the selected part of the displayed image and confirms the coordinates.
  • the user describes the coordinates in the second file.
  • the acquisition unit 11 acquires the video signal from the processing device 21 to the display device 23 when inputting characters with the keyboard.
  • the calculation unit 12 causes the display device 16 to display an image based on the video signal. In the displayed image, the user points the pointer at the text box for inputting characters and confirms the coordinates. The user describes the coordinates and one or more characters to be entered in the text box in the second file.
  • the calculation unit 12 determines whether the processing device 21 has been operated according to the operation signal by performing image recognition. If it is determined that the operation is not performed, the calculation unit 12 may send a notification to the display device 16 or another terminal device that the operation cannot be performed.
  • the file name of the template image used for image recognition, the range used for character recognition, the image and range used for color recognition, and the like are described in the second file as described above.
  • FIG. 4A is a schematic diagram illustrating a part of the first file.
  • FIG. 4B is a schematic diagram illustrating a part of the second file.
  • the screens shown in FIGS. 2 and 3 are displayed on the display device 23 based on the video signal output from the processing device 21.
  • icons B1 to B7 and information C1 regarding the work processed by the processing device 21 are displayed.
  • icons B1 to B4, icons B11 to B14, and a schematic diagram D of the processing device 21 are displayed.
  • icons B1 to B4, icons B21 to B24, and schematic diagram D are displayed.
  • icons B1 to B4, icons B21 to B24, and information C2 regarding processing conditions are displayed by the processing device 21.
  • the user moves the pointer to the icon B3 on the screen A1 shown in FIG. 2A and selects the icon B3.
  • the screen A2 shown in FIG. 2B is displayed on the display device 23.
  • the user moves the pointer to the icon B14 on the screen A2 and selects the icon B14.
  • the screen A3 shown in FIG. 3A is displayed on the display device 23.
  • the user moves the pointer to the icon B21 on the screen A3 and selects the icon B21.
  • the screen A4 shown in FIG. 3B is displayed on the display device 23.
  • the calculation unit 12 When the processing device 21 is operated by the operation system 10, the calculation unit 12 causes the screen A1 to be displayed on the display device 16. The user confirms the coordinates of the icon B3 on the screen A1 and describes them in the second file. Next, the calculation unit 12 causes the display device 16 to display the screen A2. The user cuts out a template image for determining the transition from the screen A1 to the screen A2 from the screen A2. For example, any of the schematic diagram D and the icons B11 to B14 is used as a template image. The user stores the cut out template image as image data in the storage device 14. The user describes the file name of the template image referred to at the time of determining the transition from the screen A1 to the screen A2 in the second file.
  • the storage of the template image and the description of the file name in the second file are repeated in the same manner.
  • the user cuts out a template image for determining the transition of the screen from the screen A3 and stores it in the storage device 14 as image data.
  • the user describes the file name of the template image in the second file.
  • the first file and the second file are prepared for the operation by the operation system 10.
  • the operation executed by the calculation unit 12 such as the generation of the operation instruction for selecting a specific icon and the reference of the template image described above, is described.
  • the second file the selected coordinates, the file name of the template image to be referenced, and the like are described.
  • the trigger for starting the operation by the operation system 10 is arbitrary.
  • the operation may be started based on a signal input to the processing device 21, a signal sent from another sensor, a measuring instrument, or the like.
  • the operation may be initiated in response to a signal transmitted from another operating system 10 or a signal transmitted from another processing device 21 operated by another operating system 10.
  • the signal may be transmitted at the time of operation by another operation system 10, or may be transmitted at a time other than the operation by another operation system 10.
  • the operation may be started when a specific screen is displayed on the display device 23.
  • the operation may be started when the user inputs a specific instruction to the arithmetic unit 12.
  • FIG. 5 is a flowchart showing a procedure for operation by the operation system according to the embodiment.
  • the user creates the first file (step S1).
  • the user creates reference data (step S2) and creates a second file (step S3).
  • the operation system 10 automatically operates the processing device 21 using the created first file, reference data, and second file (step S4).
  • the operation system 10 acquires an output signal from the processing device 21 and inputs an operation signal corresponding to the processing device 21. Therefore, the operation system 10 can be applied to the existing processing device 21. By applying the operation system 10 to the processing device 21 whose operation is not automated, the operation of the processing device 21 can be automated. For example, the operation of the processing device 21 can be automated without modifying the processing device 21 or rewriting the program.
  • the operation file when operating the processing device 21, the operation file is referred to.
  • the operation file includes the first file and the second file.
  • the first file describes the generation operation for generating the operation instruction.
  • the second file is created separately from the first file and is referred to when the generation operation based on the description of the first file is executed.
  • a plurality of processing devices 21 of the same type may be used.
  • the plurality of processing devices 21 can be automatically operated by the plurality of operation systems 10.
  • a common operation file and a common reference data may be used.
  • the operation file and reference data are duplicated and used in the operation by each operation system 10.
  • the displayed screens may differ slightly even between processing devices 21 of the same type.
  • the positions of the same icons on the same screen may be slightly different between the processing devices 21.
  • the desired icon may not be selected.
  • the appearance of the template image may differ due to the influence of noise, deterioration of the video signal output unit of the processing device 21 over time, and the like. As a result, the accuracy of image recognition may decrease.
  • the information used when generating the operation instruction is described in a second file separate from the first file.
  • the description of the coordinates may be changed so as to correct the deviation.
  • the template image having a different appearance may be prepared, and the description of the second file may be changed so as to refer to the image data.
  • the machine difference can be easily corrected, and the burden on the user required for preparing the operation system 10 can be reduced.
  • a processing system 1 including an operation system 10 and a processing device 21 that can reduce the burden on the user is provided.
  • the second file is, for example, a text file. Therefore, even a user who has no knowledge about programming can describe the second file in a text format and can easily modify the second file.
  • the user may try the operation in order to confirm whether there is a problem in the operation by the operation system 10 before the operation of the production line. For example, the user confirms whether the icon is properly selected, whether the image is recognized normally, and the like in the operation attempt. When a malfunction in the operation is confirmed, the user appropriately corrects the description in the second file.
  • the calculation unit 12 may automatically correct the coordinates described in the second file. For example, in an operation trial, the arithmetic unit 12 generates a selection operation command for selecting the coordinates described in the second file and transmits it to the generation unit 13. The generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21. The calculation unit 12 recognizes the image acquired by the acquisition unit 11 after the operation signal is transmitted, and determines whether or not the template image prepared in advance is displayed. When the template image is not displayed, the arithmetic unit 12 generates another selection operation command for selecting another coordinate different from the coordinates described in the second file. For example, the other coordinates are randomly determined from the vicinity of the coordinates described in the second file.
  • the arithmetic unit 12 transmits another selection operation command to the generation unit 13.
  • the generation unit 13 generates another operation signal based on another selection operation command and transmits it to the processing device 21.
  • the generation of another selection operation instruction is repeated, for example, until a prepared template image is displayed or a preset number of times is reached.
  • the arithmetic unit 12 generates another selection operation command so as to select another coordinate different from the previous coordinates.
  • FIG. 6 is a flowchart showing a part of an operation trial by the operation system according to the embodiment.
  • an operation using the first file, the second file, and the reference data is tried after steps S1 to S3 and before step S4.
  • the arithmetic unit 12 generates a selection operation instruction for selecting a part of the screen (step S11).
  • the generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21.
  • the calculation unit 12 determines whether the operation to the processing device 21 by the operation signal is completed (step S12).
  • the calculation unit 12 determines whether the number of trials of the selection operation is less than the specified number (step S13). When the number of trials is less than the specified number of times, the arithmetic unit 12 generates another selection operation instruction for selecting another coordinate (step S14). The generation unit 13 generates an operation signal based on the selection operation command and transmits it to the processing device 21. When the number of trials of the selection operation reaches the specified number of times, the calculation unit 12 notifies the user (step S15). For example, the display device 16 displays information informing that the operation cannot be properly executed.
  • step S12 When it is determined in step S12 that the operation corresponding to the selection operation command is completed, the arithmetic unit 12 changes the coordinates described in the second file to the coordinates used for the selection operation command capable of executing the selection operation. Modify (step S16). If the selection operation can be executed with the original coordinates described in the second file, the modification of the second file is not executed.
  • the response by the processing device 21 may be delayed.
  • the screen does not change due to the delay, and the screen may change normally after the determination. Therefore, for the processing device 21 where the display of the screen on the display device 23 may be delayed, the interval for transmitting the operation signal is set longer, or the operation signal based on the selection operation command is transmitted a plurality of times. For example, in the flowchart shown in FIG. 6, when the operation is not completed after a certain operation signal is input to the processing device 21, the same operation signal is input to the processing device 21 again from the generation unit 13.
  • FIG. 7 is a schematic diagram showing an operation system according to the first modification.
  • An input display device having a function as an input device and a display device may be connected to the processing device 21.
  • a touch panel 24 is provided as an input display device.
  • the operation system 10a according to the first modification further includes a switching unit 17.
  • the processing system 1a includes an operation system 10a and a processing device 21.
  • the generation unit 13 generates an operation signal corresponding to the signal transmitted from the touch panel 24.
  • the switching unit 17 switches the signal input to the processing device 21. Specifically, a signal transmitted from the touch panel 24 and an operation signal transmitted from the generation unit 13 are input to the switching unit 17.
  • the switching unit 17 switches between a state in which the signal transmitted from the touch panel 24 is input to the processing device 21 and a state in which the operation signal transmitted from the generation unit 13 is input to the processing device 21. During the operation by the operation system 10a, the switching unit 17 inputs the operation signal to the processing device 21.
  • the switching unit 17 can prevent the user from operating the processing device 21 by the touch panel 24 during the operation by the operation system 10a. For example, it is possible to avoid the transition to another screen by the user's operation and the interruption of the operation by the operation system 10a.
  • the switching unit 17 includes, for example, a relay circuit or a switching circuit. By providing the switching unit 17, the processing device 21 in which the touch panel 24 is used can be operated by the operation system 10a.
  • the coordinate deviation of the generated operation signal is large. This is because the coordinates of the sensor that detects touch and the positional relationship of the display are individually adjusted. Therefore, even when a plurality of touch panels 24 of the same type are used for a plurality of processing devices 21 of the same type, the coordinates selected or referred to at the time of automatic operation must be corrected for each processing device 21. be.
  • the invention according to the embodiment is particularly suitable for the processing device 21 in which the touch panel 24 is used. According to the first modification, the burden on the user on the processing system 1a including the touch panel 24 can be reduced.
  • FIG. 8 is a schematic diagram showing an operation system according to the second modification.
  • the operation system 10b according to the second modification further includes a switching unit 17 as compared with the operation system 10.
  • the signal transmitted from the generation unit 13 or the input device 22 to the processing device 21 is input to the processing device 21 via the switching unit 17.
  • the switching unit 17 switches between a state in which the signal transmitted from the input device 22 is input to the processing device 21 and a state in which the operation signal transmitted from the generation unit 13 is input to the processing device 21.
  • the switching unit 17 inputs an operation signal to the processing device 21. This makes it possible to prevent the user from operating the processing device 21 using the input device 22 during the operation by the operation system 10b. According to the second modification, the convenience of the processing system 1b including the operation system 10b and the processing device 21 can be improved.
  • FIG. 9 is a schematic diagram showing an operation system according to the third modification.
  • the operation system 10c according to the third modification further includes reading devices 18a and 18b as compared with the operation system 10.
  • the reading devices 18a and 18b read the identification information.
  • As the identification information a one-dimensional bar code, a two-dimensional bar code, or a Radio Frequency IDentifier (RFID) can be used.
  • the readers 18a and 18b are barcode readers or RFID readers.
  • the reading device 18a reads the identification information of the operator in charge of the processing device 21.
  • the reading device 18b reads the identification information of the work processed by the processing device 21.
  • the user when executing the process by the processing device 21, the user (operator) places the work in a place where the reading device 18b can read the identification information of the work. After that, the user causes the reading device 18a to read his / her identification information.
  • the calculation unit 12 starts the operation of the processing device 21 in response to the reading of the identification information by the reading device 18a. For example, the arithmetic unit 12 causes the reading device 18b to read the identification information of the work, and then starts generating an operation command.
  • the operation system 10c may generate an operation signal according to the read identification information and input it to the processing device 21.
  • the identification information may be directly input from the arithmetic unit 12 to the processing device 21 via a communication port or the like by serial communication or the like.
  • the operation system 10c may include other devices related to the processing executed by the processing device 21.
  • the operation of the processing device 21 may be started according to the operation of the other device. This eliminates the need for the user to operate the input device 15 or 22 to initiate the operation.
  • the convenience of the operation system 10c can be improved.
  • the convenience of the processing system 1c including the operation system 10c and the processing device 21 can be improved.
  • FIG. 10 is a schematic diagram showing an operation system according to the fourth modification.
  • the operation system 10d according to the fourth modification is connected to the management device 30 (second device).
  • the arithmetic unit 12 is connected to the management device 30 by network, wired communication, or wireless communication.
  • the management device 30 is a higher-level device that transmits a command to the processing device 21.
  • the operation system 10d receives the command.
  • the operation system 10d executes an operation in response to an instruction.
  • the arithmetic unit 12 generates an operation instruction according to the instruction and transmits it to the generation unit 13.
  • a plurality of operation files are stored in the storage device 14.
  • the arithmetic unit 12 selects one of the plurality of operation files according to the instruction transmitted from the management device 30.
  • the arithmetic unit 12 executes the program included in the selected operation file and generates an operation instruction.
  • the instruction transmitted from the management device 30 may be used as a trigger for starting the operation.
  • the management device 30 may be connected to a plurality of operation systems 10d.
  • the operation by the operation system 10d may be started in response to the signal transmitted from another operation system 10c or the signal transmitted from another processing device 21 operated by another operation system 10d.
  • the signal may be transmitted at the time of operation by another operation system 10d, or may be transmitted at a time other than the operation by another operation system 10d.
  • the operation system 10d may input parameters related to the processing executed by the processing device 21 to the processing device 21.
  • the parameters are the number of workpieces to be processed, conditions at the time of processing execution, and the like.
  • the conditions are, for example, temperature, pressure, gas or liquid flow rate, current, voltage, processing time and the like.
  • the operation system 10d generates an operation signal for inputting a numerical value of a parameter and transmits it to the processing device 21. As a result, a numerical value is input to the text box or the like displayed on the display device 23.
  • the management device 30 may transmit parameters input from the operation system 10d to the processing device 21 to the operation system 10d.
  • the operation system 10d receives a numerical value input as a parameter and generates an operation signal according to the numerical value. As a result, the numerical value transmitted from the management device 30 is input to the processing device 21.
  • the management device 30 may transmit information about the workpiece to be processed to the operation system 10d.
  • the management device 30 transmits the work identification information (ID) to the operation system 10d.
  • ID work identification information
  • the operation system 10d inputs the identification information to the processing device 21 via the communication port.
  • the operation system 10d may generate an operation signal according to the identification information and input it to the processing device 21.
  • higher-level equipment may be installed to manage multiple processing equipment.
  • a command cannot be directly transmitted from the management device 30 to the processing device 21 due to compatibility of communication IFs between the processing device 21 and the management device 30, differences in communication standards, and the like.
  • the processing device 21 may not accept the command transmitted from the management device 30.
  • the operation system 10d the processing device 21 can be managed by the management device 30.
  • the management device 30 it is not necessary to modify the processing device 21, and the production efficiency can be improved more easily.
  • the management device 30 can easily acquire information on the operating rate or the operating status of the single or a plurality of processing systems 1d.
  • FIG. 11 is a schematic diagram showing the hardware configuration of the operation system according to the embodiment.
  • the arithmetic unit 12 is a computer and includes a ROM 12a, a RAM 12b, a CPU 12c, and an HDD 12d.
  • the ROM 12a stores a program that controls the operation of the computer.
  • the ROM 12a stores programs necessary for realizing various processes in a computer.
  • the RAM 12b functions as a storage area in which the program stored in the ROM 12a is expanded.
  • the CPU 12c includes a processing circuit.
  • the CPU 12c reads the control program stored in the ROM 12a and controls the operation of the computer according to the control program.
  • the CPU 12c expands various data obtained by the operation of the computer into the RAM 12b.
  • the HDD 12d stores data necessary for reading and data obtained in the process of reading.
  • the HDD 12d may function as the storage device 14 shown in FIG.
  • Convenience can be improved by using the operation system, processing system, or operation method described above.
  • the convenience of the operation system can be improved by using a program for causing the computer to execute the above-mentioned processing or a storage medium in which the program is stored.
  • the above-mentioned various data processing can be performed by a computer as a program that can be executed by a magnetic disk (flexible disk, hard disk, etc.), an optical disk (CD-ROM, CD-R, CD-RW, DVD-ROM, DVD ⁇ R). , DVD ⁇ RW, etc.), and may be recorded on a non-transitory computer-readable storage medium such as a semiconductor memory.
  • the data recorded on the recording medium can be read by a computer (or embedded system).
  • the recording format storage format
  • the computer reads a program from the recording medium and causes the CPU to execute the instructions described in the program based on the program.
  • the acquisition (or reading) of the program may be performed through the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un système d'exploitation selon un mode de réalisation de la présente invention est pourvu d'une unité d'extraction, d'une unité informatique et d'une unité de génération. L'unité d'acquisition acquiert une sortie auprès d'un premier dispositif qui doit être exploité. L'unité informatique génère une instruction d'exploitation sur la base du résultat d'acquisition de sortie et d'un fichier d'action créé à l'avance. L'unité de génération génère un signal d'exploitation correspondant au premier dispositif sur la base de l'instruction d'exploitation, et transmet celui-ci au premier dispositif. Le fichier d'action comprend un premier fichier dans lequel une action de génération pour générer l'instruction d'opération est décrite, et un second fichier qui est appelé lorsque l'action de génération est exécutée, et qui comprend des informations utilisées pour identifier la sortie et pour générer le signal de fonctionnement.
PCT/JP2021/025172 2020-07-06 2021-07-02 Système d'exploitation, système de traitement, procédé d'exploitation, programme et support de stockage WO2022009802A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020116434A JP7174014B2 (ja) 2020-07-06 2020-07-06 操作システム、処理システム、操作方法、及びプログラム
JP2020-116434 2020-07-06

Publications (1)

Publication Number Publication Date
WO2022009802A1 true WO2022009802A1 (fr) 2022-01-13

Family

ID=79552628

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/025172 WO2022009802A1 (fr) 2020-07-06 2021-07-02 Système d'exploitation, système de traitement, procédé d'exploitation, programme et support de stockage

Country Status (2)

Country Link
JP (2) JP7174014B2 (fr)
WO (1) WO2022009802A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024067151A (ja) * 2022-11-04 2024-05-17 株式会社東芝 設備制御システム、設備制御装置、設備制御方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011115249A1 (fr) * 2010-03-19 2011-09-22 株式会社日立国際電気 Appareil de traitement de substrat
WO2013031735A1 (fr) * 2011-08-27 2013-03-07 株式会社エネサイバー Procédé de contrôle et de surveillance, et dispositif de contrôle et de surveillance
WO2013190714A1 (fr) * 2012-06-19 2013-12-27 日本電能株式会社 Système de commande automatique et procédé permettant d'automatiser un fonctionnement
JP2018535459A (ja) * 2015-07-02 2018-11-29 アクセンチュア グローバル サービスィズ リミテッド ロボットによるプロセス自動化

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011115249A1 (fr) * 2010-03-19 2011-09-22 株式会社日立国際電気 Appareil de traitement de substrat
WO2013031735A1 (fr) * 2011-08-27 2013-03-07 株式会社エネサイバー Procédé de contrôle et de surveillance, et dispositif de contrôle et de surveillance
WO2013190714A1 (fr) * 2012-06-19 2013-12-27 日本電能株式会社 Système de commande automatique et procédé permettant d'automatiser un fonctionnement
JP2018535459A (ja) * 2015-07-02 2018-11-29 アクセンチュア グローバル サービスィズ リミテッド ロボットによるプロセス自動化

Also Published As

Publication number Publication date
JP2023014104A (ja) 2023-01-26
JP2022014215A (ja) 2022-01-19
JP7174014B2 (ja) 2022-11-17

Similar Documents

Publication Publication Date Title
JP5077826B2 (ja) プラント情報表示装置およびプラント情報表示方法
JP2010131705A (ja) 複数のロボット機構部を備えたロボットシステム
WO2022009802A1 (fr) Système d'exploitation, système de traitement, procédé d'exploitation, programme et support de stockage
JP2010152429A (ja) Guiアプリケーションテスト支援装置及びテスト支援方法
KR102291974B1 (ko) 반도체 시스템 및 데이터 편집 지원 방법
WO2022009764A1 (fr) Dispositif de génération de code de réglage, machine industrielle, procédé de génération de code de réglage et programme de génération de code de réglage
JP2003345506A (ja) 操作入力装置および画像形成装置
JP2011186607A (ja) フィールドデバイス、フィールドデバイスの設定方法、フィールドデバイス管理装置及びプログラム
EP3101537A1 (fr) Dispositif de commande, système de commande, procédé de commande pour dispositif de commande et procédé de commande pour système de commande
US11703830B2 (en) Production system, recovery system, production method, and information storage medium
JP2007034837A (ja) 電子部品実装システム
JP4558675B2 (ja) 数値制御装置
US11036440B2 (en) Image formation system having a plurality of image formation apparatuses and method for controlling them
WO2023276875A1 (fr) Système d'exploitation, système de traitement, procédé de construction de système de traitement, ordinateur, procédé pour opération, programme et support de stockage
JP4346387B2 (ja) 表示制御システム
EP3247097B1 (fr) Appareil d'impression, son procédé de commande et support d'informations
JP6643966B2 (ja) 安全システムの開発支援装置
JP2007335963A (ja) 機器設定装置及び機器設定方法及びプログラム
WO2017217317A1 (fr) Dispositif de création de données de pièce, procédé de création de données de pièce, programme de création de données de pièce, dispositif de création de données de tôle, procédé de création de données de tôle et programme de création de données de tôle
JP6390468B2 (ja) 製造ラインの作業指示システム
JPWO2019234897A1 (ja) 数値制御装置および情報処理装置
US20170374219A1 (en) Image processing apparatus, debugging assistance method and non-transitory computer-readable recording medium encoded with debugging assistance program
CN112272802B (zh) 支持装置以及记录介质
JP7283965B2 (ja) 無線タグ読取装置及びプログラム
TWI802163B (zh) 基於腳本的控制系統和控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837541

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837541

Country of ref document: EP

Kind code of ref document: A1