WO2023167324A1 - Production system and reproduction method - Google Patents

Production system and reproduction method Download PDF

Info

Publication number
WO2023167324A1
WO2023167324A1 PCT/JP2023/008145 JP2023008145W WO2023167324A1 WO 2023167324 A1 WO2023167324 A1 WO 2023167324A1 JP 2023008145 W JP2023008145 W JP 2023008145W WO 2023167324 A1 WO2023167324 A1 WO 2023167324A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
production
event
unit
display
Prior art date
Application number
PCT/JP2023/008145
Other languages
French (fr)
Japanese (ja)
Inventor
裕樹 大江
悠太 畑中
彩 松永
佳史 斧山
智史 山口
Original Assignee
株式会社安川電機
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社安川電機 filed Critical 株式会社安川電機
Publication of WO2023167324A1 publication Critical patent/WO2023167324A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators

Definitions

  • This disclosure relates to a production system and a reproduction method.
  • Patent Document 1 describes a controller that operates one or more robots in a real space based on an operation program, and a virtual robot that operates one or more virtual robots corresponding to the one or more robots in a virtual space based on an operation program.
  • a controller an action interrupting unit that interrupts the motion of one or more robots by the controller, an interrupted state reproducing unit that reproduces in a virtual space the state of the real space in which the motion of the one or more robots is interrupted, and an interrupted state reproducing unit.
  • a restart simulator for restarting the motion of one or more virtual robots based on a motion program by a virtual controller in a virtual space that reproduces the state of the real space.
  • the present disclosure provides an effective system for post-verification of the operation of production equipment.
  • a production system provides a virtual machine corresponding to a production apparatus based on transition information representing transition of control signals collected from a controller that controls a production apparatus including a robot, and model information of the production apparatus.
  • a simulator for operating a production device in a virtual space, and a display section for displaying a virtual space in which the virtual production device operates on a monitor.
  • a reproduction method is a virtual machine corresponding to a production device based on transition information representing transitions of control signals collected from a controller that controls a production device including a robot, and a model of the production device. It includes operating the production equipment in the virtual space and displaying the virtual space in which the virtual production equipment operates on the monitor.
  • FIG. 1 is a schematic diagram illustrating the configuration of a production system
  • FIG. 1 is a schematic diagram illustrating the configuration of a robot
  • FIG. 2 is a block diagram illustrating functional configurations of a data collection device and a simulation device
  • FIG. 10 is a diagram illustrating extraction of state changes of a virtual peripheral object
  • FIG. 10 is a diagram illustrating extraction of state changes of a virtual peripheral object
  • It is a block diagram which shows the modification of a simulation apparatus.
  • FIG. 11 is a block diagram showing a further modified example of the simulation device
  • FIG. 4 is a diagram illustrating a display screen
  • 2 is a block diagram illustrating hardware configurations of a data collection device, a simulation device, a host controller, and a local controller
  • 9 is a flowchart illustrating a procedure for displaying an interval setting screen; 8 is a flowchart illustrating a procedure for displaying an initial screen; 12 is a flowchart illustrating a simulation procedure in FIG. 11; 5 is a flowchart illustrating a display update procedure; 14 is a flowchart illustrating a screen transition procedure in FIG. 13; FIG. 14 is a flowchart illustrating a reproduction procedure in FIG. 13; FIG.
  • a production system 1 shown in FIG. 1 is a system for producing a product in a physical space.
  • a product may be any tangible object produced by mechanical processing and assembly of one or more parts.
  • Real space is the space in which tangible objects actually exist.
  • the production system 1 includes a production device 2 and a controller 3. production equipment 2.
  • a product is produced by executing a processing process on the work W in the real space.
  • the work W is a tangible object handled by the production apparatus 2 to constitute at least part of the product.
  • the work W may be a part to be assembled into a product, an intermediate product formed by assembling parts or the like, or a final finished product itself.
  • the processing process for work W includes multiple tasks.
  • a task is a pre-ordered set of actions for a given work purpose. At least one of the plurality of tasks can be shared by two or more different processing steps. Depending on the content of the processing steps, the execution order of multiple tasks can be changed, but the order of operations within one task cannot be changed. Examples of multiple tasks include bringing the base part into the work area, assembling the part to the base part, fixing the part to the base part by fastening or welding, etc., and assembling one or more parts to the base part. Carrying out the completed product from the work area, and the like.
  • the production equipment 2 includes a plurality of machines 5.
  • the plurality of machines 5 includes at least robots.
  • the plurality of machines 5 may further include machines other than robots.
  • machines other than robots include transfer devices that transfer workpieces W, devices that adjust the position and orientation of workpieces W to be worked by robots, machine tools that process workpieces W, and the like. is not limited to A plurality of machines 5 can have any structure as long as they can execute a task for the work W. FIG.
  • a plurality of tasks may include two or more tasks to be executed sequentially in series, or may include two or more tasks that can be executed concurrently by a plurality of machines 5 .
  • Examples of tasks executed by the robot include transportation of the work W, assembly of the work W, fixation of the work W by fastening, loading/unloading of the work W to/from peripheral machines such as machine tools, and the like.
  • Examples of tasks executed by the machine tool include opening and closing the door, chucking the loaded workpiece W, rotating/moving the workpiece W, changing tools, arranging/moving the tool relative to the workpiece W, and releasing the chuck after machining. is mentioned.
  • the plurality of machines 5 shown in FIG. 1 includes a transfer device 5A, robots 5B, 5C, and 5D, and a machine tool 5E, but is not limited to this.
  • the number and type of machines 5 can vary in any way, at least as long as they include robots.
  • the transport device 5A transports the work W by being driven by, for example, an electric motor.
  • Examples of the conveying device 5A include belt conveyors, roller conveyors, carousels, and the like.
  • the robots 5B and 5C are vertically articulated robots, and have an articulated arm 10 and an end effector 50, as shown in FIG.
  • the end effector 50 acts on the workpiece W.
  • Examples of the end effector 50 include a hand for gripping the work W, a suction nozzle for sucking the work W, a welding torch for welding the work W, a screw tightening tool for screwing the work W, and the like.
  • the end effector 50 is connected to the articulated arm 10.
  • the multi-joint arm 10 changes the position and orientation of the end effector 50 by multi-joint motion.
  • the articulated arm 10 includes a base portion 11, a turning portion 12, a first arm 13, a second arm 14, a swing portion 15, a third arm 17, a tip portion 18, actuators 41, 42, 43, 44, 45, 46.
  • the base 11 is installed around the transport device 5A.
  • the swivel part 12 is provided on the base part 11 so as to swivel around a vertical axis 21 .
  • the first arm 13 is connected to the swivel portion 12 so as to swing about an axis 22 that intersects (for example, is perpendicular to) the axis 21 .
  • Crossing includes a twisted relationship such as a so-called overpass.
  • the second arm 14 is connected to the tip of the first arm 13 so as to swing about an axis 23 substantially parallel to the axis 22 .
  • the second arm 14 includes a swinging portion 15 and a turning portion 16 .
  • the swinging portion 15 is connected to the tip of the first arm 13 and extends along an axis 24 that intersects (for example, orthogonally) the axis 23 .
  • the swivel portion 16 is connected to the distal end portion of the swing portion 15 so as to swivel around the axis 24 and further extends along the axis 24 .
  • the third arm 17 is connected to the distal end of the turning section 16 so as to swing about an axis 25 intersecting (for example, perpendicular to) the axis 24 .
  • the distal end portion 18 is connected to the distal end portion of the third arm 17 so as to pivot about an axis 26 that intersects (for example, is perpendicular to) the axis 25 .
  • An end effector 50 is attached to the distal end portion 18 .
  • the articulated arm 10 includes a joint 31 connecting the base portion 11 and the swivel portion 12, a joint 32 connecting the swivel portion 12 and the first arm 13, the first arm 13 and the second arm . a joint 34 connecting the swinging portion 15 and the turning portion 16 in the second arm 14; a joint 35 connecting the turning portion 16 and the third arm 17; and a joint 36 connecting with the portion 18 .
  • the actuators 41, 42, 43, 44, 45, 46 include, for example, electric motors and speed reducers, and drive the joints 31, 32, 33, 34, 35, 36, respectively.
  • the actuator 41 swings the swivel unit 12 about the axis 21
  • the actuator 42 swings the first arm 13 about the axis 22
  • the actuator 43 swings the second arm 14 about the axis 23
  • the actuator 44 pivots the pivot 16 about the axis 24
  • the actuator 45 pivots the third arm 17 about the axis 25
  • the actuator 46 pivots the tip 18 about the axis 26 .
  • the configuration of the articulated arm 10 can be changed as appropriate.
  • the articulated arm 10 may be a 7-axis redundant robot in which a 1-axis joint is added to the 6-axis configuration described above, or a so-called scalar type articulated robot.
  • the robot 5D is a robot capable of autonomous travel.
  • the robot 5D is similar to the robots 5B and 5C, but the base 11 of the articulated arm 10 is self-propelled.
  • An example of the self-propellable base 11 is an electric automatic guided vehicle (AGV: Automated Guided Vehicle).
  • the machine tool 5E performs cutting and the like on the work W that is carried in and out by the robots 5B and 5C.
  • Examples of the machine tool 5E include an NC lathe, an NC milling machine, a machining center, and the like.
  • the production device 2 may further include peripheral devices whose movable ranges overlap with any of the robots 5B, 5C, and 5D.
  • the movable range of the robot 5C may overlap with that of the robot 5B.
  • the controller 3 controls a plurality of machines 5.
  • the controller 3 has a plurality of local controllers 400 each controlling a plurality of machines 5 and a host controller 300 .
  • the plurality of local controllers 400 include a local controller 400A that controls the transfer device 5A, a local controller 400B that controls the robot 5B, a local controller 400C that controls the robot 5C, a local controller 400D that controls the robot 5D, and a machine tool. and a local controller 400E that controls the machine 5E.
  • the host controller 300 collects state information of the physical space, and outputs task execution commands to a plurality of local controllers 400 based on a predetermined control program and the collected state information. Each of the multiple local controllers 400 controls the machine 5 based on execution instructions output from the host controller 300 .
  • the local controller 400 determines the output to be supplied to the machine 5 in order to perform the task corresponding to the execution command, and supplies the determined output to the machine 5 . To determine the output, local controller 400 acquires, generates, or outputs one or more control signals.
  • the local controller 400B determines the output to be supplied to the robot 5B in order to perform the task corresponding to the execution command, and supplies the determined output to the robot 5B at a predetermined control cycle from the start to the completion of the task. Execute repeatedly. In order to supply the above outputs to the robot 5B, the local controller 400B acquires, generates or outputs one or more control signals in each control cycle.
  • the execution command may include designation of one of a plurality of motion programs held by local controller 400B.
  • the local controller 400B reads out a motion program corresponding to the execution command, calculates the target position/target orientation of the end effector 50 for performing the task based on the read motion program, and calculates the target position/target orientation.
  • target currents examples of the above outputs
  • target currents examples of the above outputs
  • the one or more control signals also include a program execution history indicating which part (eg, line or block) of the motion program has been executed.
  • the local controller 400B may output commands to the end effector 50 placed at the target position/target posture. For example, when the end effector 50 is a hand, the local controller 400B may output a workpiece W gripping command to the end effector 50 . The local controller 400B may output a screw tightening command to the end effector 50 when the end effector 50 is a screw tightening tool. Both the grip command and the screw tightening command are included in one or more control signals.
  • the robot 5B may have cameras 51 and 52 on the tip portion 18 or the turning portion 12 or the like.
  • the local controller 400B may recognize the task execution target based on the imaging data acquired from the cameras 51 and 52, and calculate the target position/target orientation of the end effector 50 based on the recognition result. good.
  • the recognition result of the task execution target is included in the control signal of the production system 1 .
  • the local controller 400C also acquires, generates, or outputs one or more control signals similar to those of the local controller 400B in each control cycle.
  • the local controller 400D of the robot 5D whose base 11 is movable has, in addition to one or more control signals similar to those of the local controller 400B, a target position of the base 11, a current position of the base 11, a target position of the base 11 and a current position of the base 11.
  • a target speed or the like of the base 11 based on the deviation from is obtained, generated, or output for each control cycle.
  • the target position of the base 11, the current position of the base 11, and the target speed of the base 11 are also included in the one or more control signals.
  • the local controller 400E sequentially outputs operation commands to the machine tool 5E for performing tasks corresponding to the execution commands.
  • the execution command may include designation of one of a plurality of machining programs held by the local controller 400E.
  • the local controller 400E reads a machining program corresponding to an execution command, and based on the read machining program, a door opening command, a work W chucking command, a tool setting command, a tool placement command, a tool or work rotation command.
  • a drive command, a tool or work feed command, etc. are sequentially output to the machine tool 5E.
  • the local controller 400E may acquire a notification of completion of the corresponding operation from the machine tool 5E for each operation command, and output the next operation command to the machine tool 5E after receiving the completion notification. All of the above execution commands, each motion command, and completion notification for each motion command are included in one or more control signals.
  • the one or more control signals also include a program execution history indicating which part (eg, line or block) of the machining program
  • the control signal acquired, generated or output by each of the plurality of local controllers 400 may further include one or more status signals representing the status of at least one of the corresponding machine 5 and local controller 400.
  • each of the multiple local controllers 400 may generate a completion notification indicating that the task has been completed normally, and output it to the upper controller 300 .
  • each of the plurality of local controllers 400 may generate an alarm to report an abnormality in the machine 5 or an abnormality in the local controller 400 and output it to the host controller 300 .
  • the local controllers 400B, 400C, 400D may generate an alarm notifying the occurrence of overload or overcurrent in at least one of the actuators 41, 42, 43, 44, 45, 46 and output it to the host controller 300. good. All of the completion notifications and alarms described above are included in one or more status signals.
  • the upper controller 300 collects one or more control signals from multiple local controllers 400 .
  • the host controller 300 performs wired or wireless synchronous communication with a plurality of local controllers 400 to collect one or more control signals.
  • Synchronous communication means that a plurality of machines 5 also communicate with the local controller 400 every cycle in synchronization with a synchronous frame of a constant communication cycle.
  • the host controller 300 may store state records in which one or more control signals collected in the same cycle are associated with the same time.
  • the host controller 300 may further collect position information and the like of the work W based on the detection results of the environment sensor 6 provided in the physical space, and store the information as part of the state record.
  • the positional information of the workpiece W and the like collected by the host controller 300 may be obtained as one or more control signals by a plurality of local controllers 400 .
  • Examples of the environment sensor 6 include a load sensor provided at a position where the work W can be placed, or a camera that captures the position where the work W can be placed. If the environment sensor 6 is a camera, the host controller 300 may generate position information and the like of the workpiece W based on the image processing result of image data captured by the environment sensor 6 . The host controller 300 may further collect data captured by the environment sensor 6 and the cameras 51 and 52, and store the collected captured data in the state record.
  • the upper controller 300 generates task execution commands for the multiple local controllers 400 based on the stored status records and production instructions based on the production plan.
  • At least one of the plurality of local controllers 400 may autonomously determine the execution timing of the task corresponding to the execution instruction based on the status record stored by the host controller 300. good.
  • the data collection device 100 accumulates transition information representing transitions of the multiple types of control signals collected by the multiple local controllers 400 .
  • the data collection device 100 acquires and sequentially accumulates state records stored by the host controller 300 .
  • transition information representing transitions of the plurality of control signals is accumulated.
  • Transition means change over time, and is represented by a plurality of control signals arranged in time series.
  • the plurality of control signals need only be stored in association with time so that they can be arranged in chronological order, and they do not necessarily have to be arranged in chronological order within the data collection device 100 .
  • the data collection device 100 may acquire transition information about at least one of the plurality of types of control signals from any one of the plurality of local controllers 400 without going through the host controller 300 .
  • At least one of the plurality of machines 5 may be directly controlled by the host controller 300 without going through the local controller 400 .
  • Control signals for the machine 5 directly controlled by the host controller 300 are collected by the host controller 300 .
  • All machines 5 may be directly controlled by the host controller 300 (the controller 3 may not be divided into the host controller 300 and a plurality of local controllers 400).
  • the simulation apparatus 200 executes a simulation of operating a virtual production apparatus V2 (a model of the production apparatus 2) corresponding to the production apparatus 2 in the virtual space VS for various purposes such as prior verification of the operation of the production apparatus 2.
  • Simulation data representing the state of the plurality of machines 5 after operation is performed.
  • the model information of the production equipment 2 includes model information of a plurality of machines 5 .
  • Each of the model information of the plurality of machines 5 includes, for example, information on the arrangement, structure, shape, size, etc. of the corresponding machine 5 in the real space.
  • the shape and size of the machine 5 in the physical space are represented by three-dimensional data such as polygon data or voxel data.
  • the simulation data calculated based on the model information of the production device 2 is the virtual production device V2 (model of the production device 2), and the coordinate system on which the simulation data is based is the virtual space VS.
  • the configurations of the data collection device 100 and the simulation device 200 are illustrated in detail below.
  • the data collection device 100 includes a collection unit 111 and a database 113 as functional components (hereinafter referred to as "function blocks").
  • the collection unit 111 acquires status records from the host controller 300 and sequentially accumulates them in the database 113 .
  • the simulation device 200 has a program storage unit 211, a virtual controller 212, a model holding unit 213, a simulator 214, a result holding unit 215, and a display unit 216 as functional blocks.
  • the program storage unit 211 stores programs executed by each of the host controller 300 and the plurality of local controllers 400 to control the plurality of machines 5 (for example, the above control program, a plurality of motion programs, a plurality of machining programs, etc.). are doing.
  • the virtual controller 212 executes programs stored in the program storage unit 211 in the same procedure as when the host controller 300 and the plurality of local controllers 400 control the plurality of machines 5 .
  • the model holding unit 213 stores model information of one or more models included in the virtual space VS.
  • the one or more models include at least the models of the production device 2 (models of the plurality of machines 5) described above.
  • the simulator 214 causes the virtual production apparatus V2 to operate in the virtual space VS based on the model information stored by the model holding unit 213 and the execution result of the program by the virtual controller 212 .
  • the simulator 214 virtualizes the model of the robot 5B by forward kinematics calculation based on the motion angles of the joints 31, 32, 33, 34, 35, and 36 calculated by the virtual controller 212 and the model information of the robot 5B. Operate in space VS. The same applies to robots 5C and 5D.
  • the simulator 214 operates the virtual production apparatus V2 in the virtual space VS each time the execution result of the program by the virtual controller 212 is updated, and accumulates the simulation results including the virtual production apparatus V2 after the operation in the result holding unit 215. .
  • the simulation results include models of multiple machines 5 after operation.
  • the models of the machines 5 include the three-dimensional data representing the appearance of the machines 5 after operation.
  • the display unit 216 displays the virtual space VS in which the virtual production device V2 is operating on a monitor (for example, the monitor of the user interface 295 described later) based on the simulation results accumulated in the result holding unit 215 in chronological order. For example, the display unit 216 reads the simulation results accumulated in the result holding unit 215 in chronological order, converts the three-dimensional data included in the simulation results into two-dimensional data from a predetermined viewpoint, and displays the data on the monitor. As a result, the moving image of the virtual space VS in which the virtual production device V2 is operating is reproduced on the monitor.
  • a monitor for example, the monitor of the user interface 295 described later
  • the simulation it is possible to construct the production apparatus 2 efficiently while verifying the operation of the virtual production apparatus V2 in the virtual space VS. It may be necessary to verify the operation of the after the fact. For example, when an unexpected event occurs in the operation of the production device 2 in the real space, it may be necessary to verify the cause of the event after the fact.
  • the simulation device 200 operates the virtual production device V2 in the virtual space VS based on the transition information representing the transition of the control signal collected from the controller 3 and the model information of the production device 2, and the transition information and displaying on a monitor a virtual space VS in which the virtual production apparatus V2 operates based on the above.
  • control signals are, for example, past control signals acquired, generated, or output by the plurality of local controllers 400 when operating the plurality of machines 5 in the physical space, as described above.
  • transition means change over time and is represented by a plurality of control signals arranged in time series.
  • log reproduction displaying on the monitor the virtual space VS in which the virtual production apparatus V2 operates based on the transition information of the past control signal.
  • log reproduction even if the entirety of the production apparatus 2 having a plurality of machines 5 is not photographed thoroughly, phenomena occurring in the real space can be reproduced. Reproduction results can also be displayed from any viewpoint. Therefore, it is effective for ex-post verification of system operation.
  • the simulation device 200 further includes a transition information extraction unit 217 and a transition information storage unit 218 as functional blocks.
  • the transition information extraction unit 217 extracts the transition information of the target section of log reproduction from the database 113 and stores it in the transition information storage unit 218 .
  • the target section means a section on the time axis of transition information.
  • the transition information extraction unit 217 extracts a plurality of types of control signals included in the state record corresponding to the target section from the database 113 and stores them in the transition information storage unit 218 in association with the time included in the state record.
  • the simulator 214 When executing log reproduction, the simulator 214 causes the virtual production apparatus V2 to operate in the virtual space VS based on transition information stored in the transition information storage unit 218, instead of the result of program execution by the virtual controller 212. For example, the simulator 214 reads multiple types of control signals from the transition information storage unit 218 in chronological order, and performs virtual production based on the read multiple types of control signals and the model of the production apparatus 2 stored in the model holding unit 213. Device V2 is operated in virtual space VS.
  • the simulator 214 corresponds to the robot 5B by forward kinematics calculation based on the angles of the joints 31, 32, 33, 34, 35, and 36 read from the transition information storage unit 218 and the model information of the robot 5B.
  • the virtual robot V5B is operated in the virtual space VS. The same applies to robots 5C and 5D.
  • the simulator 214 causes the virtual production apparatus V2 to operate in the virtual space VS, and accumulates the simulation results including the virtual production apparatus V2 after operation in the result holding unit 215. do.
  • the display unit 216 causes the monitor to display the virtual space VS in which the virtual production device V2 is operating, based on the simulation results accumulated in the result holding unit 215 in time series corresponding to the transition information of the target section. For example, the display unit 216 reads a plurality of simulation results corresponding to the transition information of the target section in chronological order, converts the three-dimensional data included in the simulation results into two-dimensional data from a predetermined viewpoint, and displays the data on the monitor. . As a result, a moving image of the virtual space VS in which the virtual production apparatus V2 is operating is reproduced on the monitor based on the transition information of the target section.
  • peripheral objects include a work W, a work tool held by the end effector 50, and the like.
  • the simulation apparatus 200 further extracts the state change of the virtual space VS based on the operation of the virtual production apparatus V2 in the virtual space VS, and displays the extracted state change of the virtual space VS on the monitor.
  • the transition of the control signal stored in the database 113 can also be effectively used to reproduce the state change of surrounding objects. By including state changes of virtual surrounding objects in the display, phenomena occurring in the real space can be verified more easily.
  • the model holding unit 213 stores model information of one or more peripheral objects in addition to the model information of the production device 2 .
  • the model information of the surrounding objects includes information such as the arrangement, structure, shape and size of the surrounding objects.
  • the simulator 214 accumulates, in the result holding unit 215, simulation results further including one or more virtual surrounding objects (one or more surrounding object models) arranged in the virtual space VS. do.
  • the simulation device 200 further has an extraction unit 219 as a functional block.
  • the extraction unit 219 extracts state changes in the virtual space VS based on the operation of the virtual production apparatus V2 in the virtual space VS.
  • the extraction unit 219 extracts the state change of the virtual space VS in the portion not operated by the control signal, based on the operation of the virtual production apparatus V2 by the control signal. For example, when one virtual machine V5 (model of machine 5) operates according to the control signal, the extraction unit 219 extracts state changes of virtual peripheral objects (models of peripheral objects) that do not operate according to the control signal.
  • the virtual peripheral object may be a work W or work tool, or may be another virtual machine V5 (a model of another machine 5).
  • the extraction unit 219 may extract at least one state change of a plurality of models based on the positional relationship between the plurality of models (the plurality of models included in the virtual space VS) stored in the model holding unit 213. Based on the positional relationship, state changes can be easily extracted.
  • a state change may be a change in the position of the model, a change in the shape of the model, or a change in the size of the model.
  • the extraction unit 219 extracts a part of the work W that interfered with the model of the grindstone. We may extract the changes to remove from the model of .
  • the state change is from a state in which the position of the model in the production system 1 is independent of the positions of the other models (hereinafter referred to as "independent state”) to a state in which the position of the model in the production system 1 is related to the positions of the other models. It may be a change to a state (hereinafter referred to as "subordinate state”).
  • the extraction unit 219 extracts the model of the work W from the model of the end effector 50. has changed from the independent state to the dependent state.
  • the extraction unit 219 converts the position/orientation of the model of the work W in the virtual space VS into a relative position/orientation based on the position/orientation of the model of the end effector 50 .
  • the simulator 214 operates the virtual production device V2 in the virtual space VS
  • the model of the work W moves together with the model of the end effector 50 based on the relative position/relative orientation. Since the movement of the model of the work W accompanying the movement of the model of the end effector 50 is displayed on the monitor by the display unit 216, the change from the independent state to the dependent state is displayed on the monitor.
  • the state change to be extracted is from before the button is pressed to after the button is pressed. Change.
  • the change to be extracted is that the state of the model of the work W with respect to the model of the end effector 50 is the independent state to the above dependent state.
  • the extraction unit 219 may extract at least the state change of the model of the production system 1 included in the virtual space VS based on the control signal and the purpose information associated with the control signal. Based on the purpose information, it is possible to more accurately extract the state change of the virtual peripheral object.
  • the purpose information represents, for example, the work purpose of the task corresponding to the execution command, which is an example of the control signal.
  • work purposes include “grasping”, “conveying”, “joining”, and “fastening”.
  • purpose information may be associated with at least one of a plurality of programs (a plurality of motion programs and a plurality of machining programs).
  • at least one of the plurality of types of control signals is associated with the purpose information via the program.
  • control signals representing program execution history are associated with the purpose information.
  • the extracting unit 219 identifies a program executed when acquiring, generating, or outputting the control signal based on the control signal representing the execution history of the program, and associates the purpose information associated with the identified program with the control signal. It acquires from the program storage unit 211 as the purpose information.
  • the extraction unit 219 identifies the virtual peripheral object on which the virtual end effector V50 acts based on the positional relationship between one or more virtual peripheral objects included in the virtual space VS and the virtual end effector V50 (model of the end effector 50). , the change in state of the virtual peripheral object may be extracted based on the control signal and the purpose information relating to the operation of the virtual end effector V50.
  • the extraction unit 219 recognizes that the program being executed when the control signal for gripping the workpiece W by the end effector 50 (hand) is generated is associated with the purpose information "gripping". Based on the recognition result that "grasping" is associated with the program, the extraction unit 219 extracts the position/orientation of the virtual work VW (model of the work W) in the virtual space VS with the position of the virtual end effector V50 as a reference. Rewrite to the relative position/relative orientation. After rewriting, when the simulator 214 moves the virtual end effector V50 in the virtual space VS, the virtual work VW moves together with the virtual end effector V50 based on the relative position/relative orientation.
  • FIGS. 4 and 5 include virtual robot V5B and virtual robot V5C operating in virtual space VS.
  • Virtual robots V5B and V5C are models of robots 5B and 5C, each having a virtual end effector V50 (model of end effector 50).
  • the end effector 50 of the robot 5B is a hand
  • the end effector 50 of the robot 5C is a joining tool such as welding.
  • the virtual space VS includes virtual tables VT1 and VT2, which are virtual peripheral objects, in addition to the virtual robots V5B and V5C.
  • Virtual works VW1 and VW2, which are virtual peripheral objects, are arranged on the virtual table VT1.
  • the simulator 214 based on the transition information and the model information of the production apparatus 2, after the virtual robot V5B moves the virtual end effector V50 to the position for gripping the virtual workpiece VW1 ( (see FIG. 4(a)), the virtual end effector V50 is operated so as to grip the virtual work VW1.
  • the extraction unit 219 recognizes that the control signal for operating the virtual end effector V50 to grip the virtual work VW1 is associated with the purpose information "gripping", and extracts the position/orientation of the virtual work VW1 in the virtual space VS. to a relative position/orientation based on the position of the virtual end effector V50.
  • the simulator 214 causes the virtual robot V5B to move the virtual end effector V50 to a position where the virtual work VW1 is placed on the virtual work VW2 based on the transition information and the model information of the production apparatus 2 .
  • the simulator 214 moves the virtual work VW1 together with the virtual end effector V50 based on the movement of the virtual end effector V50 and the relative position/orientation, and places it on the virtual work VW2 ((b) in FIG. 4). reference).
  • the simulator 214 operates the virtual end effector V50 to release the grip of the virtual work VW1.
  • the extracting unit 219 recognizes that the control signal for operating the virtual end effector V50 to release the grip of the virtual work VW1 is associated with the purpose information "grip release", and the position of the virtual end effector V50 is used as a reference.
  • the relative position/relative orientation of the virtual work VW1 to be is rewritten to the position/orientation in the virtual space VS.
  • the simulator 214 causes the virtual robot V5B to withdraw the virtual end effector V50 from the virtual work VW1 based on the transition information and the model information of the production apparatus 2 .
  • the virtual work VW1 since the position of the virtual work VW1 has been rewritten to the position and orientation in the virtual space VS, the virtual work VW1 remains on the virtual work VW2 without moving together with the virtual end effector V50 (see FIG. 4). (c)).
  • the simulator 214 moves the virtual end effector V50, which is the joining tool, between the virtual work VW1 and the virtual work VW2 using the virtual robot V5C based on the transition information and the model information of the production apparatus 2. contact between them (see FIG. 5(a)).
  • the extraction unit 219 recognizes that the control signal for bringing the virtual end effector V50 into contact with the virtual work VW1 and the virtual work VW2 is associated with the purpose information of "joining", and recognizes that the purpose information "bonding" is associated with the virtual work VW2 in the virtual space VS.
  • the position/orientation is rewritten to a relative position/orientation based on the position of the virtual work VW1.
  • the simulator 214 causes the virtual robot V5C to withdraw the virtual end effector V50 from the virtual works VW1 and VW2 based on the transition information and the model information of the production apparatus 2, and moves the virtual end effector V50 to a position for gripping the virtual work VW1.
  • the virtual end effector V50 is moved by the robot V5B (see (b) of FIG. 5).
  • the simulator 214 operates the virtual end effector V50 so as to grip the virtual work VW1.
  • the extraction unit 219 recognizes in the program storage unit 211 that the control signal for operating the virtual end effector V50 so as to grip the virtual work VW1 is associated with the purpose information "gripping the work W", The position/orientation of the virtual work VW1 in VS is rewritten to a relative position/orientation based on the position of the virtual end effector V50. After that, the simulator 214 causes the virtual robot V5B to move the virtual end effector V50 to a position where the virtual work VW3 is arranged on the virtual table VT2 based on the transition information and the model information of the production apparatus 2 .
  • the simulator 214 moves the virtual work VW1 based on the movement of the virtual end effector V50, the relative position/relative orientation of the virtual work VW1 with respect to the virtual end effector V50, and the relative position/relative orientation of the virtual work VW2 with respect to the virtual work VW1.
  • VW1 is moved together with the virtual end effector V50
  • the virtual work VW2 is moved together with the virtual work VW1.
  • the virtual work VW3 moves onto the virtual table VT2 (see (c) in FIG. 5).
  • the simulation device 200 may be configured to further display a graph representing the transition of the control signal on the monitor.
  • the simulation device 200 may display the graph on the monitor so that it can be visually recognized together with the virtual space VS.
  • the simulation device 200 may display a graph in association with a point-in-time indicator indicating a point in time in the operation of the virtual production apparatus V2 displayed on the monitor.
  • the value of the control signal in the graph can be instantly associated with the display of the virtual space VS.
  • the simulation device 200 further includes a graph generator 221 and a synchronizer 222 as functional blocks.
  • the graph generation unit 221 reads the transition information of the target section from the transition information storage unit 218 and generates the above graph based on the read transition information.
  • the synchronization unit 222 synchronizes the time point in the operation of the virtual production apparatus V2 displayed on the monitor with the time point indicated by the time point index in the graph. For example, the synchronization unit 222 identifies the position on the graph corresponding to the point in the operation of the virtual production apparatus V2 displayed on the monitor. For example, the synchronizing unit 222 selects a point in time corresponding to the operation of the virtual production apparatus V2 displayed on the monitor (a point in time to be reproduced by the operation of the virtual production apparatus V2 displayed on the monitor) on the time axis of the transition information. Identify and identify the location on the graph that corresponds to the identified time point. The display unit 216 displays the graph generated by the graph generation unit 221 on the monitor, and causes the monitor to display the time index in association with the position specified by the synchronization unit 222 .
  • the position specified by the synchronization unit 222 can be visually recognized, there is no particular limitation on the display form of the time point index.
  • the time point index include a line drawn on the graph passing through the position specified by the synchronization unit 222, a point displayed at the position specified by the synchronization unit 222, and a position specified by the synchronization unit 222.
  • a pointing arrow or the like can be used.
  • the display unit 216 may cause the monitor to display an event index representing the event in association with the position on the graph corresponding to the time represented by the event information, based on the event information representing the time when the specific event occurred.
  • An example of an event index is a combination of a position index indicating a position on a graph and an icon indicating occurrence of an event.
  • the display unit 216 displays on the monitor a notification indicating that the event has occurred in association with the operation of the virtual production apparatus V2 corresponding to the time indicated by the event information, based on the event information indicating the time when the specific event occurred. You may let For example, the display unit 216 may display the notification at a position that can be visually recognized together with the operation of the virtual production apparatus V2 at the timing when the operation of the virtual production apparatus V2 corresponding to the event information is displayed on the monitor. It is possible to quickly recognize how the production device 2 was operating when the event occurred.
  • the display form of the notification As long as it can be associated with the operation of the virtual production equipment V2 corresponding to the time represented by the event information, there is no particular limitation on the display form of the notification.
  • Examples of the form of notification display include displaying text or a diagram notifying the occurrence of an event together with the operation of the virtual production apparatus V2, or at least partially displaying the operation of the virtual production apparatus V2 corresponding to the time represented by the event information. highlighting by changing the color, etc.
  • the simulation device 200 may further include a detection unit 223, an event storage unit 225, and an event notification unit 226.
  • the detection unit 223 detects the occurrence of an event based on the control signal and generates the event information. For example, the detection unit 223 may detect the occurrence of an event based on the status signal such as the alarm described above, or may detect the occurrence of the event based on the magnitude of the deviation described above. Since the detection unit 223 detects an event based on the control signal, convenience is improved.
  • the detection unit 223 may detect that the robots 5B, 5C, and 5D are overloaded as an event.
  • the detector 223 may detect overload based on the status signal, or may detect overload based on the deviation between the target angle and the feedback value.
  • the detection unit 223 sequentially reads out a plurality of status records stored in the database 113, checks the presence or absence of an event based on the read status records, associates the detected event with the time common to the status record, and stores the event. Stored in unit 225 . As a result, the event information is accumulated in the event storage unit 225 .
  • the event notification unit 226 confirms, based on the event information stored in the event storage unit 225, whether there is an event occurring at the time corresponding to the operation of the virtual production apparatus V2 displayed on the monitor by the display unit 216.
  • the event notification unit 226 identifies an event that occurred at the time corresponding to the operation of the virtual production apparatus V2
  • the event indicator and notification of the identified event are displayed on the display unit 216.
  • the simulation device 200 may be configured to set the target section based on user input.
  • the simulation device 200 may further have an interval setting section 227 .
  • the section setting unit 227 sets the target section based on user input.
  • the section setting unit 227 displays an input screen on which the start time and the end time can be individually input, and sets the inputted start time to the end time as the target section.
  • the simulation device 200 may further include an event identification unit 228.
  • the event identification unit 228 identifies at least an event of the production system 1 based on user input.
  • the section setting unit 227 may set the target section so as to include the point in time when the event specified by the event specifying unit 228 occurs.
  • the event identification unit 228 displays a list of events detected by the detection unit 223 based on the event information stored in the event storage unit 225, and identifies an event selected from the list by user input as event 1 above. do.
  • the interval setting unit 227 sets the point in time before the occurrence of the event specified by the event specifying unit 228 by a predetermined period as the start time, the point in time at which the event occurs as the end time, and the interval from the start time to the end time as the target interval. set.
  • the interval setting unit 227 may set the end time to a point in time past the point at which the event occurred.
  • the display unit 216 causes the monitor to display, based on the photographed data of the production apparatus 2 photographed in the real space, the photographing index representing the photographed data in association with the position on the graph corresponding to the time when the photographed data was photographed. may It is possible to facilitate association between the photographed data of the physical space and the control signal.
  • the display unit 216 may display on the monitor the frames captured in the real space at the time indicated by the operation of the virtual production apparatus V2. It is possible to easily associate the frames captured in the real space with the virtual space.
  • a frame may be a still image or one frame of a moving image.
  • the simulation device 200 may further include a photographed data extraction unit 231, a photographed data storage unit 232, and a frame identification unit 233.
  • the photographed data extraction unit 231 extracts the photographed data included in the state record corresponding to the target section from the database 113, and stores the photographed data storage unit 231 in association with the time when the photographed data was photographed (for example, the time included in the state record).
  • the display unit 216 causes the monitor to display the shooting index based on the time points associated with the shooting data in the shooting data storage unit 232 .
  • an imaging index is a combination of a position index indicating a position on a graph and an icon indicating existence of imaging data.
  • the frame identification unit 233 identifies a frame captured in the real space at the time indicated by the operation of the virtual production apparatus V2 based on the time associated with the captured data stored in the captured data storage unit 232.
  • the display unit 216 displays the virtual space VS and the frame specified by the frame specifying unit 233 on the monitor.
  • the simulation device 200 may be configured to change the viewpoint based on user input. Calculation results from the simulator can be effectively used to reproduce moving images from arbitrary viewpoints.
  • the simulation device 200 further has a viewpoint setting section 241 and a viewpoint storage section 242 .
  • the viewpoint setting unit 241 sets the viewpoint for the virtual space VS based on user input.
  • the viewpoint setting unit 241 sets the position and direction of the viewpoint in the virtual space VS.
  • the display unit 216 displays the virtual space VS viewed from the viewpoint set by the viewpoint setting unit 241 .
  • the viewpoint storage unit 242 stores the viewpoint change history by the viewpoint setting unit 241 .
  • the viewpoint storage unit 242 stores a change history of the viewpoint on the time axis of the transition information.
  • the viewpoint storage unit 242 associates and stores the point in time (point in time on the time axis) displayed by the display unit 216 and the changed viewpoint.
  • the display unit 216 changes the viewpoint based on the viewpoint change history stored in the viewpoint storage unit 242 when displaying the operation of the virtual production apparatus V2. Viewpoint change history can be reused to improve convenience.
  • the simulation device 200 may further have a confirmation unit 245 .
  • the confirmation unit 245 checks the configuration of the production apparatus 2 and the configuration of the virtual production apparatus V2 based on transition information (for example, a plurality of state records) stored in the database 113 and model information stored in the model holding unit 213. Check if they match. If the confirmation unit 245 determines that the configuration of the production apparatus 2 and the configuration of the virtual production apparatus V2 do not match, the confirmation unit 245 may display a mismatch notification on the monitor. Inappropriate verification due to mismatches can be easily prevented.
  • the configuration here includes at least the type and number of machines.
  • the confirmation unit 245 identifies the configuration of the production device 2 based on the data item label (column header or key, etc.) in the transition information, and identifies the configuration of the virtual production device V2 based on the model information of the production device 2. , the configuration of the specified production apparatus 2 and the configuration of the virtual production apparatus V2 are compared.
  • FIG. 8 is a schematic diagram illustrating an image that the display unit 216 causes the monitor to display.
  • the image illustrated in FIG. 8 includes a simulation area 250, a graph 260, a slide bar 261, a play button 262, a time indicator 263, an event indicator 264, and a plurality of shooting indicators 265.
  • the simulation area 250 is a two-dimensional simulation image showing the virtual space VS seen from the above viewpoint, and includes a virtual transport device V5A, virtual robots V5B, V5C, and V5D, a virtual machine tool V5E, and a virtual work VW. .
  • a graph 260 represents temporal changes in target sections of a plurality of control signals.
  • graph 260 classifies the plurality of control signals into two or more groups and includes multiple graphs for each group. Multiple graphs are arranged vertically.
  • the horizontal axis represents time on the time axis of the transition information
  • the vertical axis represents the value of the control signal.
  • the time index 263 is displayed in association with the position on the graph 260 corresponding to the time displayed by the simulation image on the time axis of the transition information.
  • the time indicator 263 is linearly displayed on the graph 260 across multiple graphs.
  • the event index 264 is displayed at the position on the graph 260 corresponding to the time when the event occurred.
  • the shooting index 265 includes a still image shooting index 266 and a moving image shooting index 267.
  • the still image shooting index 266 indicates the shooting time point of the still image included in the shooting data.
  • the moving image shooting index 267 indicates the shooting section of the moving image included in the shooting data.
  • the slide bar 261 is an object that graphically represents the time on the time axis of transition information.
  • Slide bar 261 has a laterally extending track 268 and a thumb 269 that moves along track 268 .
  • the position of thumb 269 on track 268 represents the time on the time axis of the transition information.
  • Synchronization unit 222 identifies the display position of time indicator 263 based on the specified time.
  • the display unit 216 changes the display of the simulation area 250 based on the simulation result at the designated time, and moves the time indicator 263 to the display position specified by the synchronization unit 222 .
  • the play button 262 is a button for instructing to operate the virtual production apparatus V2 in the simulation area 250 based on the transition information.
  • the display unit 216 reads the simulation results accumulated in the result holding unit 215 in chronological order, and displays the thumb of the slide bar 261 at the position representing the time associated with the read simulation result. 269 to change the display of the simulation area 250 based on the read simulation result.
  • the synchronization unit 222 specifies the display position of the time indicator 263 based on the time associated with the simulation result read by the display unit 216, and the display unit 216 displays the time indicator 263 at the display position specified by the synchronization unit 222. move.
  • the time index 263 overlaps the still image shooting index 266 and the moving image shooting index 267 . Therefore, an imaging data window 271 and an imaging data window 272 are additionally displayed.
  • the photographed data window 271 displays a still image photographed by the camera 51 in a state where the end effector 50, which is a hand, is arranged at a position where the workpiece W can be grasped.
  • the captured data window 272 displays a frame corresponding to the point in time indicated by the time indicator 263 in the moving image captured by the environment sensor 6 .
  • FIG. 9 is a block diagram illustrating the hardware configuration of the local controller 400, the host controller 300, the data collection device 100, and the simulation device 200.
  • local controller 400 includes circuitry 490 .
  • Circuitry 490 includes processor 491 , memory 492 , storage 493 , driver circuitry 494 and communication port 495 .
  • the storage 493 is composed of one or more non-volatile memory devices such as flash memory or hard disk. The storage 493 stores programs for causing the local controller 400 to control the machine 5 .
  • the memory 492 is composed of one or more volatile memory devices such as random access memory. Memory 492 temporarily stores programs loaded from storage 493 .
  • the processor 491 is composed of one or more computing devices such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). The processor 491 executes a program loaded in the memory 492 to configure each functional block described above in the local controller 400 . A calculation result by the processor 491 is temporarily stored in the memory 492 .
  • the driver circuit 494 operates the machine 5 according to requests from the processor 491 .
  • the communication port 495 communicates with the host controller 300 via the communication network NW1 for control in response to requests from the processor 491 .
  • the upper controller 300 has a circuit 390.
  • Circuit 390 has processor 391 , memory 392 , storage 393 , input/output port 394 , communication port 395 and communication port 396 .
  • the storage 393 is composed of one or more non-volatile memory devices such as flash memory or hard disk.
  • the storage 393 stores a program for causing the host controller 300 to collect the status records and generate the execution command described above.
  • the memory 392 is composed of one or more volatile memory devices such as random access memory. Memory 392 temporarily stores programs loaded from storage 393 .
  • the processor 391 is composed of one or more computing devices such as CPU or GPU. The processor 391 executes a program loaded in the memory 392 to configure each functional block described above in the host controller 300 . A calculation result by the processor 391 is temporarily stored in the memory 392 .
  • the input/output port 394 exchanges information with the environment sensor 6 in response to requests from the processor 391 .
  • Communication port 395 communicates with local controller 400 via communication network NW1 in response to requests from processor 391 .
  • the communication port 396 communicates with the data collection device 100 and the simulation device 200 via the communication network NW2 in response to requests from the processor 391 .
  • the communication network NW2 may be a network different from the communication network NW1, or may be the same network as the communication network NW1.
  • the data collection device 100 has a circuit 190.
  • Circuitry 190 includes processor 191 , memory 192 , storage 193 , communication port 194 and user interface 195 .
  • the storage 193 is composed of one or more non-volatile memory devices such as flash memory or hard disk.
  • the storage 193 stores a program for configuring the data collection device 100 with the functional blocks described above.
  • the memory 192 is composed of one or more volatile memory devices such as random access memory. Memory 192 temporarily stores programs loaded from storage 193 .
  • the processor 191 is composed of one or more arithmetic devices such as CPU or GPU. The processor 191 executes the program loaded in the memory 192 to configure the data collection device 100 with each functional block described above. A calculation result by the processor 191 is temporarily stored in the memory 192 .
  • the communication port 194 communicates with the simulation device 200 and the upper controller 300 via the communication network NW2 in response to requests from the processor 191.
  • a user interface 195 inputs and outputs information to and from an operator in response to requests from the processor 191 .
  • the user interface 195 has a display device such as a liquid crystal monitor or an organic EL (Electro-Luminescence) monitor and an input device such as a keyboard or mouse.
  • the input device may be integrated with the display device as a touch panel.
  • the simulation device 200 has a circuit 290.
  • Circuitry 290 includes processor 291 , memory 292 , storage 293 , communication port 294 and user interface 295 .
  • the storage 293 is composed of one or more non-volatile memory devices such as flash memory or hard disk.
  • the storage 293 stores a program for causing the simulation device 200 to configure the functional blocks described above.
  • the memory 292 is composed of one or more volatile memory devices such as random access memory. Memory 292 temporarily stores programs loaded from storage 293 .
  • the processor 291 is composed of one or more computing devices such as CPU or GPU. The processor 291 executes the program loaded in the memory 292 to configure the above functional blocks in the simulation device 200 . A calculation result by the processor 291 is temporarily stored in the memory 292 .
  • the communication port 294 communicates with the data collection device 100 and the upper controller 300 via the communication network NW2 in response to requests from the processor 291.
  • a user interface 295 inputs and outputs information to and from an operator in response to requests from the processor 291 .
  • the user interface 295 has a display device such as a liquid crystal monitor or an organic EL (Electro-Luminescence) monitor and an input device such as a keyboard or mouse.
  • the input device may be integrated with the display device as a touch panel.
  • simulation device 200 may be incorporated in data collection device 100 .
  • simulation device 200 and the data collection device 100 may be incorporated in the host controller 300 .
  • [Log playback procedure] As an example of the reproduction method, a log reproduction procedure executed by the simulation apparatus 200 is illustrated. The log reproduction procedure is to operate the virtual production device V2 corresponding to the production device 2 in the virtual space VS based on the transition information representing the transition of the control signal collected from the controller 3 and the model information of the production device 2. and causing the monitor to display the virtual space VS in which the virtual production apparatus V2 operates.
  • the log playback procedure includes, in order, a section setting screen display procedure, an initial screen display procedure, and a display update procedure. These procedures are exemplified below.
  • step S01 the confirmation unit 245 waits for the call of the log reproduction function.
  • step S02 the confirmation unit 245 acquires the configuration information of the production device 2 and the configuration information of the virtual production device V2.
  • the confirmation unit 245 acquires configuration information of the production apparatus 2 based on a plurality of state records stored in the database 113, and acquires configuration information of the virtual production apparatus V2 based on model information stored in the model holding unit 213. .
  • step S03 the confirmation unit 245 confirms whether or not the configuration of the production apparatus 2 and the configuration of the virtual production apparatus V2 match based on the configuration information acquired in step S02. If it is determined in step S03 that the configuration of production device 2 and the configuration of virtual production device V2 do not match, simulation device 200 executes step S04. In step S04, the confirmation unit 245 causes the monitor to display the inconsistency notification. After that, the simulation device 200 returns the processing to step S01.
  • step S05 the detection unit 223 sequentially reads a plurality of state records from the database 113, checks the presence or absence of an event based on the read state records, and stores the event information in the event storage unit 225 based on the confirmation result. .
  • step S06 the section setting unit 227 displays a target section setting screen.
  • the section setting unit 227 displays a setting screen including input boxes for the start time and the end time and a list of the events. This completes the procedure for displaying the section setting screen.
  • step S11 the section setting unit 227 confirms whether or not a section is specified by inputting the start time and the end time on the section setting screen.
  • step S12 the event specifying unit 228 checks whether or not the event of the production system 1 has been selected from the event list on the section setting screen. If it is determined in step S12 that no event has been selected, the simulation device 200 returns the process to step S11.
  • step S11 When it is determined in step S11 that a section has been designated, the simulation device 200 executes step S13.
  • the section setting unit 227 sets the target section from the start time to the end time.
  • step S12 If it is determined in step S12 that an event has been selected, the simulation device 200 executes step S14.
  • the event specifying unit 228 specifies the selected event, and the interval setting unit 227 sets the target interval so as to include the point in time when the specified event occurs.
  • the simulation device 200 executes steps S15, S16 and S17.
  • step S ⁇ b>15 the transition information extraction unit 217 extracts the transition information of the log reproduction target section from the database 113 and stores it in the transition information storage unit 218 .
  • step S16 the graph generation unit 221 reads the transition information of the target section from the transition information storage unit 218, and generates the graph 260 based on the read transition information.
  • step S17 the photographed data extracting unit 231 extracts the photographed data included in the state record corresponding to the target section from the database 113, and associates the photographed data with the time when the photographed data was photographed (for example, the time included in the state record). It is stored in the photographed data storage unit 232 .
  • step S18 the simulator 214 operates the virtual production apparatus V2 in the virtual space VS based on the transition information stored in the transition information storage unit 218.
  • step S19 the display unit 216 sets the start time of the target section as the display target time, reads the simulation result corresponding to the display target time from the transition information storage unit 218, and based on the read simulation result, a predetermined viewpoint is displayed. generates a two-dimensional simulation image in which the virtual space VS is viewed from.
  • the display unit 216 causes the monitor to display the simulation area 250 including the generated simulation image, the graph 260 generated in step S16, and the imaging index 265 based on the imaging data extracted in step S17. This completes the procedure for displaying the initial screen.
  • FIG. 12 is a flowchart illustrating the simulation procedure in step S18. As shown in FIG. 12, the simulation apparatus 200 first executes steps S21 and S22.
  • step S21 the simulator 214 extracts a plurality of types of control signals corresponding to the start time of the target section from transition information stored in the transition information storage unit 218.
  • step S22 the simulator 214 reproduces the virtual space VS immediately before the virtual production apparatus V2 begins to operate based on the extracted multiple types of control signals and the model information of the production apparatus 2 stored in the model holding unit 213. Generate.
  • step S ⁇ b>23 the simulator 214 extracts the following multiple types of control signals from the transition information stored in the transition information storage unit 218 .
  • step S24 the simulator 214 operates the virtual production apparatus V2 in the virtual space VS based on the following multiple types of control signals and the model information of the production apparatus 2 stored in the model holding unit 213.
  • step S25 the extraction unit 219 confirms whether or not the virtual end effector V50 is positioned at the start position of the work on the virtual peripheral object based on the control signal related to the operation of the virtual end effector V50 and the purpose information. . If it is determined in step S25 that the virtual end effector V50 is positioned at the work start position, the simulation device 200 executes step S26. In step S26, the extraction unit 219 extracts a virtual peripheral object on which the virtual end effector V50 acts based on the positional relationship between one or more virtual peripheral objects included in the virtual space VS and the virtual end effector V50 (model of the end effector 50). Identify objects.
  • step S27 the extraction unit 219 confirms whether or not the virtual end effector V50 is placed at the position where the work on the virtual peripheral object is completed, based on the control signal related to the operation of the virtual end effector V50 and the purpose information. do. If it is determined in step S27 that the virtual end effector V50 is positioned at the work completion position, the simulation device 200 executes step S28. In step S28, the extraction unit 219 extracts the state change of the virtual peripheral object based on the purpose information, and changes the state of the virtual peripheral object in the virtual space VS based on the extracted state change.
  • step S29 the simulator 214 confirms whether or not the operation of the virtual production apparatus V2 based on the transition information has been completed. If it is determined in step S29 that the operation of the virtual production apparatus V2 based on the transition information has not been completed, the simulation apparatus 200 returns the process to step S23. If it is determined in step S29 that the operation of the virtual production apparatus V2 based on the transition information has been completed, the simulation apparatus 200 completes the simulation procedure.
  • step S31 the display unit 216 confirms whether or not the slide bar 261 has been operated (whether or not the thumb 269 has been moved).
  • step S32 and S33 the display unit 216 identifies the display target time on the time axis of the transition information based on the position of the thumb 269 .
  • step S33 the display unit 216 executes screen transition processing to the display target time. The contents of the screen transition processing will be described later.
  • step S34 the viewpoint setting unit 241 confirms whether or not an operation to change the viewpoint has been performed, such as by dragging the simulation area 250 . If it is determined in step S34 that an operation for changing the viewpoint has been performed, the simulation device 200 executes steps S35 and S36.
  • step S35 the viewpoint setting unit 241 changes the viewpoint with respect to the virtual space VS, and causes the viewpoint storage unit 242 to store the viewpoint change history.
  • step S36 the display unit 216 causes the simulation area 250 to display the virtual space VS viewed from the changed viewpoint.
  • step S34 If it is determined in step S34 that an operation to change the viewpoint has not been performed, the simulation device 200 executes step S37.
  • step S37 the display unit 216 confirms whether or not a reproduction operation has been performed using the reproduction button 262.
  • step S38 the display unit 216 executes reproduction processing. Details of the reproduction process will be described later.
  • step S39 the simulation device 200 executes step S39. If it is determined in step S37 that the reproduction operation has not been performed, the simulation apparatus 200 executes step S39 without executing steps S33, S36, and S38.
  • step S39 the display unit 216 confirms whether or not an instruction to end the log reproduction function has been given. If it is determined in step S39 that the termination of the log reproduction function has not been instructed, the simulation device 200 returns the process to step S31. If it is determined in step S39 that the termination of the log reproduction function has been instructed, the simulation apparatus 200 completes the display update procedure.
  • FIG. 14 is a flowchart illustrating the screen transition procedure in step S33.
  • the simulation device 200 executes steps S41, S42, S43, and S44.
  • step S ⁇ b>41 the display unit 216 reads the simulation result corresponding to the display target time from the result holding unit 215 .
  • step S ⁇ b>42 based on the change history stored in the viewpoint storage unit 242 , the viewpoint setting unit 241 sets the viewpoint corresponding to the display target time.
  • step S43 the display unit 216 generates a simulation image of the virtual space VS viewed from the set viewpoint based on the read simulation result.
  • the synchronization unit 222 identifies the display position of the time indicator 263 based on the display target time.
  • step S45 the event notification unit 226 checks whether an event has occurred at the display target time. If it is determined in step S45 that an event has occurred, the simulation device 200 executes step S46. In step S46, the display unit 216 adds an event occurrence notification to the generated simulation image.
  • step S47 the simulation device 200 executes step S47. If it is determined in step S45 that no event has occurred, the simulation apparatus 200 executes step S47 without executing step S46.
  • step S47 the frame identification unit 233 confirms whether or not there is photographed data photographed at the display target time. If it is determined in step S47 that there is shooting data, the simulation device 200 executes step S48.
  • step S48 the frame identification unit 233 identifies the frame captured in the physical space at the display target time based on the time associated with the captured data.
  • the display unit 216 generates captured data windows 271 and 272 for displaying the frames specified by the frame specifying unit 233 .
  • step S49 the simulation device 200 executes step S49. If it is determined in step S47 that there is no shooting data, the simulation device 200 executes step S49 without executing step S48.
  • step S49 the display unit 216 updates the display of the simulation area 250 based on the generated simulation image, moves the time index 263 to the specified display position, and displays the generated captured data windows 271 and 272. This completes the screen transition procedure.
  • FIG. 15 is a flowchart illustrating the reproduction procedure in step S38.
  • the simulation device 200 executes steps S51, S52, and S53.
  • step S51 the display unit 216 adds one cycle (for example, the communication cycle described above) to update the display target time.
  • the display unit 216 executes screen transition processing to the updated display target time.
  • the procedure of step S52 is the same as the screen transition procedure described above.
  • step S53 the display unit 216 confirms whether the display target time is the end time of the target section.
  • step S54 the viewpoint setting unit 241 confirms whether or not an operation for changing the viewpoint has been performed. If it is determined in step S54 that an operation for changing the viewpoint has been performed, the simulation device 200 executes step S55.
  • step S55 the viewpoint setting unit 241 changes the viewpoint with respect to the virtual space VS, and causes the viewpoint storage unit 242 to store the viewpoint change history. The result of changing the viewpoint is reflected in the display of the simulation area 250 in the next screen transition process.
  • step S56 the display unit 216 confirms whether or not a stop operation has been performed using the playback button 262.
  • step S56 the display unit 216 confirms whether or not a stop operation has been performed using the playback button 262.
  • step S57 the display unit 216 confirms whether or not one cycle has passed since the execution of step S51. If it is determined in step S57 that one period has not elapsed, the simulation device 200 returns the process to step S54. If it is determined in step S57 that one cycle has passed, the simulation device 200 returns the process to step S51.
  • step S53 If it is determined in step S53 that the display target time is the end time of the target section, or if it is determined that a stop operation has been performed in step S56, the simulation device 200 completes the reproduction procedure.
  • the embodiments shown above include the following configurations. (1) Create a virtual production device V2 corresponding to the production device 2 based on the transition information representing the transition of control signals collected from the controller 3 that controls the production device 2 including the robot 5B and the model information of the production device 2. in a virtual space VS, and a display unit 216 for displaying on a monitor the virtual space VS in which the virtual production device V2 operates. Even if the real space is not fully photographed, it is possible to reproduce the phenomenon that occurred in the real space. Reproduction results can also be displayed from any viewpoint. Therefore, it is effective for ex-post verification of system operation.
  • the production system 1 according to (1) which is displayed.
  • the transition of the control signal stored in the database can also be effectively used to reproduce the state change of surrounding objects. By including state changes of virtual surrounding objects in the display, phenomena occurring in the real space can be verified more easily.
  • the extraction unit 219 extracts state changes of at least one model included in the virtual space VS based on the control signal and purpose information associated with the control signal.
  • production system 1 It is possible to more accurately extract state changes of virtual peripheral objects.
  • the virtual production apparatus V2 includes a virtual robot V5B having a virtual end effector V50 corresponding to the end effector 50 of the robot 5B.
  • a virtual peripheral object on which the virtual end effector V50 acts is specified based on the positional relationship with the end effector V50, and a state change of the virtual peripheral object is detected based on the control signal and purpose information relating to the operation of the virtual end effector V50.
  • the production system 1 according to (4), which extracts. Calculations for changing the state of the virtual peripheral object based on the operation of the virtual production device V2 can be performed more easily.
  • a graph generation unit 221 that generates a graph representing the transition of the control signal
  • a synchronization unit 222 that synchronizes the time point in the operation of the virtual production apparatus V2 displayed on the monitor with the time point indicated by the time point index in the graph
  • the production system 1 according to any one of (1) to (5), wherein the display unit 216 causes the monitor to display the virtual space VS and the graph including the time point index.
  • the value of the control signal in the graph can be instantly associated with the virtual space VS being displayed.
  • the display unit 216 causes the monitor to display an event indicator representing the event based on the event information indicating the time when the specific event occurred, in association with the position on the graph corresponding to the time indicated by the event information.
  • the production system 1 described. By making it possible to instantly grasp how the production apparatus 2 operates and what the value of the control signal is when an event occurs, it is possible to quickly identify the cause of the event.
  • the display unit 216 based on the event information indicating the time when the specific event occurred, displays a notification indicating the occurrence of the event in association with the operation of the virtual production apparatus V2 corresponding to the time indicated by the event information. , is displayed on the monitor, the production system 1 according to (6) or (7). It is possible to quickly recognize how the production device 2 was operating when the event occurred.
  • (9) It further has a detection unit 223 that detects the occurrence of a specific event based on the control signal and generates event information about the event, and the display unit 216 detects the occurrence of the event based on the event information.
  • the production system 1 according to any one of (6) to (8), wherein a notification representing is displayed on the monitor. Since the detection unit 223 detects an event based on the control signal, convenience is improved.
  • the display unit 216 monitors the photographing index representing the photographed data in association with the position on the graph corresponding to the time when the photographed data was photographed. , the production system 1 according to any one of (6) to (11). It is possible to easily associate the photographed data of the physical space with the control signal.
  • the display unit 216 further includes a frame identification unit 233 that identifies a frame captured in the physical space at the time represented by the operation of the virtual production device V2 based on the captured data of the production device 2 captured in the physical space. displays the virtual space VS and the specified frame on the monitor, the production system 1 according to any one of (1) to (12). It is possible to easily associate the frames captured in the physical space with the virtual space VS.
  • the display unit 216 displays the virtual space VS seen from the set viewpoint, (1) to (13) ).
  • the calculation result by the simulator 214 can be effectively used for display from any viewpoint.
  • the display unit 216 further includes a viewpoint storage unit 242 that stores a viewpoint change history, and the display unit 216 changes the viewpoint based on the viewpoint change history when displaying the operation of the virtual production apparatus V2. ) production system 1 described. Viewpoint change history can be reused to improve convenience.
  • the simulator 214 further includes an event identification unit 228 that identifies at least one event based on user input, and an interval setting unit 227 that sets an interval including the point in time when the event occurs.
  • the production system 1 according to any one of (1) to (15), wherein the virtual production device V2 is operated based on transition information in the section set by the unit 227. If an event that can be verified is specified, the period for which the simulation should be performed is set, which improves convenience.
  • the production equipment 2 includes a plurality of machines 5 including robots 5B, the controller 3 includes a plurality of local controllers 400 that respectively control the plurality of machines 5, and the simulator 214 collects data from the plurality of local controllers 400.
  • the virtual production apparatus V2 is operated in the virtual space VS based on transition information representing transitions of the plurality of types of control signals obtained and the models of the plurality of machines 5.
  • production system 1 Based on the control signals of a plurality of machines 5, the operation of the production device 2 can be reproduced in detail.
  • a virtual production device V2 corresponding to the production device 2 is created based on the transition information representing the transition of control signals collected from the controller 3 that controls the production device 2 including the robot 5B and the model of the production device 2.
  • a reproduction method including operating in a virtual space VS and displaying on a monitor the virtual space VS in which the virtual production device V2 operates.

Abstract

A production system 1 comprises: a simulator 214 with which, on the basis of transition information representing transition of a control signal collected from a controller 3 that controls a production apparatus 2 including a robot 5B and model information about the production apparatus 2, a virtual production apparatus V2 corresponding to the production apparatus 2 is operated in a virtual space VS; and a display part 216 for displaying, on a monitor, the virtual space VS in which the virtual production apparatus V2 is operated.

Description

生産システム及び再現方法Production system and reproduction method
 本開示は、生産システム及び再現方法に関する。 This disclosure relates to a production system and a reproduction method.
  特許文献1には、リアル空間において1以上のロボットを動作プログラムに基づいて動作させるコントローラと、バーチャル空間において、1以上のロボットにそれぞれ対応する1以上のバーチャルロボットを動作プログラムに基づいて動作させるバーチャルコントローラと、コントローラによる1以上のロボットの動作を中断させる動作中断部と、1以上のロボットの動作が中断したリアル空間の状態を、バーチャル空間に再現させる中断状態再現部と、中断状態再現部がリアル空間の状態を再現させたバーチャル空間において、バーチャルコントローラにより動作プログラムに基づいて1以上のバーチャルロボットの動作を再開させる再開シミュレータと、を備える。 Patent Document 1 describes a controller that operates one or more robots in a real space based on an operation program, and a virtual robot that operates one or more virtual robots corresponding to the one or more robots in a virtual space based on an operation program. A controller, an action interrupting unit that interrupts the motion of one or more robots by the controller, an interrupted state reproducing unit that reproduces in a virtual space the state of the real space in which the motion of the one or more robots is interrupted, and an interrupted state reproducing unit. a restart simulator for restarting the motion of one or more virtual robots based on a motion program by a virtual controller in a virtual space that reproduces the state of the real space.
特許第6898506号Patent No. 6898506
 本開示は、生産装置の動作の事後検証に有効なシステムを提供する。 The present disclosure provides an effective system for post-verification of the operation of production equipment.
 本開示の一側面に係る生産システムは、ロボットを含む生産装置を制御するコントローラから収集された制御信号の推移を表す推移情報と、生産装置のモデル情報とに基づいて、生産装置に対応する仮想生産装置を仮想空間で動作させるシミュレータと、仮想生産装置が動作する仮想空間をモニタに表示させる表示部と、を備える。 A production system according to one aspect of the present disclosure provides a virtual machine corresponding to a production apparatus based on transition information representing transition of control signals collected from a controller that controls a production apparatus including a robot, and model information of the production apparatus. A simulator for operating a production device in a virtual space, and a display section for displaying a virtual space in which the virtual production device operates on a monitor.
 本開示の他の側面に係る再現方法は、ロボットを含む生産装置を制御するコントローラから収集された制御信号の推移を表す推移情報と、生産装置のモデルとに基づいて、生産装置に対応する仮想生産装置を仮想空間で動作させることと、仮想生産装置が動作する仮想空間をモニタに表示させることと、を含む。 A reproduction method according to another aspect of the present disclosure is a virtual machine corresponding to a production device based on transition information representing transitions of control signals collected from a controller that controls a production device including a robot, and a model of the production device. It includes operating the production equipment in the virtual space and displaying the virtual space in which the virtual production equipment operates on the monitor.
 本開示によれば、生産装置の動作の事後検証に有効なシステムを提供することができる。 According to the present disclosure, it is possible to provide an effective system for post-verification of the operation of production equipment.
生産システムの構成を例示する模式図である。1 is a schematic diagram illustrating the configuration of a production system; FIG. ロボットの構成を例示する模式図である。1 is a schematic diagram illustrating the configuration of a robot; FIG. データ収集装置及びシミュレーション装置の機能的な構成を例示するブロック図である。2 is a block diagram illustrating functional configurations of a data collection device and a simulation device; FIG. 仮想周辺物体の状態変化の抽出を例示する図である。FIG. 10 is a diagram illustrating extraction of state changes of a virtual peripheral object; 仮想周辺物体の状態変化の抽出を例示する図である。FIG. 10 is a diagram illustrating extraction of state changes of a virtual peripheral object; シミュレーション装置の変形例を示すブロック図である。It is a block diagram which shows the modification of a simulation apparatus. シミュレーション装置の更なる変形例を示すブロック図である。FIG. 11 is a block diagram showing a further modified example of the simulation device; 表示画面を例示する図である。FIG. 4 is a diagram illustrating a display screen; データ収集装置、シミュレーション装置、上位コントローラ、及びローカルコントローラのハードウェア構成を例示するブロック図である。2 is a block diagram illustrating hardware configurations of a data collection device, a simulation device, a host controller, and a local controller; FIG. 区間設定画面の表示手順を例示するフローチャートである。9 is a flowchart illustrating a procedure for displaying an interval setting screen; 初期画面の表示手順を例示するフローチャートである。8 is a flowchart illustrating a procedure for displaying an initial screen; 図11におけるシミュレーション手順を例示するフローチャートである。12 is a flowchart illustrating a simulation procedure in FIG. 11; 表示更新手順を例示するフローチャートである。5 is a flowchart illustrating a display update procedure; 図13における画面遷移手順を例示するフローチャートである。14 is a flowchart illustrating a screen transition procedure in FIG. 13; 図13における再生手順を例示するフローチャートである。FIG. 14 is a flowchart illustrating a reproduction procedure in FIG. 13; FIG.
 以下、実施形態について、図面を参照しつつ詳細に説明する。説明において、同一要素又は同一機能を有する要素には同一の符号を付し、重複する説明を省略する。 Hereinafter, embodiments will be described in detail with reference to the drawings. In the description, the same reference numerals are given to the same elements or elements having the same function, and overlapping descriptions are omitted.
〔生産システム〕
 図1に示す生産システム1は、現実空間において生産物を生産するシステムである。生産物は、1以上の部品に対する機械的な加工、組み立てにより生産される有体物であればいかなるものであってもよい。現実空間は、有体物が実際に存在する空間である。
[Production system]
A production system 1 shown in FIG. 1 is a system for producing a product in a physical space. A product may be any tangible object produced by mechanical processing and assembly of one or more parts. Real space is the space in which tangible objects actually exist.
 生産システム1は、生産装置2と、コントローラ3とを含む。生産装置2は。ワークWに対し処理工程を現実空間で実行して生産物を生産する。ワークWは、生産物の少なくとも一部を構成するために生産装置2が扱う有体物である。例えばワークWは生産物に組付けられる部品であってもよく、部品の組付け等によって構成される中間生成物であってもよく、最終的に完成した生産物自体であってもよい。 The production system 1 includes a production device 2 and a controller 3. production equipment 2. A product is produced by executing a processing process on the work W in the real space. The work W is a tangible object handled by the production apparatus 2 to constitute at least part of the product. For example, the work W may be a part to be assembled into a product, an intermediate product formed by assembling parts or the like, or a final finished product itself.
 ワークWに対する処理工程は、複数のタスクを含む。タスクは、所定の作業目的のために、予め順序が定められた一まとまりの動作である。複数のタスクの少なくともいずれかは、相異なる2種以上の処理工程で共用可能である。処理工程の内容によって、複数のタスクの実行順序は変更可であるが、1のタスク内における動作の順序は変更不可である。複数のタスクの例としては、ベース部品を作業エリアに搬入すること、部品をベース部品に組み付けること、締結又は溶接等により部品をベース部品に固定すること、ベース部品に1以上の部品を組み付けて完成した生産物を作業エリアから搬出すること、等が挙げられる。 The processing process for work W includes multiple tasks. A task is a pre-ordered set of actions for a given work purpose. At least one of the plurality of tasks can be shared by two or more different processing steps. Depending on the content of the processing steps, the execution order of multiple tasks can be changed, but the order of operations within one task cannot be changed. Examples of multiple tasks include bringing the base part into the work area, assembling the part to the base part, fixing the part to the base part by fastening or welding, etc., and assembling one or more parts to the base part. Carrying out the completed product from the work area, and the like.
 例えば生産装置2は、複数のマシン5を含む。複数のマシン5は、少なくともロボットを含む。複数のマシン5は、ロボット以外のマシンを更に含んでもよい。ロボット以外のマシンの例としては、ワークWを搬送する搬送装置、ロボットによる作業対象のワークWの位置・姿勢を調節する装置、ワークWに対する加工を行う工作機械等が挙げられるが、例示したものに限られない。ワークWに対するタスクを実行可能であれば、いかなる構造を有していても複数のマシン5に含まれる。 For example, the production equipment 2 includes a plurality of machines 5. The plurality of machines 5 includes at least robots. The plurality of machines 5 may further include machines other than robots. Examples of machines other than robots include transfer devices that transfer workpieces W, devices that adjust the position and orientation of workpieces W to be worked by robots, machine tools that process workpieces W, and the like. is not limited to A plurality of machines 5 can have any structure as long as they can execute a task for the work W. FIG.
 複数のタスクは、直列的に順次実行すべき2以上のタスクを含んでいてもよく、複数のマシン5によって同時進行で実行可能な2以上のタスクを含んでいてもよい。ロボットにより実行されるタスクの例としては、ワークWの搬送、ワークWの組付け、締結などによるワークWの固定、工作機械などの周辺マシンに対するワークWの搬入・搬出等が挙げられる。工作機械により実行されるタスクの例としては、扉の開閉、搬入されたワークWのチャック、ワークWの回転・移動、工具交換、ワークWに対する工具の配置・移動、加工後のチャックの解除等が挙げられる。 A plurality of tasks may include two or more tasks to be executed sequentially in series, or may include two or more tasks that can be executed concurrently by a plurality of machines 5 . Examples of tasks executed by the robot include transportation of the work W, assembly of the work W, fixation of the work W by fastening, loading/unloading of the work W to/from peripheral machines such as machine tools, and the like. Examples of tasks executed by the machine tool include opening and closing the door, chucking the loaded workpiece W, rotating/moving the workpiece W, changing tools, arranging/moving the tool relative to the workpiece W, and releasing the chuck after machining. is mentioned.
 図1に示す複数のマシン5は、搬送装置5Aと、ロボット5B,5C,5Dと、工作機械5Eとを含んでいるが、これに限られない。少なくともロボットを含んでいる限り、マシン5の数及び種類はいかようにも変更可能である。 The plurality of machines 5 shown in FIG. 1 includes a transfer device 5A, robots 5B, 5C, and 5D, and a machine tool 5E, but is not limited to this. The number and type of machines 5 can vary in any way, at least as long as they include robots.
 搬送装置5Aは、例えば電動モータ等により駆動され、ワークWを搬送する。搬送装置5Aの例としては、ベルトコンベヤ、ローラコンベヤ、カルーセル等が挙げられる。 The transport device 5A transports the work W by being driven by, for example, an electric motor. Examples of the conveying device 5A include belt conveyors, roller conveyors, carousels, and the like.
 ロボット5B,5Cは、垂直多関節ロボットであり、図2に示すように、多関節アーム10と、エンドエフェクタ50とを有する。エンドエフェクタ50は、ワークWに対し作用する。エンドエフェクタ50の例としては、ワークW等を把持するハンド、ワークWを吸着する吸着ノズル、ワークWに対し溶接を行う溶接トーチ又はワークWに対しねじ締めを行うねじ締めツール等が挙げられる。 The robots 5B and 5C are vertically articulated robots, and have an articulated arm 10 and an end effector 50, as shown in FIG. The end effector 50 acts on the workpiece W. As shown in FIG. Examples of the end effector 50 include a hand for gripping the work W, a suction nozzle for sucking the work W, a welding torch for welding the work W, a screw tightening tool for screwing the work W, and the like.
 エンドエフェクタ50は、多関節アーム10に接続される。多関節アーム10は、多関節の動作によってエンドエフェクタ50の位置及び姿勢を変更する。例えば多関節アーム10は、基部11と、旋回部12と、第1アーム13と、第2アーム14と、揺動部15と、第3アーム17と、先端部18と、アクチュエータ41,42,43,44,45,46とを有する。基部11は、搬送装置5Aの周囲に設置されている。旋回部12は、鉛直な軸線21まわりに旋回するように基部11上に設けられている。第1アーム13は、軸線21に交差(例えば直交)する軸線22まわりに揺動するように旋回部12に接続されている。交差は、所謂立体交差のようにねじれの関係にある場合も含む。第2アーム14は、軸線22に実質的に平行な軸線23まわりに揺動するように第1アーム13の先端部に接続されている。第2アーム14は、揺動部15と旋回部16とを含む。揺動部15は、第1アーム13の先端部に接続され、軸線23に交差(例えば直交)する軸線24に沿って延びている。旋回部16は、軸線24まわりに旋回するように揺動部15の先端部に接続され、軸線24に沿って更に延びている。第3アーム17は、軸線24に交差(例えば直交)する軸線25まわりに揺動するように旋回部16の先端部に接続されている。先端部18は、軸線25に交差(例えば直交)する軸線26まわりに旋回するように第3アーム17の先端部に接続されている。先端部18には、エンドエフェクタ50が取り付けられる。 The end effector 50 is connected to the articulated arm 10. The multi-joint arm 10 changes the position and orientation of the end effector 50 by multi-joint motion. For example, the articulated arm 10 includes a base portion 11, a turning portion 12, a first arm 13, a second arm 14, a swing portion 15, a third arm 17, a tip portion 18, actuators 41, 42, 43, 44, 45, 46. The base 11 is installed around the transport device 5A. The swivel part 12 is provided on the base part 11 so as to swivel around a vertical axis 21 . The first arm 13 is connected to the swivel portion 12 so as to swing about an axis 22 that intersects (for example, is perpendicular to) the axis 21 . Crossing includes a twisted relationship such as a so-called overpass. The second arm 14 is connected to the tip of the first arm 13 so as to swing about an axis 23 substantially parallel to the axis 22 . The second arm 14 includes a swinging portion 15 and a turning portion 16 . The swinging portion 15 is connected to the tip of the first arm 13 and extends along an axis 24 that intersects (for example, orthogonally) the axis 23 . The swivel portion 16 is connected to the distal end portion of the swing portion 15 so as to swivel around the axis 24 and further extends along the axis 24 . The third arm 17 is connected to the distal end of the turning section 16 so as to swing about an axis 25 intersecting (for example, perpendicular to) the axis 24 . The distal end portion 18 is connected to the distal end portion of the third arm 17 so as to pivot about an axis 26 that intersects (for example, is perpendicular to) the axis 25 . An end effector 50 is attached to the distal end portion 18 .
 このように、多関節アーム10は、基部11と旋回部12とを接続する関節31と、旋回部12と第1アーム13とを接続する関節32と、第1アーム13と第2アーム14とを接続する関節33と、第2アーム14において揺動部15と旋回部16とを接続する関節34と、旋回部16と第3アーム17とを接続する関節35と、第3アーム17と先端部18とを接続する関節36とを有する。 Thus, the articulated arm 10 includes a joint 31 connecting the base portion 11 and the swivel portion 12, a joint 32 connecting the swivel portion 12 and the first arm 13, the first arm 13 and the second arm . a joint 34 connecting the swinging portion 15 and the turning portion 16 in the second arm 14; a joint 35 connecting the turning portion 16 and the third arm 17; and a joint 36 connecting with the portion 18 .
 アクチュエータ41,42,43,44,45,46は、例えば電動モータ及び減速機を含み、関節31,32,33,34,35,36をそれぞれ駆動する。例えばアクチュエータ41は、軸線21まわりに旋回部12を旋回させ、アクチュエータ42は、軸線22まわりに第1アーム13を揺動させ、アクチュエータ43は、軸線23まわりに第2アーム14を揺動させ、アクチュエータ44は、軸線24まわりに旋回部16を旋回させ、アクチュエータ45は、軸線25まわりに第3アーム17を揺動させ、アクチュエータ46は、軸線26まわりに先端部18を旋回させる。 The actuators 41, 42, 43, 44, 45, 46 include, for example, electric motors and speed reducers, and drive the joints 31, 32, 33, 34, 35, 36, respectively. For example, the actuator 41 swings the swivel unit 12 about the axis 21, the actuator 42 swings the first arm 13 about the axis 22, the actuator 43 swings the second arm 14 about the axis 23, The actuator 44 pivots the pivot 16 about the axis 24 , the actuator 45 pivots the third arm 17 about the axis 25 , and the actuator 46 pivots the tip 18 about the axis 26 .
 なお、多関節アーム10の構成は適宜変更可能である。例えば多関節アーム10は、上述した6軸の構成に更に1軸の関節を追加した7軸の冗長型ロボットであってもよく、所謂スカラー型の多関節ロボットであってもよい。 The configuration of the articulated arm 10 can be changed as appropriate. For example, the articulated arm 10 may be a 7-axis redundant robot in which a 1-axis joint is added to the 6-axis configuration described above, or a so-called scalar type articulated robot.
 図1に戻り、ロボット5Dは、自律走行可能なロボットである。例えばロボット5Dは、ロボット5B,5Cと同様のロボットにおいて、多関節アーム10の基部11が自走可能となったものである。自走可能な基部11の例としては、電動式の無人搬送車(AGV:Automated Guided Vehcle)が挙げられる。 Returning to FIG. 1, the robot 5D is a robot capable of autonomous travel. For example, the robot 5D is similar to the robots 5B and 5C, but the base 11 of the articulated arm 10 is self-propelled. An example of the self-propellable base 11 is an electric automatic guided vehicle (AGV: Automated Guided Vehicle).
 工作機械5Eは、ロボット5B,5Cにより搬入・搬出されるワークWに対する切削加工等を行う。工作機械5Eの例としては、NC旋盤、NCフライス盤、又はマシニングセンタ等が挙げられる。生産装置2は、ロボット5B,5C,5Dのいずれかと可動範囲が重複する周辺装置を更に含んでいてもよい。例えばロボット5C(周辺装置)の可動範囲がロボット5Bと重複していてもよい。 The machine tool 5E performs cutting and the like on the work W that is carried in and out by the robots 5B and 5C. Examples of the machine tool 5E include an NC lathe, an NC milling machine, a machining center, and the like. The production device 2 may further include peripheral devices whose movable ranges overlap with any of the robots 5B, 5C, and 5D. For example, the movable range of the robot 5C (peripheral device) may overlap with that of the robot 5B.
 コントローラ3は、複数のマシン5を制御する。例えばコントローラ3は、複数のマシン5をそれぞれ制御する複数のローカルコントローラ400と、上位コントローラ300とを有する。例えば複数のローカルコントローラ400は、搬送装置5Aを制御するローカルコントローラ400Aと、ロボット5Bを制御するローカルコントローラ400Bと、ロボット5Cを制御するローカルコントローラ400Cと、ロボット5Dを制御するローカルコントローラ400Dと、工作機械5Eを制御するローカルコントローラ400Eとを含む。 The controller 3 controls a plurality of machines 5. For example, the controller 3 has a plurality of local controllers 400 each controlling a plurality of machines 5 and a host controller 300 . For example, the plurality of local controllers 400 include a local controller 400A that controls the transfer device 5A, a local controller 400B that controls the robot 5B, a local controller 400C that controls the robot 5C, a local controller 400D that controls the robot 5D, and a machine tool. and a local controller 400E that controls the machine 5E.
 上位コントローラ300は、現実空間の状態情報を収集し、予め定められた制御プログラムと、収集した状態情報とに基づいて、複数のローカルコントローラ400に対しタスクの実行指令を出力する。複数のローカルコントローラ400のそれぞれは、上位コントローラ300から出力された実行指令に基づいて、マシン5を制御する。 The host controller 300 collects state information of the physical space, and outputs task execution commands to a plurality of local controllers 400 based on a predetermined control program and the collected state information. Each of the multiple local controllers 400 controls the machine 5 based on execution instructions output from the host controller 300 .
 例えばローカルコントローラ400は、実行指令に対応するタスクを遂行するためにマシン5に供給する出力を決定し、決定した出力をマシン5に供給する。出力を決定するために、ローカルコントローラ400は、1以上の制御信号を取得、生成、又は出力する。 For example, the local controller 400 determines the output to be supplied to the machine 5 in order to perform the task corresponding to the execution command, and supplies the determined output to the machine 5 . To determine the output, local controller 400 acquires, generates, or outputs one or more control signals.
 例えばローカルコントローラ400Bは、実行指令に対応するタスクを遂行するためにロボット5Bに供給する出力を決定し、決定した出力をロボット5Bに供給することを、タスクの開始から完了まで所定の制御周期で繰り返し実行する。上記出力をロボット5Bに供給するために、ローカルコントローラ400Bは、制御周期ごとに1以上の制御信号を取得、生成又は出力する。実行指令は、ローカルコントローラ400Bが保持する複数のモーションプログラムのいずれかの指定を含んでいてもよい。例えばローカルコントローラ400Bは、実行指令に対応するモーションプログラムを読み出し、読み出したモーションプログラムに基づいて、タスクを遂行するためのエンドエフェクタ50の目標位置・目標姿勢を算出することと、目標位置・目標姿勢に対応する関節31,32,33,34,35,36の目標角度を算出することと、関節31,32,33,34,35,36の角度のフィードバック値を取得することと、目標角度とフィードバック値との偏差に基づいて、アクチュエータ41,42,43,44,45,46に供給すべき目標電流(上記出力の例)を算出することと、目標電流に対応する電流をアクチュエータ41,42,43,44,45,46に供給することと、を制御周期で繰り返しを実行する。上述の実行指令、エンドエフェクタ50の目標位置・目標姿勢、関節31,32,33,34,35,36の目標角度、関節31,32,33,34,35,36の角度のフィードバック値、及び目標電流の全てが、1以上の制御信号に含まれる。また、モーションプログラムのいずれの部分(例えばライン又はブロック)を実行したかを表すプログラムの実行履歴も1以上の制御信号に含まれる。 For example, the local controller 400B determines the output to be supplied to the robot 5B in order to perform the task corresponding to the execution command, and supplies the determined output to the robot 5B at a predetermined control cycle from the start to the completion of the task. Execute repeatedly. In order to supply the above outputs to the robot 5B, the local controller 400B acquires, generates or outputs one or more control signals in each control cycle. The execution command may include designation of one of a plurality of motion programs held by local controller 400B. For example, the local controller 400B reads out a motion program corresponding to the execution command, calculates the target position/target orientation of the end effector 50 for performing the task based on the read motion program, and calculates the target position/target orientation. Calculating the target angles of the joints 31, 32, 33, 34, 35, and 36 corresponding to , obtaining the feedback values of the angles of the joints 31, 32, 33, 34, 35, and 36, and the target angles and Based on the deviation from the feedback value, target currents (examples of the above outputs) to be supplied to the actuators 41, 42, 43, 44, 45, and 46 are calculated; , 43, 44, 45, and 46 are repeatedly executed at the control period. the execution command described above, the target position/target orientation of the end effector 50, the target angles of the joints 31, 32, 33, 34, 35, and 36, the feedback values of the angles of the joints 31, 32, 33, 34, 35, and 36; All target currents are included in one or more control signals. The one or more control signals also include a program execution history indicating which part (eg, line or block) of the motion program has been executed.
 ローカルコントローラ400Bは、目標位置・目標姿勢に配置されたエンドエフェクタ50に対する指令を出力してもよい。例えばエンドエフェクタ50がハンドである場合に、ローカルコントローラ400Bは、エンドエフェクタ50に対してワークWの把持指令を出力してもよい。エンドエフェクタ50がネジ締めツールである場合に、ローカルコントローラ400Bは、エンドエフェクタ50に対してねじ締め指令を出力してもよい。把持指令及びねじ締め指令は、いずれも1以上の制御信号に含まれる。 The local controller 400B may output commands to the end effector 50 placed at the target position/target posture. For example, when the end effector 50 is a hand, the local controller 400B may output a workpiece W gripping command to the end effector 50 . The local controller 400B may output a screw tightening command to the end effector 50 when the end effector 50 is a screw tightening tool. Both the grip command and the screw tightening command are included in one or more control signals.
 ロボット5Bは、先端部18又は旋回部12等にカメラ51,52を有してもよい。この場合に、ローカルコントローラ400Bは、カメラ51,52から取得された撮影データに基づいて、タスクの実行対象を認識し、認識結果に基づいてエンドエフェクタ50の目標位置・目標姿勢を算出してもよい。タスクの実行対象の認識結果は、生産システム1の制御信号に含まれる。 The robot 5B may have cameras 51 and 52 on the tip portion 18 or the turning portion 12 or the like. In this case, the local controller 400B may recognize the task execution target based on the imaging data acquired from the cameras 51 and 52, and calculate the target position/target orientation of the end effector 50 based on the recognition result. good. The recognition result of the task execution target is included in the control signal of the production system 1 .
 ローカルコントローラ400Cも、ローカルコントローラ400Bと同様の1以上の制御信号を制御周期ごとに取得、生成又は出力する。基部11が移動可能であるロボット5Dのローカルコントローラ400Dは、ローカルコントローラ400Bと同様の1以上の制御信号に加えて、基部11の目標位置、基部11の現在位置、基部11の目標位置と現在位置との偏差に基づく基部11の目標速度等を制御周期ごとに取得、生成又は出力する。基部11の目標位置、基部11の現在位置、及び基部11の目標速度も1以上の制御信号に含まれる。 The local controller 400C also acquires, generates, or outputs one or more control signals similar to those of the local controller 400B in each control cycle. The local controller 400D of the robot 5D whose base 11 is movable has, in addition to one or more control signals similar to those of the local controller 400B, a target position of the base 11, a current position of the base 11, a target position of the base 11 and a current position of the base 11. A target speed or the like of the base 11 based on the deviation from is obtained, generated, or output for each control cycle. The target position of the base 11, the current position of the base 11, and the target speed of the base 11 are also included in the one or more control signals.
 ローカルコントローラ400Eは、実行指令に対応するタスクを遂行するための動作指令を工作機械5Eに順次出力する。実行指令は、ローカルコントローラ400Eが保持する複数の加工プログラムのいずれかの指定を含んでいてもよい。例えばローカルコントローラ400Eは、実行指令に対応する加工プログラムを読み出し、読み出した加工プログラムに基づいて、扉の開放指令、ワークWのチャック指令、工具のセット指令、工具の配置指令、工具又はワークの回転駆動指令、工具又はワークの送り指令等を工作機械5Eに順次出力する。ローカルコントローラ400Eは、各動作指令ごとに、対応する動作の完了通知を工作機械5Eから取得し、完了通知を受領してから次の動作指令を工作機械5Eに出力してもよい。上述の実行指令、各動作指令、及び各動作指令に対する完了通知の全てが1以上の制御信号に含まれる。また、加工プログラムのいずれの部分(例えばライン又はブロック)を実行したかを表すプログラムの実行履歴も1以上の制御信号に含まれる。 The local controller 400E sequentially outputs operation commands to the machine tool 5E for performing tasks corresponding to the execution commands. The execution command may include designation of one of a plurality of machining programs held by the local controller 400E. For example, the local controller 400E reads a machining program corresponding to an execution command, and based on the read machining program, a door opening command, a work W chucking command, a tool setting command, a tool placement command, a tool or work rotation command. A drive command, a tool or work feed command, etc. are sequentially output to the machine tool 5E. The local controller 400E may acquire a notification of completion of the corresponding operation from the machine tool 5E for each operation command, and output the next operation command to the machine tool 5E after receiving the completion notification. All of the above execution commands, each motion command, and completion notification for each motion command are included in one or more control signals. The one or more control signals also include a program execution history indicating which part (eg, line or block) of the machining program has been executed.
 複数のローカルコントローラ400のそれぞれが取得、生成又は出力する制御信号は、対応するマシン5と、ローカルコントローラ400との少なくともいずれかのステータスを表す1以上のステータス信号を更に含んでいてもよい。例えば複数のローカルコントローラ400のそれぞれは、タスクが正常に完了したことを表す完了通知を生成し、上位コントローラ300に出力してもよい。また、複数のローカルコントローラ400のそれぞれは、マシン5における異常又はローカルコントローラ400における異常を報知するアラームを生成し、上位コントローラ300に出力してもよい。例えばローカルコントローラ400B,400C,400Dは、アクチュエータ41,42,43,44,45,46の少なくともいずれかにおける過負荷又は過電流の発生を報知するアラームを生成し、上位コントローラ300に出力してもよい。上述の完了通知及びアラームは、いずれも1以上のステータス信号に含まれる。 The control signal acquired, generated or output by each of the plurality of local controllers 400 may further include one or more status signals representing the status of at least one of the corresponding machine 5 and local controller 400. For example, each of the multiple local controllers 400 may generate a completion notification indicating that the task has been completed normally, and output it to the upper controller 300 . Further, each of the plurality of local controllers 400 may generate an alarm to report an abnormality in the machine 5 or an abnormality in the local controller 400 and output it to the host controller 300 . For example, the local controllers 400B, 400C, 400D may generate an alarm notifying the occurrence of overload or overcurrent in at least one of the actuators 41, 42, 43, 44, 45, 46 and output it to the host controller 300. good. All of the completion notifications and alarms described above are included in one or more status signals.
 上位コントローラ300は、複数のローカルコントローラ400から1以上の制御信号を収集する。例えば上位コントローラ300は、複数のローカルコントローラ400と有線又は無線による同期通信を行って、1以上の制御信号を収集する。同期通信とは、一定の通信周期の同期フレームに同期して、1周期ごとに、複数のマシン5もローカルコントローラ400との通信を行うことを意味する。上位コントローラ300は、同一の周期で収集した1以上の制御信号を、同一の時刻に対応付けた状態レコードを記憶してもよい。 The upper controller 300 collects one or more control signals from multiple local controllers 400 . For example, the host controller 300 performs wired or wireless synchronous communication with a plurality of local controllers 400 to collect one or more control signals. Synchronous communication means that a plurality of machines 5 also communicate with the local controller 400 every cycle in synchronization with a synchronous frame of a constant communication cycle. The host controller 300 may store state records in which one or more control signals collected in the same cycle are associated with the same time.
 上位コントローラ300は、現実空間に設けられた環境センサ6による検出結果に基づいて、ワークWの位置情報等を更に収集し、上記状態レコードに含めて記憶してもよい。上位コントローラ300により収集されたワークWの位置情報等が、複数のローカルコントローラ400により1以上の制御信号として取得されてもよい。 The host controller 300 may further collect position information and the like of the work W based on the detection results of the environment sensor 6 provided in the physical space, and store the information as part of the state record. The positional information of the workpiece W and the like collected by the host controller 300 may be obtained as one or more control signals by a plurality of local controllers 400 .
 環境センサ6の例としては、ワークWが配置され得る位置に設けられた在荷センサ、又はワークWが配置され得る位置を撮影するカメラ等が挙げられる。環境センサ6がカメラである場合、上位コントローラ300は、環境センサ6による撮影データの画像処理結果に基づいてワークWの位置情報等を生成してもよい。上位コントローラ300は、環境センサ6、及びカメラ51,52による撮影データを更に収集し、収集した撮影データを上記状態レコードに含めて記憶してもよい。 Examples of the environment sensor 6 include a load sensor provided at a position where the work W can be placed, or a camera that captures the position where the work W can be placed. If the environment sensor 6 is a camera, the host controller 300 may generate position information and the like of the workpiece W based on the image processing result of image data captured by the environment sensor 6 . The host controller 300 may further collect data captured by the environment sensor 6 and the cameras 51 and 52, and store the collected captured data in the state record.
 上位コントローラ300は、記憶した状態レコードと、生産計画に基づく生産指示とに基づいて、複数のローカルコントローラ400に対するタスクの実行指令を生成する。 The upper controller 300 generates task execution commands for the multiple local controllers 400 based on the stored status records and production instructions based on the production plan.
 複数のローカルコントローラ400の少なくともいずれか(例えばローカルコントローラ400B,400C,400D)は、上位コントローラ300が記憶する状態レコードに基づいて、実行指令に対応するタスクの実行タイミングを自律的に決定してもよい。 At least one of the plurality of local controllers 400 (for example, local controllers 400B, 400C, and 400D) may autonomously determine the execution timing of the task corresponding to the execution instruction based on the status record stored by the host controller 300. good.
 データ収集装置100は、複数のローカルコントローラ400において収集された複数種類の制御信号の推移を表す推移情報を蓄積する。例えばデータ収集装置100は、上位コントローラ300が記憶する状態レコードを取得して順次蓄積する。これにより、上記複数の制御信号の推移を表す推移情報が蓄積される。推移は時間変化を意味し、時系列に並ぶ複数の制御信号によって表される。複数の制御信号は、時系列順に並べ得るように、それぞれが時刻と対応付けて記憶されていればよく、必ずしもデータ収集装置100内で時系列順に並んでいなくてもよい。なお、データ収集装置100は、複数種類の制御信号の少なくともいずれかについての推移情報を、上位コントローラ300を経ずに複数のローカルコントローラ400のいずれかから取得してもよい。複数のマシン5の少なくともいずれかは、ローカルコントローラ400を介さず、上位コントローラ300により直接制御されてもよい。上位コントローラ300により直接制御されるマシン5の制御信号は、上位コントローラ300により収集される。全てのマシン5が上位コントローラ300により直接制御されてもよい(コントローラ3が上位コントローラ300と複数のローカルコントローラ400とに分かれていなくてもよい)。 The data collection device 100 accumulates transition information representing transitions of the multiple types of control signals collected by the multiple local controllers 400 . For example, the data collection device 100 acquires and sequentially accumulates state records stored by the host controller 300 . As a result, transition information representing transitions of the plurality of control signals is accumulated. Transition means change over time, and is represented by a plurality of control signals arranged in time series. The plurality of control signals need only be stored in association with time so that they can be arranged in chronological order, and they do not necessarily have to be arranged in chronological order within the data collection device 100 . Note that the data collection device 100 may acquire transition information about at least one of the plurality of types of control signals from any one of the plurality of local controllers 400 without going through the host controller 300 . At least one of the plurality of machines 5 may be directly controlled by the host controller 300 without going through the local controller 400 . Control signals for the machine 5 directly controlled by the host controller 300 are collected by the host controller 300 . All machines 5 may be directly controlled by the host controller 300 (the controller 3 may not be divided into the host controller 300 and a plurality of local controllers 400).
 シミュレーション装置200は、生産装置2の動作の事前検証等の様々な目的のために、生産装置2に対応する仮想生産装置V2(生産装置2のモデル)を仮想空間VSで動作させるシミュレーションを実行するように構成されている。仮想生産装置V2を仮想空間VSで動作ささせることは、生産装置2のモデル情報に基づいて、複数のマシン5を実際に動作させることなく、動作後における複数のマシン5の状態を表すシミュレーションデータを算出することを意味する。生産装置2のモデル情報は、複数のマシン5のモデル情報を含む。複数のマシン5のモデル情報のそれぞれは、例えば、対応するマシン5の現実空間における配置、構造、形状、大きさ等の情報を含む。マシン5の現実空間における形状、大きさは、例えばポリゴンデータ又はボクセルデータなどの三次元データにより表される。 The simulation apparatus 200 executes a simulation of operating a virtual production apparatus V2 (a model of the production apparatus 2) corresponding to the production apparatus 2 in the virtual space VS for various purposes such as prior verification of the operation of the production apparatus 2. is configured as Operating the virtual production apparatus V2 in the virtual space VS is based on the model information of the production apparatus 2, without actually operating the plurality of machines 5. Simulation data representing the state of the plurality of machines 5 after operation is performed. means to calculate The model information of the production equipment 2 includes model information of a plurality of machines 5 . Each of the model information of the plurality of machines 5 includes, for example, information on the arrangement, structure, shape, size, etc. of the corresponding machine 5 in the real space. The shape and size of the machine 5 in the physical space are represented by three-dimensional data such as polygon data or voxel data.
 生産装置2のモデル情報に基づき算出される上記シミュレーションデータが 仮想生産装置V2(生産装置2のモデル)であり、上記シミュレーションデータの基準となる座標系が仮想空間VSであるともいえる。以下、データ収集装置100及びシミュレーション装置200の構成を詳細に例示する。 It can also be said that the simulation data calculated based on the model information of the production device 2 is the virtual production device V2 (model of the production device 2), and the coordinate system on which the simulation data is based is the virtual space VS. The configurations of the data collection device 100 and the simulation device 200 are illustrated in detail below.
〔データ収集装置及びシミュレーション装置〕
 図3に例示するように、データ収集装置100は、機能上の構成要素(以下、「機能ブロック」という。)として、収集部111と、データベース113とを含む。収集部111は、状態レコードを上位コントローラ300から取得してデータベース113に順次蓄積する。
[Data collection device and simulation device]
As illustrated in FIG. 3, the data collection device 100 includes a collection unit 111 and a database 113 as functional components (hereinafter referred to as "function blocks"). The collection unit 111 acquires status records from the host controller 300 and sequentially accumulates them in the database 113 .
 シミュレーション装置200は、機能ブロックとして、プログラム記憶部211と、仮想コントローラ212と、モデル保持部213と、シミュレータ214と、結果保持部215と、表示部216とを有する。プログラム記憶部211は、複数のマシン5を制御するために上位コントローラ300及び複数のローカルコントローラ400のそれぞれが実行するプログラム(例えば上記制御プログラム、複数のモーションプログラム、及び複数の加工プログラム等)を記憶している。仮想コントローラ212は、上位コントローラ300及び複数のローカルコントローラ400が複数のマシン5を制御する際と同じ手順にて、プログラム記憶部211が記憶するプログラムを実行する。モデル保持部213は、仮想空間VSに含まれる1以上のモデルのモデル情報を記憶する。1以上のモデルは、少なくとも、上述した生産装置2のモデル(複数のマシン5のモデル)を含む。シミュレータ214は、モデル保持部213が記憶するモデル情報と、仮想コントローラ212によるプログラムの実行結果とに基づいて、仮想生産装置V2を仮想空間VSで動作させる。 The simulation device 200 has a program storage unit 211, a virtual controller 212, a model holding unit 213, a simulator 214, a result holding unit 215, and a display unit 216 as functional blocks. The program storage unit 211 stores programs executed by each of the host controller 300 and the plurality of local controllers 400 to control the plurality of machines 5 (for example, the above control program, a plurality of motion programs, a plurality of machining programs, etc.). are doing. The virtual controller 212 executes programs stored in the program storage unit 211 in the same procedure as when the host controller 300 and the plurality of local controllers 400 control the plurality of machines 5 . The model holding unit 213 stores model information of one or more models included in the virtual space VS. The one or more models include at least the models of the production device 2 (models of the plurality of machines 5) described above. The simulator 214 causes the virtual production apparatus V2 to operate in the virtual space VS based on the model information stored by the model holding unit 213 and the execution result of the program by the virtual controller 212 .
 一例として、シミュレータ214は、仮想コントローラ212が算出した関節31,32,33,34,35,36の動作角度と、ロボット5Bのモデル情報とに基づく順運動学演算によって、ロボット5Bのモデルを仮想空間VSにおいて動作させる。ロボット5C,5Dについても同様である。 As an example, the simulator 214 virtualizes the model of the robot 5B by forward kinematics calculation based on the motion angles of the joints 31, 32, 33, 34, 35, and 36 calculated by the virtual controller 212 and the model information of the robot 5B. Operate in space VS. The same applies to robots 5C and 5D.
 シミュレータ214は、仮想コントローラ212によるプログラムの実行結果が更新される度に、仮想生産装置V2を仮想空間VSで動作させ、動作後の仮想生産装置V2を含むシミュレーション結果を結果保持部215に蓄積する。シミュレーション結果は、動作後における複数のマシン5のモデルを含む。複数のマシン5のモデルは、動作後における複数のマシン5の外観を表す上記三次元データを含む。 The simulator 214 operates the virtual production apparatus V2 in the virtual space VS each time the execution result of the program by the virtual controller 212 is updated, and accumulates the simulation results including the virtual production apparatus V2 after the operation in the result holding unit 215. . The simulation results include models of multiple machines 5 after operation. The models of the machines 5 include the three-dimensional data representing the appearance of the machines 5 after operation.
 表示部216は、結果保持部215に時系列で蓄積されたシミュレーション結果に基づいて、仮想生産装置V2が動作している仮想空間VSをモニタ(例えば後述するユーザインタフェース295のモニタ)に表示させる。例えば表示部216は、結果保持部215に蓄積されたシミュレーション結果を時系列順に読み出し、シミュレーション結果が含む三次元データを予め定められた視点からの二次元データに変換してモニタに表示させる。これにより、仮想生産装置V2が動作している仮想空間VSの動画がモニタに再生される。 The display unit 216 displays the virtual space VS in which the virtual production device V2 is operating on a monitor (for example, the monitor of the user interface 295 described later) based on the simulation results accumulated in the result holding unit 215 in chronological order. For example, the display unit 216 reads the simulation results accumulated in the result holding unit 215 in chronological order, converts the three-dimensional data included in the simulation results into two-dimensional data from a predetermined viewpoint, and displays the data on the monitor. As a result, the moving image of the virtual space VS in which the virtual production device V2 is operating is reproduced on the monitor.
 シミュレーションの利用によって、仮想空間VSにおいて仮想生産装置V2の動作を検証しながら、効率よく生産装置2を構築することができるが、事前検証に基づき生産装置2が構築されていても、生産装置2の動作を事後的に検証する必要性は生じ得る。例えば、現実空間における生産装置2の動作において想定外の事象が生じた場合に、その事象の発生要因等を事後的に検証する必要が生じ得る。 By using the simulation, it is possible to construct the production apparatus 2 efficiently while verifying the operation of the virtual production apparatus V2 in the virtual space VS. It may be necessary to verify the operation of the after the fact. For example, when an unexpected event occurs in the operation of the production device 2 in the real space, it may be necessary to verify the cause of the event after the fact.
 現実空間における生産装置2の動作を事後的に検証するためには、現実空間において撮影された撮影データを再生することが考えられる。しかしながら、複数のマシン5を備える生産装置2の全体をくまなく撮影しておくことは困難であり、事後的な検証には限界がある。そこで、シミュレーション装置200は、コントローラ3から収集された制御信号の推移を表す推移情報と、生産装置2のモデル情報とに基づいて、仮想生産装置V2を仮想空間VSで動作させることと、推移情報に基づき仮想生産装置V2が動作する仮想空間VSをモニタに表示させることと、を更に実行するように構成されている。 In order to verify the operation of the production device 2 in the real space after the fact, it is conceivable to reproduce the photographed data photographed in the real space. However, it is difficult to photograph the entire production apparatus 2 including a plurality of machines 5, and there is a limit to post facto verification. Therefore, the simulation device 200 operates the virtual production device V2 in the virtual space VS based on the transition information representing the transition of the control signal collected from the controller 3 and the model information of the production device 2, and the transition information and displaying on a monitor a virtual space VS in which the virtual production apparatus V2 operates based on the above.
 制御信号は、例えば、上述のように、現実空間で複数のマシン5を動作させる際に複数のローカルコントローラ400において取得、生成又は出力された過去の制御信号である。上述のとおり、推移は時間変化を意味し、時系列に並ぶ複数の制御信号により表される。 The control signals are, for example, past control signals acquired, generated, or output by the plurality of local controllers 400 when operating the plurality of machines 5 in the physical space, as described above. As described above, transition means change over time and is represented by a plurality of control signals arranged in time series.
 以下、過去の制御信号の推移情報に基づき仮想生産装置V2が動作する仮想空間VSをモニタに表示させることを「ログ再生」という。ログ再生によれば、複数のマシン5を備える生産装置2の全体がくまなく撮影されていなくても、現実空間で発生した現象を再現することができる。再現結果を任意の視点で表示することもできる。従って、システムの動作の事後検証に有効である。 Hereinafter, displaying on the monitor the virtual space VS in which the virtual production apparatus V2 operates based on the transition information of the past control signal is referred to as "log reproduction". According to the log reproduction, even if the entirety of the production apparatus 2 having a plurality of machines 5 is not photographed thoroughly, phenomena occurring in the real space can be reproduced. Reproduction results can also be displayed from any viewpoint. Therefore, it is effective for ex-post verification of system operation.
 例えばシミュレーション装置200は、機能ブロックとして、推移情報抽出部217と、推移情報記憶部218とを更に有する。推移情報抽出部217は、ログ再生の対象区間の推移情報をデータベース113から抽出し、推移情報記憶部218に記憶させる。対象区間は、推移情報の時間軸における区間を意味する。例えば推移情報抽出部217は、対象区間に対応する状態レコードに含まれる複数種類の制御信号をデータベース113から抽出し、状態レコードが含む上記時刻に対応付けて推移情報記憶部218に記憶させる。 For example, the simulation device 200 further includes a transition information extraction unit 217 and a transition information storage unit 218 as functional blocks. The transition information extraction unit 217 extracts the transition information of the target section of log reproduction from the database 113 and stores it in the transition information storage unit 218 . The target section means a section on the time axis of transition information. For example, the transition information extraction unit 217 extracts a plurality of types of control signals included in the state record corresponding to the target section from the database 113 and stores them in the transition information storage unit 218 in association with the time included in the state record.
 ログ再生の実行に際して、シミュレータ214は、仮想コントローラ212によるプログラムの実行結果の代わりに、推移情報記憶部218が記憶する推移情報に基づいて仮想生産装置V2を仮想空間VSで動作させる。例えばシミュレータ214は、時系列順に推移情報記憶部218から複数種類の制御信号を読み出し、読み出した複数種類の制御信号と、モデル保持部213が記憶する生産装置2のモデルとに基づいて、仮想生産装置V2を仮想空間VSで動作させる。 When executing log reproduction, the simulator 214 causes the virtual production apparatus V2 to operate in the virtual space VS based on transition information stored in the transition information storage unit 218, instead of the result of program execution by the virtual controller 212. For example, the simulator 214 reads multiple types of control signals from the transition information storage unit 218 in chronological order, and performs virtual production based on the read multiple types of control signals and the model of the production apparatus 2 stored in the model holding unit 213. Device V2 is operated in virtual space VS.
 一例として、シミュレータ214は、推移情報記憶部218から読み出した関節31,32,33,34,35,36の角度と、ロボット5Bのモデル情報とに基づく順運動学演算によって、ロボット5Bに対応する仮想ロボットV5Bを仮想空間VSにおいて動作させる。ロボット5C,5Dについても同様である。 As an example, the simulator 214 corresponds to the robot 5B by forward kinematics calculation based on the angles of the joints 31, 32, 33, 34, 35, and 36 read from the transition information storage unit 218 and the model information of the robot 5B. The virtual robot V5B is operated in the virtual space VS. The same applies to robots 5C and 5D.
 シミュレータ214は、推移情報記憶部218から1以上の制御信号を読み出す度に、仮想生産装置V2を仮想空間VSで動作させ、動作後の仮想生産装置V2を含むシミュレーション結果を結果保持部215に蓄積する。 Every time one or more control signals are read from the transition information storage unit 218, the simulator 214 causes the virtual production apparatus V2 to operate in the virtual space VS, and accumulates the simulation results including the virtual production apparatus V2 after operation in the result holding unit 215. do.
 表示部216は、対象区間の推移情報に対応して結果保持部215に時系列で蓄積されたシミュレーション結果に基づいて、仮想生産装置V2が動作している仮想空間VSをモニタに表示させる。例えば表示部216は、対象区間の推移情報に対応する複数のシミュレーション結果を時系列順に読み出し、シミュレーション結果が含む三次元データを予め定められた視点からの二次元データに変換してモニタに表示させる。これにより、対象区間の推移情報に基づき仮想生産装置V2が動作している仮想空間VSの動画がモニタに再生される。 The display unit 216 causes the monitor to display the virtual space VS in which the virtual production device V2 is operating, based on the simulation results accumulated in the result holding unit 215 in time series corresponding to the transition information of the target section. For example, the display unit 216 reads a plurality of simulation results corresponding to the transition information of the target section in chronological order, converts the three-dimensional data included in the simulation results into two-dimensional data from a predetermined viewpoint, and displays the data on the monitor. . As a result, a moving image of the virtual space VS in which the virtual production apparatus V2 is operating is reproduced on the monitor based on the transition information of the target section.
 以上により、過去の生産装置2の動作が仮想空間VSにおいて再現されることとなるが、仮想生産装置V2を仮想空間VSで動作させるのみでは、生産装置2の動作に伴って生じた生産装置2の周辺物体の動作を仮想空間VSにおいて再現することができない。周辺物体の例としては、ワークW、又はエンドエフェクタ50が保持する作業ツール等が挙げられる。 As described above, the operation of the production apparatus 2 in the past can be reproduced in the virtual space VS. cannot be reproduced in the virtual space VS. Examples of peripheral objects include a work W, a work tool held by the end effector 50, and the like.
 そこで、シミュレーション装置200は、仮想空間VSにおける仮想生産装置V2の動作に基づいて、仮想空間VSの状態変化を抽出することと、抽出した仮想空間VSの状態変化をモニタに表示させることとを更に実行するように構成されていてもよい。データベース113に記憶された制御信号の推移を、周辺物体の状態変化の再現にも有効利用することができる。仮想周辺物体の状態変化も表示に含めることで、現実空間で生じた現象を更に容易に検証することができる。 Therefore, the simulation apparatus 200 further extracts the state change of the virtual space VS based on the operation of the virtual production apparatus V2 in the virtual space VS, and displays the extracted state change of the virtual space VS on the monitor. may be configured to run. The transition of the control signal stored in the database 113 can also be effectively used to reproduce the state change of surrounding objects. By including state changes of virtual surrounding objects in the display, phenomena occurring in the real space can be verified more easily.
 例えばモデル保持部213は、生産装置2のモデル情報に加えて、1以上の周辺物体のモデル情報を記憶する。周辺物体のモデル情報は、周辺物体の配置、構造、形状、及び大きさ等の情報を含む。シミュレータ214は、1以上の周辺物体のモデル情報に基づいて、仮想空間VSに配置された1以上の仮想周辺物体(1以上の周辺物体のモデル)を更に含むシミュレーション結果を結果保持部215に蓄積する。 For example, the model holding unit 213 stores model information of one or more peripheral objects in addition to the model information of the production device 2 . The model information of the surrounding objects includes information such as the arrangement, structure, shape and size of the surrounding objects. Based on the model information of one or more surrounding objects, the simulator 214 accumulates, in the result holding unit 215, simulation results further including one or more virtual surrounding objects (one or more surrounding object models) arranged in the virtual space VS. do.
 例えばシミュレーション装置200は、機能ブロックとして、抽出部219を更に有する。抽出部219は、仮想空間VSにおける仮想生産装置V2の動作に基づいて、仮想空間VSの状態変化を抽出する。 For example, the simulation device 200 further has an extraction unit 219 as a functional block. The extraction unit 219 extracts state changes in the virtual space VS based on the operation of the virtual production apparatus V2 in the virtual space VS.
 例えば抽出部219は、制御信号による仮想生産装置V2の動作に基づいて、制御信号によっては動作しない部分における仮想空間VSの状態変化を抽出する。例えば抽出部219は、制御信号によって、1の仮想マシンV5(マシン5のモデル)が動作する場合に、制御信号によっては動作しない仮想周辺物体(周辺物体のモデル)の状態変化を抽出する。仮想周辺物体は、ワークW又は作業ツールであってもよく、他の仮想マシンV5(他のマシン5のモデル)であってもよい。 For example, the extraction unit 219 extracts the state change of the virtual space VS in the portion not operated by the control signal, based on the operation of the virtual production apparatus V2 by the control signal. For example, when one virtual machine V5 (model of machine 5) operates according to the control signal, the extraction unit 219 extracts state changes of virtual peripheral objects (models of peripheral objects) that do not operate according to the control signal. The virtual peripheral object may be a work W or work tool, or may be another virtual machine V5 (a model of another machine 5).
 抽出部219は、モデル保持部213が記憶する複数のモデル(仮想空間VSに含まれる複数のモデル)間の位置関係に基づいて、複数のモデルの少なくとも1つの状態変化を抽出してもよい。位置関係に基づくことで、状態変化を容易に抽出することができる。 The extraction unit 219 may extract at least one state change of a plurality of models based on the positional relationship between the plurality of models (the plurality of models included in the virtual space VS) stored in the model holding unit 213. Based on the positional relationship, state changes can be easily extracted.
 状態変化は、モデルの位置の変化であってもよく、モデルの形状の変化であってもよく、モデルの大きさの変化であってもよい。一例として、複数のマシン5が研削用のグラインダを含み、グラインダにおける砥石のモデルがワークWのモデルと仮想空間VSにおいて干渉した場合に、抽出部219は、砥石のモデルと干渉した部分をワークWのモデルから除去する変化を抽出してもよい。 A state change may be a change in the position of the model, a change in the shape of the model, or a change in the size of the model. As an example, when a plurality of machines 5 includes a grinder for grinding, and a grindstone model in the grinder interferes with a model of the work W in the virtual space VS, the extraction unit 219 extracts a part of the work W that interfered with the model of the grindstone. We may extract the changes to remove from the model of .
 状態変化は、生産システム1のモデルの位置が他のモデルの位置から独立した状態(以下、「独立状態」という。)から、生産システム1のモデルの位置が他のモデルの位置に関連付いた状態(以下、「従属状態」という。)への変化であってもよい。一例として、ロボット5Bのエンドエフェクタ50が吸着ノズルであり、仮想空間VSにおいてエンドエフェクタ50のモデルがワークWのモデルに接触した場合に、抽出部219は、エンドエフェクタ50のモデルに対するワークWのモデルの状態が上記独立状態から上記従属状態に変化したことを抽出してもよい。 The state change is from a state in which the position of the model in the production system 1 is independent of the positions of the other models (hereinafter referred to as "independent state") to a state in which the position of the model in the production system 1 is related to the positions of the other models. It may be a change to a state (hereinafter referred to as "subordinate state"). As an example, when the end effector 50 of the robot 5B is a suction nozzle and the model of the end effector 50 contacts the model of the work W in the virtual space VS, the extraction unit 219 extracts the model of the work W from the model of the end effector 50. has changed from the independent state to the dependent state.
 独立状態から従属状態への変化を抽出した場合に、抽出部219は、仮想空間VSにおけるワークWのモデルの位置・姿勢を、エンドエフェクタ50のモデルの位置・姿勢を基準とする相対位置・相対姿勢に書き換えてもよい。書き換えの後、シミュレータ214が仮想生産装置V2を仮想空間VSで動作させる場合には、上記相対位置・相対姿勢に基づいて、ワークWのモデルはエンドエフェクタ50のモデルと共に移動することとなる。エンドエフェクタ50のモデルの移動に伴うワークWのモデルの移動は、表示部216によりモニタに表示されるので、上記独立状態から従属状態への変化がモニタに表示されることとなる。 When the change from the independent state to the dependent state is extracted, the extraction unit 219 converts the position/orientation of the model of the work W in the virtual space VS into a relative position/orientation based on the position/orientation of the model of the end effector 50 . You can rewrite the posture. After the rewriting, when the simulator 214 operates the virtual production device V2 in the virtual space VS, the model of the work W moves together with the model of the end effector 50 based on the relative position/relative orientation. Since the movement of the model of the work W accompanying the movement of the model of the end effector 50 is displayed on the monitor by the display unit 216, the change from the independent state to the dependent state is displayed on the monitor.
 複数のモデルの位置関係のみでは、状態変化を正確に抽出できない場合も考えられる。例えば、エンドエフェクタ50がハンドであり、仮想空間VSにおいてエンドエフェクタ50のモデルが周辺物体のモデルに接近した場合に、その接近が周辺物体を把持することを目的とするのか否かによって、抽出されるべき状態変化が異なり得る。 It is conceivable that it may not be possible to accurately extract state changes from only the positional relationships of multiple models. For example, when the end effector 50 is a hand and the model of the end effector 50 approaches the model of the peripheral object in the virtual space VS, the extracted The state changes that should occur can be different.
 例えば周辺物体がボタンを有する周辺装置であり、上記接近の目的が、エンドエフェクタ50によりボタンを押すことである場合、抽出されるべき状態変化は、ボタンが押される前から押された後への変化である。周辺物体がワークWであり、上記接近の目的がエンドエフェクタ50によりワークWを把持することである場合、抽出されるべき変化は、エンドエフェクタ50のモデルに対するワークWのモデルの状態が上記独立状態から上記従属状態に変化したことである。 For example, if the peripheral object is a peripheral device having a button, and the purpose of the approach is to press the button with the end effector 50, the state change to be extracted is from before the button is pressed to after the button is pressed. Change. When the surrounding object is the work W and the purpose of the approach is to grip the work W with the end effector 50, the change to be extracted is that the state of the model of the work W with respect to the model of the end effector 50 is the independent state to the above dependent state.
 そこで、抽出部219は、制御信号と、制御信号に関連付けられた目的情報とに基づいて、仮想空間VSに含まれる少なくとも生産システム1のモデルの状態変化を抽出してもよい。目的情報に基づくことで、仮想周辺物体の状態変化をより正確に抽出することができる。 Therefore, the extraction unit 219 may extract at least the state change of the model of the production system 1 included in the virtual space VS based on the control signal and the purpose information associated with the control signal. Based on the purpose information, it is possible to more accurately extract the state change of the virtual peripheral object.
 目的情報は、例えば、制御信号の一例である実行指令に対応するタスクの上記作業目的を表す。作業目的の例としては、「把持する」、「搬送する」、「接合する」、「締結する」等が挙げられる。 The purpose information represents, for example, the work purpose of the task corresponding to the execution command, which is an example of the control signal. Examples of work purposes include "grasping", "conveying", "joining", and "fastening".
 例えば、プログラム記憶部211において、複数のプログラム(複数のモーションプログラム、及び複数の加工プログラム)の少なくともいずれかに対して目的情報が関連付けられていてもよい。この場合、プログラムと目的情報との関連付けによって、複数種類の制御信号の少なくともいずれかがプログラムを介して目的情報に関連付けられることとなる。例えば、プログラムの実行履歴を表す制御信号が、目的情報に関連付けられる。抽出部219は、プログラムの実行履歴を表す制御信号に基づいて、制御信号の取得、生成又は出力に際して実行されたプログラムを特定し、特定したプログラムに対応付けられた目的情報を、制御信号に関連付けられた目的情報としてプログラム記憶部211から取得する。 For example, in the program storage unit 211, purpose information may be associated with at least one of a plurality of programs (a plurality of motion programs and a plurality of machining programs). In this case, by associating the program with the purpose information, at least one of the plurality of types of control signals is associated with the purpose information via the program. For example, control signals representing program execution history are associated with the purpose information. The extracting unit 219 identifies a program executed when acquiring, generating, or outputting the control signal based on the control signal representing the execution history of the program, and associates the purpose information associated with the identified program with the control signal. It acquires from the program storage unit 211 as the purpose information.
 抽出部219は、仮想空間VSが含む1以上の仮想周辺物体と、仮想エンドエフェクタV50(エンドエフェクタ50のモデル)との位置関係に基づいて、仮想エンドエフェクタV50が作用する仮想周辺物体を特定し、仮想エンドエフェクタV50の動作に関わる制御信号と目的情報とに基づいて、仮想周辺物体の状態変化を抽出してもよい。 The extraction unit 219 identifies the virtual peripheral object on which the virtual end effector V50 acts based on the positional relationship between one or more virtual peripheral objects included in the virtual space VS and the virtual end effector V50 (model of the end effector 50). , the change in state of the virtual peripheral object may be extracted based on the control signal and the purpose information relating to the operation of the virtual end effector V50.
 例えば抽出部219は、ワークWをエンドエフェクタ50(ハンド)が把持するための制御信号を生成した際に実行していたプログラムに「把持」という目的情報が関連付けられていることを認識する。プログラムに「把持」が関連付けられているという認識結果に基づいて、抽出部219は、仮想空間VSにおける仮想ワークVW(ワークWのモデル)の位置・姿勢を、仮想エンドエフェクタV50の位置を基準とする相対位置・相対姿勢に書き換える。書き換えの後、シミュレータ214が仮想エンドエフェクタV50を仮想空間VSで移動させる場合には、上記相対位置・相対姿勢に基づいて、仮想ワークVWは仮想エンドエフェクタV50と共に移動する。 For example, the extraction unit 219 recognizes that the program being executed when the control signal for gripping the workpiece W by the end effector 50 (hand) is generated is associated with the purpose information "gripping". Based on the recognition result that "grasping" is associated with the program, the extraction unit 219 extracts the position/orientation of the virtual work VW (model of the work W) in the virtual space VS with the position of the virtual end effector V50 as a reference. Rewrite to the relative position/relative orientation. After rewriting, when the simulator 214 moves the virtual end effector V50 in the virtual space VS, the virtual work VW moves together with the virtual end effector V50 based on the relative position/relative orientation.
 図4及び図5を参照し、状態変化の抽出及び表示を更に例示する。図4及び図5は、仮想空間VSにおいて動作する仮想ロボットV5B及び仮想ロボットV5Cを含む。仮想ロボットV5B,V5Cは、ロボット5B,5Cのモデルであり、それぞれが仮想エンドエフェクタV50(エンドエフェクタ50のモデル)を有する。一例として、ロボット5Bのエンドエフェクタ50はハンドであり、ロボット5Cのエンドエフェクタ50は溶接等の接合ツールである。仮想空間VSは、仮想ロボットV5B,V5Cに加えて、仮想周辺物体である仮想テーブルVT1,VT2を含む。仮想テーブルVT1上には、仮想周辺物体である仮想ワークVW1,VW2が配置されている。  Referring to FIGS. 4 and 5, the extraction and display of state changes are further illustrated. 4 and 5 include virtual robot V5B and virtual robot V5C operating in virtual space VS. Virtual robots V5B and V5C are models of robots 5B and 5C, each having a virtual end effector V50 (model of end effector 50). As an example, the end effector 50 of the robot 5B is a hand, and the end effector 50 of the robot 5C is a joining tool such as welding. The virtual space VS includes virtual tables VT1 and VT2, which are virtual peripheral objects, in addition to the virtual robots V5B and V5C. Virtual works VW1 and VW2, which are virtual peripheral objects, are arranged on the virtual table VT1.
 図4に示すように、シミュレータ214は、推移情報と、生産装置2のモデル情報とに基づいて、仮想ワークVW1を把持するための位置まで仮想ロボットV5Bが仮想エンドエフェクタV50を移動させた後(図4の(a)参照)、仮想ワークVW1を把持するように仮想エンドエフェクタV50を動作させる。 As shown in FIG. 4, the simulator 214, based on the transition information and the model information of the production apparatus 2, after the virtual robot V5B moves the virtual end effector V50 to the position for gripping the virtual workpiece VW1 ( (see FIG. 4(a)), the virtual end effector V50 is operated so as to grip the virtual work VW1.
 抽出部219は、仮想ワークVW1を把持するように仮想エンドエフェクタV50を動作させる制御信号に「把持」という目的情報が関連付けられていることを認識し、仮想空間VSにおける仮想ワークVW1の位置・姿勢を、仮想エンドエフェクタV50の位置を基準とする相対位置・相対姿勢に書き換える。シミュレータ214は、推移情報と、生産装置2のモデル情報とに基づいて、仮想ワークVW1を仮想ワークVW2の上に配置する位置まで仮想ロボットV5Bにより仮想エンドエフェクタV50を移動させる。シミュレータ214は、仮想エンドエフェクタV50の移動と、上記相対位置・相対姿勢とに基づいて、仮想ワークVW1を仮想エンドエフェクタV50と共に移動させ、仮想ワークVW2の上に配置する(図4の(b)参照)。 The extraction unit 219 recognizes that the control signal for operating the virtual end effector V50 to grip the virtual work VW1 is associated with the purpose information "gripping", and extracts the position/orientation of the virtual work VW1 in the virtual space VS. to a relative position/orientation based on the position of the virtual end effector V50. The simulator 214 causes the virtual robot V5B to move the virtual end effector V50 to a position where the virtual work VW1 is placed on the virtual work VW2 based on the transition information and the model information of the production apparatus 2 . The simulator 214 moves the virtual work VW1 together with the virtual end effector V50 based on the movement of the virtual end effector V50 and the relative position/orientation, and places it on the virtual work VW2 ((b) in FIG. 4). reference).
 その後、シミュレータ214は、仮想ワークVW1の把持を解除するように仮想エンドエフェクタV50を動作させる。抽出部219は、仮想ワークVW1の把持を解除するように仮想エンドエフェクタV50を動作させる制御信号に「把持解除」という目的情報が関連付けられていることを認識し、仮想エンドエフェクタV50の位置を基準とする仮想ワークVW1の相対位置・相対姿勢を、仮想空間VSにおける位置・姿勢に書き換える。その後、シミュレータ214は、推移情報と、生産装置2のモデル情報とに基づいて、仮想ロボットV5Bにより仮想ワークVW1から仮想エンドエフェクタV50を退避させる。上述のように、仮想ワークVW1の位置が、仮想空間VSにおける位置・姿勢に書き換えられているため、仮想ワークVW1は仮想エンドエフェクタV50と共には移動せずに仮想ワークVW2の上に残る(図4の(c)参照)。 After that, the simulator 214 operates the virtual end effector V50 to release the grip of the virtual work VW1. The extracting unit 219 recognizes that the control signal for operating the virtual end effector V50 to release the grip of the virtual work VW1 is associated with the purpose information "grip release", and the position of the virtual end effector V50 is used as a reference. The relative position/relative orientation of the virtual work VW1 to be is rewritten to the position/orientation in the virtual space VS. After that, the simulator 214 causes the virtual robot V5B to withdraw the virtual end effector V50 from the virtual work VW1 based on the transition information and the model information of the production apparatus 2 . As described above, since the position of the virtual work VW1 has been rewritten to the position and orientation in the virtual space VS, the virtual work VW1 remains on the virtual work VW2 without moving together with the virtual end effector V50 (see FIG. 4). (c)).
 図5に示すように、シミュレータ214は、推移情報と、生産装置2のモデル情報とに基づいて、上記接合ツールである仮想エンドエフェクタV50を、仮想ロボットV5Cにより仮想ワークVW1と仮想ワークVW2との間に接触させる(図5の(a)参照)。抽出部219は、仮想エンドエフェクタV50を仮想ワークVW1と仮想ワークVW2との間に接触させる制御信号に「接合」という目的情報が関連付けられていることを認識し、仮想空間VSにおける仮想ワークVW2の位置・姿勢を、仮想ワークVW1の位置を基準とする相対位置・相対姿勢に書き換える。その後、シミュレータ214は、推移情報と、生産装置2のモデル情報とに基づいて、仮想ロボットV5Cにより仮想ワークVW1,VW2から仮想エンドエフェクタV50を退避させ、仮想ワークVW1を把持するための位置まで仮想ロボットV5Bにより仮想エンドエフェクタV50を移動させる(図5の(b)参照)。更に、シミュレータ214は、仮想ワークVW1を把持するように仮想エンドエフェクタV50を動作させる。 As shown in FIG. 5, the simulator 214 moves the virtual end effector V50, which is the joining tool, between the virtual work VW1 and the virtual work VW2 using the virtual robot V5C based on the transition information and the model information of the production apparatus 2. contact between them (see FIG. 5(a)). The extraction unit 219 recognizes that the control signal for bringing the virtual end effector V50 into contact with the virtual work VW1 and the virtual work VW2 is associated with the purpose information of "joining", and recognizes that the purpose information "bonding" is associated with the virtual work VW2 in the virtual space VS. The position/orientation is rewritten to a relative position/orientation based on the position of the virtual work VW1. After that, the simulator 214 causes the virtual robot V5C to withdraw the virtual end effector V50 from the virtual works VW1 and VW2 based on the transition information and the model information of the production apparatus 2, and moves the virtual end effector V50 to a position for gripping the virtual work VW1. The virtual end effector V50 is moved by the robot V5B (see (b) of FIG. 5). Furthermore, the simulator 214 operates the virtual end effector V50 so as to grip the virtual work VW1.
 抽出部219は、プログラム記憶部211において、仮想ワークVW1を把持するように仮想エンドエフェクタV50を動作させる制御信号に「ワークWの把持」という目的情報が関連付けられていることを認識し、仮想空間VSにおける仮想ワークVW1の位置・姿勢を、仮想エンドエフェクタV50の位置を基準とする相対位置・相対姿勢に書き換える。その後、シミュレータ214は、推移情報と、生産装置2のモデル情報とに基づいて、仮想ワークVW3を仮想テーブルVT2上に配置する位置まで仮想ロボットV5Bにより仮想エンドエフェクタV50を移動させる。シミュレータ214は、仮想エンドエフェクタV50の移動と、仮想エンドエフェクタV50に対する仮想ワークVW1の上記相対位置・相対姿勢と、仮想ワークVW1に対する仮想ワークVW2の上記相対位置・相対姿勢とに基づいて、仮想ワークVW1を仮想エンドエフェクタV50と共に移動させ、仮想ワークVW2を仮想ワークVW1と共に移動させる。これにより、仮想ワークVW3が仮想テーブルVT2の上に移動する(図5の(c)参照)。 The extraction unit 219 recognizes in the program storage unit 211 that the control signal for operating the virtual end effector V50 so as to grip the virtual work VW1 is associated with the purpose information "gripping the work W", The position/orientation of the virtual work VW1 in VS is rewritten to a relative position/orientation based on the position of the virtual end effector V50. After that, the simulator 214 causes the virtual robot V5B to move the virtual end effector V50 to a position where the virtual work VW3 is arranged on the virtual table VT2 based on the transition information and the model information of the production apparatus 2 . The simulator 214 moves the virtual work VW1 based on the movement of the virtual end effector V50, the relative position/relative orientation of the virtual work VW1 with respect to the virtual end effector V50, and the relative position/relative orientation of the virtual work VW2 with respect to the virtual work VW1. VW1 is moved together with the virtual end effector V50, and the virtual work VW2 is moved together with the virtual work VW1. As a result, the virtual work VW3 moves onto the virtual table VT2 (see (c) in FIG. 5).
 図3に戻り、シミュレーション装置200は、制御信号の推移を表すグラフをモニタに更に表示させるように構成されていてもよい。例えばシミュレーション装置200は、グラフを仮想空間VSと共に視認し得るようにモニタに表示させてもよい。シミュレーション装置200は、モニタに表示された仮想生産装置V2の動作における時点を示す時点指標をグラフに関連付けて表示してもよい。グラフにおける制御信号の値と、仮想空間VSの表示とを瞬時に対応付けることができる。 Returning to FIG. 3, the simulation device 200 may be configured to further display a graph representing the transition of the control signal on the monitor. For example, the simulation device 200 may display the graph on the monitor so that it can be visually recognized together with the virtual space VS. The simulation device 200 may display a graph in association with a point-in-time indicator indicating a point in time in the operation of the virtual production apparatus V2 displayed on the monitor. The value of the control signal in the graph can be instantly associated with the display of the virtual space VS.
 例えばシミュレーション装置200は、機能ブロックとして、グラフ生成部221と、同期部222とを更に有する。グラフ生成部221は、対象区間の推移情報を推移情報記憶部218から読み出し、読み出した推移情報に基づいて上記グラフを生成する。 For example, the simulation device 200 further includes a graph generator 221 and a synchronizer 222 as functional blocks. The graph generation unit 221 reads the transition information of the target section from the transition information storage unit 218 and generates the above graph based on the read transition information.
 同期部222は、モニタに表示された仮想生産装置V2の動作における時点と、グラフにおける時点指標が示す時点とを同期させる。例えば同期部222は、モニタに表示された仮想生産装置V2の動作における時点に対応するグラフ上の位置を特定する。例えば同期部222は、推移情報の時間軸において、モニタに表示された仮想生産装置V2の動作に対応する時点(モニタに表示された仮想生産装置V2の動作による再現対象となっている時点)を特定し、特定した時点に対応するグラフ上の位置を特定する。表示部216は、グラフ生成部221により生成されたグラフをモニタに表示し、同期部222により特定された位置に関連付けて時点指標をモニタに表示させる。 The synchronization unit 222 synchronizes the time point in the operation of the virtual production apparatus V2 displayed on the monitor with the time point indicated by the time point index in the graph. For example, the synchronization unit 222 identifies the position on the graph corresponding to the point in the operation of the virtual production apparatus V2 displayed on the monitor. For example, the synchronizing unit 222 selects a point in time corresponding to the operation of the virtual production apparatus V2 displayed on the monitor (a point in time to be reproduced by the operation of the virtual production apparatus V2 displayed on the monitor) on the time axis of the transition information. Identify and identify the location on the graph that corresponds to the identified time point. The display unit 216 displays the graph generated by the graph generation unit 221 on the monitor, and causes the monitor to display the time index in association with the position specified by the synchronization unit 222 .
 同期部222により特定された位置を視認させ得る限り、時点指標の表示形態に特に制限はない。時点指標の例としては、同期部222により特定された位置を通るようにグラフ上に描画されるライン、同期部222により特定された位置に表示される点、同期部222により特定された位置を指し示す矢印等が挙げられる。 As long as the position specified by the synchronization unit 222 can be visually recognized, there is no particular limitation on the display form of the time point index. Examples of the time point index include a line drawn on the graph passing through the position specified by the synchronization unit 222, a point displayed at the position specified by the synchronization unit 222, and a position specified by the synchronization unit 222. A pointing arrow or the like can be used.
 表示部216は、特定のイベントが発生した時点を示すイベント情報に基づいて、イベントを表すイベント指標を、イベント情報が表す時点に対応したグラフ上の位置に関連付けてモニタに表示させてもよい。イベントが発生した際に、生産装置がどのように動作し、制御信号がいかなる値であったかを瞬時に把握可能とすることで、イベントの要因特定の迅速化を図ることができる。 The display unit 216 may cause the monitor to display an event index representing the event in association with the position on the graph corresponding to the time represented by the event information, based on the event information representing the time when the specific event occurred. By making it possible to instantly grasp how the production apparatus operates and what the value of the control signal is when an event occurs, it is possible to quickly identify the cause of the event.
 イベント情報が表す時点をグラフ上において視認させ得る限り、イベント指標の表示形態に特に制限はない。イベント指標の例としては、グラフ上の位置を示す位置指標と、イベントの発生を示すアイコンとの組み合わせが挙げられる。 As long as the time points represented by the event information can be visually recognized on the graph, there are no particular restrictions on the display format of the event indicators. An example of an event index is a combination of a position index indicating a position on a graph and an icon indicating occurrence of an event.
 表示部216は、特定のイベントが発生した時点を示すイベント情報に基づいて、イベント情報が表す時点に対応した仮想生産装置V2の動作に関連付けて、イベントが発生したことを表す通知をモニタに表示させてもよい。例えば表示部216は、イベント情報に対応した仮想生産装置V2の動作がモニタに表示されるタイミングで、仮想生産装置V2の動作と共に視認し得る位置に上記通知を表示してもよい。イベントが発生した際に生産装置2がどのように動作していたかをより早く認識させることができる。 The display unit 216 displays on the monitor a notification indicating that the event has occurred in association with the operation of the virtual production apparatus V2 corresponding to the time indicated by the event information, based on the event information indicating the time when the specific event occurred. You may let For example, the display unit 216 may display the notification at a position that can be visually recognized together with the operation of the virtual production apparatus V2 at the timing when the operation of the virtual production apparatus V2 corresponding to the event information is displayed on the monitor. It is possible to quickly recognize how the production device 2 was operating when the event occurred.
 イベント情報が表す時点に対応した仮想生産装置V2の動作に関連付け得る限り、通知の表示形態に特に制限はない。通知の表示形態の例としては、イベントの発生を通知するテキスト又は図を仮想生産装置V2の動作と共に表示すること、又は、イベント情報が表す時点に対応した仮想生産装置V2の動作を少なくとも部分的な色の変更などにより強調表示すること等が挙げられる。 As long as it can be associated with the operation of the virtual production equipment V2 corresponding to the time represented by the event information, there is no particular limitation on the display form of the notification. Examples of the form of notification display include displaying text or a diagram notifying the occurrence of an event together with the operation of the virtual production apparatus V2, or at least partially displaying the operation of the virtual production apparatus V2 corresponding to the time represented by the event information. highlighting by changing the color, etc.
 シミュレーション装置200は、検出部223と、イベント記憶部225と、イベント通知部226とを更に有してもよい。検出部223は、制御信号に基づいて、イベントの発生を検出し、上記イベント情報を生成する。例えば検出部223は、上述したアラーム等のステータス信号に基づいてイベントの発生を検出してもよく、上述した偏差の大きさ等に基づいてイベントの発生を検出してもよい。制御信号に基づくイベントの検出が検出部223により行われるので、利便性が向上する。 The simulation device 200 may further include a detection unit 223, an event storage unit 225, and an event notification unit 226. The detection unit 223 detects the occurrence of an event based on the control signal and generates the event information. For example, the detection unit 223 may detect the occurrence of an event based on the status signal such as the alarm described above, or may detect the occurrence of the event based on the magnitude of the deviation described above. Since the detection unit 223 detects an event based on the control signal, convenience is improved.
 検出部223は、制御信号に基づいて、ロボット5B,5C,5Dに過負荷が発生したことをイベントとして検出してもよい。検出部223は、ステータス信号に基づいて過負荷を検出してもよく、目標角度とフィードバック値との偏差に基づいて過負荷を検出してもよい。 Based on the control signal, the detection unit 223 may detect that the robots 5B, 5C, and 5D are overloaded as an event. The detector 223 may detect overload based on the status signal, or may detect overload based on the deviation between the target angle and the feedback value.
 例えば検出部223は、データベース113が記憶する複数の状態レコードを順次読み出して、読み出した状態レコードに基づきイベントの有無を確認し、検出したイベントを、状態レコードと共通の時刻に対応付けてイベント記憶部225に記憶させる。これにより、上記イベント情報がイベント記憶部225に蓄積される。 For example, the detection unit 223 sequentially reads out a plurality of status records stored in the database 113, checks the presence or absence of an event based on the read status records, associates the detected event with the time common to the status record, and stores the event. Stored in unit 225 . As a result, the event information is accumulated in the event storage unit 225 .
 イベント通知部226は、表示部216がモニタに表示させている仮想生産装置V2の動作に対応する時点で発生したイベントの有無を、イベント記憶部225が記憶するイベント情報に基づいて確認する。イベント通知部226は、仮想生産装置V2の動作に対応する時点で発生したイベントを特定した場合に、特定したイベントのイベント指標及び通知を表示部216に表示させる。 The event notification unit 226 confirms, based on the event information stored in the event storage unit 225, whether there is an event occurring at the time corresponding to the operation of the virtual production apparatus V2 displayed on the monitor by the display unit 216. When the event notification unit 226 identifies an event that occurred at the time corresponding to the operation of the virtual production apparatus V2, the event indicator and notification of the identified event are displayed on the display unit 216. FIG.
 シミュレーション装置200は、ユーザ入力に基づいて上記対象区間を設定するように構成されていてもよい。例えばシミュレーション装置200は、区間設定部227を更に有してもよい。区間設定部227は、ユーザ入力に基づいて上記対象区間を設定する。例えば区間設定部227は、開始時刻と終了時刻とを個別に入力可能な入力画面を表示し、入力された開始時刻から終了時刻までを対象区間として設定する。 The simulation device 200 may be configured to set the target section based on user input. For example, the simulation device 200 may further have an interval setting section 227 . The section setting unit 227 sets the target section based on user input. For example, the section setting unit 227 displays an input screen on which the start time and the end time can be individually input, and sets the inputted start time to the end time as the target section.
 シミュレーション装置200は、イベント特定部228を更に有してもよい。イベント特定部228は、ユーザ入力に基づいて、少なくとも生産システム1のイベントを特定する。区間設定部227は、イベント特定部228により特定されたイベントが発生した時点を含むように対象区間を設定してもよい。 The simulation device 200 may further include an event identification unit 228. The event identification unit 228 identifies at least an event of the production system 1 based on user input. The section setting unit 227 may set the target section so as to include the point in time when the event specified by the event specifying unit 228 occurs.
 例えばイベント特定部228は、イベント記憶部225が記憶するイベント情報に基づいて、検出部223により検出されたイベントのリストを表示し、ユーザ入力によりリストから選択されたイベントを上記1のイベントとして特定する。 For example, the event identification unit 228 displays a list of events detected by the detection unit 223 based on the event information stored in the event storage unit 225, and identifies an event selected from the list by user input as event 1 above. do.
 区間設定部227は、イベント特定部228により特定されたイベントが発生した時点から所定期間遡った時点を開始時刻とし、イベントが発生した時点を終了時刻として、開始時刻から終了時刻までを対象区間として設定する。区間設定部227は、イベントが発生した時点を過ぎた時点を終了時刻としてもよい。 The interval setting unit 227 sets the point in time before the occurrence of the event specified by the event specifying unit 228 by a predetermined period as the start time, the point in time at which the event occurs as the end time, and the interval from the start time to the end time as the target interval. set. The interval setting unit 227 may set the end time to a point in time past the point at which the event occurred.
 表示部216は、現実空間で撮影された生産装置2の撮影データに基づいて、撮影データを表す撮影指標を、撮影データが撮影された時点に対応したグラフ上の位置に関連付けてモニタに表示させてもよい。現実空間の撮影データと、制御信号との対応付けを容易にすることができる。 The display unit 216 causes the monitor to display, based on the photographed data of the production apparatus 2 photographed in the real space, the photographing index representing the photographed data in association with the position on the graph corresponding to the time when the photographed data was photographed. may It is possible to facilitate association between the photographed data of the physical space and the control signal.
 表示部216は、仮想生産装置V2の動作が表す時点において現実空間で撮影されたフレームをモニタに表示させてもよい。現実空間において撮影されたフレームと、仮想空間との対応付けを容易にすることができる。フレームは、静止画像であってもよく、動画の1コマであってもよい。 The display unit 216 may display on the monitor the frames captured in the real space at the time indicated by the operation of the virtual production apparatus V2. It is possible to easily associate the frames captured in the real space with the virtual space. A frame may be a still image or one frame of a moving image.
 例えば図6に示すように、シミュレーション装置200は、撮影データ抽出部231と、撮影データ記憶部232と、フレーム特定部233とを更に有してもよい。撮影データ抽出部231は、対象区間に対応する状態レコードに含まれる撮影データをデータベース113から抽出し、撮影データが撮影された時点(例えば状態レコードが含む上記時刻)に対応付けて撮影データ記憶部232に記憶させる。表示部216は、撮影データ記憶部232において撮影データに対応付けられた時点に基づいて撮影指標をモニタに表示させる。 For example, as shown in FIG. 6, the simulation device 200 may further include a photographed data extraction unit 231, a photographed data storage unit 232, and a frame identification unit 233. The photographed data extraction unit 231 extracts the photographed data included in the state record corresponding to the target section from the database 113, and stores the photographed data storage unit 231 in association with the time when the photographed data was photographed (for example, the time included in the state record). 232. The display unit 216 causes the monitor to display the shooting index based on the time points associated with the shooting data in the shooting data storage unit 232 .
 撮影データが撮影された時点をグラフ上において視認させ得る限り、撮影指標の表示形態に特に制限はない。撮影指標の例としては、グラフ上の位置を示す位置指標と、撮影データの存在を示すアイコンとの組み合わせが挙げられる。 There are no particular restrictions on the display format of the shooting index as long as the time point at which the shooting data was shot can be visually recognized on the graph. An example of an imaging index is a combination of a position index indicating a position on a graph and an icon indicating existence of imaging data.
 フレーム特定部233は、撮影データ記憶部232が記憶する撮影データに対応付けられた時刻に基づいて、仮想生産装置V2の動作が表す時点において現実空間で撮影されたフレームを特定する。表示部216は、仮想空間VSと、フレーム特定部233により特定されたフレームとをモニタに表示させる。 The frame identification unit 233 identifies a frame captured in the real space at the time indicated by the operation of the virtual production apparatus V2 based on the time associated with the captured data stored in the captured data storage unit 232. The display unit 216 displays the virtual space VS and the frame specified by the frame specifying unit 233 on the monitor.
 シミュレーション装置200は、ユーザ入力に基づいて、上記視点を変更するように構成されていてもよい。シミュレータによる演算結果を、任意の視点からの動画の再生に有効活用することができる。例えば図7に示すように、シミュレーション装置200は、視点設定部241と、視点記憶部242とを更に有する。視点設定部241は、ユーザ入力に基づいて、仮想空間VSに対する上記視点を設定する。例えば視点設定部241は、仮想空間VSにおける上記視点の位置及び方向を設定する。表示部216は、視点設定部241により設定された視点から見た仮想空間VSを表示させる。 The simulation device 200 may be configured to change the viewpoint based on user input. Calculation results from the simulator can be effectively used to reproduce moving images from arbitrary viewpoints. For example, as shown in FIG. 7, the simulation device 200 further has a viewpoint setting section 241 and a viewpoint storage section 242 . The viewpoint setting unit 241 sets the viewpoint for the virtual space VS based on user input. For example, the viewpoint setting unit 241 sets the position and direction of the viewpoint in the virtual space VS. The display unit 216 displays the virtual space VS viewed from the viewpoint set by the viewpoint setting unit 241 .
 視点記憶部242は、視点設定部241による視点の変更履歴を記憶する。例えば視点記憶部242は、上記推移情報の時間軸において、視点の変更履歴を記憶する。例えば視点記憶部242は、視点設定部241が視点を変更した際に、表示部216が表示させていた時点(上記時間軸における時点)と、変更後の視点とを対応付けて記憶する。表示部216は、仮想生産装置V2の動作を表示する際に、視点記憶部242が記憶する視点の変更履歴に基づいて視点を変化させる。視点変更履歴を再利用して、利便性を向上させることができる。 The viewpoint storage unit 242 stores the viewpoint change history by the viewpoint setting unit 241 . For example, the viewpoint storage unit 242 stores a change history of the viewpoint on the time axis of the transition information. For example, when the viewpoint setting unit 241 changes the viewpoint, the viewpoint storage unit 242 associates and stores the point in time (point in time on the time axis) displayed by the display unit 216 and the changed viewpoint. The display unit 216 changes the viewpoint based on the viewpoint change history stored in the viewpoint storage unit 242 when displaying the operation of the virtual production apparatus V2. Viewpoint change history can be reused to improve convenience.
 シミュレーション装置200は、確認部245を更に有してもよい。確認部245は、データベース113が記憶する推移情報(例えば複数の状態レコード)と、モデル保持部213が記憶するモデル情報とに基づいて、生産装置2の構成と、仮想生産装置V2の構成とが一致しているか否かを確認する。確認部245は、生産装置2の構成と仮想生産装置V2の構成とが一致しないと判定した場合に、不整合通知をモニタに表示させてもよい。ミスマッチによる不適切な検証を容易に防ぐことができる。 The simulation device 200 may further have a confirmation unit 245 . The confirmation unit 245 checks the configuration of the production apparatus 2 and the configuration of the virtual production apparatus V2 based on transition information (for example, a plurality of state records) stored in the database 113 and model information stored in the model holding unit 213. Check if they match. If the confirmation unit 245 determines that the configuration of the production apparatus 2 and the configuration of the virtual production apparatus V2 do not match, the confirmation unit 245 may display a mismatch notification on the monitor. Inappropriate verification due to mismatches can be easily prevented.
 ここでの構成は、少なくともマシンの種類と台数とを含む。例えば確認部245は、推移情報におけるデータ項目のラベル(列見出し又はキー等)に基づいて生産装置2の構成を特定し、生産装置2のモデル情報に基づいて仮想生産装置V2の構成を特定し、特定した生産装置2の構成と仮想生産装置V2の構成とを比較する。 The configuration here includes at least the type and number of machines. For example, the confirmation unit 245 identifies the configuration of the production device 2 based on the data item label (column header or key, etc.) in the transition information, and identifies the configuration of the virtual production device V2 based on the model information of the production device 2. , the configuration of the specified production apparatus 2 and the configuration of the virtual production apparatus V2 are compared.
 図8は、表示部216がモニタに表示させる画像を例示する模式図である。図8に例示される画像は、シミュレーションエリア250と、グラフ260と、スライドバー261と、再生ボタン262と、時刻指標263と、イベント指標264と、複数の撮影指標265とを含む。シミュレーションエリア250は、上記視点から見た仮想空間VSを示す二次元のシミュレーション画像であり、仮想搬送装置V5Aと、仮想ロボットV5B,V5C,V5Dと、仮想工作機械V5Eと、仮想ワークVWとを含む。 FIG. 8 is a schematic diagram illustrating an image that the display unit 216 causes the monitor to display. The image illustrated in FIG. 8 includes a simulation area 250, a graph 260, a slide bar 261, a play button 262, a time indicator 263, an event indicator 264, and a plurality of shooting indicators 265. The simulation area 250 is a two-dimensional simulation image showing the virtual space VS seen from the above viewpoint, and includes a virtual transport device V5A, virtual robots V5B, V5C, and V5D, a virtual machine tool V5E, and a virtual work VW. .
 グラフ260は、複数の制御信号の対象区間における時間変化を表している。図示の例において、グラフ260は、複数の制御信号を2以上のグループに分類し、グループごとの複数のグラフを含んでいる。複数のグラフは縦方向に並んでいる。複数のグラフのそれぞれにおいて、横軸は推移情報の時間軸における時刻を表し、縦軸は制御信号の値を表している。 A graph 260 represents temporal changes in target sections of a plurality of control signals. In the illustrated example, graph 260 classifies the plurality of control signals into two or more groups and includes multiple graphs for each group. Multiple graphs are arranged vertically. In each of the plurality of graphs, the horizontal axis represents time on the time axis of the transition information, and the vertical axis represents the value of the control signal.
 時刻指標263は、推移情報の時間軸において上記シミュレーション画像により表示されている時点に対応するグラフ260上の位置に関連付けて表示されている。例えば時刻指標263は、複数のグラフに亘るように、グラフ260上に線状に表示されている。 The time index 263 is displayed in association with the position on the graph 260 corresponding to the time displayed by the simulation image on the time axis of the transition information. For example, the time indicator 263 is linearly displayed on the graph 260 across multiple graphs.
 イベント指標264は、イベントの発生時点に対応するグラフ260上の位置に表示されている。 The event index 264 is displayed at the position on the graph 260 corresponding to the time when the event occurred.
 撮影指標265は、静止画撮影指標266と、動画撮影指標267とを含んでいる。静止画撮影指標266は、撮影データが含む静止画像の撮影時点を示している。動画撮影指標267は、撮影データが含む動画の撮影区間を示している。 The shooting index 265 includes a still image shooting index 266 and a moving image shooting index 267. The still image shooting index 266 indicates the shooting time point of the still image included in the shooting data. The moving image shooting index 267 indicates the shooting section of the moving image included in the shooting data.
 スライドバー261は、推移情報の時間軸における時刻をグラフィカルに表すオブジェクトである。スライドバー261は、横方向に延びるトラック268と、トラック268に沿って移動するサム269とを有する。トラック268におけるサム269の位置が、推移情報の時間軸における時刻を表す。 The slide bar 261 is an object that graphically represents the time on the time axis of transition information. Slide bar 261 has a laterally extending track 268 and a thumb 269 that moves along track 268 . The position of thumb 269 on track 268 represents the time on the time axis of the transition information.
 ドラッグ操作などにより、サム269の位置を変更することで、推移情報の時間軸における時刻を指定することができる。同期部222は、指定された時刻に基づいて時刻指標263の表示位置を特定する。表示部216は、指定された時刻のシミュレーション結果に基づいてシミュレーションエリア250の表示を変更し、同期部222により特定された表示位置に時刻指標263を移動させる。 By changing the position of the thumb 269 by a drag operation or the like, it is possible to specify the time on the time axis of the transition information. Synchronization unit 222 identifies the display position of time indicator 263 based on the specified time. The display unit 216 changes the display of the simulation area 250 based on the simulation result at the designated time, and moves the time indicator 263 to the display position specified by the synchronization unit 222 .
 再生ボタン262は、推移情報に基づいて、シミュレーションエリア250における仮想生産装置V2を動作させることを指示するボタンである。再生ボタン262が押された場合に、表示部216は、結果保持部215に蓄積されたシミュレーション結果を時系列順に読み出し、読み出したシミュレーション結果に対応付けられた時刻を表す位置にスライドバー261のサム269を移動させ、読み出したシミュレーション結果に基づいてシミュレーションエリア250の表示を変更する。同期部222は、表示部216が読み出したシミュレーション結果に対応付けられた時刻に基づいて時刻指標263の表示位置を特定し、表示部216は同期部222により特定された表示位置に時刻指標263を移動させる。 The play button 262 is a button for instructing to operate the virtual production apparatus V2 in the simulation area 250 based on the transition information. When the play button 262 is pressed, the display unit 216 reads the simulation results accumulated in the result holding unit 215 in chronological order, and displays the thumb of the slide bar 261 at the position representing the time associated with the read simulation result. 269 to change the display of the simulation area 250 based on the read simulation result. The synchronization unit 222 specifies the display position of the time indicator 263 based on the time associated with the simulation result read by the display unit 216, and the display unit 216 displays the time indicator 263 at the display position specified by the synchronization unit 222. move.
 図8においては、時刻指標263が、静止画撮影指標266と動画撮影指標267とにかかっている。このため、撮影データウィンドウ271と撮影データウィンドウ272とが更に表示されている。撮影データウィンドウ271は、ハンドであるエンドエフェクタ50がワークWを把持可能な位置に配置された状態でカメラ51により撮影された静止画像を表示している。撮影データウィンドウ272は、環境センサ6が撮影した動画のうち、時刻指標263が示している時点に対応するフレームを表示している。 In FIG. 8, the time index 263 overlaps the still image shooting index 266 and the moving image shooting index 267 . Therefore, an imaging data window 271 and an imaging data window 272 are additionally displayed. The photographed data window 271 displays a still image photographed by the camera 51 in a state where the end effector 50, which is a hand, is arranged at a position where the workpiece W can be grasped. The captured data window 272 displays a frame corresponding to the point in time indicated by the time indicator 263 in the moving image captured by the environment sensor 6 .
 以上において、生産装置2が現実空間においてワークWに対し処理工程を実行する場合を例示したが、生産装置2は仮想空間においてワークWに対し処理工程を実行する仮想生産装置であってもよい。〔ハードウェア構成〕図9は、ローカルコントローラ400、上位コントローラ300、データ収集装置100、及びシミュレーション装置200のハードウェア構成を例示するブロック図である。図9に示すように、ローカルコントローラ400は、回路490を有する。回路490は、プロセッサ491と、メモリ492と、ストレージ493と、ドライバ回路494と、通信ポート495とを有する。ストレージ493は、フラッシュメモリ、又はハードディスク等の1以上の不揮発性メモリデバイスにより構成されている。ストレージ493は、マシン5の制御をローカルコントローラ400に実行させるためのプログラムを記憶している。 In the above, the case where the production device 2 executes the processing process on the work W in the real space is illustrated, but the production device 2 may be a virtual production device that executes the processing process on the work W in the virtual space. [Hardware Configuration] FIG. 9 is a block diagram illustrating the hardware configuration of the local controller 400, the host controller 300, the data collection device 100, and the simulation device 200. As shown in FIG. As shown in FIG. 9, local controller 400 includes circuitry 490 . Circuitry 490 includes processor 491 , memory 492 , storage 493 , driver circuitry 494 and communication port 495 . The storage 493 is composed of one or more non-volatile memory devices such as flash memory or hard disk. The storage 493 stores programs for causing the local controller 400 to control the machine 5 .
 メモリ492は、例えばランダムアクセスメモリ等の1以上の揮発性メモリデバイスにより構成されている。メモリ492は、ストレージ493からロードされたプログラムを一時的に記憶する。プロセッサ491は、CPU(Central Processing Unit)又はGPU(Graphics Processing Unit)等の1以上の演算デバイスにより構成されている。プロセッサ491は、メモリ492にロードされたプログラムを実行することで、上述した各機能ブロックをローカルコントローラ400に構成させる。プロセッサ491による演算結果は一時的にメモリ492に格納される。 The memory 492 is composed of one or more volatile memory devices such as random access memory. Memory 492 temporarily stores programs loaded from storage 493 . The processor 491 is composed of one or more computing devices such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). The processor 491 executes a program loaded in the memory 492 to configure each functional block described above in the local controller 400 . A calculation result by the processor 491 is temporarily stored in the memory 492 .
 ドライバ回路494は、プロセッサ491からの要求に応じてマシン5を動作させる。通信ポート495は、プロセッサ491からの要求に応じ、制御用の通信ネットワークNW1を介して上位コントローラ300と通信する。 The driver circuit 494 operates the machine 5 according to requests from the processor 491 . The communication port 495 communicates with the host controller 300 via the communication network NW1 for control in response to requests from the processor 491 .
 上位コントローラ300は、回路390を有する。回路390は、プロセッサ391と、メモリ392と、ストレージ393と、入出力ポート394と、通信ポート395と、通信ポート396とを有する。ストレージ393は、フラッシュメモリ、又はハードディスク等の1以上の不揮発性メモリデバイスにより構成されている。ストレージ393は、上述した状態レコードの収集と、実行指令の生成とを上位コントローラ300に実行させるためのプログラムを記憶している。 The upper controller 300 has a circuit 390. Circuit 390 has processor 391 , memory 392 , storage 393 , input/output port 394 , communication port 395 and communication port 396 . The storage 393 is composed of one or more non-volatile memory devices such as flash memory or hard disk. The storage 393 stores a program for causing the host controller 300 to collect the status records and generate the execution command described above.
 メモリ392は、例えばランダムアクセスメモリ等の1以上の揮発性メモリデバイスにより構成されている。メモリ392は、ストレージ393からロードされたプログラムを一時的に記憶する。プロセッサ391は、CPU又はGPU等の1以上の演算デバイスにより構成されている。プロセッサ391は、メモリ392にロードされたプログラムを実行することで、上述した各機能ブロックを上位コントローラ300に構成させる。プロセッサ391による演算結果は一時的にメモリ392に格納される。 The memory 392 is composed of one or more volatile memory devices such as random access memory. Memory 392 temporarily stores programs loaded from storage 393 . The processor 391 is composed of one or more computing devices such as CPU or GPU. The processor 391 executes a program loaded in the memory 392 to configure each functional block described above in the host controller 300 . A calculation result by the processor 391 is temporarily stored in the memory 392 .
 入出力ポート394は、プロセッサ391からの要求に応じて、環境センサ6との間で情報の受け渡しを行う。通信ポート395は、プロセッサ391からの要求に応じ、通信ネットワークNW1を介してローカルコントローラ400と通信する。通信ポート396は、プロセッサ391からの要求に応じ、通信ネットワークNW2を介してデータ収集装置100及びシミュレーション装置200と通信する。通信ネットワークNW2は、通信ネットワークNW1と別のネットワークであってもよく、通信ネットワークNW1と同一のネットワークであってもよい。 The input/output port 394 exchanges information with the environment sensor 6 in response to requests from the processor 391 . Communication port 395 communicates with local controller 400 via communication network NW1 in response to requests from processor 391 . The communication port 396 communicates with the data collection device 100 and the simulation device 200 via the communication network NW2 in response to requests from the processor 391 . The communication network NW2 may be a network different from the communication network NW1, or may be the same network as the communication network NW1.
 データ収集装置100は、回路190を有する。回路190は、プロセッサ191と、メモリ192と、ストレージ193と、通信ポート194と、ユーザインタフェース195とを有する。ストレージ193は、フラッシュメモリ、又はハードディスク等の1以上の不揮発性メモリデバイスにより構成されている。ストレージ193は、上述した機能ブロックをデータ収集装置100に構成させるためのプログラムを記憶している。 The data collection device 100 has a circuit 190. Circuitry 190 includes processor 191 , memory 192 , storage 193 , communication port 194 and user interface 195 . The storage 193 is composed of one or more non-volatile memory devices such as flash memory or hard disk. The storage 193 stores a program for configuring the data collection device 100 with the functional blocks described above.
 メモリ192は、例えばランダムアクセスメモリ等の1以上の揮発性メモリデバイスにより構成されている。メモリ192は、ストレージ193からロードされたプログラムを一時的に記憶する。プロセッサ191は、CPU又はGPU等の1以上の演算デバイスにより構成されている。プロセッサ191は、メモリ192にロードされたプログラムを実行することで、上述した各機能ブロックをデータ収集装置100に構成させる。プロセッサ191による演算結果は一時的にメモリ192に格納される。 The memory 192 is composed of one or more volatile memory devices such as random access memory. Memory 192 temporarily stores programs loaded from storage 193 . The processor 191 is composed of one or more arithmetic devices such as CPU or GPU. The processor 191 executes the program loaded in the memory 192 to configure the data collection device 100 with each functional block described above. A calculation result by the processor 191 is temporarily stored in the memory 192 .
 通信ポート194は、プロセッサ191からの要求に応じ、通信ネットワークNW2を介してシミュレーション装置200及び上位コントローラ300と通信する。ユーザインタフェース195は、プロセッサ191からの要求に応じ、オペレータとの間で情報の入出力を行う。例えばユーザインタフェース195は、液晶モニタ又は有機EL(Electro-Luminescence)モニタ等の表示デバイスと、キーボード又はマウスなどの入力デバイスとを有する。入力デバイスは、タッチパネルとして表示デバイスと一体化されていてもよい。 The communication port 194 communicates with the simulation device 200 and the upper controller 300 via the communication network NW2 in response to requests from the processor 191. A user interface 195 inputs and outputs information to and from an operator in response to requests from the processor 191 . For example, the user interface 195 has a display device such as a liquid crystal monitor or an organic EL (Electro-Luminescence) monitor and an input device such as a keyboard or mouse. The input device may be integrated with the display device as a touch panel.
 シミュレーション装置200は、回路290を有する。回路290は、プロセッサ291と、メモリ292と、ストレージ293と、通信ポート294と、ユーザインタフェース295とを有する。ストレージ293は、フラッシュメモリ、又はハードディスク等の1以上の不揮発性メモリデバイスにより構成されている。ストレージ293は、上述した機能ブロックをシミュレーション装置200に構成させるためのプログラムを記憶している。 The simulation device 200 has a circuit 290. Circuitry 290 includes processor 291 , memory 292 , storage 293 , communication port 294 and user interface 295 . The storage 293 is composed of one or more non-volatile memory devices such as flash memory or hard disk. The storage 293 stores a program for causing the simulation device 200 to configure the functional blocks described above.
 メモリ292は、例えばランダムアクセスメモリ等の1以上の揮発性メモリデバイスにより構成されている。メモリ292は、ストレージ293からロードされたプログラムを一時的に記憶する。プロセッサ291は、CPU又はGPU等の1以上の演算デバイスにより構成されている。プロセッサ291は、メモリ292にロードされたプログラムを実行することで、上述した各機能ブロックをシミュレーション装置200に構成させる。プロセッサ291による演算結果は一時的にメモリ292に格納される。 The memory 292 is composed of one or more volatile memory devices such as random access memory. Memory 292 temporarily stores programs loaded from storage 293 . The processor 291 is composed of one or more computing devices such as CPU or GPU. The processor 291 executes the program loaded in the memory 292 to configure the above functional blocks in the simulation device 200 . A calculation result by the processor 291 is temporarily stored in the memory 292 .
 通信ポート294は、プロセッサ291からの要求に応じ、通信ネットワークNW2を介してデータ収集装置100及び上位コントローラ300と通信する。ユーザインタフェース295は、プロセッサ291からの要求に応じ、オペレータとの間で情報の入出力を行う。例えばユーザインタフェース295は、液晶モニタ又は有機EL(Electro-Luminescence)モニタ等の表示デバイスと、キーボード又はマウスなどの入力デバイスとを有する。入力デバイスは、タッチパネルとして表示デバイスと一体化されていてもよい。 The communication port 294 communicates with the data collection device 100 and the upper controller 300 via the communication network NW2 in response to requests from the processor 291. A user interface 295 inputs and outputs information to and from an operator in response to requests from the processor 291 . For example, the user interface 295 has a display device such as a liquid crystal monitor or an organic EL (Electro-Luminescence) monitor and an input device such as a keyboard or mouse. The input device may be integrated with the display device as a touch panel.
 以上のハードウェア構成はあくまで一例であり、適宜変更可能である。例えばシミュレーション装置200がデータ収集装置100に組み込まれていてもよい。また、シミュレーション装置200及びデータ収集装置100が上位コントローラ300に組み込まれていてもよい。 The above hardware configuration is just an example and can be changed as appropriate. For example, simulation device 200 may be incorporated in data collection device 100 . Also, the simulation device 200 and the data collection device 100 may be incorporated in the host controller 300 .
〔ログ再生手順〕
 再現方法の一例として、シミュレーション装置200が実行するログ再生手順を例示する。ログ再生手順は、コントローラ3から収集された制御信号の推移を表す推移情報と、生産装置2のモデル情報とに基づいて、生産装置2に対応する仮想生産装置V2を仮想空間VSで動作させることと、仮想生産装置V2が動作する仮想空間VSをモニタに表示させることと、を含む。
[Log playback procedure]
As an example of the reproduction method, a log reproduction procedure executed by the simulation apparatus 200 is illustrated. The log reproduction procedure is to operate the virtual production device V2 corresponding to the production device 2 in the virtual space VS based on the transition information representing the transition of the control signal collected from the controller 3 and the model information of the production device 2. and causing the monitor to display the virtual space VS in which the virtual production apparatus V2 operates.
 一例として、ログ再生手順は、区間設定画面の表示手順と、初期画面の表示手順と、表示更新手順とを順に含む。以下、これらの手順を例示する。 As an example, the log playback procedure includes, in order, a section setting screen display procedure, an initial screen display procedure, and a display update procedure. These procedures are exemplified below.
(区間設定画面の表示手順)
 図10に示すように、シミュレーション装置200は、まずステップS01,S02を実行する。ステップS01では、確認部245が、ログ再生機能の呼び出しを待機する。ステップS02では、確認部245が、生産装置2の構成情報と、仮想生産装置V2の構成情報とを取得する。例えば確認部245は、データベース113が記憶する複数の状態レコードに基づいて生産装置2の構成情報を取得し、モデル保持部213が記憶するモデル情報に基づいて仮想生産装置V2の構成情報を取得する。
(Display procedure of section setting screen)
As shown in FIG. 10, the simulation device 200 first executes steps S01 and S02. In step S01, the confirmation unit 245 waits for the call of the log reproduction function. In step S02, the confirmation unit 245 acquires the configuration information of the production device 2 and the configuration information of the virtual production device V2. For example, the confirmation unit 245 acquires configuration information of the production apparatus 2 based on a plurality of state records stored in the database 113, and acquires configuration information of the virtual production apparatus V2 based on model information stored in the model holding unit 213. .
 次に、シミュレーション装置200はステップS03を実行する。ステップS03では、確認部245が、ステップS02において取得した構成情報に基づいて、生産装置2の構成と仮想生産装置V2の構成とが一致するか否かを確認する。ステップS03において、生産装置2の構成と仮想生産装置V2の構成とが一致しないと判定した場合、シミュレーション装置200はステップS04を実行する。ステップS04では、確認部245が、不整合通知をモニタに表示させる。その後、シミュレーション装置200は処理をステップS01に戻す。 Next, the simulation device 200 executes step S03. In step S03, the confirmation unit 245 confirms whether or not the configuration of the production apparatus 2 and the configuration of the virtual production apparatus V2 match based on the configuration information acquired in step S02. If it is determined in step S03 that the configuration of production device 2 and the configuration of virtual production device V2 do not match, simulation device 200 executes step S04. In step S04, the confirmation unit 245 causes the monitor to display the inconsistency notification. After that, the simulation device 200 returns the processing to step S01.
 ステップS03において、生産装置2の構成と仮想生産装置V2の構成とが一致すると判定した場合、シミュレーション装置200はステップS05,S06を実行する。ステップS05では、検出部223が、複数の状態レコードをデータベース113から順次読み出して、読み出した状態レコードに基づきイベントの有無を確認し、確認結果に基づいて上記イベント情報をイベント記憶部225に記憶させる。 When it is determined in step S03 that the configuration of the production device 2 and the configuration of the virtual production device V2 match, the simulation device 200 executes steps S05 and S06. In step S05, the detection unit 223 sequentially reads a plurality of state records from the database 113, checks the presence or absence of an event based on the read state records, and stores the event information in the event storage unit 225 based on the confirmation result. .
 ステップS06では、区間設定部227が、対象区間の設定画面を表示する。例えば区間設定部227は、上記開始時刻と上記終了時刻との入力ボックスと、上記イベントのリストを含む設定画面を表示する。以上で区間設定画面の表示手順が完了する。 In step S06, the section setting unit 227 displays a target section setting screen. For example, the section setting unit 227 displays a setting screen including input boxes for the start time and the end time and a list of the events. This completes the procedure for displaying the section setting screen.
(初期画面の表示手順)
 図11に示すように、シミュレーション装置200は、まずステップS11を実行する。ステップS11では、区間設定画面に開始時刻と終了時刻とを入力することで区間が指定されたか否かを区間設定部227が確認する。ステップS11において、区間は指定されていないと判定した場合、シミュレーション装置200はステップS12を実行する。ステップS12では、区間設定画面において、イベントのリストから生産システム1のイベントが選択されたか否かをイベント特定部228が確認する。ステップS12において、イベントは選択されていないと判定した場合、シミュレーション装置200は処理をステップS11に戻す。
(Initial screen display procedure)
As shown in FIG. 11, the simulation apparatus 200 first executes step S11. In step S11, the section setting unit 227 confirms whether or not a section is specified by inputting the start time and the end time on the section setting screen. When it is determined in step S11 that the section is not specified, the simulation device 200 executes step S12. In step S12, the event specifying unit 228 checks whether or not the event of the production system 1 has been selected from the event list on the section setting screen. If it is determined in step S12 that no event has been selected, the simulation device 200 returns the process to step S11.
 ステップS11において、区間が指定されたと判定した場合、シミュレーション装置200はステップS13を実行する。ステップS13では、区間設定部227が、開始時刻から終了時刻までを対象区間として設定する。 When it is determined in step S11 that a section has been designated, the simulation device 200 executes step S13. In step S13, the section setting unit 227 sets the target section from the start time to the end time.
 ステップS12において、イベントが選択されたと判定した場合、シミュレーション装置200はステップS14を実行する。ステップS14では、選択されたイベントをイベント特定部228が特定し、特定されたイベントが発生した時点を含むように区間設定部227が対象区間を設定する。 If it is determined in step S12 that an event has been selected, the simulation device 200 executes step S14. In step S14, the event specifying unit 228 specifies the selected event, and the interval setting unit 227 sets the target interval so as to include the point in time when the specified event occurs.
 ステップS13,S14の次に、シミュレーション装置200はステップS15,S16,S17を実行する。ステップS15では、推移情報抽出部217が、ログ再生の対象区間の推移情報をデータベース113から抽出し、推移情報記憶部218に記憶させる。ステップS16では、グラフ生成部221が、対象区間の推移情報を推移情報記憶部218から読み出し、読み出した推移情報に基づいてグラフ260を生成する。ステップS17では、撮影データ抽出部231が、対象区間に対応する状態レコードに含まれる撮影データをデータベース113から抽出し、撮影データが撮影された時点(例えば状態レコードが含む上記時刻)に対応付けて撮影データ記憶部232に記憶させる。 After steps S13 and S14, the simulation device 200 executes steps S15, S16 and S17. In step S<b>15 , the transition information extraction unit 217 extracts the transition information of the log reproduction target section from the database 113 and stores it in the transition information storage unit 218 . In step S16, the graph generation unit 221 reads the transition information of the target section from the transition information storage unit 218, and generates the graph 260 based on the read transition information. In step S17, the photographed data extracting unit 231 extracts the photographed data included in the state record corresponding to the target section from the database 113, and associates the photographed data with the time when the photographed data was photographed (for example, the time included in the state record). It is stored in the photographed data storage unit 232 .
 次に、シミュレーション装置200はステップS18,S19を実行する。ステップS18では、推移情報記憶部218が記憶する推移情報に基づいて、シミュレータ214が仮想生産装置V2を仮想空間VSで動作させる。 Next, the simulation device 200 executes steps S18 and S19. In step S18, the simulator 214 operates the virtual production apparatus V2 in the virtual space VS based on the transition information stored in the transition information storage unit 218. FIG.
 ステップS19では、表示部216が、対象区間の開始時刻を表示対象時刻として、表示対象時刻に対応するシミュレーション結果を推移情報記憶部218から読み出し、読み出したシミュレーション結果に基づいて、予め定められた視点から仮想空間VSを見た二次元のシミュレーション画像を生成する。表示部216は、生成したシミュレーション画像を含むシミュレーションエリア250と、ステップS16において生成されたグラフ260と、ステップS17において抽出された撮影データに基づく撮影指標265とをモニタに表示させる。以上で初期画面の表示手順が完了する。 In step S19, the display unit 216 sets the start time of the target section as the display target time, reads the simulation result corresponding to the display target time from the transition information storage unit 218, and based on the read simulation result, a predetermined viewpoint is displayed. generates a two-dimensional simulation image in which the virtual space VS is viewed from. The display unit 216 causes the monitor to display the simulation area 250 including the generated simulation image, the graph 260 generated in step S16, and the imaging index 265 based on the imaging data extracted in step S17. This completes the procedure for displaying the initial screen.
 図12は、ステップS18におけるシミュレーション手順を例示するフローチャートである。図12に示すように、シミュレーション装置200は、まずステップS21,S22を実行する。 FIG. 12 is a flowchart illustrating the simulation procedure in step S18. As shown in FIG. 12, the simulation apparatus 200 first executes steps S21 and S22.
 ステップS21では、シミュレータ214が、対象区間の開始時刻に対応する複数種類の制御信号を推移情報記憶部218が記憶する推移情報から抽出する。ステップS22では、シミュレータ214が、抽出した複数種類の制御信号と、モデル保持部213が記憶する生産装置2のモデル情報とに基づいて、仮想生産装置V2が動作を開示する直前の仮想空間VSを生成する。 In step S21, the simulator 214 extracts a plurality of types of control signals corresponding to the start time of the target section from transition information stored in the transition information storage unit 218. In step S22, the simulator 214 reproduces the virtual space VS immediately before the virtual production apparatus V2 begins to operate based on the extracted multiple types of control signals and the model information of the production apparatus 2 stored in the model holding unit 213. Generate.
 次に、シミュレーション装置200は、ステップS23、ステップS24を実行する。ステップS23では、シミュレータ214が、推移情報記憶部218が記憶する推移情報から次の複数種類の制御信号を抽出する。ステップS24では、シミュレータ214が、次の複数種類の制御信号と、モデル保持部213が記憶する生産装置2のモデル情報とに基づいて、仮想生産装置V2を仮想空間VSで動作させる。 Next, the simulation device 200 executes steps S23 and S24. In step S<b>23 , the simulator 214 extracts the following multiple types of control signals from the transition information stored in the transition information storage unit 218 . In step S24, the simulator 214 operates the virtual production apparatus V2 in the virtual space VS based on the following multiple types of control signals and the model information of the production apparatus 2 stored in the model holding unit 213.
 次に、シミュレーション装置200は、ステップS25を実行する。ステップS25では、抽出部219が、仮想エンドエフェクタV50の動作に関わる制御信号と、上記目的情報に基づいて、仮想エンドエフェクタV50が仮想周辺物体に対する作業の開始位置に配置されたか否かを確認する。ステップS25において、仮想エンドエフェクタV50が作業の開始位置に配置されていると判定した場合、シミュレーション装置200はステップS26を実行する。ステップS26では、抽出部219が、仮想空間VSが含む1以上の仮想周辺物体と、仮想エンドエフェクタV50(エンドエフェクタ50のモデル)との位置関係に基づいて、仮想エンドエフェクタV50が作用する仮想周辺物体を特定する。 Next, the simulation device 200 executes step S25. In step S25, the extraction unit 219 confirms whether or not the virtual end effector V50 is positioned at the start position of the work on the virtual peripheral object based on the control signal related to the operation of the virtual end effector V50 and the purpose information. . If it is determined in step S25 that the virtual end effector V50 is positioned at the work start position, the simulation device 200 executes step S26. In step S26, the extraction unit 219 extracts a virtual peripheral object on which the virtual end effector V50 acts based on the positional relationship between one or more virtual peripheral objects included in the virtual space VS and the virtual end effector V50 (model of the end effector 50). Identify objects.
 ステップS25において、仮想エンドエフェクタV50が作業の開始位置に配置されていないと判定した場合、シミュレーション装置200はステップS27を実行する。ステップS27では、抽出部219が、仮想エンドエフェクタV50の動作に関わる制御信号と、上記目的情報とに基づいて、仮想エンドエフェクタV50が仮想周辺物体に対する作業の完了位置に配置されたか否かを確認する。ステップS27において、仮想エンドエフェクタV50が作業の完了位置に配置されていると判定した場合、シミュレーション装置200はステップS28を実行する。ステップS28では、抽出部219が、上記目的情報に基づいて、仮想周辺物体の状態変化を抽出し、抽出した状態変化に基づいて仮想空間VSにおける仮想周辺物体の状態を変更する。 If it is determined in step S25 that the virtual end effector V50 is not placed at the work start position, the simulation device 200 executes step S27. In step S27, the extraction unit 219 confirms whether or not the virtual end effector V50 is placed at the position where the work on the virtual peripheral object is completed, based on the control signal related to the operation of the virtual end effector V50 and the purpose information. do. If it is determined in step S27 that the virtual end effector V50 is positioned at the work completion position, the simulation device 200 executes step S28. In step S28, the extraction unit 219 extracts the state change of the virtual peripheral object based on the purpose information, and changes the state of the virtual peripheral object in the virtual space VS based on the extracted state change.
 ステップS26,S28の次に、シミュレーション装置200はステップS29を実行する。ステップS29では、推移情報に基づく仮想生産装置V2の動作が完了したか否かをシミュレータ214が確認する。ステップS29において、推移情報に基づく仮想生産装置V2の動作は完了していないと判定した場合、シミュレーション装置200は処理をステップS23に戻す。ステップS29において、推移情報に基づく仮想生産装置V2の動作は完了したと判定した場合、シミュレーション装置200はシミュレーション手順を完了する。 After steps S26 and S28, the simulation device 200 executes step S29. In step S29, the simulator 214 confirms whether or not the operation of the virtual production apparatus V2 based on the transition information has been completed. If it is determined in step S29 that the operation of the virtual production apparatus V2 based on the transition information has not been completed, the simulation apparatus 200 returns the process to step S23. If it is determined in step S29 that the operation of the virtual production apparatus V2 based on the transition information has been completed, the simulation apparatus 200 completes the simulation procedure.
(表示更新手順)
 図13に示すように、シミュレーション装置200は、ステップS31を実行する。ステップS31では、スライドバー261が操作されたか否か(サム269が移動されたか否か)を表示部216が確認する。ステップS31において、スライドバー261が操作されたと判定した場合、シミュレーション装置200はステップS32,S33を実行する。ステップS32では、表示部216が、サム269の位置に基づいて、推移情報の時間軸における表示対象時刻を特定する。ステップS33では、表示部216が、表示対象時刻への画面遷移処理を実行する。画面遷移処理の内容については後述する。
(Display update procedure)
As shown in FIG. 13, the simulation device 200 executes step S31. In step S31, the display unit 216 confirms whether or not the slide bar 261 has been operated (whether or not the thumb 269 has been moved). When it is determined in step S31 that the slide bar 261 has been operated, the simulation device 200 executes steps S32 and S33. In step S<b>32 , the display unit 216 identifies the display target time on the time axis of the transition information based on the position of the thumb 269 . In step S33, the display unit 216 executes screen transition processing to the display target time. The contents of the screen transition processing will be described later.
 ステップS31において、スライドバー261は操作されていないと判定した場合、シミュレーション装置200はステップS34を実行する。ステップS34では、シミュレーションエリア250に対するドラッグ操作などにより、視点を変更する操作が行われたか否かを視点設定部241が確認する。ステップS34において、視点を変更する操作が行われたと判定した場合、シミュレーション装置200はステップS35,S36を実行する。ステップS35では、視点設定部241が仮想空間VSに対する視点を変更し、視点の変更履歴を視点記憶部242に記憶させる。ステップS36では、表示部216が、変更後の視点から見た仮想空間VSをシミュレーションエリア250に表示させる。 If it is determined in step S31 that the slide bar 261 has not been operated, the simulation device 200 executes step S34. In step S<b>34 , the viewpoint setting unit 241 confirms whether or not an operation to change the viewpoint has been performed, such as by dragging the simulation area 250 . If it is determined in step S34 that an operation for changing the viewpoint has been performed, the simulation device 200 executes steps S35 and S36. In step S35, the viewpoint setting unit 241 changes the viewpoint with respect to the virtual space VS, and causes the viewpoint storage unit 242 to store the viewpoint change history. In step S36, the display unit 216 causes the simulation area 250 to display the virtual space VS viewed from the changed viewpoint.
 ステップS34において、視点を変更する操作は行われていないと判定した場合、シミュレーション装置200はステップS37を実行する。ステップS37では、再生ボタン262による再生操作が行われたか否かを表示部216が確認する。ステップS37において再生操作が行われたと判定した場合、シミュレーション装置200はステップS38を実行する。ステップS38では、表示部216が再生処理を実行する。再生処理の内容については後述する。 If it is determined in step S34 that an operation to change the viewpoint has not been performed, the simulation device 200 executes step S37. In step S37, the display unit 216 confirms whether or not a reproduction operation has been performed using the reproduction button 262. FIG. If it is determined in step S37 that the reproduction operation has been performed, the simulation apparatus 200 executes step S38. In step S38, the display unit 216 executes reproduction processing. Details of the reproduction process will be described later.
 ステップS33,S36,S38の次に、シミュレーション装置200はステップS39を実行する。ステップS37において、再生操作は行われていないと判定した場合、シミュレーション装置200はステップS33,S36,S38を実行することなくステップS39を実行する。ステップS39では、ログ再生機能の終了が指示されたか否かを表示部216が確認する。ステップS39において、ログ再生機能の終了は指示されていないと判定した場合、シミュレーション装置200は処理をステップS31に戻す。ステップS39において、ログ再生機能の終了が指示されたと判定した場合、シミュレーション装置200は表示更新手順を完了する。 After steps S33, S36, and S38, the simulation device 200 executes step S39. If it is determined in step S37 that the reproduction operation has not been performed, the simulation apparatus 200 executes step S39 without executing steps S33, S36, and S38. In step S39, the display unit 216 confirms whether or not an instruction to end the log reproduction function has been given. If it is determined in step S39 that the termination of the log reproduction function has not been instructed, the simulation device 200 returns the process to step S31. If it is determined in step S39 that the termination of the log reproduction function has been instructed, the simulation apparatus 200 completes the display update procedure.
 図14は、ステップS33における画面遷移手順を例示するフローチャートである。図14に示すように、シミュレーション装置200は、ステップS41,S42,S43,S44を実行する。ステップS41では、表示部216が、表示対象時刻に対応するシミュレーション結果を結果保持部215から読み出す。ステップS42では、視点記憶部242が記憶する変更履歴に基づいて、視点設定部241が表示対象時刻に対応する視点を設定する。ステップS43では、表示部216が、読み出されたシミュレーション結果に基づいて、設定された視点から仮想空間VSを見たシミュレーション画像を生成する。ステップS44では、同期部222が、表示対象時刻に基づいて、時刻指標263の表示位置を特定する。 FIG. 14 is a flowchart illustrating the screen transition procedure in step S33. As shown in FIG. 14, the simulation device 200 executes steps S41, S42, S43, and S44. In step S<b>41 , the display unit 216 reads the simulation result corresponding to the display target time from the result holding unit 215 . In step S<b>42 , based on the change history stored in the viewpoint storage unit 242 , the viewpoint setting unit 241 sets the viewpoint corresponding to the display target time. In step S43, the display unit 216 generates a simulation image of the virtual space VS viewed from the set viewpoint based on the read simulation result. In step S44, the synchronization unit 222 identifies the display position of the time indicator 263 based on the display target time.
 次に、シミュレーション装置200はステップS45を実行する。ステップS45では、表示対象時刻においてイベントが発生したか否かをイベント通知部226が確認する。ステップS45において、イベントが発生したと判定した場合、シミュレーション装置200はステップS46を実行する。ステップS46では、表示部216が、生成済みのシミュレーション画像にイベントの発生通知を追加する。 Next, the simulation device 200 executes step S45. In step S45, the event notification unit 226 checks whether an event has occurred at the display target time. If it is determined in step S45 that an event has occurred, the simulation device 200 executes step S46. In step S46, the display unit 216 adds an event occurrence notification to the generated simulation image.
 次に、シミュレーション装置200はステップS47を実行する。ステップS45において、イベントは発生していないと判定した場合、シミュレーション装置200はステップS46を実行することなくステップS47を実行する。ステップS47では、表示対象時刻に撮影された撮影データがあるか否かをフレーム特定部233が確認する。ステップS47において、撮影データがあると判定した場合、シミュレーション装置200はステップS48を実行する。ステップS48では、フレーム特定部233が、撮影データに対応付けられた時刻に基づいて、表示対象時刻において現実空間で撮影されたフレームを特定する。表示部216は、フレーム特定部233により特定されたフレームを表示させる撮影データウィンドウ271,272を生成する。 Next, the simulation device 200 executes step S47. If it is determined in step S45 that no event has occurred, the simulation apparatus 200 executes step S47 without executing step S46. In step S47, the frame identification unit 233 confirms whether or not there is photographed data photographed at the display target time. If it is determined in step S47 that there is shooting data, the simulation device 200 executes step S48. In step S48, the frame identification unit 233 identifies the frame captured in the physical space at the display target time based on the time associated with the captured data. The display unit 216 generates captured data windows 271 and 272 for displaying the frames specified by the frame specifying unit 233 .
 次に、シミュレーション装置200はステップS49を実行する。ステップS47において、撮影データはないと判定した場合、シミュレーション装置200はステップS48を実行することなくステップS49を実行する。ステップS49では、表示部216が、生成したシミュレーション画像に基づいてシミュレーションエリア250の表示を更新し、特定された表示位置に時刻指標263を移動させ、生成した撮影データウィンドウ271,272を表示させる。以上で画面遷移手順が完了する。 Next, the simulation device 200 executes step S49. If it is determined in step S47 that there is no shooting data, the simulation device 200 executes step S49 without executing step S48. In step S49, the display unit 216 updates the display of the simulation area 250 based on the generated simulation image, moves the time index 263 to the specified display position, and displays the generated captured data windows 271 and 272. This completes the screen transition procedure.
 図15は、ステップS38における再生手順を例示するフローチャートである。図15に示すように、シミュレーション装置200は、ステップS51,S52,S53を実行する。ステップS51では、表示部216が、1周期(例えば上記通信周期)を加算して表示対象時刻を更新する。ステップS52では、表示部216が、更新済みの表示対象時刻への画面遷移処理を実行する。ステップS52の手順は、上述した画面遷移手順と同じである。ステップS53では、表示対象時刻が対象区間の終了時刻であるか否かを表示部216が確認する。 FIG. 15 is a flowchart illustrating the reproduction procedure in step S38. As shown in FIG. 15, the simulation device 200 executes steps S51, S52, and S53. In step S51, the display unit 216 adds one cycle (for example, the communication cycle described above) to update the display target time. In step S52, the display unit 216 executes screen transition processing to the updated display target time. The procedure of step S52 is the same as the screen transition procedure described above. In step S53, the display unit 216 confirms whether the display target time is the end time of the target section.
 ステップS53において、表示対象時刻が終了時刻ではないと判定した場合、シミュレーション装置200はステップS54を実行する。ステップS54では、視点を変更する操作が行われたか否かを視点設定部241が確認する。ステップS54において、視点を変更する操作が行われたと判定した場合、シミュレーション装置200はステップS55を実行する。ステップS55では、視点設定部241が仮想空間VSに対する視点を変更し、視点の変更履歴を視点記憶部242に記憶させる。視点の変更結果は、次の画面遷移処理にてシミュレーションエリア250の表示に反映される。 If it is determined in step S53 that the display target time is not the end time, the simulation device 200 executes step S54. In step S54, the viewpoint setting unit 241 confirms whether or not an operation for changing the viewpoint has been performed. If it is determined in step S54 that an operation for changing the viewpoint has been performed, the simulation device 200 executes step S55. In step S55, the viewpoint setting unit 241 changes the viewpoint with respect to the virtual space VS, and causes the viewpoint storage unit 242 to store the viewpoint change history. The result of changing the viewpoint is reflected in the display of the simulation area 250 in the next screen transition process.
 次に、シミュレーション装置200はステップS56を実行する。ステップS56では、再生ボタン262による停止操作が行われたか否かを表示部216が確認する。ステップS56において、停止操作は行われていないと判定した場合、シミュレーション装置200はステップS57を実行する。ステップS57では、ステップS51の実行から上記1周期が経過したか否かを表示部216が確認する。ステップS57において1周期は経過していないと判定した場合、シミュレーション装置200は処理をステップS54に戻す。ステップS57において1周期が経過したと判定した場合、シミュレーション装置200は処理をステップS51に戻す。 Next, the simulation device 200 executes step S56. In step S56, the display unit 216 confirms whether or not a stop operation has been performed using the playback button 262. FIG. If it is determined in step S56 that the stop operation has not been performed, the simulation device 200 executes step S57. In step S57, the display unit 216 confirms whether or not one cycle has passed since the execution of step S51. If it is determined in step S57 that one period has not elapsed, the simulation device 200 returns the process to step S54. If it is determined in step S57 that one cycle has passed, the simulation device 200 returns the process to step S51.
 ステップS53において、表示対象時刻が対象区間の終了時刻であると判定するか、ステップS56において停止操作が行われたと判定した場合、シミュレーション装置200は再生手順を完了する。 If it is determined in step S53 that the display target time is the end time of the target section, or if it is determined that a stop operation has been performed in step S56, the simulation device 200 completes the reproduction procedure.
〔まとめ〕
 以上に示した実施形態は、以下の構成を含む。
(1) ロボット5Bを含む生産装置2を制御するコントローラ3から収集された制御信号の推移を表す推移情報と、生産装置2のモデル情報とに基づいて、生産装置2に対応する仮想生産装置V2を仮想空間VSで動作させるシミュレータ214と、仮想生産装置V2が動作する仮想空間VSをモニタに表示させる表示部216と、を備える生産システム1。
 現実空間がくまなく撮影されていなくても、現実空間で発生した現象を再現することができる。再現結果を任意の視点で表示することもできる。従って、システムの動作の事後検証に有効である。
〔summary〕
The embodiments shown above include the following configurations.
(1) Create a virtual production device V2 corresponding to the production device 2 based on the transition information representing the transition of control signals collected from the controller 3 that controls the production device 2 including the robot 5B and the model information of the production device 2. in a virtual space VS, and a display unit 216 for displaying on a monitor the virtual space VS in which the virtual production device V2 operates.
Even if the real space is not fully photographed, it is possible to reproduce the phenomenon that occurred in the real space. Reproduction results can also be displayed from any viewpoint. Therefore, it is effective for ex-post verification of system operation.
(2) 仮想空間VSにおける仮想生産装置V2の動作に基づいて、仮想空間VSの状態変化を抽出する抽出部219を更に備え、表示部216は、抽出された仮想空間VSの状態変化をモニタに表示させる、(1)記載の生産システム1。
 データベースに記憶された制御信号の推移を、周辺物体の状態変化の再現にも有効利用することができる。仮想周辺物体の状態変化も表示に含めることで、現実空間で生じた現象を更に容易に検証することができる。
(2) It further comprises an extraction unit 219 for extracting state changes in the virtual space VS based on the operation of the virtual production apparatus V2 in the virtual space VS, and the display unit 216 displays the extracted state changes in the virtual space VS on a monitor. The production system 1 according to (1), which is displayed.
The transition of the control signal stored in the database can also be effectively used to reproduce the state change of surrounding objects. By including state changes of virtual surrounding objects in the display, phenomena occurring in the real space can be verified more easily.
(3) 抽出部219は、仮想空間VSに含まれる複数のモデル間の位置関係に基づいて、複数のモデルの少なくとも1つの状態変化を抽出する、(2)記載の生産システム1。
 位置関係に基づくことで、状態変化を容易に抽出することができる。
(3) The production system 1 according to (2), wherein the extraction unit 219 extracts at least one state change of the plurality of models based on the positional relationship between the plurality of models included in the virtual space VS.
Based on the positional relationship, state changes can be easily extracted.
(4) 抽出部219は、制御信号と、制御信号に関連付けられた目的情報とに基づいて、仮想空間VSに含まれる少なくとも1のモデルの状態変化を抽出する、(2)又は(3)記載の生産システム1。
 仮想周辺物体の状態変化をより正確に抽出することができる。
(4) (2) or (3), wherein the extraction unit 219 extracts state changes of at least one model included in the virtual space VS based on the control signal and purpose information associated with the control signal. production system 1.
It is possible to more accurately extract state changes of virtual peripheral objects.
(5) 仮想生産装置V2は、ロボット5Bが有するエンドエフェクタ50に対応する仮想エンドエフェクタV50を有する仮想ロボットV5Bを含み、抽出部219は、仮想空間VSが含む1以上の仮想周辺物体と、仮想エンドエフェクタV50との位置関係に基づいて、仮想エンドエフェクタV50が作用する仮想周辺物体を特定し、仮想エンドエフェクタV50の動作に関わる制御信号と目的情報とに基づいて、仮想周辺物体の状態変化を抽出する、(4)記載の生産システム1。
 仮想生産装置V2の動作に基づいて仮想周辺物体の状態を変化させる演算を更に容易に行うことができる。
(5) The virtual production apparatus V2 includes a virtual robot V5B having a virtual end effector V50 corresponding to the end effector 50 of the robot 5B. A virtual peripheral object on which the virtual end effector V50 acts is specified based on the positional relationship with the end effector V50, and a state change of the virtual peripheral object is detected based on the control signal and purpose information relating to the operation of the virtual end effector V50. The production system 1 according to (4), which extracts.
Calculations for changing the state of the virtual peripheral object based on the operation of the virtual production device V2 can be performed more easily.
(6) 制御信号の推移を表すグラフを生成するグラフ生成部221と、モニタに表示された仮想生産装置V2の動作における時点と、グラフにおける時点指標が示す時点とを同期させる同期部222と、を更に備え、表示部216は、仮想空間VSと、時点指標を含むグラフとをモニタに表示させる、(1)~(5)のいずれか記載の生産システム1。
 グラフにおける制御信号の値と、表示中の仮想空間VSとを瞬時に対応付けることができる。
(6) A graph generation unit 221 that generates a graph representing the transition of the control signal, a synchronization unit 222 that synchronizes the time point in the operation of the virtual production apparatus V2 displayed on the monitor with the time point indicated by the time point index in the graph, The production system 1 according to any one of (1) to (5), wherein the display unit 216 causes the monitor to display the virtual space VS and the graph including the time point index.
The value of the control signal in the graph can be instantly associated with the virtual space VS being displayed.
(7) 表示部216は、特定のイベントが発生した時点を示すイベント情報に基づいて、イベントを表すイベント指標を、イベント情報が表す時点に対応したグラフ上の位置に関連付けてモニタに表示させる、(6)記載の生産システム1。
 イベントが発生した際に、生産装置2がどのように動作し、制御信号がいかなる値であったかを瞬時に把握可能とすることで、イベントの要因特定の迅速化を図ることができる。
(7) The display unit 216 causes the monitor to display an event indicator representing the event based on the event information indicating the time when the specific event occurred, in association with the position on the graph corresponding to the time indicated by the event information. (6) The production system 1 described.
By making it possible to instantly grasp how the production apparatus 2 operates and what the value of the control signal is when an event occurs, it is possible to quickly identify the cause of the event.
(8) 表示部216は、特定のイベントが発生した時点を示すイベント情報に基づいて、イベント情報が表す時点に対応した仮想生産装置V2の動作に関連付けて、イベントが発生したことを表す通知を、モニタに表示させる、(6)又は(7)記載の生産システム1。
 イベントが発生した際に生産装置2がどのように動作していたかをより早く認識させることができる。
(8) The display unit 216, based on the event information indicating the time when the specific event occurred, displays a notification indicating the occurrence of the event in association with the operation of the virtual production apparatus V2 corresponding to the time indicated by the event information. , is displayed on the monitor, the production system 1 according to (6) or (7).
It is possible to quickly recognize how the production device 2 was operating when the event occurred.
(9) 制御信号に基づいて、特定のイベントの発生を検出して、イベントに関するイベント情報を生成する検出部223を更に有し、表示部216は、イベント情報に基づいて、イベントが発生したことを表す通知をモニタに表示させる、(6)~(8)のいずれか記載の生産システム1。
 制御信号に基づくイベントの検出が検出部223により行われるので、利便性が向上する。
(9) It further has a detection unit 223 that detects the occurrence of a specific event based on the control signal and generates event information about the event, and the display unit 216 detects the occurrence of the event based on the event information. The production system 1 according to any one of (6) to (8), wherein a notification representing is displayed on the monitor.
Since the detection unit 223 detects an event based on the control signal, convenience is improved.
(10) ロボット5Bは、多関節アーム10を有し、検出部223は、制御信号に基づいて、イベントとしてロボット5Bに過負荷が発生したことを検出する、(9)記載の生産システム1。
 過負荷の発生要因を迅速に認識させることができる。
(10) The production system 1 according to (9), wherein the robot 5B has the articulated arm 10, and the detection unit 223 detects that the robot 5B is overloaded as an event based on the control signal.
It is possible to quickly recognize the cause of the overload.
(11) 生産装置2は、可動範囲がロボット5Bと重複する周辺装置を更に含む、(10)のいずれか記載の生産システム1。
 周辺装置とロボット5Bとの衝突が生じ得るため、過負荷の発生要因を迅速に認識させる構成が更に有益である。
(11) The production system 1 according to any one of (10), wherein the production device 2 further includes a peripheral device whose movable range overlaps with that of the robot 5B.
Since a collision between the peripheral device and the robot 5B may occur, it is more beneficial to quickly recognize the cause of the overload.
(12) 表示部216は、現実空間で撮影された生産装置2の撮影データに基づいて、撮影データを表す撮影指標を、撮影データが撮影された時点に対応したグラフ上の位置に関連付けてモニタに表示させる、(6)~(11)のいずれか記載の生産システム1。
 現実空間の撮影データと、制御信号とを容易に対応付けることができる。
(12) Based on the photographed data of the production apparatus 2 photographed in the real space, the display unit 216 monitors the photographing index representing the photographed data in association with the position on the graph corresponding to the time when the photographed data was photographed. , the production system 1 according to any one of (6) to (11).
It is possible to easily associate the photographed data of the physical space with the control signal.
(13) 現実空間で撮影された生産装置2の撮影データに基づいて、仮想生産装置V2の動作が表す時点において現実空間で撮影されたフレームを特定するフレーム特定部233を更に備え、表示部216は、仮想空間VSと、特定されたフレームとをモニタに表示させる、(1)~(12)のいずれか記載の生産システム1。
 現実空間において撮影されたフレームと、仮想空間VSとの対応付けを容易にすることができる。
(13) The display unit 216 further includes a frame identification unit 233 that identifies a frame captured in the physical space at the time represented by the operation of the virtual production device V2 based on the captured data of the production device 2 captured in the physical space. displays the virtual space VS and the specified frame on the monitor, the production system 1 according to any one of (1) to (12).
It is possible to easily associate the frames captured in the physical space with the virtual space VS.
(14) ユーザ入力に基づいて、仮想空間VSに対する視点を設定する視点設定部241を更に備え、表示部216は、設定された視点から見た仮想空間VSを表示させる、(1)~(13)のいずれか記載の生産システム1。
 シミュレータ214による演算結果を、任意の視点からの表示に有効活用することができる。
(14) Further comprising a viewpoint setting unit 241 for setting a viewpoint for the virtual space VS based on user input, the display unit 216 displays the virtual space VS seen from the set viewpoint, (1) to (13) ).
The calculation result by the simulator 214 can be effectively used for display from any viewpoint.
(15) 視点の変更履歴を記憶する視点記憶部242を更に備え、表示部216は、仮想生産装置V2の動作を表示する際に、視点の変更履歴に基づいて、視点を変化させる、(14)記載の生産システム1。
 視点変更履歴を再利用して、利便性を向上させることができる。
(15) The display unit 216 further includes a viewpoint storage unit 242 that stores a viewpoint change history, and the display unit 216 changes the viewpoint based on the viewpoint change history when displaying the operation of the virtual production apparatus V2. ) production system 1 described.
Viewpoint change history can be reused to improve convenience.
(16) ユーザ入力に基づいて、少なくとも1のイベントを特定するイベント特定部228と、イベントが発生した時点を含む区間を設定する区間設定部227と、を更に有し、シミュレータ214は、区間設定部227が設定した区間における推移情報に基づいて、仮想生産装置V2を動作させる、(1)~(15)のいずれか記載の生産システム1。
 検証対象となり得るイベントを指定すれば、シミュレーションすべき期間が設定されるので、利便性が向上する。
(16) The simulator 214 further includes an event identification unit 228 that identifies at least one event based on user input, and an interval setting unit 227 that sets an interval including the point in time when the event occurs. The production system 1 according to any one of (1) to (15), wherein the virtual production device V2 is operated based on transition information in the section set by the unit 227.
If an event that can be verified is specified, the period for which the simulation should be performed is set, which improves convenience.
(17) 推移情報とモデルとに基づいて、生産装置2の構成と、仮想生産装置V2の構成とが一致しているか否かを確認する確認部245を更に備える、(1)~(16)のいずれか記載の生産システム1。
 ミスマッチによる不適切な検証を容易に防ぐことができる。
(17) Further comprising a confirmation unit 245 for confirming whether or not the configuration of the production device 2 and the configuration of the virtual production device V2 match based on the transition information and the model, (1) to (16) Production system 1 according to any one of
Inappropriate verification due to mismatches can be easily prevented.
(18) 生産装置2は、ロボット5Bを含む複数のマシン5を含み、コントローラ3は、複数のマシン5をそれぞれ制御する複数のローカルコントローラ400を含み、シミュレータ214は、複数のローカルコントローラ400から収集された複数種類の制御信号の推移を表す推移情報と、複数のマシン5のモデルとに基づいて、仮想生産装置V2を仮想空間VSで動作させる、(1)~(17)のいずれか記載の生産システム1。
 複数のマシン5の制御信号に基づくことで、生産装置2の動作を細部まで再現することができる。
(18) The production equipment 2 includes a plurality of machines 5 including robots 5B, the controller 3 includes a plurality of local controllers 400 that respectively control the plurality of machines 5, and the simulator 214 collects data from the plurality of local controllers 400. (1) to (17), wherein the virtual production apparatus V2 is operated in the virtual space VS based on transition information representing transitions of the plurality of types of control signals obtained and the models of the plurality of machines 5. production system 1;
Based on the control signals of a plurality of machines 5, the operation of the production device 2 can be reproduced in detail.
(19) ロボット5Bを含む生産装置2を制御するコントローラ3から収集された制御信号の推移を表す推移情報と、生産装置2のモデルとに基づいて、生産装置2に対応する仮想生産装置V2を仮想空間VSで動作させることと、仮想生産装置V2が動作する仮想空間VSをモニタに表示させることと、を含む、再現方法。
 以上、実施形態について説明したが、本発明は必ずしも上述した実施形態に限定されるものではなく、その要旨を逸脱しない範囲で様々な変更が可能である。
(19) A virtual production device V2 corresponding to the production device 2 is created based on the transition information representing the transition of control signals collected from the controller 3 that controls the production device 2 including the robot 5B and the model of the production device 2. A reproduction method including operating in a virtual space VS and displaying on a monitor the virtual space VS in which the virtual production device V2 operates.
Although the embodiments have been described above, the present invention is not necessarily limited to the above-described embodiments, and various modifications can be made without departing from the scope of the invention.
 1…生産システム、2…生産装置、W…ワーク、5…マシン、5B…ロボット、10…多関節アーム、50…エンドエフェクタ、3…コントローラ、400…ローカルコントローラ、214…シミュレータ、216…表示部、219…抽出部、VW…仮想ワーク、VS…仮想空間、V5B…仮想ロボット、V50…仮想エンドエフェクタ、221…グラフ生成部、222…同期部、223…検出部、227…区間設定部、228…イベント特定部、233…フレーム特定部、241…視点設定部、242…視点記憶部、245…確認部。 DESCRIPTION OF SYMBOLS 1... Production system, 2... Production apparatus, W... Work, 5... Machine, 5B... Robot, 10... Articulated arm, 50... End effector, 3... Controller, 400... Local controller, 214... Simulator, 216... Display part , 219... Extraction unit, VW... Virtual work, VS... Virtual space, V5B... Virtual robot, V50... Virtual end effector, 221... Graph generation unit, 222... Synchronization unit, 223... Detection unit, 227... Section setting unit, 228 ... event identification unit, 233 ... frame identification unit, 241 ... viewpoint setting unit, 242 ... viewpoint storage unit, 245 ... confirmation unit.

Claims (19)

  1.  ロボットを含む生産装置を制御するコントローラから収集された制御信号の推移を表す推移情報と、前記生産装置のモデル情報とに基づいて、前記生産装置に対応する仮想生産装置を仮想空間で動作させるシミュレータと、
     前記仮想生産装置が動作する前記仮想空間をモニタに表示させる表示部と、
    を備える生産システム。
    A simulator that operates a virtual production device corresponding to the production device in a virtual space based on transition information representing the transition of control signals collected from a controller that controls the production device including the robot and model information of the production device. and,
    a display unit for displaying on a monitor the virtual space in which the virtual production apparatus operates;
    production system with
  2.  前記仮想空間における前記仮想生産装置の動作に基づいて、前記仮想空間の状態変化を抽出する抽出部を更に備え、
     前記表示部は、抽出された前記仮想空間の状態変化を前記モニタに表示させる、
    請求項1記載の生産システム。
    further comprising an extraction unit for extracting state changes in the virtual space based on the operation of the virtual production device in the virtual space;
    The display unit causes the monitor to display the extracted state change of the virtual space.
    The production system according to claim 1.
  3.  前記抽出部は、前記仮想空間に含まれる複数のモデル間の位置関係に基づいて、前記複数のモデルの少なくとも1つの前記状態変化を抽出する、請求項2記載の生産システム。 The production system according to claim 2, wherein the extraction unit extracts the state change of at least one of the plurality of models based on the positional relationship between the plurality of models included in the virtual space.
  4.  前記抽出部は、前記制御信号と、前記制御信号に関連付けられた目的情報とに基づいて、前記仮想空間に含まれる少なくとも1のモデルの状態変化を抽出する、
    請求項2記載の生産システム。
    The extraction unit extracts state changes of at least one model included in the virtual space based on the control signal and purpose information associated with the control signal.
    3. The production system according to claim 2.
  5.  前記仮想生産装置は、前記ロボットが有するエンドエフェクタに対応する仮想エンドエフェクタを有する仮想ロボットを含み、
     前記抽出部は、
      前記仮想空間が含む1以上の仮想周辺物体と、前記仮想エンドエフェクタとの位置関係に基づいて、前記仮想エンドエフェクタが作用する仮想周辺物体を特定し、
      前記仮想エンドエフェクタの動作に関わる前記制御信号と前記目的情報とに基づいて、前記仮想周辺物体の状態変化を抽出する、
    請求項4記載の生産システム。
    The virtual production device includes a virtual robot having a virtual end effector corresponding to the end effector of the robot,
    The extractor is
    identifying a virtual peripheral object on which the virtual end effector acts based on the positional relationship between one or more virtual peripheral objects included in the virtual space and the virtual end effector;
    extracting a state change of the virtual peripheral object based on the control signal and the objective information related to the operation of the virtual end effector;
    The production system according to claim 4.
  6.  前記制御信号の推移を表すグラフを生成するグラフ生成部と、
     前記モニタに表示された前記仮想生産装置の動作における時点と、前記グラフにおける時点指標が示す時点とを同期させる同期部と、
    を更に備え、
     前記表示部は、前記仮想空間と、前記時点指標を含む前記グラフとを前記モニタに表示させる、
    請求項1~5のいずれか一項記載の生産システム。
    a graph generator that generates a graph representing the transition of the control signal;
    a synchronizing unit for synchronizing the time point of the operation of the virtual production apparatus displayed on the monitor with the time point indicated by the time point index in the graph;
    further comprising
    The display unit causes the monitor to display the virtual space and the graph including the time point index.
    The production system according to any one of claims 1-5.
  7.  前記表示部は、特定のイベントが発生した時点を示すイベント情報に基づいて、前記イベントを表すイベント指標を、前記イベント情報が表す時点に対応した前記グラフ上の位置に関連付けて前記モニタに表示させる、
    請求項6記載の生産システム。
    The display unit causes the monitor to display an event indicator representing the event in association with a position on the graph corresponding to the time represented by the event information based on event information representing a time at which a specific event occurs. ,
    The production system according to claim 6.
  8.  前記表示部は、特定のイベントが発生した時点を示すイベント情報に基づいて、前記イベント情報が表す時点に対応した前記仮想生産装置の動作に関連付けて、前記イベントが発生したことを表す通知を、前記モニタに表示させる、
    請求項6記載の生産システム。
    The display unit, based on event information indicating a time point at which a specific event occurs, in association with the operation of the virtual production apparatus corresponding to the time point indicated by the event information, displays a notification indicating the occurrence of the event, displaying on the monitor;
    The production system according to claim 6.
  9.  前記制御信号に基づいて、特定のイベントの発生を検出して、前記イベントに関するイベント情報を生成する検出部を更に有し、
     前記表示部は、前記イベント情報に基づいて、前記イベントが発生したことを表す通知を前記モニタに表示させる、
    請求項6記載の生産システム。
    further comprising a detection unit that detects occurrence of a specific event based on the control signal and generates event information about the event;
    The display unit causes the monitor to display a notification indicating that the event has occurred, based on the event information.
    The production system according to claim 6.
  10.  前記ロボットは、多関節アームを有し、
     前記検出部は、前記制御信号に基づいて、前記イベントとして前記ロボットに過負荷が発生したことを検出する、
    請求項9記載の生産システム。
    The robot has an articulated arm,
    The detection unit detects that the robot is overloaded as the event based on the control signal.
    The production system according to claim 9.
  11.  前記生産装置は、可動範囲が前記ロボットと重複する周辺装置を更に含む、
    請求項10記載の生産システム。
    The production device further includes a peripheral device whose movable range overlaps with that of the robot,
    The production system according to claim 10.
  12.  前記表示部は、現実空間で撮影された前記生産装置の撮影データに基づいて、前記撮影データを表す撮影指標を、前記撮影データが撮影された時点に対応した前記グラフ上の位置に関連付けて前記モニタに表示させる、
    請求項6記載の生産システム。
    Based on the photographed data of the production apparatus photographed in real space, the display unit associates a photographing index representing the photographed data with a position on the graph corresponding to the time when the photographed data was photographed. display on the monitor
    The production system according to claim 6.
  13.  現実空間で撮影された前記生産装置の撮影データに基づいて、前記仮想生産装置の動作が表す時点において前記現実空間で撮影されたフレームを特定するフレーム特定部を更に備え、
     前記表示部は、前記仮想空間と、特定された前記フレームとを前記モニタに表示させる、
    請求項1~5のいずれか一項記載の生産システム。
    further comprising a frame identification unit that identifies a frame captured in the physical space at a point in time represented by the operation of the virtual production device based on the captured data of the production device captured in the physical space;
    The display unit causes the monitor to display the virtual space and the specified frame.
    The production system according to any one of claims 1-5.
  14.  ユーザ入力に基づいて、前記仮想空間に対する視点を設定する視点設定部を更に備え、
     前記表示部は、設定された前記視点から見た前記仮想空間を表示させる、
    請求項1~5のいずれか一項記載の生産システム。
    further comprising a viewpoint setting unit that sets a viewpoint for the virtual space based on user input;
    the display unit displays the virtual space viewed from the set viewpoint;
    The production system according to any one of claims 1-5.
  15.  前記視点の変更履歴を記憶する視点記憶部を更に備え、
     前記表示部は、前記仮想生産装置の動作を表示する際に、前記視点の変更履歴に基づいて、前記視点を変化させる、
    請求項14記載の生産システム。
    further comprising a viewpoint storage unit that stores a change history of the viewpoint;
    The display unit changes the viewpoint based on the change history of the viewpoint when displaying the operation of the virtual production apparatus.
    15. The production system according to claim 14.
  16.  ユーザ入力に基づいて、少なくとも1のイベントを特定するイベント特定部と、
     前記イベントが発生した時点を含む区間を設定する区間設定部と、
    を更に有し、
     前記シミュレータは、前記区間設定部が設定した区間における前記推移情報に基づいて、前記仮想生産装置を動作させる、請求項1~5のいずれか一項記載の生産システム。
    an event identifier that identifies at least one event based on user input;
    an interval setting unit that sets an interval including the point in time when the event occurs;
    further having
    6. The production system according to any one of claims 1 to 5, wherein said simulator operates said virtual production apparatus based on said transition information in the section set by said section setting unit.
  17.  前記推移情報と前記モデルとに基づいて、前記生産装置の構成と、前記仮想生産装置の構成とが一致しているか否かを確認する確認部を更に備える、
    請求項1~5のいずれか一項記載の生産システム。
    further comprising a confirmation unit that confirms whether the configuration of the production apparatus and the configuration of the virtual production apparatus match based on the transition information and the model;
    The production system according to any one of claims 1-5.
  18.  前記生産装置は、前記ロボットを含む複数のマシンを含み、
     前記コントローラは、前記複数のマシンをそれぞれ制御する複数のローカルコントローラを含み、
     前記シミュレータは、前記複数のローカルコントローラから収集された複数種類の制御信号の推移を表す前記推移情報と、前記複数のマシンのモデルとに基づいて、前記仮想生産装置を前記仮想空間で動作させる、
    請求項1~5のいずれか一項記載の生産システム。
    The production equipment includes a plurality of machines including the robot,
    the controller includes a plurality of local controllers respectively controlling the plurality of machines;
    The simulator causes the virtual production apparatus to operate in the virtual space based on the transition information representing transitions of the plurality of types of control signals collected from the plurality of local controllers and the models of the plurality of machines.
    The production system according to any one of claims 1-5.
  19.  ロボットを含む生産装置を制御するコントローラから収集された制御信号の推移を表す推移情報と、前記生産装置のモデルとに基づいて、前記生産装置に対応する仮想生産装置を仮想空間で動作させることと、
     前記仮想生産装置が動作する前記仮想空間をモニタに表示させることと、
    を含む、再現方法。
    Operating a virtual production device corresponding to the production device in a virtual space based on transition information representing transition of control signals collected from a controller that controls the production device including the robot and a model of the production device. ,
    displaying on a monitor the virtual space in which the virtual production device operates;
    How to reproduce, including
PCT/JP2023/008145 2022-03-03 2023-03-03 Production system and reproduction method WO2023167324A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263315982P 2022-03-03 2022-03-03
US63/315,982 2022-03-03

Publications (1)

Publication Number Publication Date
WO2023167324A1 true WO2023167324A1 (en) 2023-09-07

Family

ID=87883811

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/008145 WO2023167324A1 (en) 2022-03-03 2023-03-03 Production system and reproduction method

Country Status (1)

Country Link
WO (1) WO2023167324A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0619922A (en) * 1990-03-24 1994-01-28 Reflex Mfg Syst Ltd Network field interface for production system
JP2011039731A (en) * 2009-08-10 2011-02-24 Takenaka Komuten Co Ltd Virtual space use type apparatus control system, real space control system and virtual space use type apparatus control program
JP2021056900A (en) * 2019-09-30 2021-04-08 キヤノン株式会社 Image processor, image processing method, and program
JP2021180507A (en) * 2019-05-13 2021-11-18 株式会社安川電機 Machine control system, program, machine, and communication method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0619922A (en) * 1990-03-24 1994-01-28 Reflex Mfg Syst Ltd Network field interface for production system
JP2011039731A (en) * 2009-08-10 2011-02-24 Takenaka Komuten Co Ltd Virtual space use type apparatus control system, real space control system and virtual space use type apparatus control program
JP2021180507A (en) * 2019-05-13 2021-11-18 株式会社安川電機 Machine control system, program, machine, and communication method
JP2021056900A (en) * 2019-09-30 2021-04-08 キヤノン株式会社 Image processor, image processing method, and program

Similar Documents

Publication Publication Date Title
CN108161904B (en) Robot on-line teaching device based on augmented reality, system, method, equipment
JP3708083B2 (en) Robot teaching device
US20200101599A1 (en) Robot controller and display device using augmented reality and mixed reality
US20160303737A1 (en) Method and apparatus for robot path teaching
EP3579100A1 (en) Apparatus and method for skill-based robot programming
JP2020075354A (en) External input device, robot system, control method for robot system, control program, and recording medium
JP2004351570A (en) Robot system
US11826913B2 (en) Control system, robot system and control method
JP2003117863A (en) Robot simulation device
JP2018167334A (en) Teaching device and teaching method
KR102525831B1 (en) Control system, controller and control method
US20230107431A1 (en) Comparison between real control and virtual control of robot
WO2020054281A1 (en) Robot system, robot system control device, robot system control method, imaging device, control program, and recording medium
JP2022009322A (en) Information collection device, information collection method and program for production system
CN113874177A (en) Robot system, recovery program generation device, control support device, control device, program, recovery program generation method, and recovery program output method
WO2023167324A1 (en) Production system and reproduction method
JP2019188545A (en) Robot control device
JP7374867B2 (en) Control system, local controller and control method
CN114161424B (en) Control method and control system for dynamic braking of SCARA robot
JP6825134B2 (en) Control device, work work device, work work system and control method
CN111899629B (en) Flexible robot teaching system and method
JP2577003B2 (en) Robot control method
Ohsato et al. Robot Tele-Operation/Teaching System using VR Device
WO2022091306A1 (en) Data collection device, program, and data collection method
Emrick Shared Autonomous System for Robot-Assisted Sewing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23763581

Country of ref document: EP

Kind code of ref document: A1