WO2022190545A1 - Safety verification device, safety verification method, and program - Google Patents
Safety verification device, safety verification method, and program Download PDFInfo
- Publication number
- WO2022190545A1 WO2022190545A1 PCT/JP2021/047127 JP2021047127W WO2022190545A1 WO 2022190545 A1 WO2022190545 A1 WO 2022190545A1 JP 2021047127 W JP2021047127 W JP 2021047127W WO 2022190545 A1 WO2022190545 A1 WO 2022190545A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional
- work area
- robot
- safety verification
- dimensional model
- Prior art date
Links
- 238000012795 verification Methods 0.000 title claims abstract description 54
- 238000000034 method Methods 0.000 title claims description 21
- 238000004088 simulation Methods 0.000 claims abstract description 87
- 230000002093 peripheral effect Effects 0.000 claims abstract description 32
- 238000012544 monitoring process Methods 0.000 claims description 23
- 238000010586 diagram Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 238000005094 computer simulation Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000002366 time-of-flight method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/06—Safety devices
Definitions
- the present invention relates to a safety verification device, a safety verification method, and a program.
- FA factor automation
- computer simulation is beginning to be used for teaching and verifying the operation of industrial robots. For example, by arranging a 3D model of the robot and its peripheral elements (workbench, work, other machines, safety fences, etc.) can be verified and whether there are any safety issues.
- Patent Document 1 discloses an interference determination device that performs a simulation using a three-dimensional model of a robot, a workpiece, and a peripheral arrangement (such as a tray), and determines interference with the workpiece and the peripheral arrangement while the robot is moving. ing.
- the present invention has been devised in view of the above circumstances, and aims to provide a technique for easily verifying whether the actual environment has been constructed according to the design assumed in the simulation.
- the present invention adopts the following configuration.
- a first aspect of the present invention includes real environment information acquisition means for acquiring three-dimensional space information representing a work area of a robot, and virtual environment information for acquiring a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged. At least one of the robot and the peripheral elements is moved between the virtual space of the simulation and the actual work area by using the obtaining means, the obtained three-dimensional model, and the three-dimensional space information of the work area.
- a safety verification device comprising comparison means for comparing configurations, and output means for outputting a result of comparison by the comparison means.
- 3D space information is information that represents the actual environment (robot's work area).
- three-dimensional space information may be obtained by measuring the actual environment (work area of the robot) with a three-dimensional sensor.
- the three-dimensional space information includes, for example, three-dimensional shape/position information of the robot and its peripheral elements.
- a three-dimensional model is information representing a virtual environment rather than a measurement of the real environment.
- the three-dimensional model may be data that the simulator uses for simulation.
- the safety verification device compares the two and confirms whether or not there is a difference in the configuration of the robot and peripheral elements.
- the output means may output a comparison screen between the three-dimensional model and the three-dimensional space information of the work area to a display device. This allows easy and visual confirmation of the match/difference between the virtual space of the simulation and the actual work area.
- the output means may superimpose the three-dimensional model and the three-dimensional space information on the comparison screen.
- Such a comparison screen makes it easy to compare the shape/arrangement assumed in the simulation with the actual shape/arrangement.
- the output means may output the difference portion between the three-dimensional model and the three-dimensional space information on the comparison screen in a manner identifiable from other portions. With such a comparison screen, it is possible to easily and intuitively grasp the difference between the shape/arrangement assumed in the simulation and the actual shape/arrangement.
- the output means provides, on the comparison screen, a first difference portion that exists in the three-dimensional model but does not exist in the three-dimensional space information, and a first difference portion that does not exist in the three-dimensional model but does not exist in the three-dimensional space.
- the three-dimensional model for simulation is modified so as to match the shapes and positions of the robot and peripheral elements in the actual work area. Therefore, highly accurate simulation can be easily realized under the same conditions as the actual work area.
- a monitoring system having a three-dimensional sensor may be installed to monitor the work area of the robot, and the real environment information acquiring means may acquire three-dimensional space information of the work area from the monitoring system. .
- the three-dimensional sensor of the monitoring system it is not necessary to prepare a dedicated three-dimensional sensor for the safety verification device, so it is possible to reduce costs and improve convenience.
- the comparison means may perform coordinate matching between the three-dimensional model and the three-dimensional space information of the work area based on the position of the three-dimensional sensor of the monitoring system. Since the three-dimensional spatial information is generated from the measurement results of the three-dimensional sensor, it is easy to express it in a coordinate system based on the position of the three-dimensional sensor. Therefore, it is possible to easily match the coordinates of the virtual space (three-dimensional model) and the real space (work area).
- the comparison means may have a user interface for correcting the three-dimensional space information of the work area before comparison with the three-dimensional model.
- a user interface for correcting the three-dimensional space information of the work area before comparison with the three-dimensional model.
- a second aspect of the present invention includes a step of acquiring three-dimensional space information representing a work area of a robot, a step of acquiring a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged, and a step of acquiring the acquired three-dimensional model. comparing configurations of the robot and/or the peripheral elements between a simulated virtual space and an actual work area using the dimensional model and the three-dimensional spatial information of the work area; and outputting the result of the comparing step.
- a third aspect of the present invention provides a program for causing a computer to execute each step of the above safety verification method.
- the present invention may be regarded as a safety verification device or a real environment confirmation device having at least part of the above means.
- the present invention also provides a system comprising a safety verification device and a three-dimensional sensor, a system comprising a safety verification device and a monitoring system, a system comprising a safety verification device and a simulator, or a system comprising a safety verification device, a simulator and a monitoring system.
- the present invention may be regarded as a safety verification method having at least part of the above processing, a control method for a safety verification device, or the like, or a program for realizing such a method or a non-temporarily recorded program for the program. It can also be regarded as a recording medium. It should be noted that each of the means and processes described above can be combined with each other as much as possible to constitute the present invention.
- FIG. 1 is a diagram schematically showing a configuration example of a simulation system including a safety verification device according to one embodiment of the present invention.
- FIG. 2 is a block diagram showing a configuration example of a simulation system.
- FIG. 3 is a flow chart showing the flow of safety verification processing by the safety verification device.
- FIG. 4 is a diagram showing an example of a simulation (virtual environment) and a work area (real environment) in a factory.
- FIG. 5 is a diagram showing an example of a comparison screen.
- FIG. 6 is a diagram showing an example of a comparison screen.
- FIG. 7 is a diagram showing an example of a user interface for correcting three-dimensional spatial information.
- FIG. 1 schematically shows a configuration example of a simulation system 1 including a safety verification device according to one embodiment of the present invention.
- This simulation system 1 is a system for performing teaching (control program development), simulation, safety verification, etc. of an industrial robot (hereinafter referred to as "robot") 21.
- the simulation system 1 includes a safety verification device 10 and a simulator 11.
- the simulation system 1 can communicate with a robot controller 20 for controlling the robot 21 and a monitoring system 30 for monitoring the work area of the robot 21 via a network.
- a user who maintains or manages the robot 21 may use the simulation system 1 to remotely access the robot controller 20 and the monitoring system 30 installed in the factory via the Internet.
- the simulator 11 is an integrated development tool that has a control program development function for the robot 21 and a simulation function using a three-dimensional model. That is, the user creates a control program for the robot controller 20 using the simulator 11, executes the control program using an emulator, confirms the motion and arrangement of the robot and its peripheral elements in the virtual space, It is possible to debug programs, verify operations, verify safety, and the like.
- the created control program can be installed in the robot controller 20 via the network.
- teaching and operation verification can be performed without an actual machine (robot 21 or robot controller 20). Therefore, there are advantages such as the ability to significantly reduce the startup time when equipment is introduced into a factory, and the ability to perform teaching and maintenance by remote control.
- the simulator 11 is a virtual environment in which a three-dimensional model (eg, CAD data) of the robot 21 and its peripheral elements (eg, workbench, work, tools, parts, other machines, safety fences, etc.) is arranged in a virtual space. to run the simulation. If this virtual environment sufficiently matches the real environment of the factory, the validity and reliability of the simulation results are guaranteed. However, in reality, there are cases where some difference exists between the virtual environment and the real environment. The existence of critical differences should not be overlooked, but it is not easy for humans to discover all the differences between the virtual environment and the real environment.
- CAD data three-dimensional model
- peripheral elements eg, workbench, work, tools, parts, other machines, safety fences, etc.
- the safety verification device 10 provides a function to compare the virtual environment used by the simulator 11 and the real environment in the factory and determine whether there is a difference. Specifically, by measuring the work area of the robot 21 with the three-dimensional sensor 31, three-dimensional space information of the work area is generated and provided to the safety verification device 10 as real environment information. The safety verification device 10 compares this real environment information with the three-dimensional model (virtual environment information) obtained from the simulator 11, and determines the configuration of the robot and its peripheral elements (that is, the three-dimensional shape, position, and size). etc.) is different. Then, it outputs the comparison result between the real environment and the virtual environment.
- the three-dimensional model virtual environment information
- a notification may be output to notify the user of the problem.
- a notification to the effect that the safety as simulated is ensured may be output.
- the safety verification device 10 may make necessary corrections to the simulation information and the control program.
- the user can easily verify whether or not the actual environment is constructed according to the design assumed in the simulation. That is, when there is no difference or the difference is negligible, the operation and safety as designed can be guaranteed. Conversely, if it is found that there is a non-negligible difference, it can lead to the discovery of installation errors, the review of safety measures, the necessity of off-line teaching and re-simulation, and the like.
- FIG. 2 is a block diagram showing a configuration example of the simulation system 1 according to this embodiment.
- the simulation system 1 has a safety verification device 10, a simulator 11, a display device 12, and an input device 13 as main components.
- the safety verification device 10 has a real environment information acquisition unit 100 , a virtual environment information acquisition unit 101 , a comparison unit 102 , an output unit 103 and a simulation data update unit 104 .
- the simulator 11 has a simulation data storage unit 110 that stores simulation data.
- the real environment information acquisition unit 100 acquires 3D space information (sensor data) of the work area measured by the 3D sensor 31 .
- This three-dimensional space information includes the three-dimensional space of the robot 21 installed in the work area and the surrounding elements (for example, workbench, work, tools, parts, other machines, safety fences, etc.). information such as shape, position, and size.
- point cloud data is used as an example in this embodiment.
- the virtual environment information acquisition unit 101 acquires a three-dimensional model (simulation data) used for simulation from the simulator 11 .
- This three-dimensional model includes a three-dimensional model of the robot and its surrounding elements.
- the specific data format of the three-dimensional model does not matter, for example, data generated from three-dimensional CAD design information is used.
- the comparison unit 102 compares the 3D space information (sensor data) of the work area with the 3D model (simulation data) used for the simulation. Then, the comparison unit 102 determines whether or not the configurations of the robot and/or peripheral elements are different between the virtual space of the simulation and the actual work area.
- the output unit 103 outputs the comparison results and determination results obtained by the comparison unit 102 .
- the output destination of the information is the display device 12 provided in the simulation system 1, an external device (for example, another computer, PLC, etc.), and the like.
- the simulation data update unit 104 is a function that updates the 3D model according to the 3D space information (sensor data) based on the comparison results and determination results from the comparison unit 102 .
- the updated three-dimensional model is stored in the simulation data storage unit 110 and used for re-simulation.
- the simulation data storage unit 110 stores data groups used by the simulator 11 for simulation.
- the simulation data may include, for example, a three-dimensional model of the robot and its peripheral elements, a control program for the robot controller, various condition settings during execution of the simulation, and the like.
- the simulator 11 can communicate with the robot controller 20, write a control program to the robot controller 20, and acquire information on the robot controller 20 side.
- the display device 12 is a device for displaying information, and uses, for example, a liquid crystal display or an organic EL display.
- the input device 13 is a device for inputting information, and uses, for example, a keyboard, a mouse, a touch panel, and the like.
- a touch panel display that serves as both the display device 12 and the input device 13 may be used.
- the simulation system 1 may be composed of a personal computer equipped with a processor, memory, storage, etc., for example.
- the processor loads the program stored in the storage into the memory and executes the program to provide the functions of the units shown in FIG.
- the configuration is not limited to this configuration, and some or all of the functions in FIG. 2 may be configured by ASIC, FPGA, or the like, or may be realized by cloud computing or distributed computing.
- the three-dimensional sensor 31 is a device that measures three-dimensional spatial information. Any type of sensor may be used as long as it can measure the three-dimensional shape, position, and size of an object existing within the work area. Either an active system or a passive system may be used, and any principle such as a TOF camera, a stereo camera, structured illumination, or LiDAR may be used. In this embodiment, a TOF sensor (TOF camera) that acquires a range image from the time of flight (TOF) of light using infrared light is used. Note that the TOF method includes a direct type and an indirect type, and either of them may be used. A TOF sensor of the type that obtains distance information is used.
- the three-dimensional sensor 31 of the monitoring system 30 installed in the work area of the robot 21 is used.
- the reason for this is that by using the three-dimensional sensor 31 of the monitoring system 30, which is an existing facility, it is not necessary to prepare a dedicated three-dimensional sensor for the safety verification device 10, thereby reducing costs and improving convenience. This is because it is possible to In addition, since the three-dimensional sensor 31 of the monitoring system 30 is fixed at an optimum position where the work area of the robot 21 can be measured, necessary and sufficient information for safety verification (comparison with virtual environment information) is obtained. A range of sensor data can be reliably obtained.
- the installation position (relative positional relationship with respect to the work area and the robot 21) is known, there is an advantage that it is easy to perform coordinate matching with virtual environment information, which will be described later.
- the three-dimensional sensor 31 of the monitoring system 30 is used, but of course, a three-dimensional sensor unrelated to the monitoring system 30 or an unfixed three-dimensional sensor may be used.
- a three-dimensional sensor unrelated to the monitoring system 30 or an unfixed three-dimensional sensor may be used.
- the work area is measured from multiple different viewpoints, and the measurement results are integrated.
- Three-dimensional spatial information may be generated.
- a fixed three-dimensional sensor 31 such as that shown in FIG. 1 inevitably has blind spots (areas that cannot be measured). You can expect to get
- FIG. 3 is a flow chart showing the flow of processing by the safety verification device.
- This safety verification compares the virtual environment used in the simulation with the actual environment in the factory, and checks for any differences between the virtual environment and the actual environment to ensure that the designed safety is maintained during the simulation (during teaching). This is the process of verifying whether or not this is guaranteed even on site. Therefore, before the process of FIG. 3 is performed, (1) a simulation is performed by the simulator 11, the simulation data used at that time is stored in the simulation storage unit 110, and (2) the robot and its peripheral elements It is assumed that such facilities are installed in the factory.
- step S100 the real environment information acquisition unit 100 acquires the three-dimensional space information of the work area of the factory from the monitoring system 30.
- the real environment information acquisition unit 100 may directly receive data from the monitoring system 30 online, or may read the data of the three-dimensional space information offline or indirectly.
- step S101 the virtual environment information acquisition unit 101 acquires the three-dimensional model used for the simulation from the simulator 11.
- step S102 the comparison unit 102 performs coordinate matching between the three-dimensional space information acquired in step S100 and the three-dimensional model acquired in step S101.
- the comparison unit 102 preferably performs coordinate matching based on the position (more specifically, the viewpoint) of the three-dimensional sensor 31 of the monitoring system 30 .
- the measurement results (raw data) of the three-dimensional sensor 31 are generally expressed in a coordinate system (also called camera coordinate system) based on the position (viewpoint) of the three-dimensional sensor 31 .
- a coordinate system also called camera coordinate system
- the comparison unit 102 compares the 3D space information and the 3D model.
- the three-dimensional space information is provided by point cloud data, for example, and the three-dimensional model is provided by so-called vector data such as CAD data.
- vector data such as CAD data.
- the comparison unit 102 preferably outputs the determination results of (1) to (3) below for each three-dimensional model or each point group cluster, for example.
- Objects that do not exist in the 3D model are included in the 3D space information.
- objects that were not assumed in the simulation exist in the real environment. For example, if the shape of the robot itself is different, or if an object protrudes from the robot's flow line, there is a risk of unintended interference (collision), so this judgment result is also a critical issue. there is a possibility.
- step S ⁇ b>104 the output unit 103 outputs the comparison result and determination result from the comparison unit 102 .
- the user can see the output contents of the output unit 103 to determine whether there is a difference between the real environment and the virtual environment, and if there is a difference, whether it is a critical problem. Then, if necessary, improvements can be made, such as supplementing safety measures that were lacking in the real environment or eliminating objects that may interfere.
- FIG. 4 shows an example of a simulation (virtual environment) and a work area (actual environment) in a factory.
- a robot 21v and a workpiece 41v are placed on a workbench 40v, and a safety fence 44v is provided on the left side of the workbench 40v.
- Reference numeral 31v indicates a three-dimensional sensor.
- the robot 21 and the workpiece 41 are placed on the workbench 40, which is the same as in the virtual environment.
- the tool 42 is arranged in front of the robot 21 and the safety fence is not provided on the left side of the workbench 40 .
- FIG. 5 is an example of a comparison screen displayed on the display device 12.
- the left side of the screen is the display of simulation data (three-dimensional model), and the right side is the display of sensor data (three-dimensional spatial information).
- the safety fence 44v is highlighted. This is the difference corresponding to the determination result (2).
- the part of the tool 42 is highlighted. This is the difference corresponding to the determination result (3).
- Fig. 6 is another example of the comparison screen.
- the three-dimensional model and the three-dimensional space information are displayed side by side on the comparison screen of FIG. 5, the three-dimensional model and the three-dimensional space information are displayed superimposed on the comparison screen of FIG.
- Such a display makes it easier to compare the shape/arrangement assumed in the simulation with the actual shape/arrangement.
- the difference portion (first difference portion) corresponding to the determination result (2) and the difference portion (second difference portion) corresponding to the determination result (3) are displayed in different manners. it's shown. Since the response to be taken may differ between determination result (2) and determination result (3), if both are notified separately in this way, convenience for the user is improved.
- the simulation data updating unit 104 may update the three-dimensional model in accordance with the three-dimensional space information (sensor data). This update process is executed after step S103 in FIG. 3, for example.
- the simulation data updating unit 104 should generate a three-dimensional model corresponding to the tool 42 existing in the real environment and add it to the simulation data. Then, the simulator 11 can check whether or not the robot 21 and the tool 42 will interfere by performing a re-simulation using the updated simulation data.
- FIG. 7 is an example of a user interface for correcting three-dimensional spatial information provided by the comparison unit 102.
- FIG. Three-dimensional space information (point cloud data) and a three-dimensional model (broken line) are superimposed on the screen.
- the mouse pointer When the user operates the mouse pointer to select an area and press the "delete” button, the point cloud in the area is deleted. Also, when a region is selected and the "Add" button is pressed, point clouds within the region are interpolated. By performing such manual correction, well-ordered three-dimensional spatial information as shown in the lower part of FIG. 7 can be obtained.
- the above-described embodiment is merely an example of the configuration of the present invention.
- the present invention is not limited to the specific forms described above, and various modifications are possible within the technical scope of the present invention.
- the comparison screens and highlighting shown in FIGS. 5 and 6 are merely examples, and any form of display may be used as long as the purpose can be achieved.
- the safety verification device and the simulator are provided in the same system, but the safety verification device and the simulator may be separate devices. Alternatively, the safety verifier and monitoring system may be provided within the same system.
- real environment information acquisition means 10 for acquiring three-dimensional space information representing the work area of the robot (21); virtual environment information acquisition means for acquiring a three-dimensional model (21v, 40v, 41v, 44v) of a virtual space in which the robot and its peripheral elements are arranged;
- a safety verification device (10) characterized in that it comprises:
- [2] a step of acquiring three-dimensional space information representing the work area of the robot (S100); obtaining a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged (S101); Using the acquired three-dimensional model and the three-dimensional space information of the work area, the configurations of at least one of the robot and the peripheral elements are compared between the virtual space of the simulation and the actual work area.
- a safety verification method characterized by having
- [4] acquiring three-dimensional spatial information of the work area obtained by measuring the work area of the robot with a three-dimensional sensor (S100); a step of arranging a three-dimensional model of the robot and its peripheral elements in a virtual space and obtaining a three-dimensional model used for the simulation from a simulator that simulates the motion of the robot in the virtual space (S101); Comparing the acquired three-dimensional model with the three-dimensional space information of the work area to determine whether or not there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area. a determining step (S102); outputting a notification when there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area (S103);
- a safety verification method characterized by having
- Simulation system 10 Safety verification device 11: Simulator 12: Display device 13: Input device 20: Robot controller 21: Robot 21v: 3D model of robot 30: Monitoring system 31: 3D sensor 31v: 3 of 3D sensors
- Dimensional model 40 workbench 40v: three-dimensional model of workbench 41: workpiece 41v: three-dimensional model of workpiece 42: tool 44v: three-dimensional model of safety fence
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
This safety verification device has a real environment information acquisition means for acquiring three-dimensional space information representing a work area of a robot, a virtual environment information acquisition means for acquiring a three-dimensional model of a virtual space in which the robot and an element on the periphery thereof are each arranged, a comparison means for using the acquired three-dimensional model and the three-dimensional space information of the work area to compare the configuration of the robot and/or the peripheral element between the virtual space of a simulation and the actual work area, and an output means for outputting the result of comparison by the comparison means.
Description
本発明は、安全検証装置、安全検証方法、およびプログラムに関する。
The present invention relates to a safety verification device, a safety verification method, and a program.
FA(ファクトリーオートメーション)分野において、産業用ロボットのティーチングや動作検証のためにコンピュータシミュレーションが利用されはじめている。例えば、ロボットやその周辺要素(作業台、ワーク、他の機械、安全柵など)の3次元モデルを仮想空間上に配置し動作シミュレーションを実行することで、ロボットの動きや他の機械とのタイミングを検証したり、安全性に問題がないかを確認したりすることができる。
In the FA (factory automation) field, computer simulation is beginning to be used for teaching and verifying the operation of industrial robots. For example, by arranging a 3D model of the robot and its peripheral elements (workbench, work, other machines, safety fences, etc.) can be verified and whether there are any safety issues.
特許文献1には、ロボット、ワーク、周辺配置物(トレイなど)の3次元モデルを用いてシミュレーションを実行し、ロボット移動中のワークや周辺配置物との干渉を判定する干渉判定装置が開示されている。
Patent Document 1 discloses an interference determination device that performs a simulation using a three-dimensional model of a robot, a workpiece, and a peripheral arrangement (such as a tray), and determines interference with the workpiece and the peripheral arrangement while the robot is moving. ing.
3次元モデルを用いたシミュレーションで動作や安全性をあらかじめ確認することができれば、ロボット(実機)を工場に導入する際の立ち上げ時間を削減でき、メリットは大きい。しかしながら、現実には、シミュレーションで用いた仮想環境と工場における実環境との間に何らかの相違が存在し、それがリスク要因となり得るおそれがある。例えば、シミュレーションではロボットの周辺に安全柵を設置していたにもかかわらず、実環境において安全柵を忘れていたり配置を間違えていたりした場合には、シミュレーションで保証した安全性が意味のないものになってしまう。また、シミュレーションで用いたロボットの3次元モデルと実機の形状が相違している場合に、想定外の干渉が発生するリスクもある。クリティカルな相違の存在は見過ごしてはならないが、ヒトが仮想環境と実環境の間の相違を漏れなく発見することは容易ではない。
If it is possible to confirm the operation and safety in advance through a simulation using a 3D model, it is possible to reduce the startup time when introducing the robot (actual machine) into the factory, which is a great advantage. However, in reality, there is some difference between the virtual environment used in the simulation and the real environment in the factory, which may become a risk factor. For example, even though a safety fence was installed around the robot in the simulation, if the safety fence is forgotten or misplaced in the actual environment, the safety guaranteed by the simulation is meaningless. Become. Also, if the three-dimensional model of the robot used in the simulation differs from the shape of the actual machine, there is a risk of unexpected interference occurring. The existence of critical differences should not be overlooked, but it is not easy for humans to discover all the differences between the virtual environment and the real environment.
本発明は、上記実情に鑑みなされたものであって、シミュレーションで想定した設計どおりに実環境が構築されているかを簡単に検証するための技術を提供することを目的とする。
The present invention has been devised in view of the above circumstances, and aims to provide a technique for easily verifying whether the actual environment has been constructed according to the design assumed in the simulation.
上記目的を達成するために本発明は、以下の構成を採用する。
In order to achieve the above objects, the present invention adopts the following configuration.
本発明の第一側面は、ロボットの作業エリアを表す3次元空間情報を取得する実環境情報取得手段と、前記ロボットおよびその周辺要素それぞれを配置した仮想空間の3次元モデルを取得する仮想環境情報取得手段と、取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットまたは前記周辺要素の少なくともいずれか一方の構成を比較する比較手段と、前記比較手段による比較結果を出力する出力手段と、を有することを特徴とする安全検証装置を提供する。
A first aspect of the present invention includes real environment information acquisition means for acquiring three-dimensional space information representing a work area of a robot, and virtual environment information for acquiring a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged. At least one of the robot and the peripheral elements is moved between the virtual space of the simulation and the actual work area by using the obtaining means, the obtained three-dimensional model, and the three-dimensional space information of the work area. Provided is a safety verification device comprising comparison means for comparing configurations, and output means for outputting a result of comparison by the comparison means.
3次元空間情報は、実環境(ロボットの作業エリア)を表す情報である。例えば、3次元空間情報は、実環境(ロボットの作業エリア)を3次元センサによって測定することにより得てもよい。3次元空間情報には、例えば、ロボットおよびその周辺要素の3次元的な形状・位置の情報が含まれる。他方、3次元モデルは、実環境を測定したものではなく、仮想環境を表す情報である。例えば、3次元モデルは、シミュレータがシミュレーションで用いるデータであってよい。安全検証装置は、両者の比較を行い、ロボットや周辺要素の構成の相違の有無を確認する。
3D space information is information that represents the actual environment (robot's work area). For example, three-dimensional space information may be obtained by measuring the actual environment (work area of the robot) with a three-dimensional sensor. The three-dimensional space information includes, for example, three-dimensional shape/position information of the robot and its peripheral elements. On the other hand, a three-dimensional model is information representing a virtual environment rather than a measurement of the real environment. For example, the three-dimensional model may be data that the simulator uses for simulation. The safety verification device compares the two and confirms whether or not there is a difference in the configuration of the robot and peripheral elements.
これにより、ユーザは、シミュレーションで想定した設計どおりに実環境が構築されているかを簡単に検証することができる。すなわち、相違が無いか、無視しても構わない程度の相違である場合には、設計どおりの動作および安全性を保証することができる。逆に、無視できない程度の相違が存在することがわかれば、設置ミスの発見、安全対策の見直し、オフラインティーチングや再シミュレーションの要否判断などにつなげることができる。
This allows users to easily verify whether the actual environment has been constructed as designed in the simulation. That is, when there is no difference or the difference is negligible, the operation and safety as designed can be guaranteed. Conversely, if it is found that there is a non-negligible difference, it can lead to the discovery of installation errors, the review of safety measures, the necessity of off-line teaching and re-simulation, and the like.
前記出力手段は、前記3次元モデルと前記作業エリアの前記3次元空間情報との比較画面を表示装置に出力してもよい。これにより、シミュレーションの仮想空間と実際の作業エリアの間の一致/相違を、容易かつ視覚的に確認することができる。
The output means may output a comparison screen between the three-dimensional model and the three-dimensional space information of the work area to a display device. This allows easy and visual confirmation of the match/difference between the virtual space of the simulation and the actual work area.
前記出力手段は、前記比較画面において、前記3次元モデルと前記3次元空間情報を重畳してもよい。このような比較画面により、シミュレーションで想定していた形状・配置と実際の形状・配置との比較が容易になる。
The output means may superimpose the three-dimensional model and the three-dimensional space information on the comparison screen. Such a comparison screen makes it easy to compare the shape/arrangement assumed in the simulation with the actual shape/arrangement.
前記出力手段は、前記比較画面において、前記3次元モデルと前記3次元空間情報の間の相違部分を、他の部分とは識別可能な態様で、出力してもよい。このような比較画面により、シミュレーションで想定していた形状・配置と実際の形状・配置との相違を、容易かつ直観的に把握することができる。
The output means may output the difference portion between the three-dimensional model and the three-dimensional space information on the comparison screen in a manner identifiable from other portions. With such a comparison screen, it is possible to easily and intuitively grasp the difference between the shape/arrangement assumed in the simulation and the actual shape/arrangement.
前記出力手段は、前記比較画面において、前記3次元モデルには存在するが前記3次元空間情報には存在しない第1の相違部分と、前記3次元モデルには存在しないが前記3次元空間には存在する第2の相違部分とを、互いに異なる態様で、出力してもよい。シミュレーションで想定していたものが実際には存在しなかった場合と、逆に、シミュレーションで想定していなかったものが実際には存在していた場合とでは、採るべき対応が異なり得るからである。
The output means provides, on the comparison screen, a first difference portion that exists in the three-dimensional model but does not exist in the three-dimensional space information, and a first difference portion that does not exist in the three-dimensional model but does not exist in the three-dimensional space. You may output the 2nd different part which exists in a mutually different aspect. This is because the response that should be taken may differ between the case where what was assumed in the simulation did not actually exist and, conversely, the case where what was not assumed in the simulation actually existed. .
実際の作業エリアを反映したシミュレーションを行うために、前記3次元空間情報に合わせて前記3次元モデルを更新する更新手段をさらに有してもよい。この構成によれば、実際の作業エリアにおけるロボットや周辺要素の形状・位置に合うように、シミュレーション用の3次元モデルが修正される。したがって、実際の作業エリアと同じ条件での高精度なシミュレーションを容易に実現できる。
In order to perform a simulation that reflects the actual work area, it may further include updating means for updating the three-dimensional model in accordance with the three-dimensional space information. According to this configuration, the three-dimensional model for simulation is modified so as to match the shapes and positions of the robot and peripheral elements in the actual work area. Therefore, highly accurate simulation can be easily realized under the same conditions as the actual work area.
3次元センサを有する監視システムが前記ロボットの前記作業エリアを監視するために設置されており、前記実環境情報取得手段は、前記監視システムから前記作業エリアの3次元空間情報を取得してもよい。監視システムの3次元センサを流用することにより、安全検証装置のための専用の3次元センサを用意する必要がなくなるため、コストの低減と利便性の向上を図ることができる。
A monitoring system having a three-dimensional sensor may be installed to monitor the work area of the robot, and the real environment information acquiring means may acquire three-dimensional space information of the work area from the monitoring system. . By using the three-dimensional sensor of the monitoring system, it is not necessary to prepare a dedicated three-dimensional sensor for the safety verification device, so it is possible to reduce costs and improve convenience.
前記比較手段は、前記監視システムの前記3次元センサの位置を基準にして、前記3次元モデルと前記作業エリアの前記3次元空間情報の間の座標合わせを行ってもよい。3次元空間情報は3次元センサの測定結果から生成されるものであるため、3次元センサの位置を基準にした座標系で表現するのは易しい。したがって、仮想空間(3次元モデル)と実空間(作業エリア)の座標合わせを簡易に実現できる。
The comparison means may perform coordinate matching between the three-dimensional model and the three-dimensional space information of the work area based on the position of the three-dimensional sensor of the monitoring system. Since the three-dimensional spatial information is generated from the measurement results of the three-dimensional sensor, it is easy to express it in a coordinate system based on the position of the three-dimensional sensor. Therefore, it is possible to easily match the coordinates of the virtual space (three-dimensional model) and the real space (work area).
前記比較手段は、前記3次元モデルとの比較の前に、前記作業エリアの前記3次元空間情報の補正を行うためのユーザインターフェイスを有してもよい。上記のようなユーザインターフェイスがあれば、3次元空間情報の誤差データや欠損データをマニュアルで補正することができるため、後段の3次元モデルとの比較処理の精度や信頼性を向上することが可能になる。
The comparison means may have a user interface for correcting the three-dimensional space information of the work area before comparison with the three-dimensional model. With the above user interface, it is possible to manually correct error data and missing data in 3D space information, so it is possible to improve the accuracy and reliability of later comparison processing with 3D models. become.
本発明の第二側面は、ロボットの作業エリアを表す3次元空間情報を取得するステップと、前記ロボットおよびその周辺要素それぞれを配置した仮想空間の3次元モデルを取得するステップと、取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットまたは前記周辺要素の少なくもいずれか一方の構成を比較するステップと、前記比較するステップの結果を出力するステップと、を有することを特徴とする安全検証方法を提供する。
A second aspect of the present invention includes a step of acquiring three-dimensional space information representing a work area of a robot, a step of acquiring a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged, and a step of acquiring the acquired three-dimensional model. comparing configurations of the robot and/or the peripheral elements between a simulated virtual space and an actual work area using the dimensional model and the three-dimensional spatial information of the work area; and outputting the result of the comparing step.
本発明の第三側面は、上記の安全検証方法の各ステップをコンピュータに実行させるためのプログラムを提供する。
A third aspect of the present invention provides a program for causing a computer to execute each step of the above safety verification method.
本発明は、上記手段の少なくとも一部を有する安全検証装置や実環境確認装置として捉えてもよい。また、本発明は、安全検証装置と3次元センサを備えるシステム、安全検証装置と監視システムを備えるシステム、安全検証装置とシミュレータを備えるシステム、あるいは、安全検証装置とシミュレータと監視システムを備えるシステムなどとして捉えてもよい。また、本発明は、上記処理の少なくとも一部を有する安全検証方法、安全検証装置の制御方法などとして捉えてもよいし、かかる方法を実現するためのプログラムやそのプログラムを非一時的に記録した記録媒体として捉えることもできる。なお、上記手段ないし処理の各々は可能な限り互いに組み合わせて本発明を構成することができる。
The present invention may be regarded as a safety verification device or a real environment confirmation device having at least part of the above means. The present invention also provides a system comprising a safety verification device and a three-dimensional sensor, a system comprising a safety verification device and a monitoring system, a system comprising a safety verification device and a simulator, or a system comprising a safety verification device, a simulator and a monitoring system. can be taken as Further, the present invention may be regarded as a safety verification method having at least part of the above processing, a control method for a safety verification device, or the like, or a program for realizing such a method or a non-temporarily recorded program for the program. It can also be regarded as a recording medium. It should be noted that each of the means and processes described above can be combined with each other as much as possible to constitute the present invention.
本発明によれば、シミュレーションで想定した設計どおりに実環境が構築されているかを簡単に検証することができる。
According to the present invention, it is possible to easily verify whether the actual environment has been constructed as designed in the simulation.
<適用例>
図1を参照して、本発明の適用例を説明する。図1は、本発明の一実施の形態に係る安全検証装置を含むシミュレーションシステム1の構成例を模式的に示している。 <Application example>
An application example of the present invention will be described with reference to FIG. FIG. 1 schematically shows a configuration example of a simulation system 1 including a safety verification device according to one embodiment of the present invention.
図1を参照して、本発明の適用例を説明する。図1は、本発明の一実施の形態に係る安全検証装置を含むシミュレーションシステム1の構成例を模式的に示している。 <Application example>
An application example of the present invention will be described with reference to FIG. FIG. 1 schematically shows a configuration example of a simulation system 1 including a safety verification device according to one embodiment of the present invention.
このシミュレーションシステム1は、産業用ロボット(以下「ロボット」と呼ぶ)21のティーチング(制御プログラムの開発)、シミュレーション、安全検証などを行うためのシステムである。図1に示すように、シミュレーションシステム1は、安全検証装置10とシミュレータ11を有して構成されている。シミュレーションシステム1は、ロボット21を制御するためのロボットコントローラ20や、ロボット21の作業エリアを監視するための監視システム30とネットワークを介して通信可能である。例えば、ロボット21の保守ないし管理を行うユーザがシミュレーションシステム1を用い、インターネットを通じて、工場に設置されたロボットコントローラ20および監視システム30にリモートアクセスするような利用方法が想定される。
This simulation system 1 is a system for performing teaching (control program development), simulation, safety verification, etc. of an industrial robot (hereinafter referred to as "robot") 21. As shown in FIG. 1, the simulation system 1 includes a safety verification device 10 and a simulator 11. As shown in FIG. The simulation system 1 can communicate with a robot controller 20 for controlling the robot 21 and a monitoring system 30 for monitoring the work area of the robot 21 via a network. For example, a user who maintains or manages the robot 21 may use the simulation system 1 to remotely access the robot controller 20 and the monitoring system 30 installed in the factory via the Internet.
シミュレータ11は、ロボット21の制御プログラムの開発機能と、3次元モデルを用いたシミュレーション機能とを有する、統合開発ツールである。すなわち、ユーザは、シミュレータ11を利用して、ロボットコントローラ20用の制御プログラムを作成し、さらに、エミュレータで制御プログラムを実行させ、仮想空間上でロボットやその周辺要素の動作や配置を確認し、プログラムのデバッグ、動作検証、安全性の検証などを行うことが可能である。作成した制御プログラムはネットワーク経由でロボットコントローラ20にインストールすることができる。このようなシミュレータ11を利用することにより、実機(ロボット21やロボットコントローラ20)がなくても、ティーチングや動作検証が可能となる。よって、設備を工場に導入する際の立ち上げ時間を大幅に削減できる、遠隔操作によるティーチングや保守が可能になる、などの利点がある。
The simulator 11 is an integrated development tool that has a control program development function for the robot 21 and a simulation function using a three-dimensional model. That is, the user creates a control program for the robot controller 20 using the simulator 11, executes the control program using an emulator, confirms the motion and arrangement of the robot and its peripheral elements in the virtual space, It is possible to debug programs, verify operations, verify safety, and the like. The created control program can be installed in the robot controller 20 via the network. By using such a simulator 11, teaching and operation verification can be performed without an actual machine (robot 21 or robot controller 20). Therefore, there are advantages such as the ability to significantly reduce the startup time when equipment is introduced into a factory, and the ability to perform teaching and maintenance by remote control.
シミュレータ11は、ロボット21やその周辺要素(例えば、作業台、ワーク、工具、部品、他の機械、安全柵など)の3次元モデル(例えば、CADデータなど)を仮想空間上に配置した仮想環境を用いてシミュレーションを実行する。この仮想環境が工場の実環境と十分に一致していれば、シミュレーション結果の妥当性・信頼性が保証される。しかしながら、現実には、仮想環境と実環境との間に何らかの相違が存在するケースが発生する。クリティカルな相違の存在は見過ごしてはならないが、ヒトが仮想環境と実環境の間の相違を漏れなく発見することは容易ではない。特に、ティーチングの担当者(シミュレータ11のユーザ)と設備の担当者(工場で実機の立ち上げ等を行うユーザ)とが異なる場合や、シミュレータ11が遠隔(工場とは異なる場所)に設置されている場合には、仮想環境と実環境の比較はきわめて難しい。
The simulator 11 is a virtual environment in which a three-dimensional model (eg, CAD data) of the robot 21 and its peripheral elements (eg, workbench, work, tools, parts, other machines, safety fences, etc.) is arranged in a virtual space. to run the simulation. If this virtual environment sufficiently matches the real environment of the factory, the validity and reliability of the simulation results are guaranteed. However, in reality, there are cases where some difference exists between the virtual environment and the real environment. The existence of critical differences should not be overlooked, but it is not easy for humans to discover all the differences between the virtual environment and the real environment. In particular, when the person in charge of teaching (the user of the simulator 11) and the person in charge of the equipment (the user who starts up the actual machine at the factory) are different, or the simulator 11 is installed remotely (at a place different from the factory). It is extremely difficult to compare virtual and real environments when
安全検証装置10は、このような課題を解決するため、シミュレータ11が用いた仮想環境と工場における実環境とを比較し、相違の有無を判定する機能を提供する。具体的には、ロボット21の作業エリアを3次元センサ31によって測定することにより、作業エリアの3次元空間情報を生成し、これを実環境情報として安全検証装置10に与える。安全検証装置10では、この実環境情報と、シミュレータ11から得た3次元モデル(仮想環境情報)とを比較し、ロボットやその周辺要素の構成(すなわち、3次元的な形状・位置・大きさなど)が相違するか否か判定する。そして、実環境と仮想環境との比較結果を出力する。例えば、もし、両者の間に相違が存在した場合には通知を出力して、ユーザに問題点を知らしめるとよい。また、両者の間に相違がない(あるいは問題とならない程度の小さな相違にすぎない)場合には、シミュレーション通りの安全性が担保されている旨の通知を出力してもよい。さらに、安全検証装置10は、シミュレーション情報や制御プログラムに対して必要な修正を行ってもよい。
In order to solve such problems, the safety verification device 10 provides a function to compare the virtual environment used by the simulator 11 and the real environment in the factory and determine whether there is a difference. Specifically, by measuring the work area of the robot 21 with the three-dimensional sensor 31, three-dimensional space information of the work area is generated and provided to the safety verification device 10 as real environment information. The safety verification device 10 compares this real environment information with the three-dimensional model (virtual environment information) obtained from the simulator 11, and determines the configuration of the robot and its peripheral elements (that is, the three-dimensional shape, position, and size). etc.) is different. Then, it outputs the comparison result between the real environment and the virtual environment. For example, if there is a discrepancy between the two, a notification may be output to notify the user of the problem. In addition, if there is no difference between the two (or the difference is so small that it does not pose a problem), a notification to the effect that the safety as simulated is ensured may be output. Furthermore, the safety verification device 10 may make necessary corrections to the simulation information and the control program.
このような安全検証装置10を利用することにより、ユーザは、シミュレーションで想定した設計どおりに実環境が構築されているかを簡単に検証することができる。すなわち、相違が無いか、無視しても構わない程度の相違である場合には、設計どおりの動作および安全性を保証することができる。逆に、無視できない程度の相違が存在することがわかれば、設置ミスの発見、安全対策の見直し、オフラインティーチングや再シミュレーションの要否判断などにつなげることができる。
By using such a safety verification device 10, the user can easily verify whether or not the actual environment is constructed according to the design assumed in the simulation. That is, when there is no difference or the difference is negligible, the operation and safety as designed can be guaranteed. Conversely, if it is found that there is a non-negligible difference, it can lead to the discovery of installation errors, the review of safety measures, the necessity of off-line teaching and re-simulation, and the like.
<実施形態>
図2は、本実施形態に係るシミュレーションシステム1の構成例を示すブロック図である。 <Embodiment>
FIG. 2 is a block diagram showing a configuration example of the simulation system 1 according to this embodiment.
図2は、本実施形態に係るシミュレーションシステム1の構成例を示すブロック図である。 <Embodiment>
FIG. 2 is a block diagram showing a configuration example of the simulation system 1 according to this embodiment.
シミュレーションシステム1は、主な構成として、安全検証装置10、シミュレータ11、表示装置12、入力装置13を有する。安全検証装置10は、実環境情報取得部100、仮想環境情報取得部101、比較部102、出力部103、シミュレーションデータ更新部104を有する。シミュレータ11は、シミュレーションデータを記憶するシミュレーションデータ記憶部110を有する。
The simulation system 1 has a safety verification device 10, a simulator 11, a display device 12, and an input device 13 as main components. The safety verification device 10 has a real environment information acquisition unit 100 , a virtual environment information acquisition unit 101 , a comparison unit 102 , an output unit 103 and a simulation data update unit 104 . The simulator 11 has a simulation data storage unit 110 that stores simulation data.
実環境情報取得部100は、3次元センサ31によって測定された、作業エリアの3次元空間情報(センサデータ)を取得する。この3次元空間情報には、作業エリア内に設置されているロボット21や、そのまわりに存在する周辺要素(例えば、作業台、ワーク、工具、部品、他の機械、安全柵など)の3次元的な形状・位置・大きさの情報が含まれている。3次元空間情報の具体的なデータ形式は問わないが、本実施形態では、一例として、点群データを用いる。
The real environment information acquisition unit 100 acquires 3D space information (sensor data) of the work area measured by the 3D sensor 31 . This three-dimensional space information includes the three-dimensional space of the robot 21 installed in the work area and the surrounding elements (for example, workbench, work, tools, parts, other machines, safety fences, etc.). information such as shape, position, and size. Although the specific data format of the three-dimensional space information does not matter, point cloud data is used as an example in this embodiment.
仮想環境情報取得部101は、シミュレータ11からシミュレーションに用いられる3次元モデル(シミュレーションデータ)を取得する。この3次元モデルには、ロボットやそのまわりに存在する周辺要素の3次元モデルが含まれている。3次元モデルの具体的なデータ形式は問わないが、例えば、3次元CADの設計情報から生成されるデータなどが用いられる。
The virtual environment information acquisition unit 101 acquires a three-dimensional model (simulation data) used for simulation from the simulator 11 . This three-dimensional model includes a three-dimensional model of the robot and its surrounding elements. Although the specific data format of the three-dimensional model does not matter, for example, data generated from three-dimensional CAD design information is used.
比較部102は、作業エリアの3次元空間情報(センサデータ)とシミュレーションに用いられた3次元モデル(シミュレーションデータ)とを比較する。そして、比較部102は、シミュレーションの仮想空間と実際の作業エリアの間で、ロボットおよび/または周辺要素の構成が相違するか否かを判定する。
The comparison unit 102 compares the 3D space information (sensor data) of the work area with the 3D model (simulation data) used for the simulation. Then, the comparison unit 102 determines whether or not the configurations of the robot and/or peripheral elements are different between the virtual space of the simulation and the actual work area.
出力部103は、比較部102により得られる比較結果や判定結果を出力する。情報の出力先は、シミュレーションシステム1に備わる表示装置12、外部装置(例えば、他のコンピュータ、PLCなど)などである。
The output unit 103 outputs the comparison results and determination results obtained by the comparison unit 102 . The output destination of the information is the display device 12 provided in the simulation system 1, an external device (for example, another computer, PLC, etc.), and the like.
シミュレーションデータ更新部104は、比較部102による比較結果や判定結果に基づき、3次元空間情報(センサデータ)に合わせて3次元モデルを更新する機能である。更新された3次元モデルは、シミュレーションデータ記憶部110に格納され、再シミュレーションを行う場合などに利用される。
The simulation data update unit 104 is a function that updates the 3D model according to the 3D space information (sensor data) based on the comparison results and determination results from the comparison unit 102 . The updated three-dimensional model is stored in the simulation data storage unit 110 and used for re-simulation.
シミュレーションデータ記憶部110は、シミュレータ11がシミュレーションのために用いるデータ群を記憶する。シミュレーションデータには、例えば、ロボットやその周辺要素の3次元モデル、ロボットコントローラ用の制御プログラム、シミュレーション実行時の各種条件設定などが含まれてもよい。シミュレータ11は、ロボットコントローラ20と通信可能であり、制御プログラムのロボットコントローラ20への書き込み、ロボットコントローラ20側の情報の取得などが可能である。
The simulation data storage unit 110 stores data groups used by the simulator 11 for simulation. The simulation data may include, for example, a three-dimensional model of the robot and its peripheral elements, a control program for the robot controller, various condition settings during execution of the simulation, and the like. The simulator 11 can communicate with the robot controller 20, write a control program to the robot controller 20, and acquire information on the robot controller 20 side.
表示装置12は、情報を表示するためのデバイスであり、例えば、液晶ディスプレイや有機ELディスプレイが用いられる。入力装置13は、情報を入力するためのデバイスであり、例えば、キーボード、マウス、タッチパネルなどが用いられる。表示装置12と入力装置13を兼ねるタッチパネルディスプレイを用いてもよい。
The display device 12 is a device for displaying information, and uses, for example, a liquid crystal display or an organic EL display. The input device 13 is a device for inputting information, and uses, for example, a keyboard, a mouse, a touch panel, and the like. A touch panel display that serves as both the display device 12 and the input device 13 may be used.
シミュレーションシステム1は、例えば、プロセッサ、メモリ、ストレージなどを備えるパーソナルコンピュータにより構成してもよい。その場合は、プロセッサが、ストレージに格納されているプログラムをメモリにロードし、プログラムを実行することによって、図2に示す各部の機能が提供される。なお、この構成に限られず、図2の機能のうちの一部または全部をASICやFPGAなどで構成してもよいし、クラウドコンピューティングや分散コンピューティングによって実現してもよい。
The simulation system 1 may be composed of a personal computer equipped with a processor, memory, storage, etc., for example. In that case, the processor loads the program stored in the storage into the memory and executes the program to provide the functions of the units shown in FIG. Note that the configuration is not limited to this configuration, and some or all of the functions in FIG. 2 may be configured by ASIC, FPGA, or the like, or may be realized by cloud computing or distributed computing.
<3次元センサ>
3次元センサ31は、3次元空間情報を測定する装置である。作業エリア内に存在する物体の3次元的な形状・位置・大きさを測定することができれば、どのような種類のセンサを用いてもよい。アクティブ方式、パッシブ方式のいずれの方式でもよいし、TOFカメラ、ステレオカメラ、構造化照明、LiDARなどどのような原理を用いてもよい。本実施形態では、赤外光を用いて、光の飛行時間(Time of Flight:TOF)から距離画像を取得するTOFセンサ(TOFカメラ)を用いる。なお、TOF方式には直接型(ダイレクト型)と間接型(インダイレクト型)があり、どちらを用いてもよいが、本実施形態では間接型、すなわち投影光と反射光の位相差に基づいて距離情報を得るタイプのTOFセンサを用いる。 <Three-dimensional sensor>
The three-dimensional sensor 31 is a device that measures three-dimensional spatial information. Any type of sensor may be used as long as it can measure the three-dimensional shape, position, and size of an object existing within the work area. Either an active system or a passive system may be used, and any principle such as a TOF camera, a stereo camera, structured illumination, or LiDAR may be used. In this embodiment, a TOF sensor (TOF camera) that acquires a range image from the time of flight (TOF) of light using infrared light is used. Note that the TOF method includes a direct type and an indirect type, and either of them may be used. A TOF sensor of the type that obtains distance information is used.
3次元センサ31は、3次元空間情報を測定する装置である。作業エリア内に存在する物体の3次元的な形状・位置・大きさを測定することができれば、どのような種類のセンサを用いてもよい。アクティブ方式、パッシブ方式のいずれの方式でもよいし、TOFカメラ、ステレオカメラ、構造化照明、LiDARなどどのような原理を用いてもよい。本実施形態では、赤外光を用いて、光の飛行時間(Time of Flight:TOF)から距離画像を取得するTOFセンサ(TOFカメラ)を用いる。なお、TOF方式には直接型(ダイレクト型)と間接型(インダイレクト型)があり、どちらを用いてもよいが、本実施形態では間接型、すなわち投影光と反射光の位相差に基づいて距離情報を得るタイプのTOFセンサを用いる。 <Three-dimensional sensor>
The three-
図1および図2では、ロボット21の作業エリアに設置された監視システム30の3次元センサ31を利用している。その理由は、既存設備である監視システム30の3次元センサ31を流用することにより、安全検証装置10のための専用の3次元センサを用意する必要がなくなるため、コストの低減と利便性の向上を図ることができるからである。また、監視システム30の3次元センサ31は、ロボット21の作業エリアを測定可能な最適な位置に固定されているものであるから、安全検証(仮想環境情報との比較)のために必要十分な範囲のセンサデータを確実に得られる。しかも、監視システム30の3次元センサ31であれば、その設置位置(作業エリアやロボット21に対する相対的な位置関係)が既知であるため、後述する仮想環境情報との座標合わせが行い易いという利点もある。
1 and 2, the three-dimensional sensor 31 of the monitoring system 30 installed in the work area of the robot 21 is used. The reason for this is that by using the three-dimensional sensor 31 of the monitoring system 30, which is an existing facility, it is not necessary to prepare a dedicated three-dimensional sensor for the safety verification device 10, thereby reducing costs and improving convenience. This is because it is possible to In addition, since the three-dimensional sensor 31 of the monitoring system 30 is fixed at an optimum position where the work area of the robot 21 can be measured, necessary and sufficient information for safety verification (comparison with virtual environment information) is obtained. A range of sensor data can be reliably obtained. Moreover, if the three-dimensional sensor 31 of the monitoring system 30 is used, since the installation position (relative positional relationship with respect to the work area and the robot 21) is known, there is an advantage that it is easy to perform coordinate matching with virtual environment information, which will be described later. There is also
なお、本実施形態では、監視システム30の3次元センサ31を流用したが、もちろん、監視システム30とは関係ない3次元センサを利用したり、固定されていない3次元センサを利用したりしてもかまわない。例えば、移動しながら測定するタイプの3次元センサ、あるいは、異なる位置に設置された複数の3次元センサを利用して、作業エリアを複数の異なる視点から測定し、それらの測定結果を統合して3次元空間情報を生成してもよい。図1のような固定の3次元センサ31では死角(測定できない領域)が生じることが避け得ないが、複数の視点からの測定結果を統合することで高精度かつ高信頼の3次元空間情報が得られると期待できる。
In this embodiment, the three-dimensional sensor 31 of the monitoring system 30 is used, but of course, a three-dimensional sensor unrelated to the monitoring system 30 or an unfixed three-dimensional sensor may be used. I don't mind. For example, using a 3D sensor that measures while moving, or multiple 3D sensors installed at different positions, the work area is measured from multiple different viewpoints, and the measurement results are integrated. Three-dimensional spatial information may be generated. A fixed three-dimensional sensor 31 such as that shown in FIG. 1 inevitably has blind spots (areas that cannot be measured). You can expect to get
<安全検証方法>
図3に沿って、本実施形態の安全検証方法の一例を説明する。図3は、安全検証装置による処理の流れを示すフローチャートである。この安全検証は、シミュレーションで用いた仮想環境と工場における実環境とを比較し、仮想環境と実環境の間に相違がないかをチェックすることにより、シミュレーション時(ティーチング時)に設計した安全性が現場でも担保できているかを検証するプロセスである。したがって、図3の処理が行われる前に、(1)シミュレータ11によるシミュレーションが実施され、その時に用いたシミュレーションデータがシミュレーション記憶部110に格納されており、かつ、(2)ロボットおよびその周辺要素などの設備が工場に設置されている、ことが前提となる。 <Safety verification method>
An example of the safety verification method of this embodiment will be described with reference to FIG. FIG. 3 is a flow chart showing the flow of processing by the safety verification device. This safety verification compares the virtual environment used in the simulation with the actual environment in the factory, and checks for any differences between the virtual environment and the actual environment to ensure that the designed safety is maintained during the simulation (during teaching). This is the process of verifying whether or not this is guaranteed even on site. Therefore, before the process of FIG. 3 is performed, (1) a simulation is performed by thesimulator 11, the simulation data used at that time is stored in the simulation storage unit 110, and (2) the robot and its peripheral elements It is assumed that such facilities are installed in the factory.
図3に沿って、本実施形態の安全検証方法の一例を説明する。図3は、安全検証装置による処理の流れを示すフローチャートである。この安全検証は、シミュレーションで用いた仮想環境と工場における実環境とを比較し、仮想環境と実環境の間に相違がないかをチェックすることにより、シミュレーション時(ティーチング時)に設計した安全性が現場でも担保できているかを検証するプロセスである。したがって、図3の処理が行われる前に、(1)シミュレータ11によるシミュレーションが実施され、その時に用いたシミュレーションデータがシミュレーション記憶部110に格納されており、かつ、(2)ロボットおよびその周辺要素などの設備が工場に設置されている、ことが前提となる。 <Safety verification method>
An example of the safety verification method of this embodiment will be described with reference to FIG. FIG. 3 is a flow chart showing the flow of processing by the safety verification device. This safety verification compares the virtual environment used in the simulation with the actual environment in the factory, and checks for any differences between the virtual environment and the actual environment to ensure that the designed safety is maintained during the simulation (during teaching). This is the process of verifying whether or not this is guaranteed even on site. Therefore, before the process of FIG. 3 is performed, (1) a simulation is performed by the
ステップS100において、実環境情報取得部100が、監視システム30から、工場の作業エリアの3次元空間情報を取得する。このとき、実環境情報取得部100は、監視システム30からオンラインで直接的にデータを受信してもよいし、オフラインまたは間接的に3次元空間情報のデータを読み込んでもよい。
In step S100, the real environment information acquisition unit 100 acquires the three-dimensional space information of the work area of the factory from the monitoring system 30. At this time, the real environment information acquisition unit 100 may directly receive data from the monitoring system 30 online, or may read the data of the three-dimensional space information offline or indirectly.
ステップS101において、仮想環境情報取得部101が、シミュレータ11から、シミュレーションに用いられた3次元モデルを取得する。
In step S101, the virtual environment information acquisition unit 101 acquires the three-dimensional model used for the simulation from the simulator 11.
ステップS102において、比較部102が、ステップS100で取得した3次元空間情報とステップS101で取得した3次元モデルの間の座標合わせを行う。このとき、比較部102は、監視システム30の3次元センサ31の位置(より詳しくは視点)を基準にして、座標合わせを行うとよい。3次元センサ31の測定結果(生データ)は、3次元センサ31の位置(視点)を基準にした座標系(カメラ座標系とも呼ばれる)で表現されるのが一般的である。一方、3次元センサ31の位置が既知であれば、その視点に合わせて3次元モデルのビューや座標系を変換することは容易い。したがって、3次元センサ31の位置を基準にして座標合わせを行うことにより、簡単な処理で、3次元空間と3次元モデルの座標を精度良く一致させることができる。
In step S102, the comparison unit 102 performs coordinate matching between the three-dimensional space information acquired in step S100 and the three-dimensional model acquired in step S101. At this time, the comparison unit 102 preferably performs coordinate matching based on the position (more specifically, the viewpoint) of the three-dimensional sensor 31 of the monitoring system 30 . The measurement results (raw data) of the three-dimensional sensor 31 are generally expressed in a coordinate system (also called camera coordinate system) based on the position (viewpoint) of the three-dimensional sensor 31 . On the other hand, if the position of the three-dimensional sensor 31 is known, it is easy to transform the view and coordinate system of the three-dimensional model according to the viewpoint. Therefore, by performing coordinate matching based on the position of the three-dimensional sensor 31, the coordinates of the three-dimensional space and the three-dimensional model can be accurately matched with a simple process.
ステップS103において、比較部102が、3次元空間情報と3次元モデルを比較する。3次元空間情報は例えば点群データで与えられ、3次元モデルは例えばCADデータのようないわゆるベクタデータで与えられる。両者のデータ形式は異なるものの、実環境が仮想環境と完全に一致していれば、3次元空間情報において、3次元モデルと同じ位置・形状・大きさの点群クラスタが形成されているはずである。したがって、シミュレーションの際に仮想空間内に配置されていたロボットや周辺要素と、実質的に同じ形状・大きさの物体が実際の作業エリア内の同じような位置に存在するか否か、という程度の比較・判定は可能である。
In step S103, the comparison unit 102 compares the 3D space information and the 3D model. The three-dimensional space information is provided by point cloud data, for example, and the three-dimensional model is provided by so-called vector data such as CAD data. Although the data formats of the two are different, if the real environment completely matches the virtual environment, in the 3D space information, point cloud clusters with the same position, shape, and size as the 3D model should be formed. be. Therefore, it is a matter of whether or not an object with substantially the same shape and size as the robot and peripheral elements placed in the virtual space during the simulation exists in the same position in the actual work area. can be compared and judged.
判定結果は、次の3種類がある。なお、比較部102は、下記の(1)~(3)の判定結果を、例えば、3次元モデルごとや点群クラスタごとに出力するとよい。
There are three types of judgment results. Note that the comparison unit 102 preferably outputs the determination results of (1) to (3) below for each three-dimensional model or each point group cluster, for example.
(1)3次元モデルと同じ物体が3次元空間情報にも存在する。この場合は、実環境が仮想環境と一致しているということが確認できたことになる。
(1) The same object as the 3D model exists in the 3D space information. In this case, it can be confirmed that the real environment matches the virtual environment.
(2)3次元モデルと同じ物体が3次元空間情報に存在しない。この場合は、シミュレーションのときに存在していた物体が実環境には無い、ということになる。もし、該当する物体が柵や防護板のような安全対策のための設備であった場合、この判定結果はクリティカルな問題となる。
(2) The same object as the 3D model does not exist in the 3D space information. In this case, the object that existed during the simulation does not exist in the real environment. If the object in question is a facility for safety measures such as a fence or guard plate, this judgment result becomes a critical problem.
(3)3次元モデルには存在しない物体が3次元空間情報に含まれる。この場合は、シミュレーションのときには想定していなかった物体が実環境に存在する、ということになる。例えば、ロボット自体の形状が相違していたり、ロボットの動線上に物体がはみ出ているような場合には、意図しない干渉(衝突)が生じるおそれがあるため、この判定結果もクリティカルな問題となる可能性がある。
(3) Objects that do not exist in the 3D model are included in the 3D space information. In this case, objects that were not assumed in the simulation exist in the real environment. For example, if the shape of the robot itself is different, or if an object protrudes from the robot's flow line, there is a risk of unintended interference (collision), so this judgment result is also a critical issue. there is a possibility.
ステップS104において、出力部103が、比較部102による比較結果および判定結果を出力する。ユーザは、出力部103の出力内容をみて、実環境と仮想環境の間に相違があるかどうか、また相違がある場合にはそれがクリティカルな問題であるかどうかを判断することができる。そして、必要に応じて、実環境に欠けていた安全対策を補充したり、干渉する可能性のある物体を排除するなどの、改善を実施すればよい。
In step S<b>104 , the output unit 103 outputs the comparison result and determination result from the comparison unit 102 . The user can see the output contents of the output unit 103 to determine whether there is a difference between the real environment and the virtual environment, and if there is a difference, whether it is a critical problem. Then, if necessary, improvements can be made, such as supplementing safety measures that were lacking in the real environment or eliminating objects that may interfere.
<結果出力例>
図4に、シミュレーション(仮想環境)と工場内の作業エリア(実環境)の一例を示す。仮想環境では、作業台40vの上にロボット21vとワーク41vが配置され、作業台40vの左側には安全柵44vが設けられている。符号31vは3次元センサを示している。一方、実環境では、作業台40の上にロボット21とワーク41が配置され、この点は仮想環境と同じである。しかし、ロボット21の手前に工具42が配置されている点、および、作業台40の左側に安全柵が設けられていない点が、仮想環境と相違する。 <Result output example>
FIG. 4 shows an example of a simulation (virtual environment) and a work area (actual environment) in a factory. In the virtual environment, arobot 21v and a workpiece 41v are placed on a workbench 40v, and a safety fence 44v is provided on the left side of the workbench 40v. Reference numeral 31v indicates a three-dimensional sensor. On the other hand, in the real environment, the robot 21 and the workpiece 41 are placed on the workbench 40, which is the same as in the virtual environment. However, it is different from the virtual environment in that the tool 42 is arranged in front of the robot 21 and the safety fence is not provided on the left side of the workbench 40 .
図4に、シミュレーション(仮想環境)と工場内の作業エリア(実環境)の一例を示す。仮想環境では、作業台40vの上にロボット21vとワーク41vが配置され、作業台40vの左側には安全柵44vが設けられている。符号31vは3次元センサを示している。一方、実環境では、作業台40の上にロボット21とワーク41が配置され、この点は仮想環境と同じである。しかし、ロボット21の手前に工具42が配置されている点、および、作業台40の左側に安全柵が設けられていない点が、仮想環境と相違する。 <Result output example>
FIG. 4 shows an example of a simulation (virtual environment) and a work area (actual environment) in a factory. In the virtual environment, a
図5は、表示装置12に表示される比較画面の例である。画面の左側がシミュレーションデータ(3次元モデル)の表示であり、右側がセンサデータ(3次元空間情報)の表示である。このように2種類のデータを視点を揃えたビューで表示することで、シミュレーションの仮想空間と実際の作業エリアの間の一致/相違を、容易かつ視覚的に確認することができる。
FIG. 5 is an example of a comparison screen displayed on the display device 12. The left side of the screen is the display of simulation data (three-dimensional model), and the right side is the display of sensor data (three-dimensional spatial information). By displaying the two types of data in views with the same viewpoint, it is possible to easily and visually confirm the match/difference between the virtual space of the simulation and the actual work area.
そして、シミュレーションデータの画面では、安全柵44vの部分が強調表示されている。ここは判定結果(2)に該当する相違部分である。他方、センサデータの画面では、工具42の部分が強調表示されている。ここは判定結果(3)に該当する相違部分である。このように、相違部分を他の部分とは識別可能な態様で表示することにより、シミュレーションで想定していた形状・配置と実際の形状・配置との相違を、容易かつ直観的に把握することができる。
And on the simulation data screen, the safety fence 44v is highlighted. This is the difference corresponding to the determination result (2). On the other hand, in the sensor data screen, the part of the tool 42 is highlighted. This is the difference corresponding to the determination result (3). In this way, by displaying the different part in a manner that can be distinguished from other parts, it is possible to easily and intuitively grasp the difference between the shape/arrangement assumed in the simulation and the actual shape/arrangement. can be done.
図6は、比較画面の別の例である。図5の比較画面では3次元モデルと3次元空間情報を並べて表示したが、図6の比較画面では、3次元モデルと3次元空間情報を重畳表示する。このような表示により、シミュレーションで想定していた形状・配置と実際の形状・配置との比較がさらに容易になる。また、図6の比較画面では、判定結果(2)に該当する相違部分(第1の相違部分)と判定結果(3)に該当する相違部分(第2の相違部分)とを互いに異なる態様で表示している。判定結果(2)の場合と(3)の場合とでは採るべき対応が異なり得るため、このように両者を区別して通知すると、ユーザにとっての利便性が向上する。
Fig. 6 is another example of the comparison screen. Although the three-dimensional model and the three-dimensional space information are displayed side by side on the comparison screen of FIG. 5, the three-dimensional model and the three-dimensional space information are displayed superimposed on the comparison screen of FIG. Such a display makes it easier to compare the shape/arrangement assumed in the simulation with the actual shape/arrangement. In addition, on the comparison screen of FIG. 6, the difference portion (first difference portion) corresponding to the determination result (2) and the difference portion (second difference portion) corresponding to the determination result (3) are displayed in different manners. it's shown. Since the response to be taken may differ between determination result (2) and determination result (3), if both are notified separately in this way, convenience for the user is improved.
<シミュレーションデータの更新>
判定結果(3)が得られた場合には、シミュレーションデータ更新部104は、3次元空間情報(センサデータ)に合わせて3次元モデルを更新してもよい。この更新処理は、例えば、図3のステップS103の後に実行される。 <Updating simulation data>
When determination result (3) is obtained, the simulationdata updating unit 104 may update the three-dimensional model in accordance with the three-dimensional space information (sensor data). This update process is executed after step S103 in FIG. 3, for example.
判定結果(3)が得られた場合には、シミュレーションデータ更新部104は、3次元空間情報(センサデータ)に合わせて3次元モデルを更新してもよい。この更新処理は、例えば、図3のステップS103の後に実行される。 <Updating simulation data>
When determination result (3) is obtained, the simulation
図5および図6に示した例の場合であれば、シミュレーションデータ更新部104は、実環境に存在する工具42に相当する3次元モデルを生成し、シミュレーションデータに追加すればよい。そして、シミュレータ11が、更新されたシミュレーションデータを用いて再シミュレーションを実施することにより、ロボット21と工具42とが干渉するか否かをチェックすることができる。
In the case of the examples shown in FIGS. 5 and 6, the simulation data updating unit 104 should generate a three-dimensional model corresponding to the tool 42 existing in the real environment and add it to the simulation data. Then, the simulator 11 can check whether or not the robot 21 and the tool 42 will interfere by performing a re-simulation using the updated simulation data.
<3次元空間情報の補正>
3次元空間情報は、3次元センサ31の測定結果から生成されるものであるため、測定誤差を含むことは避けられない。また、3次元センサ31の視野外や死角に存在する物体については、そもそも測定結果が得られない。したがって、比較処理(図3のステップS102)の前に、3次元空間情報の誤差データや欠損データをマニュアルで補正してもよい。これによりステップS102の比較処理の精度や信頼性を向上することが可能になる。 <Correction of three-dimensional spatial information>
Since the three-dimensional spatial information is generated from the measurement results of the three-dimensional sensor 31, it inevitably contains measurement errors. In addition, measurement results cannot be obtained for an object that exists outside the field of view of the three-dimensional sensor 31 or in a blind spot. Therefore, error data and missing data of the three-dimensional spatial information may be manually corrected before the comparison process (step S102 in FIG. 3). This makes it possible to improve the accuracy and reliability of the comparison processing in step S102.
3次元空間情報は、3次元センサ31の測定結果から生成されるものであるため、測定誤差を含むことは避けられない。また、3次元センサ31の視野外や死角に存在する物体については、そもそも測定結果が得られない。したがって、比較処理(図3のステップS102)の前に、3次元空間情報の誤差データや欠損データをマニュアルで補正してもよい。これによりステップS102の比較処理の精度や信頼性を向上することが可能になる。 <Correction of three-dimensional spatial information>
Since the three-dimensional spatial information is generated from the measurement results of the three-
図7は、比較部102が提供する、3次元空間情報の補正用ユーザインターフェイスの例である。画面上に3次元空間情報(点群データ)と3次元モデル(破線)が重畳表示されている。ユーザがマウスポインタを操作して、領域を選択し「削除」ボタンを押すと、当該領域内の点群が削除される。また、領域を選択し「追加」ボタンを押すと、当該領域内の点群が補完される。このようなマニュアル補正を行うことにより、図7下段に示すような整った3次元空間情報を得ることができる。
FIG. 7 is an example of a user interface for correcting three-dimensional spatial information provided by the comparison unit 102. FIG. Three-dimensional space information (point cloud data) and a three-dimensional model (broken line) are superimposed on the screen. When the user operates the mouse pointer to select an area and press the "delete" button, the point cloud in the area is deleted. Also, when a region is selected and the "Add" button is pressed, point clouds within the region are interpolated. By performing such manual correction, well-ordered three-dimensional spatial information as shown in the lower part of FIG. 7 can be obtained.
<変形例>
上記実施形態は、本発明の構成例を例示的に説明するものに過ぎない。本発明は上記の具体的な形態には限定されることはなく、その技術的思想の範囲内で種々の変形が可能である。例えば、図5~図6に示した比較画面及び強調表示はあくまで一例であり、目的が達成できるのであればどのような形式で表示を行ってもよい。また、上記実施形態では、安全検証装置とシミュレータが同じシステム内に設けられていたが、安全検証装置とシミュレータは別の装置でも構わない。あるいは、安全検証装置と監視システムが同じシステム内に設けられていてもよい。 <Modification>
The above-described embodiment is merely an example of the configuration of the present invention. The present invention is not limited to the specific forms described above, and various modifications are possible within the technical scope of the present invention. For example, the comparison screens and highlighting shown in FIGS. 5 and 6 are merely examples, and any form of display may be used as long as the purpose can be achieved. Moreover, in the above embodiment, the safety verification device and the simulator are provided in the same system, but the safety verification device and the simulator may be separate devices. Alternatively, the safety verifier and monitoring system may be provided within the same system.
上記実施形態は、本発明の構成例を例示的に説明するものに過ぎない。本発明は上記の具体的な形態には限定されることはなく、その技術的思想の範囲内で種々の変形が可能である。例えば、図5~図6に示した比較画面及び強調表示はあくまで一例であり、目的が達成できるのであればどのような形式で表示を行ってもよい。また、上記実施形態では、安全検証装置とシミュレータが同じシステム内に設けられていたが、安全検証装置とシミュレータは別の装置でも構わない。あるいは、安全検証装置と監視システムが同じシステム内に設けられていてもよい。 <Modification>
The above-described embodiment is merely an example of the configuration of the present invention. The present invention is not limited to the specific forms described above, and various modifications are possible within the technical scope of the present invention. For example, the comparison screens and highlighting shown in FIGS. 5 and 6 are merely examples, and any form of display may be used as long as the purpose can be achieved. Moreover, in the above embodiment, the safety verification device and the simulator are provided in the same system, but the safety verification device and the simulator may be separate devices. Alternatively, the safety verifier and monitoring system may be provided within the same system.
<付記>
〔1〕 ロボット(21)の作業エリアを表す3次元空間情報を取得する実環境情報取得手段(10)と、
前記ロボットおよびその周辺要素それぞれを配置した仮想空間の3次元モデル(21v、40v、41v、44v)を取得する仮想環境情報取得手段と、
取得した前記3次元モデル(21v、40v、41v、44v)と前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボット(21)または前記周辺要素(40、41、42)の少なくともいずれか一方の構成を比較する比較手段(102)と、
前記比較手段(102)による比較結果を出力する出力手段(103)と、
を有することを特徴とする安全検証装置(10)。 <Appendix>
[1] real environment information acquisition means (10) for acquiring three-dimensional space information representing the work area of the robot (21);
virtual environment information acquisition means for acquiring a three-dimensional model (21v, 40v, 41v, 44v) of a virtual space in which the robot and its peripheral elements are arranged;
The robot (21) or the robot (21) or the comparison means (102) for comparing configurations of at least one of the peripheral elements (40, 41, 42);
output means (103) for outputting a comparison result by the comparison means (102);
A safety verification device (10), characterized in that it comprises:
〔1〕 ロボット(21)の作業エリアを表す3次元空間情報を取得する実環境情報取得手段(10)と、
前記ロボットおよびその周辺要素それぞれを配置した仮想空間の3次元モデル(21v、40v、41v、44v)を取得する仮想環境情報取得手段と、
取得した前記3次元モデル(21v、40v、41v、44v)と前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボット(21)または前記周辺要素(40、41、42)の少なくともいずれか一方の構成を比較する比較手段(102)と、
前記比較手段(102)による比較結果を出力する出力手段(103)と、
を有することを特徴とする安全検証装置(10)。 <Appendix>
[1] real environment information acquisition means (10) for acquiring three-dimensional space information representing the work area of the robot (21);
virtual environment information acquisition means for acquiring a three-dimensional model (21v, 40v, 41v, 44v) of a virtual space in which the robot and its peripheral elements are arranged;
The robot (21) or the robot (21) or the comparison means (102) for comparing configurations of at least one of the peripheral elements (40, 41, 42);
output means (103) for outputting a comparison result by the comparison means (102);
A safety verification device (10), characterized in that it comprises:
〔2〕 ロボットの作業エリアを表す3次元空間情報を取得するステップ(S100)と、
前記ロボットおよびその周辺要素それぞれのを配置した仮想空間の3次元モデルを取得するステップ(S101)と、
取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットまたは前記周辺要素の少なくともいずれか一方の構成を比較するステップ(S102)と、
前記比較するステップの結果を出力するステップ(S103)と、
を有することを特徴とする安全検証方法。 [2] a step of acquiring three-dimensional space information representing the work area of the robot (S100);
obtaining a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged (S101);
Using the acquired three-dimensional model and the three-dimensional space information of the work area, the configurations of at least one of the robot and the peripheral elements are compared between the virtual space of the simulation and the actual work area. a step (S102);
a step of outputting the result of the comparing step (S103);
A safety verification method characterized by having
前記ロボットおよびその周辺要素それぞれのを配置した仮想空間の3次元モデルを取得するステップ(S101)と、
取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットまたは前記周辺要素の少なくともいずれか一方の構成を比較するステップ(S102)と、
前記比較するステップの結果を出力するステップ(S103)と、
を有することを特徴とする安全検証方法。 [2] a step of acquiring three-dimensional space information representing the work area of the robot (S100);
obtaining a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged (S101);
Using the acquired three-dimensional model and the three-dimensional space information of the work area, the configurations of at least one of the robot and the peripheral elements are compared between the virtual space of the simulation and the actual work area. a step (S102);
a step of outputting the result of the comparing step (S103);
A safety verification method characterized by having
〔3〕 ロボット(21)の作業エリアを3次元センサ(31)によって測定することにより得られた、前記作業エリアの3次元空間情報を取得する実環境情報取得手段(100)と、
前記ロボットおよびその周辺要素それぞれの3次元モデル(21v、40v、41v、44v)を仮想空間上に配置し前記ロボット(21v)の動作を前記仮想空間上でシミュレーションするシミュレータ(11)から、シミュレーションに用いられる3次元モデル(21v、40v、41v、44v)を取得する仮想環境情報取得手段(101)と、
取得した前記3次元モデル(21v、40v、41v、44v)と前記作業エリアの前記3次元空間情報とを比較し、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボット(21)および/または前記周辺要素(40、41、42)の構成が相違するか否か判定する比較手段(102)と、
シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットおよび/または前記周辺要素の構成に相違が存在した場合に通知を出力する出力手段(103)と、
を有することを特徴とする安全検証装置(10)。 [3] real environment information acquisition means (100) for acquiring three-dimensional space information of the work area obtained by measuring the work area of the robot (21) with a three-dimensional sensor (31);
From a simulator (11) that places three-dimensional models (21v, 40v, 41v, 44v) of the robot and its peripheral elements in a virtual space and simulates the motion of the robot (21v) in the virtual space, to the simulation virtual environment information acquisition means (101) for acquiring the three-dimensional models (21v, 40v, 41v, 44v) to be used;
The acquired three-dimensional model (21v, 40v, 41v, 44v) is compared with the three-dimensional space information of the work area, and between the virtual space of the simulation and the actual work area, the robot (21) and/or or comparing means (102) for determining whether or not configurations of the peripheral elements (40, 41, 42) are different;
output means (103) for outputting a notification when there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area;
A safety verification device (10), characterized in that it comprises:
前記ロボットおよびその周辺要素それぞれの3次元モデル(21v、40v、41v、44v)を仮想空間上に配置し前記ロボット(21v)の動作を前記仮想空間上でシミュレーションするシミュレータ(11)から、シミュレーションに用いられる3次元モデル(21v、40v、41v、44v)を取得する仮想環境情報取得手段(101)と、
取得した前記3次元モデル(21v、40v、41v、44v)と前記作業エリアの前記3次元空間情報とを比較し、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボット(21)および/または前記周辺要素(40、41、42)の構成が相違するか否か判定する比較手段(102)と、
シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットおよび/または前記周辺要素の構成に相違が存在した場合に通知を出力する出力手段(103)と、
を有することを特徴とする安全検証装置(10)。 [3] real environment information acquisition means (100) for acquiring three-dimensional space information of the work area obtained by measuring the work area of the robot (21) with a three-dimensional sensor (31);
From a simulator (11) that places three-dimensional models (21v, 40v, 41v, 44v) of the robot and its peripheral elements in a virtual space and simulates the motion of the robot (21v) in the virtual space, to the simulation virtual environment information acquisition means (101) for acquiring the three-dimensional models (21v, 40v, 41v, 44v) to be used;
The acquired three-dimensional model (21v, 40v, 41v, 44v) is compared with the three-dimensional space information of the work area, and between the virtual space of the simulation and the actual work area, the robot (21) and/or or comparing means (102) for determining whether or not configurations of the peripheral elements (40, 41, 42) are different;
output means (103) for outputting a notification when there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area;
A safety verification device (10), characterized in that it comprises:
〔4〕 ロボットの作業エリアを3次元センサによって測定することにより得られた、前記作業エリアの3次元空間情報を取得するステップ(S100)と、
前記ロボットおよびその周辺要素それぞれの3次元モデルを仮想空間上に配置し前記ロボットの動作を前記仮想空間上でシミュレーションするシミュレータから、シミュレーションに用いられる3次元モデルを取得するステップ(S101)と、
取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを比較し、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットおよび/または前記周辺要素の構成が相違するか否か判定するステップ(S102)と、
シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットおよび/または前記周辺要素の構成に相違が存在した場合に通知を出力するステップ(S103)と、
を有することを特徴とする安全検証方法。 [4] acquiring three-dimensional spatial information of the work area obtained by measuring the work area of the robot with a three-dimensional sensor (S100);
a step of arranging a three-dimensional model of the robot and its peripheral elements in a virtual space and obtaining a three-dimensional model used for the simulation from a simulator that simulates the motion of the robot in the virtual space (S101);
Comparing the acquired three-dimensional model with the three-dimensional space information of the work area to determine whether or not there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area. a determining step (S102);
outputting a notification when there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area (S103);
A safety verification method characterized by having
前記ロボットおよびその周辺要素それぞれの3次元モデルを仮想空間上に配置し前記ロボットの動作を前記仮想空間上でシミュレーションするシミュレータから、シミュレーションに用いられる3次元モデルを取得するステップ(S101)と、
取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを比較し、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットおよび/または前記周辺要素の構成が相違するか否か判定するステップ(S102)と、
シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットおよび/または前記周辺要素の構成に相違が存在した場合に通知を出力するステップ(S103)と、
を有することを特徴とする安全検証方法。 [4] acquiring three-dimensional spatial information of the work area obtained by measuring the work area of the robot with a three-dimensional sensor (S100);
a step of arranging a three-dimensional model of the robot and its peripheral elements in a virtual space and obtaining a three-dimensional model used for the simulation from a simulator that simulates the motion of the robot in the virtual space (S101);
Comparing the acquired three-dimensional model with the three-dimensional space information of the work area to determine whether or not there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area. a determining step (S102);
outputting a notification when there is a difference in configuration of the robot and/or the peripheral elements between the virtual space of the simulation and the actual work area (S103);
A safety verification method characterized by having
1:シミュレーションシステム
10:安全検証装置
11:シミュレータ
12:表示装置
13:入力装置
20:ロボットコントローラ
21:ロボット
21v:ロボットの3次元モデル
30:監視システム
31:3次元センサ
31v:3次元センサの3次元モデル
40:作業台
40v:作業台の3次元モデル
41:ワーク
41v:ワークの3次元モデル
42:工具
44v:安全柵の3次元モデル 1: Simulation system 10: Safety verification device 11: Simulator 12: Display device 13: Input device 20: Robot controller 21:Robot 21v: 3D model of robot 30: Monitoring system 31: 3D sensor 31v: 3 of 3D sensors Dimensional model 40: workbench 40v: three-dimensional model of workbench 41: workpiece 41v: three-dimensional model of workpiece 42: tool 44v: three-dimensional model of safety fence
10:安全検証装置
11:シミュレータ
12:表示装置
13:入力装置
20:ロボットコントローラ
21:ロボット
21v:ロボットの3次元モデル
30:監視システム
31:3次元センサ
31v:3次元センサの3次元モデル
40:作業台
40v:作業台の3次元モデル
41:ワーク
41v:ワークの3次元モデル
42:工具
44v:安全柵の3次元モデル 1: Simulation system 10: Safety verification device 11: Simulator 12: Display device 13: Input device 20: Robot controller 21:
Claims (11)
- ロボットの作業エリアを表す3次元空間情報を取得する実環境情報取得手段と、
前記ロボットおよびその周辺要素それぞれを配置した仮想空間の3次元モデルを取得する仮想環境情報取得手段と、
取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットまたは前記周辺要素の少なくともいずれか一方の構成を比較する比較手段と、
前記比較手段による比較結果を出力する出力手段と、
を有することを特徴とする安全検証装置。 real environment information acquisition means for acquiring three-dimensional space information representing the work area of the robot;
virtual environment information acquisition means for acquiring a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged;
Using the obtained three-dimensional model and the three-dimensional space information of the work area, the configurations of at least one of the robot and the peripheral elements are compared between the virtual space of the simulation and the actual work area. a means of comparison;
an output means for outputting a comparison result by the comparison means;
A safety verification device comprising: - 前記出力手段は、前記3次元モデルと前記作業エリアの前記3次元空間情報との比較画面を表示装置に出力する
ことを特徴とする請求項1に記載の安全検証装置。 2. The safety verification apparatus according to claim 1, wherein said output means outputs a comparison screen between said three-dimensional model and said three-dimensional space information of said work area to a display device. - 前記出力手段は、前記比較画面において、前記3次元モデルと前記3次元空間情報を重畳する
ことを特徴とする請求項2に記載の安全検証装置。 3. The safety verification apparatus according to claim 2, wherein the output means superimposes the three-dimensional model and the three-dimensional space information on the comparison screen. - 前記出力手段は、前記比較画面において、前記3次元モデルと前記3次元空間情報の間の相違部分を、他の部分とは識別可能な態様で、出力する
ことを特徴とする請求項2または3に記載の安全検証装置。 4. The output means outputs the difference portion between the three-dimensional model and the three-dimensional space information on the comparison screen in a manner distinguishable from other portions. The safety verification device described in . - 前記出力手段は、前記比較画面において、前記3次元モデルには存在するが前記3次元空間情報には存在しない第1の相違部分と、前記3次元モデルには存在しないが前記3次元空間には存在する第2の相違部分とを、互いに異なる態様で、出力する
ことを特徴とする請求項2~4のうちいずれか1項に記載の安全検証装置。 The output means provides, on the comparison screen, a first difference portion that exists in the three-dimensional model but does not exist in the three-dimensional space information, and a first difference portion that does not exist in the three-dimensional model but does not exist in the three-dimensional space. 5. The safety verification device according to any one of claims 2 to 4, wherein the existing second different part is output in different modes. - 実際の作業エリアを反映したシミュレーションを行うために、前記3次元空間情報に合わせて前記3次元モデルを更新する更新手段をさらに有する
ことを特徴とする請求項1~5のうちいずれか1項に記載の安全検証装置。 6. The method according to any one of claims 1 to 5, further comprising updating means for updating said three-dimensional model according to said three-dimensional space information in order to perform a simulation reflecting an actual work area. Safety verification device as described. - 3次元センサを有する監視システムが前記ロボットの前記作業エリアを監視するために設置されており、
前記実環境情報取得手段は、前記監視システムから前記作業エリアの3次元空間情報を取得する
ことを特徴とする請求項1~6のうちいずれか1項に記載の安全検証装置。 a monitoring system having three-dimensional sensors is installed to monitor the work area of the robot;
7. The safety verification apparatus according to claim 1, wherein said real environment information acquisition means acquires three-dimensional space information of said work area from said monitoring system. - 前記比較手段は、前記監視システムの前記3次元センサの位置を基準にして、前記3次元モデルと前記作業エリアの前記3次元空間情報の間の座標合わせを行う
ことを特徴とする請求項7に記載の安全検証装置。 8. The method according to claim 7, wherein the comparison means performs coordinate matching between the three-dimensional model and the three-dimensional space information of the work area, using the position of the three-dimensional sensor of the monitoring system as a reference. Safety verification device as described. - 前記比較手段は、前記3次元モデルとの比較の前に、前記作業エリアの前記3次元空間情報の補正を行うためのユーザインターフェイスを有する
ことを特徴とする請求項1~8のうちいずれか1項に記載の安全検証装置。 9. Any one of claims 1 to 8, wherein said comparison means has a user interface for correcting said three-dimensional space information of said work area prior to comparison with said three-dimensional model. The safety verification device described in paragraph. - ロボットの作業エリアを表す3次元空間情報を取得するステップと、
前記ロボットおよびその周辺要素それぞれのを配置した仮想空間の3次元モデルを取得するステップと、
取得した前記3次元モデルと前記作業エリアの前記3次元空間情報とを用いて、シミュレーションの仮想空間と実際の作業エリアの間で、前記ロボットまたは前記周辺要素の少なくともいずれか一方の構成を比較するステップと、
前記比較するステップの結果を出力するステップと、
を有することを特徴とする安全検証方法。 obtaining three-dimensional spatial information representing the work area of the robot;
obtaining a three-dimensional model of a virtual space in which the robot and its peripheral elements are arranged;
Using the acquired three-dimensional model and the three-dimensional space information of the work area, the configurations of at least one of the robot and the peripheral elements are compared between the virtual space of the simulation and the actual work area. a step;
outputting the result of the comparing step;
A safety verification method characterized by having - 請求項10に記載の安全検証方法の各ステップをコンピュータに実行させるためのプログラム。 A program for causing a computer to execute each step of the safety verification method according to claim 10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-037469 | 2021-03-09 | ||
JP2021037469A JP2022137797A (en) | 2021-03-09 | 2021-03-09 | Safety verification apparatus, safety verification method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190545A1 true WO2022190545A1 (en) | 2022-09-15 |
Family
ID=83226232
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/047127 WO2022190545A1 (en) | 2021-03-09 | 2021-12-20 | Safety verification device, safety verification method, and program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2022137797A (en) |
WO (1) | WO2022190545A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116430795A (en) * | 2023-06-12 | 2023-07-14 | 威海海洋职业学院 | Visual industrial controller and method based on PLC |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016140958A (en) * | 2015-02-03 | 2016-08-08 | キヤノン株式会社 | Offline teaching device, offline teaching method, and robot system |
WO2020066949A1 (en) * | 2018-09-26 | 2020-04-02 | 日本電産株式会社 | Robot path determination device, robot path determination method, and program |
JP2021000678A (en) * | 2019-06-20 | 2021-01-07 | オムロン株式会社 | Control system and control method |
-
2021
- 2021-03-09 JP JP2021037469A patent/JP2022137797A/en active Pending
- 2021-12-20 WO PCT/JP2021/047127 patent/WO2022190545A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016140958A (en) * | 2015-02-03 | 2016-08-08 | キヤノン株式会社 | Offline teaching device, offline teaching method, and robot system |
WO2020066949A1 (en) * | 2018-09-26 | 2020-04-02 | 日本電産株式会社 | Robot path determination device, robot path determination method, and program |
JP2021000678A (en) * | 2019-06-20 | 2021-01-07 | オムロン株式会社 | Control system and control method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116430795A (en) * | 2023-06-12 | 2023-07-14 | 威海海洋职业学院 | Visual industrial controller and method based on PLC |
CN116430795B (en) * | 2023-06-12 | 2023-09-15 | 威海海洋职业学院 | Visual industrial controller and method based on PLC |
Also Published As
Publication number | Publication date |
---|---|
JP2022137797A (en) | 2022-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7419271B2 (en) | Visualizing and modifying operational boundary zones using augmented reality | |
JP6458713B2 (en) | Simulation device, simulation method, and simulation program | |
Lampen et al. | Combining simulation and augmented reality methods for enhanced worker assistance in manual assembly | |
US8849636B2 (en) | Assembly and method for verifying a real model using a virtual model and use in aircraft construction | |
KR100929445B1 (en) | Recording medium including robot simulation apparatus and robot simulation program | |
US9199379B2 (en) | Robot system display device | |
US11135720B2 (en) | Method and system for programming a cobot for a plurality of industrial cells | |
CN103630071A (en) | Methods and systems for inspecting a workpiece | |
Yap et al. | Virtual reality based support system for layout planning and programming of an industrial robotic work cell | |
RU2727136C2 (en) | Simulation method of manipulator movement | |
JP7097251B2 (en) | Construction management system | |
US20230153486A1 (en) | Method and device for simulation | |
EP4221944A1 (en) | Method and system for improved auto-calibration of a robotic cell | |
JP2018008347A (en) | Robot system and operation region display method | |
CN109116807B (en) | Composite reality simulation device and computer-readable medium | |
WO2022190545A1 (en) | Safety verification device, safety verification method, and program | |
US9415512B2 (en) | System and method for enhancing a visualization of coordinate points within a robots working envelope | |
US20240165811A1 (en) | Device for setting safety parameters, teaching device and method | |
CN117203601A (en) | Control of semiconductor manufacturing equipment in a mixed reality environment | |
CA3226486A1 (en) | Method and apparatus for vision-based tool localization | |
JP3560216B2 (en) | Work support device | |
JP7501064B2 (en) | Simulation device, simulation method, and simulation program | |
JP3076841B1 (en) | Teaching program creation method for real environment adaptive robot | |
Hamilton et al. | Progress in standardization for ITER Remote Handling control system | |
KR20210087831A (en) | Portable robot operation method based on virtual sensor and 3-D mesh model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21930390 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21930390 Country of ref document: EP Kind code of ref document: A1 |