WO2022180801A1 - シミュレーション装置、制御システム及びモデリング方法 - Google Patents
シミュレーション装置、制御システム及びモデリング方法 Download PDFInfo
- Publication number
- WO2022180801A1 WO2022180801A1 PCT/JP2021/007415 JP2021007415W WO2022180801A1 WO 2022180801 A1 WO2022180801 A1 WO 2022180801A1 JP 2021007415 W JP2021007415 W JP 2021007415W WO 2022180801 A1 WO2022180801 A1 WO 2022180801A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- model
- simulation
- dimensional
- actual shape
- virtual
- Prior art date
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 201
- 238000000034 method Methods 0.000 title claims description 28
- 238000012937 correction Methods 0.000 claims abstract description 44
- 238000007781 pre-processing Methods 0.000 claims description 30
- 238000004364 calculation method Methods 0.000 claims description 18
- 230000008569 process Effects 0.000 claims description 17
- 230000015572 biosynthetic process Effects 0.000 claims description 13
- 238000003786 synthesis reaction Methods 0.000 claims description 13
- 238000012217 deletion Methods 0.000 claims description 10
- 230000037430 deletion Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 7
- 238000005259 measurement Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 19
- 239000000284 extract Substances 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 7
- 238000011156 evaluation Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 239000003638 chemical reducing agent Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40311—Real time simulation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40607—Fixed camera to observe workspace, object, workpiece, global
Definitions
- the present disclosure relates to simulation devices, control systems, and modeling methods.
- Japanese Laid-Open Patent Publication No. 2004-100001 describes a model storage unit that stores model information about a robot and obstacles, and a model of a path that allows the tip of the robot to move from a start position to an end position while avoiding collisions between the robot and obstacles.
- a robot simulator is disclosed that includes an information processing unit that generates information based on the information.
- the present disclosure provides a simulation device that is effective in improving the reliability of simulation.
- a simulation apparatus includes an actual shape model generation unit that generates an actual shape model representing a three-dimensional actual shape of a machine system including a robot based on measured data, a simulation model of the machine system, and an actual shape model. a model corrector for correcting the simulation model based on the comparison with the model.
- a control system includes the simulation device further including a simulator that simulates the operation of the machine system based on the simulation model, a control device that controls the machine system based on the simulation results of the simulator, Prepare.
- a modeling method includes generating an actual shape model representing a three-dimensional actual shape of a machine system including a robot based on measured data, a simulation model of the machine system, and an actual shape model. and calibrating the simulation model based on the comparison of .
- FIG. 1 is a schematic diagram illustrating the configuration of an automation system
- FIG. 1 is a schematic diagram illustrating the configuration of a robot
- FIG. 3 is a block diagram illustrating the functional configuration of the simulation device
- FIG. FIG. 3 is a diagram illustrating an object to be photographed by a three-dimensional camera
- 5 is a diagram illustrating a three-dimensional image of the object in FIG. 4
- FIG. 4 is a diagram illustrating an actual shape model obtained by synthesizing three-dimensional images; It is a figure which illustrates a real shape model.
- FIG. 4 is a diagram illustrating a simulation model
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching
- FIG. 4 illustrates matching;
- FIG. 4 illustrates matching;
- FIG. 11 is a diagram illustrating a corrected simulation model;
- FIG. 3 is a diagram illustrating an object to be photographed by a three-dimensional camera;
- FIG. 18 is a diagram exemplifying a real shape model of an imaging target in FIG. 17;
- 18 is a diagram illustrating a preprocessed model of the imaging target in FIG. 17;
- FIG. It is a block diagram which illustrates the hardware constitutions of a simulation apparatus.
- 4 is a flow chart illustrating a modeling procedure;
- 4 is a flow chart illustrating a modeling procedure;
- the automation system 1 shown in FIG. 1 is a system for operating at least a robot in a machine system including at least a robot.
- a specific example of the automation system 1 is a production system that operates at least a robot to produce products in the machine system, but the use of the machine system is not necessarily limited to production of products.
- the automation system 1 includes a machine system 2 and a control system 50.
- Machine system 2 includes a plurality of objects 3 .
- Each of the plurality of objects 3 is a substantial object that occupies part of the three-dimensional real space.
- the multiple objects 3 include at least one controlled object 4 to be controlled and at least one peripheral object 5 .
- At least one controlled object 4 includes at least one robot.
- FIG. 1 shows two robots 4A and 4B as at least one controlled object 4, and shows a main stage 5A, substages 5B and 5C, and a frame 5D as at least one peripheral object 5. It is
- FIG. 2 is a schematic diagram illustrating the schematic configuration of the robots 4A and 4B.
- the robots 4A and 4B are 6-axis vertical articulated robots, and include a base portion 11, a swivel portion 12, a first arm 13, a second arm 14, a third arm 17, a tip portion 18, and an actuator. 41, 42, 43, 44, 45, 46.
- the base 11 is installed around the conveyor 3A.
- the swivel part 12 is provided on the base part 11 so as to swivel around a vertical axis 21 .
- the first arm 13 is connected to the swivel portion 12 so as to swing about an axis 22 that intersects (for example, is perpendicular to) the axis 21 .
- Crossing includes a twisted relationship such as a so-called overpass.
- the second arm 14 is connected to the tip of the first arm 13 so as to swing about an axis 23 substantially parallel to the axis 22 .
- the second arm 14 includes an arm base 15 and an arm end 16 .
- the arm base 15 is connected to the tip of the first arm 13 and extends along an axis 24 that intersects (for example, orthogonally) the axis 23 .
- Arm end 16 is connected to the tip of arm base 15 so as to pivot about axis 24 .
- the third arm 17 is connected to the tip of the arm end 16 so as to swing about an axis 25 that intersects (for example, is perpendicular to) the axis 24 .
- the distal end portion 18 is connected to the distal end portion of the third arm 17 so as to pivot about an axis 26 that intersects (for example, is perpendicular to) the axis 25 .
- the robots 4A and 4B include the joint 31 connecting the base 11 and the turning section 12, the joint 32 connecting the turning section 12 and the first arm 13, the first arm 13 and the second arm 14. , a joint 34 that connects the arm base 15 and the arm end 16 in the second arm 14, a joint 35 that connects the arm end 16 and the third arm 17, and the third arm 17 It has a joint 36 that connects with the tip 18 .
- the actuators 41, 42, 43, 44, 45, 46 include, for example, electric motors and speed reducers, and drive the joints 31, 32, 33, 34, 35, 36, respectively.
- the actuator 41 pivots the pivot 12 about the axis 21
- the actuator 42 pivots the first arm 13 about the axis 22
- the actuator 43 pivots the second arm 14 about the axis 23
- the actuator 44 Pivoting arm end 16 about axis 24
- actuator 45 swinging third arm 17 about axis 25 and actuator 46 pivoting tip 18 about axis 26 .
- the robots 4A and 4B can be changed as appropriate.
- the robots 4A and 4B may be 7-axis redundant robots obtained by adding 1-axis joint to the 6-axis vertical articulated robots, or may be so-called scalar type articulated robots.
- the main stage 5A supports robots 4A, 4B, substages 5B, 5C, and frame 5D.
- the substage 5B supports an object to be worked by the robot 4A.
- the substage 5C supports an object to be worked by the robot 4B.
- the frame 5D holds various objects (not shown) in the upper space of the main stage 5A. Specific examples of objects held by the frame 5D include environmental sensors such as laser sensors, tools used by the robots 4A and 4B, and the like.
- the configuration of the machine system 2 shown in FIG. 1 is an example.
- the configuration of the machine system 2 can be changed as appropriate as long as it includes at least one robot.
- machine system 2 may include three or more robots.
- the control system 50 controls at least one controlled object 4 included in the machine system 2 based on an operation program prepared in advance.
- the control system 50 may include a plurality of controllers that respectively control the plurality of controlled objects 4 and a higher-level controller that outputs control commands to the plurality of controllers so as to coordinate the plurality of controlled objects 4 .
- FIG. 1 shows controllers 51 and 52 that control the robots 4A and 4B, respectively, and a host controller 53 .
- the host controller 53 outputs control commands to the controllers 51 and 52 so as to coordinate the robots 4A and 4B.
- the control system 50 further includes a simulation device 100.
- a simulation device 100 simulates the state of the machine system 2 .
- Simulating the state of the machine system 2 includes simulating static arrangement relationships of the plurality of objects 3 .
- Simulating the state of the machine system 2 may further include simulating the dynamic positional relationship of the plurality of objects 3 that changes according to the motion of the controlled objects 4 such as the robots 4A and 4B.
- the simulation is useful for evaluating the adequacy of motions of the robots 4A, 4B based on the motion program before actually operating the robots 4A, 4B. However, if the reliability of the simulation is low, even if the motion is evaluated as appropriate according to the simulation results, when the robots 4A and 4B are actually caused to execute the motion, problems such as collisions between the objects 3 may occur. obtain.
- the motions of the robots 4A and 4B are obtained by comparing the motion results of the robots 4A and 4B with respect to the simulation model including the arrangement information of the plurality of objects 3 including the robots 4A and 4B and the structure/dimension information of each of the plurality of objects 3. is simulated by kinematic operations that reflect
- the simulation device 100 In order to improve the reliability of the simulation, it is important to improve the accuracy of the simulation model.
- the simulation device 100 generates an actual shape model representing the three-dimensional actual shape of the machine system 2 based on the measured data, and compares the simulation model of the machine system 2 with the actual shape model. , correcting the simulation model; and This makes it possible to easily improve the accuracy of the simulation model.
- the simulation apparatus 100 has a simulation model storage unit 111, an actual shape model generation unit 112, and a model correction unit 113 as functional configurations, as shown in FIG.
- a simulation model storage unit 111 stores a simulation model of the machine system 2 .
- the simulation model includes at least location information of the plurality of objects 3 and structure/dimension information of each of the plurality of objects 3 .
- a simulation model is prepared in advance based on design data of the machine system 2 such as three-dimensional CAD data.
- the simulation model may include multiple object models respectively corresponding to multiple objects 3 .
- Each of the plurality of object models includes layout information and structure/dimension information of the corresponding object 3 .
- the placement information of the object 3 includes the position/orientation of the object 3 in a predetermined simulation coordinate system.
- the actual shape model generation unit 112 generates an actual shape model representing the three-dimensional actual shape of the machine system 2 based on actual measurement data.
- Actual measurement data is data obtained by actually measuring the machine system 2 in real space.
- a specific example of the measured data is a three-dimensional actual image of the machine system 2 captured by a three-dimensional camera.
- Specific examples of the three-dimensional camera include a stereo camera, a TOF (Time of Flight) camera, and the like.
- the three-dimensional camera may be a three-dimensional laser displacement meter.
- control system 50 has at least one three-dimensional camera 54, and the real shape model generator 112 generates a real shape model based on the three-dimensional real image of the machine system 2 captured by the three-dimensional camera 54. do.
- the real shape model generation unit 112 may generate a real shape model representing the three-dimensional shape of the surface of the machine system 2 by a point group.
- the real shape model generation unit 112 may generate a real shape model representing the three-dimensional shape of the surface of the machine system 2 with a group of fine polygons.
- the control system 50 may have a plurality of three-dimensional cameras 54, and the real shape model generation unit 112 acquires a plurality of three-dimensional real images from the plurality of three-dimensional cameras 54 and generates a plurality of three-dimensional real images.
- a real shape model may be generated by combining them.
- the actual shape model generation unit 112 acquires a plurality of three-dimensional real images including an image of a common object for synthesis from a plurality of three-dimensional cameras 54, and each of the plurality of real images corresponds to the object for synthesis.
- Multiple real three-dimensional images may be combined to generate a real shape model so that the part is matched to the known shape of the object for compositing.
- FIG. 4 is a schematic diagram exemplifying an object to be photographed by the two three-dimensional cameras 54.
- the machine system 2 is represented by objects 6A and 6B with simplified shapes.
- a 3D image 221 is acquired by the 3D camera 54A on the upper left in FIG. 4
- a 3D image 222 is acquired by the 3D camera 54B on the lower right of FIG.
- the three-dimensional image 221 includes the three-dimensional shape of at least the portion of the machine system 2 facing the three-dimensional camera 54A.
- Three-dimensional image 222 includes the three-dimensional shape of at least the portion of machine system 2 facing three-dimensional camera 54B.
- the real shape model generation unit 112 generates the real shape model 220 by combining the three-dimensional image 221 and the three-dimensional image 222 with the object 6B as the object for synthesis described above.
- the real shape model generation unit 112 matches the three-dimensional shape of the object 6B included in the three-dimensional images 221 and 222 with the known three-dimensional shape of the object 6B. Matching here means moving each of the three-dimensional images 221 and 222 so that the three-dimensional shape of the object 6B included in the three-dimensional images 221 and 222 fits the known three-dimensional shape of the object 6B. do.
- the actual shape model generation unit 112 uses any one of the robots 4A and 4B, the main stage 5A, the sub-stage 5B, the sub-stage 5C, and the frame 5D as an object for synthesis, and generates a three-dimensional image from a plurality of three-dimensional cameras 54. may be synthesized.
- the model correction unit 113 corrects the simulation model based on comparison between the simulation model stored in the simulation model storage unit 111 and the actual shape model generated by the actual shape model generation unit 112 .
- the model correction unit 113 may individually match a plurality of object models to the actual shape model to correct the simulation model. Matching here means correcting the position/orientation of each of a plurality of object models so as to fit the actual shape model.
- the model correction unit 113 may correct the simulation model by repeating matching processing including selecting one matching target model from a plurality of object models and matching the matching target model to the actual shape model. good.
- the model correction unit 113 may match the matching target model with the actual shape model by excluding from the actual shape model a portion that has already matched another object model. In the matching process, the model correction unit 113 may select, as the matching target model, the largest object model among the one or more object models that have not been selected as the matching target model.
- the placement of multiple object models is corrected individually.
- the actual shape model may include portions that do not correspond to any of the multiple object models.
- any one of a plurality of object models may include a portion that does not correspond to the actual shape model.
- the simulation device 100 may further include an object addition unit 114 and an object deletion unit 115.
- the object addition unit 114 extracts from the actual shape model a portion that does not match any object model, and adds a new object based on the extracted portion. Add the model to the simulation model.
- the object deletion unit 115 extracts portions that do not match the actual shape model from the simulation model, and deletes the extracted portions from the simulation model.
- FIG. 7 is a diagram illustrating an actual shape model of the machine system 2
- FIG. 8 is a diagram illustrating a simulation model of the machine system 2.
- FIG. The actual shape model 210 shown in FIG. 7 includes a portion 211 corresponding to the robot 4A, a portion 212 corresponding to the robot 4B, a portion 213 corresponding to the main stage 5A, a portion 214 corresponding to the substage 5B, and a substage. It includes a portion 215 corresponding to frame 5C and a portion 216 corresponding to frame 5D.
- the simulation model 310 shown in FIG. 8 includes a robot model 312A corresponding to the robot 4A, a robot model 312B corresponding to the robot 4B, a main stage model 313A corresponding to the main stage 5A, and a substage model corresponding to the substage 5B. 313B and a frame model 313D corresponding to frame 5D. Simulation model 310 does not include substage model 313C (see FIG. 15) corresponding to substage 5C.
- the model correction unit 113 first selects the largest main stage model 313A from among the robot model 312A, robot model 312B, main stage model 313A, sub-stage model 313B, and frame model 313D. "Large” here means that the occupied area in the three-dimensional space is large.
- the model correction unit 113 matches the main stage model 313A with the actual shape model 210, as shown in FIGS. As shown by the hatched portion in FIG. 10, the main stage model 313A matches the portion 213 of the actual shape model 210 corresponding to the main stage 5A.
- the model correction unit 113 excludes from the actual shape model 210 the portion 213 that already matches the main stage model 313A.
- the portion 213 is deleted in FIG. 11 , excluding the portion 213 from the actual shape model 210 does not mean deleting the portion 213 from the actual shape model 210 .
- the portion 213 can be left in the actual shape model 210 without being deleted, and the portion 213 can be excluded from matching targets in subsequent matching processes. is.
- the model correction unit 113 selects the largest substage model 313B from among the robot model 312A, the robot model 312B, the substage model 313B, and the frame model 313D. is matched to the real shape model 210 .
- the sub-stage model 313B matches the portion 214 of the actual shape model 210 corresponding to the sub-stage 5B.
- Sub-stage model 313B includes a portion 313b that does not match portion 214, as indicated by the dot-patterned portion in FIG.
- the model correction unit 113 excludes from the actual shape model 210 the portion 214 that already matches the substage model 313B.
- the model correction unit 113 selects the largest robot model 312B from among the robot model 312A, the robot model 312B, and the frame model 313D, and matches the robot model 312B with the actual shape model 210.
- FIG. As shown by the hatched portion in FIG. 13, the robot model 312B matches the portion 212 of the real shape model 210 corresponding to the robot 4B.
- the model correction unit 113 excludes from the actual shape model 210 the portion 212 that already matches the robot model 312B.
- the model correction unit 113 selects the largest robot model 312A from the robot model 312A and the frame model 313D, and matches the robot model 312A with the actual shape model 210.
- FIG. As indicated by the hatched portion in FIG. 14, the robot model 312A matches the portion 211 of the real shape model 210 corresponding to the robot 4A.
- the model correction unit 113 excludes from the actual shape model 210 the portion 211 that already matches the robot model 312A. Next, the model correction unit 113 selects the frame model 313D and matches the frame model 313D with the actual shape model 210. FIG. As shown by the hatched portion in FIG. 15, the frame model 313D matches the portion 216 of the actual shape model 210 corresponding to the frame 5D.
- the object adding unit 114 extracts the part 215 and adds a substage model 313C corresponding to the substage 5C to the simulation model 310 based on the part 215, as shown in FIG.
- the object addition unit 114 extracts the portion 313b and deletes the portion 313b from the simulation model 310.
- the correction of the simulation model by the model correction unit 113, the addition of the object model by the object addition unit 114, and the deletion of unnecessary parts by the object deletion unit 115 are completed.
- the real shape model when generating a real shape model based on a three-dimensional real image of the machine system 2 captured by the three-dimensional camera 54, the real shape model may include hidden parts that are not captured by the three-dimensional camera 54. Even in the case where the actual shape model is generated based on a plurality of three-dimensional real images of the machine system 2 photographed by the plurality of three-dimensional cameras 54, the actual shape model is captured by any one of the plurality of three-dimensional cameras 54. may include overlapping hidden parts that are not captured.
- FIG. 17 is a schematic diagram exemplifying an object to be photographed by the two three-dimensional cameras 54.
- FIG. 17 the machine system 2 is represented by objects 7A, 7B, 7C, and 7D with simplified shapes.
- FIG. 18 shows an actual shape model 230 generated based on a three-dimensional image taken by the left three-dimensional camera 54A in FIG. 17 and a three-dimensional image taken by the right three-dimensional camera 54B in FIG. show.
- the actual shape model 230 includes a hidden portion 230a that is not captured by the three-dimensional camera 54A, a hidden portion 230b that is not captured by the three-dimensional camera 54B, and an overlapping hidden portion 230c that is not captured by either of the three-dimensional cameras 54A and 54B.
- An overlapping hidden portion 230c is a portion where the hidden portion 230a and the hidden portion 230b overlap.
- the simulation apparatus 100 If the simulation model does not include hidden parts even though the actual shape model includes hidden parts, the accuracy of matching the object model to the actual shape model may decrease.
- the simulation apparatus 100 generates a preprocessed model by excluding virtual hidden portions corresponding to hidden portions not captured by the three-dimensional camera 54 from the simulation model, and compares the preprocessed model with the actual shape model. The simulation model may be corrected based on the comparison.
- the simulation device 100 When generating a real shape model based on the three-dimensional real images of the machine system 2 photographed by a plurality of three-dimensional cameras 54, the simulation device 100 generates images in overlapping hidden portions that are not captured by any of the plurality of three-dimensional cameras 54.
- a preprocessed model may be generated by excluding the corresponding virtual overlapping hidden portions from the simulation model, and the simulation model may be corrected based on a comparison between the preprocessed model and the actual shape model.
- the simulation apparatus 100 may further include a camera position calculation unit 121, a preprocessing unit 122, a resegmentation unit 123, and a preprocessed model storage unit 124.
- the camera position calculation unit 121 calculates the position of the 3D virtual camera so as to match the 3D virtual image obtained by photographing the simulation model with the 3D virtual camera corresponding to the 3D camera 54 to the 3D real image. do.
- the camera position calculator 121 calculates the position of the 3D virtual camera so as to match the portion of the 3D virtual image corresponding to the predetermined calibration object with the portion of the 3D real image corresponding to the calibration object. may be calculated.
- the camera position calculation unit 121 may set any one of the plurality of objects 3 as an object for calibration, or may set two or more of the plurality of objects 3 as objects for calibration.
- the camera position calculator 121 may use the robot 4A or the robot 4B as the calibration object.
- the camera position calculation unit 121 calculates a three-dimensional virtual image under the condition that the three-dimensional virtual camera is placed at a predetermined initial position, and then calculates the calibration object and the three-dimensional real image in the three-dimensional virtual image.
- the position of the 3D virtual camera is calculated by repeating the evaluation of the difference from the calibration object and changing the position of the 3D virtual camera until the evaluation result of the difference falls below a predetermined level.
- the position of the 3D virtual camera also includes the attitude of the 3D virtual camera.
- the camera position calculation unit 121 matches a plurality of 3D virtual images obtained by photographing the simulation model with a plurality of 3D virtual cameras respectively corresponding to the plurality of 3D cameras 54 to a plurality of 3D real images. , the positions of a plurality of three-dimensional virtual cameras may be calculated.
- the preprocessing unit 122 calculates a virtual hidden portion based on the position of the three-dimensional virtual camera and the simulation model, generates a preprocessed model by excluding the virtual hidden portion from the simulation model, and stores the preprocessed model storage unit. 124. For example, the preprocessing unit 122 extracts a visible surface facing the three-dimensional virtual camera from the simulation model, and calculates a portion located behind the visible surface as a virtual hidden portion.
- the preprocessing unit 122 calculates virtual overlapping hidden portions based on the positions of the plurality of three-dimensional virtual cameras and the simulation model, generates a preprocessed model in which the virtual overlapping hidden portions are excluded from the simulation model, and performs preprocessing. It may be stored in the completed model storage unit 124 .
- FIG. 19 is a diagram exemplifying the preprocessed model 410 generated for the machine system 2 of FIG.
- Preprocessing unit 122 calculates virtual hidden portion 410a corresponding to hidden portion 230a based on the position of three-dimensional virtual camera 321A corresponding to three-dimensional camera 54A in FIG. 17 and the simulation model. Also, the preprocessing unit 122 calculates a virtual hidden portion 410b corresponding to the hidden portion 230b based on the position of the three-dimensional virtual camera 321B corresponding to the three-dimensional camera 54B in FIG. 17 and the simulation model. Furthermore, the preprocessing unit 122 calculates a virtual overlapping hidden portion 410c that is not captured by any of the three-dimensional virtual cameras 321A and 321B. A virtual overlapping hidden portion 410c is a portion where the virtual hidden portion 410a and the virtual hidden portion 410b overlap.
- the preprocessing unit 122 may generate a preprocessed model in the same data format as the actual shape model. For example, when the real shape model generation unit 112 generates a real shape model representing the three-dimensional shape of the surface of the machine system 2 as a point group, the preprocessing unit 122 converts the three-dimensional shape of the surface of the machine system 2 into a point group. You may generate a preprocessed model denoted by . When the real shape model generation unit 112 generates a real shape model that expresses the three-dimensional shape of the surface of the machine system 2 with a group of fine polygons, the preprocessing unit 122 finely reproduces the three-dimensional shape of the surface of the machine system 2. A preprocessed model represented by a group of polygons may be generated.
- the re-dividing unit 123 divides the preprocessed model into multiple preprocessed object models respectively corresponding to the multiple objects 3 .
- the re-dividing unit 123 divides the preprocessed model into a plurality of preprocessed object models based on comparison between each of the plurality of object models stored in the simulation model storage unit 111 and the preprocessed object model. .
- the re-dividing unit 123 sets the part of the preprocessed model 410 corresponding to the object model of the object 7A as the preprocessed object model 411 of the object 7A, and the part of the preprocessed model 410 corresponding to the object model of the object 7B.
- a preprocessed object model 412 of the object 7B is defined as a preprocessed object model 412 of the object 7B.
- a portion corresponding to the object model of the object 7D is defined as a preprocessed object model 414 of the object 7D.
- the model correction unit 113 stores the preprocessed model storage unit 124 corrects the simulation model based on the comparison between the preprocessed model stored by and the actual shape model generated by the actual shape model generation unit 112 .
- the model correction unit 113 matches each of the plurality of object models to the actual shape model based on comparison between the corresponding preprocessed object model and the actual shape model.
- the real shape model does not include hidden parts, or if the effect of the hidden parts on the matching accuracy of the object model with respect to the real shape model can be ignored, the virtual hidden parts are excluded from the simulation model. It is not mandatory to generate a preprocessed model. Even in such a case, preprocessing may be performed to match the data format of the simulation model with the data format of the actual shape model.
- the simulation device 100 may further have a simulator 125 .
- the simulator 125 simulates the operation of the machine system 2 based on the simulation model corrected by the model corrector 113 .
- the simulator 125 simulates the operation of the machine system 2 by kinematics calculation (for example, forward kinematics calculation) that reflects the action results of the controlled objects 4 such as the robots 4A and 4B in the simulation model.
- the simulation device 100 may further include a program generator 126.
- the program generation unit 126 (planning support device) supports operation planning of the machine system 2 based on the simulation result by the simulator 125 .
- the program generation unit 126 repeatedly evaluates an operation program for controlling the control target object 4 such as the robots 4A and 4B based on the simulation result by the simulator 125 and corrects the operation program based on the evaluation result. to generate an operating program.
- the program generation unit 126 may transmit the generated operation program to the host controller 53 so that the control target object 4 is controlled based on the generated operation program.
- the host controller 53 controls the machine system based on the simulation results from the simulator 125 .
- FIG. 20 is a block diagram illustrating the hardware configuration of the simulation device 100.
- simulation device 100 has circuit 190 .
- Circuitry 190 includes one or more processors 191 , memory 192 , storage 193 , input/output ports 194 and communication ports 195 .
- the storage 193 has a computer-readable storage medium such as a non-volatile semiconductor memory.
- the storage 193 generates at least an actual shape model representing the three-dimensional actual shape of the machine system 2 based on actual measurement data, and generates a simulation model based on comparison between the simulation model of the machine system 2 and the actual shape model.
- a program for causing the simulation apparatus 100 to correct and to execute is stored.
- the storage 193 stores a program for causing the simulation apparatus 100 to construct the functional configuration described above.
- the memory 192 temporarily stores the program loaded from the storage medium of the storage 193 and the calculation result by the processor 191 .
- the processor 191 configures each functional block of the simulation device 100 by executing the above program in cooperation with the memory 192 .
- the input/output port 194 inputs and outputs information to and from the three-dimensional camera 54 according to instructions from the processor 191 .
- the communication port 195 communicates with the host controller 53 according to instructions from the processor 191 .
- circuit 190 is not necessarily limited to configuring each function by a program.
- the circuit 190 may configure at least part of its functions by a dedicated logic circuit or an ASIC (Application Specific Integrated Circuit) integrating this.
- ASIC Application Specific Integrated Circuit
- Simulation model correction procedure executed by the simulation apparatus 100 will be illustrated.
- This procedure includes generating an actual shape model representing the three-dimensional actual shape of the machine system 2 based on actual measurement data, and correcting the simulation model based on comparison between the simulation model of the machine system 2 and the actual shape model. including doing and
- the simulation apparatus 100 first sequentially executes steps S01, S02, S03, S04, S05, S06, S07, and S08.
- step S ⁇ b>01 the real shape model generation unit 112 acquires a plurality of three-dimensional real images of the machine system 2 photographed by a plurality of three-dimensional cameras 54 .
- step S02 the real shape model generation unit 112 recognizes the portion corresponding to the synthesis object described above in each of the plurality of three-dimensional real images acquired in step S01.
- the actual shape model generation unit 112 generates a plurality of three-dimensional real images so that the portions corresponding to the synthesis objects in each of the plurality of three-dimensional real images match the known shape of the synthesis object. Combine them to generate an actual shape model.
- step S04 the camera position calculation unit 121 recognizes the portion corresponding to the calibration object in each of the plurality of three-dimensional real images.
- step S05 the camera position calculation unit 121 moves the 3D virtual camera so as to match the portion of the 3D virtual image that corresponds to the calibration object with the portion of the 3D real image that corresponds to the calibration object. , is executed for each of a plurality of three-dimensional virtual cameras.
- step S06 the preprocessing unit 122 calculates a virtual hidden part of the simulation model that is not captured by the three-dimensional virtual camera, based on the position of the three-dimensional virtual camera and the simulation model. Execute for each virtual camera.
- step S07 the preprocessing unit 122 creates a preprocessed model in which virtual overlapping hidden portions that are not captured by any of the plurality of three-dimensional virtual cameras are excluded from the simulation model, based on the calculation result of the virtual hidden portions in step S06. It is generated and stored in the preprocessed model storage unit 124 .
- step S ⁇ b>08 the re-dividing unit 123 divides the preprocessed model stored in the preprocessed model storage unit 124 into a plurality of preprocessed object models respectively corresponding to the plurality of objects 3 .
- step S11 the model correction unit 113 selects, as a matching target model, the largest object model among the one or more object models that have not been selected as matching target models from among the plurality of object models.
- step S12 the model correction unit 113 matches the matching target model with the actual shape model based on the comparison between the preprocessed object model corresponding to the matching target model and the actual shape model.
- step S13 the model correction unit 113 excludes the portion of the actual shape model that matches the matching target model from the target of the matching process from the next time onward.
- step S14 the model correction unit 113 confirms whether or not the matching process for all object models has been completed.
- step S14 If it is determined in step S14 that there remains an object model for which matching processing has not been completed, the simulation device 100 returns the processing to step S11. Thereafter, selection of a matching target model and matching of the matching target model with the actual shape model are repeated until matching is completed for all object models.
- step S15 the object adding section 114 extracts from the real shape model a portion that does not match any object model, and adds a new object model to the simulation model based on the extracted portion. Also, the object deletion unit 115 extracts portions that do not match the actual shape model from the simulation model, and deletes the extracted portions from the simulation model. This completes the procedure for correcting the simulation model.
- the simulation apparatus 100 includes the actual shape model generation unit 112 that generates the actual shape model 210 representing the three-dimensional actual shape of the machine system 2 including the robots 4A and 4B based on the measured data, and the machine system 2 simulation model 310 and a model correction unit 113 that corrects the simulation model 310 based on comparison with the actual shape model 210 .
- this simulation device 100 the accuracy of the simulation model 310 can be easily improved. Therefore, this simulation apparatus 100 is effective in improving the reliability of simulation.
- the machine system 2 includes a plurality of objects 3 including robots 4A and 4B, the simulation model 310 includes a plurality of object models respectively corresponding to the plurality of objects 3, and the model correction unit 113 separates the plurality of object models.
- the simulation model 310 may be corrected by matching the actual shape model 210 immediately. In this case, the simulation model 310 can be corrected with higher accuracy by matching the actual shape model 210 for each of a plurality of object models.
- the model correction unit 113 corrects the simulation model 310 by repeating matching processing including selecting one matching target model from a plurality of object models and matching the matching target model to the actual shape model 210. may In this case, it is possible to easily and reliably perform matching for each of a plurality of objects.
- the model correction unit 113 may match the matching target model with the actual shape model 210 by excluding from the actual shape model 210 a portion already matched with another object model.
- the new matching target model can be matched with the actual shape model 210 without being affected by the parts already matched with other object models. Therefore, the simulation model 310 can be corrected with higher accuracy.
- the model correction unit 113 may select, as a matching target model, the largest object model among one or more object models that have not yet been selected as matching target models in the matching process. In this case, matching is performed in order from the largest object model, and by excluding the portion matched with the object model from the actual shape model 210, it becomes easier to narrow down the portion to be matched with the matching target model in each matching process. Therefore, the simulation model 310 can be corrected with higher accuracy.
- the simulation apparatus 100 extracts from the actual shape model 210 a portion that does not match any object model, and creates a new object based on the extracted portion.
- An object adder 114 that adds models to the simulation model 310 may also be provided. In this case, the simulation model 310 can be corrected with higher accuracy.
- the simulation apparatus 100 extracts from the simulation model 310 portions that do not match the actual shape model 210, and deletes the extracted portions from the simulation model 310.
- a deletion unit 115 may be further provided. In this case, the simulation model 310 can be corrected with higher accuracy.
- the actual shape model generation unit 112 generates the actual shape model 230 based on the three-dimensional actual image of the machine system 2 photographed by the three-dimensional camera 54, and the simulation device 100 generates the hidden part that is not captured by the three-dimensional camera 54.
- 230a, 230b corresponding to 230a, 230b are removed from the simulation model 310 to generate a preprocessed model 410.
- Simulation model 310 may be corrected based on the comparison with model 210 .
- a preprocessed model 410 obtained by excluding from the simulation model 310 portions of the plurality of objects 3 that cannot be represented by the real shape model 210 because they are not captured by the three-dimensional camera 54 is compared with the real shape model 230.
- the simulation model 310 can be corrected with higher accuracy.
- the actual shape model generation unit 112 generates the actual shape model 230 based on the three-dimensional actual image of the machine system 2 photographed by the three-dimensional camera 54 , and the simulation device 100 uses the three-dimensional camera 54 of the machine system 2 .
- a preprocessing unit 122 that generates a preprocessed model 410 by excluding virtual hidden parts 410a and 410b corresponding to hidden parts 230a and 230b that are not shown in the simulation model 310 from the simulation model 310, and a subdivision unit 123 that divides each of the plurality of object models into a plurality of preprocessed object models respectively corresponding to the respective preprocessed object models and the actual shape model may be matched to the actual shape model 210 based on comparison with . In this case, by improving the accuracy of matching for each of the plurality of object models, the simulation model 310 can be corrected with higher accuracy.
- the simulation apparatus 100 uses the three-dimensional virtual cameras 321A and 321B corresponding to the three-dimensional camera 54 to match the three-dimensional virtual image obtained by photographing the simulation model 310 with the three-dimensional real image.
- a camera position calculation unit 121 that calculates the position of 321B is further provided, and the preprocessing unit 122 calculates virtual hidden portions 410a and 410b based on the positions of the three-dimensional virtual cameras 321A and 321B and the simulation model 310. good.
- the simulation model 310 can be corrected with higher accuracy.
- the camera position calculation unit 121 controls the 3D virtual camera 321A, 321A, and 321A so as to match a portion of the 3D virtual image that corresponds to a predetermined calibration object with a portion of the 3D real image that corresponds to the calibration object. 321B may be calculated. In this case, the positions of the three-dimensional virtual cameras 321A and 321B can be more easily corrected by performing matching between the three-dimensional virtual image and the three-dimensional real image, limited to the portion corresponding to the calibration object. can.
- the real shape model generation unit 112 acquires a plurality of three-dimensional real images from a plurality of three-dimensional cameras 54 including the three-dimensional cameras 54A and 54B, combines the plurality of three-dimensional real images to generate a real shape model 210,
- the preprocessing unit 122 may generate the preprocessed model 410 by excluding from the simulation model 310 the virtual overlapping hidden portion 410c corresponding to the overlapping hidden portion 230c that is not captured by any of the three-dimensional cameras 54A and 54B. . In this case, the simulation model 310 can be corrected with higher accuracy by reducing the virtual overlapping hidden portion 410c.
- the real shape model generating unit 112 acquires a plurality of 3D real images including a common object for synthesis from a plurality of 3D cameras 54 including the 3D cameras 54A and 54B, and generates a plurality of 3D real images.
- the actual shape model 210 may be generated by combining a plurality of three-dimensional real images such that the portion corresponding to the object for synthesis in each matches the known shape of the object for synthesis. In this case, a plurality of three-dimensional real images can be easily combined to generate the real shape model 210 with few hidden parts.
- the simulation apparatus 100 converts a plurality of three-dimensional virtual images obtained by photographing the simulation model 310 with a plurality of three-dimensional virtual cameras 321A and 321B corresponding to the plurality of three-dimensional cameras 54A and 54B to a plurality of three-dimensional real images.
- the preprocessing unit 122 calculates the positions of the three-dimensional virtual cameras 321A and 321B and the simulation model 310 A virtual overlapping hidden portion 410c may be calculated based on .
- the simulation model 310 can be corrected with higher accuracy by associating the virtual overlapping hidden portion 410c with the overlapping hidden portion 230c with higher accuracy.
- the real shape model generation unit 112 generates a real shape model 210 representing the three-dimensional real shape of the machine system 2 with a point group, and the preprocessing unit 122 represents the three-dimensional virtual shape of the simulation model 310 with a virtual point group.
- a processed model 410 may be generated. In this case, the difference between the actual shape model 210 and the preprocessed model 410 can be easily evaluated.
- the real shape model generating unit 112 generates a real shape model 210 that represents the three-dimensional real shape of the machine system 2 as a point group, and the simulation device 100 performs preprocessing that represents the three-dimensional virtual shape of the simulation model 310 as a virtual point group. and a preprocessing unit that generates the finished model 410 , and the model correction unit 113 may correct the simulation model 310 based on a comparison between the preprocessed model 410 and the actual shape model 210 . In this case, the difference between the actual shape model 210 and the preprocessed model 410 can be easily evaluated.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
Abstract
Description
図1に示すオートメーションシステム1は、少なくともロボットを含むマシンシステムにおいて、少なくともロボットを動作させるシステムである。オートメーションシステム1の具体例としては、マシンシステムにおいて製品を生産するように、少なくともロボットを動作させる生産システムが挙げられるが、マシンシステムの用途は必ずしも製品の生産に限られない。
しかしながら、シミュレーションの信頼性が低いと、シミュレーション結果によれば適切と評価された動作であっても、その動作を実際にロボット4A,4Bに実行させると、オブジェクト3同士の衝突等の不具合が生じ得る。
シミュレーションモデルは、三次元CADデータ等のマシンシステム2の設計データに基づいて予め準備されている。シミュレーションモデルは、複数のオブジェクト3にそれぞれに対応する複数のオブジェクトモデルを含んでもよい。複数のオブジェクトモデルのそれぞれは、対応するオブジェクト3の配置情報、及び構造・寸法情報を含む。オブジェクト3の配置情報は、所定のシミュレーション座標系におけるオブジェクト3の位置・姿勢を含む。
続いて、モデリング方法の一例として、シミュレーション装置100が実行するシミュレーションモデルの補正手順を例示する。この手順は、マシンシステム2の三次元実形状を表す実形状モデルを実測データに基づいて生成することと、マシンシステム2のシミュレーションモデルと、実形状モデルとの比較に基づいて、シミュレーションモデルを補正することと、を含む。
以上に説明したように、シミュレーション装置100は、ロボット4A,4Bを含むマシンシステム2の三次元実形状を表す実形状モデル210を実測データに基づいて生成する実形状モデル生成部112と、マシンシステム2のシミュレーションモデル310と、実形状モデル210との比較に基づいて、シミュレーションモデル310を補正するモデル補正部113と、を備える。
Claims (19)
- ロボットを含むマシンシステムの三次元実形状を表す実形状モデルを実測データに基づいて生成する実形状モデル生成部と、
前記マシンシステムのシミュレーションモデルと、前記実形状モデルとの比較に基づいて、前記シミュレーションモデルを補正するモデル補正部と、を備えるシミュレーション装置。 - 前記マシンシステムは、前記ロボットを含む複数のオブジェクトを含み、
前記シミュレーションモデルは、前記複数のオブジェクトにそれぞれ対応する複数のオブジェクトモデルを含み、
前記モデル補正部は、前記複数のオブジェクトモデルを個別に前記実形状モデルにマッチングさせて前記シミュレーションモデルを補正する、請求項1記載のシミュレーション装置。 - 前記モデル補正部は、前記複数のオブジェクトモデルから、一つのマッチング対象モデルを選択することと、前記マッチング対象モデルを前記実形状モデルにマッチングさせることと、を含むマッチング処理を繰り返して前記シミュレーションモデルを補正する、請求項2記載のシミュレーション装置。
- 前記モデル補正部は、前記マッチング処理において、他のオブジェクトモデルに既にマッチングしている部分を前記実形状モデルから除外して、前記マッチング対象モデルを前記実形状モデルにマッチングさせる、請求項3記載のシミュレーション装置。
- 前記モデル補正部は、前記マッチング処理において、まだ前記マッチング対象モデルとして選択されていない1以上のオブジェクトモデルのうち最も大きいオブジェクトモデルを前記マッチング対象モデルとして選択する、請求項4記載のシミュレーション装置。
- 前記複数のオブジェクトモデルの全てに対してマッチング処理が完了した後、いずれのオブジェクトモデルにもマッチングしていない部分を前記実形状モデルから抽出し、抽出した部分に基づいて新たなオブジェクトモデルを前記シミュレーションモデルに追加するオブジェクト追加部を更に備える、請求項2~5のいずれか一項記載のシミュレーション装置。
- 前記複数のオブジェクトモデルの全てに対してマッチング処理が完了した後、前記実形状モデルにマッチングしていない部分を前記シミュレーションモデルから抽出し、抽出した部分を前記シミュレーションモデルから削除するオブジェクト削除部を更に備える、請求項2~6のいずれか一項記載のシミュレーション装置。
- 前記実形状モデル生成部は、三次元カメラにより撮影された前記マシンシステムの三次元実画像に基づいて前記実形状モデルを生成し、
前記シミュレーション装置は、前記三次元カメラには写らない隠れ部分に対応する仮想隠れ部分を前記シミュレーションモデルから除外した前処理済みモデルを生成する前処理部を更に備え、
前記モデル補正部は、前記前処理済みモデルと、前記実形状モデルとの比較に基づいて前記シミュレーションモデルを補正する、請求項1~7のいずれか一項記載のシミュレーション装置。 - 前記実形状モデル生成部は、三次元カメラにより撮影された前記マシンシステムの三次元実画像に基づいて前記実形状モデルを生成し、
前記シミュレーション装置は、
前記マシンシステムのうち、前記三次元カメラには写らない隠れ部分に対応する仮想隠れ部分を前記シミュレーションモデルから除外した前処理済みモデルを生成する前処理部と、
前記前処理済みモデルを、前記複数のオブジェクトにそれぞれ対応する複数の前処理済みオブジェクトモデルに分割する再分割部と、を更に備え、
前記モデル補正部は、前記複数のオブジェクトモデルのそれぞれを、対応する前処理済みオブジェクトモデルと、前記実形状モデルとの比較に基づいて前記実形状モデルにマッチングさせる、請求項2~7のいずれか一項記載のシミュレーション装置。 - 前記三次元カメラに対応する三次元仮想カメラにより前記シミュレーションモデルを撮影することで得られる三次元仮想画像を前記三次元実画像にマッチングさせるように前記三次元仮想カメラの位置を算出するカメラ位置算出部を更に備え、
前記前処理部は、前記三次元仮想カメラの位置と、前記シミュレーションモデルとに基づいて前記仮想隠れ部分を算出する、請求項8又は9記載のシミュレーション装置。 - 前記カメラ位置算出部は、前記三次元仮想画像のうち所定のキャリブレーション用オブジェクトに対応する部分を、前記三次元実画像のうち前記キャリブレーション用オブジェクトに対応する部分にマッチングさせるように前記三次元仮想カメラの位置を算出する、請求項10記載のシミュレーション装置。
- 前記実形状モデル生成部は、前記三次元カメラを含む複数の三次元カメラから複数の三次元実画像を取得し、複数の三次元実画像を組み合わせて前記実形状モデルを生成し、
前記前処理部は、前記複数の三次元カメラのいずれにも写らない重複隠れ部分に対応する仮想重複隠れ部分を前記シミュレーションモデルから除外した前処理済みモデルを生成する、請求項8又は9記載のシミュレーション装置。 - 前記実形状モデル生成部は、前記三次元カメラを含む複数の三次元カメラから、共通の合成用オブジェクトの画像を含む複数の三次元実画像を取得し、前記複数の三次元実画像のそれぞれにおいて前記合成用オブジェクトに対応する部分を、前記合成用オブジェクトの既知の形状にマッチングさせるように、前記複数の三次元実画像を組み合わせて前記実形状モデルを生成する、請求項12記載のシミュレーション装置。
- 前記複数の三次元カメラにそれぞれ対応する複数の三次元仮想カメラにより前記シミュレーションモデルを撮影することで得られる複数の三次元仮想画像を前記複数の三次元実画像にマッチングさせるように前記複数の三次元仮想カメラの位置を算出するカメラ位置算出部を更に備え、
前記前処理部は、前記複数の三次元仮想カメラの位置と、前記シミュレーションモデルとに基づいて、前記仮想重複隠れ部分を算出する、請求項12又は13記載のシミュレーション装置。 - 前記実形状モデル生成部は、前記マシンシステムの三次元実形状を点群で表す前記実形状モデルを生成し、
前記前処理部は、前記シミュレーションモデルの三次元仮想形状を仮想点群で表す前記前処理済みモデルを生成する、請求項8~14のいずれか一項記載のシミュレーション装置。 - 前記実形状モデル生成部は、前記マシンシステムの三次元実形状を点群で表す前記実形状モデルを生成し、
前記シミュレーション装置は、前記シミュレーションモデルの三次元仮想形状を仮想点群で表す前処理済みモデルを生成する前処理部と、を更に備え、
前記モデル補正部は、前記前処理済みモデルと、前記実形状モデルとの比較に基づいて前記シミュレーションモデルを補正する、請求項1~7のいずれか一項記載のシミュレーション装置。 - 前記シミュレーションモデルに基づいて、前記マシンシステムの動作をシミュレーションするシミュレータを更に備える、請求項1~16のいずれか一項記載のシミュレーション装置。
- 請求項17記載のシミュレーション装置と、
前記シミュレータによるシミュレーション結果に基づいて前記マシンシステムを制御する制御装置と、を備える制御システム。 - ロボットを含むマシンシステムの三次元実形状を表す実形状モデルを実測データに基づいて生成することと、
前記マシンシステムのシミュレーションモデルと、前記実形状モデルとの比較に基づいて、前記シミュレーションモデルを補正することと、を含むモデリング方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023501964A JPWO2022180801A1 (ja) | 2021-02-26 | 2021-02-26 | |
CN202180094494.2A CN116917093A (zh) | 2021-02-26 | 2021-02-26 | 模拟装置、控制系统以及建模方法 |
PCT/JP2021/007415 WO2022180801A1 (ja) | 2021-02-26 | 2021-02-26 | シミュレーション装置、制御システム及びモデリング方法 |
US18/450,406 US20230390921A1 (en) | 2021-02-26 | 2023-08-16 | Simulation model correction of a machine system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/007415 WO2022180801A1 (ja) | 2021-02-26 | 2021-02-26 | シミュレーション装置、制御システム及びモデリング方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/450,406 Continuation US20230390921A1 (en) | 2021-02-26 | 2023-08-16 | Simulation model correction of a machine system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022180801A1 true WO2022180801A1 (ja) | 2022-09-01 |
Family
ID=83048701
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/007415 WO2022180801A1 (ja) | 2021-02-26 | 2021-02-26 | シミュレーション装置、制御システム及びモデリング方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230390921A1 (ja) |
JP (1) | JPWO2022180801A1 (ja) |
CN (1) | CN116917093A (ja) |
WO (1) | WO2022180801A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024134801A1 (ja) * | 2022-12-21 | 2024-06-27 | 日本電気株式会社 | 処理装置、制御装置、シミュレーションシステム、処理方法、制御方法、および記録媒体 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11104984A (ja) * | 1997-10-06 | 1999-04-20 | Fujitsu Ltd | 実環境情報表示装置及び実環境情報表示処理を実行するプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2003150219A (ja) * | 2001-11-12 | 2003-05-23 | Fanuc Ltd | 作業機械のシミュレーション装置 |
WO2007069721A1 (ja) * | 2005-12-16 | 2007-06-21 | Ihi Corporation | 三次元形状データの記憶・表示方法と装置および三次元形状の計測方法と装置 |
JP2016221659A (ja) * | 2015-06-03 | 2016-12-28 | キヤノン株式会社 | ロボットシステム、ロボットシステムの制御方法、プログラム、記録媒体及び物品の製造方法 |
JP2019089201A (ja) * | 2019-03-12 | 2019-06-13 | キヤノン株式会社 | 教示データ作成装置、教示データ作成装置の制御方法及びロボットシステム |
JP2019141943A (ja) * | 2018-02-19 | 2019-08-29 | ファナック株式会社 | ロボットの動作をシミュレーションするシミュレーション装置 |
JP2019191989A (ja) * | 2018-04-26 | 2019-10-31 | キヤノン株式会社 | 仮想視点画像を生成するシステム、方法及びプログラム |
JP2020075338A (ja) * | 2018-11-09 | 2020-05-21 | オムロン株式会社 | ロボット制御装置、シミュレーション方法、及びシミュレーションプログラム |
WO2020235057A1 (ja) * | 2019-05-22 | 2020-11-26 | 日本電気株式会社 | モデル生成装置、モデル生成システム、モデル生成方法、非一時的なコンピュータ可読媒体 |
-
2021
- 2021-02-26 CN CN202180094494.2A patent/CN116917093A/zh active Pending
- 2021-02-26 WO PCT/JP2021/007415 patent/WO2022180801A1/ja active Application Filing
- 2021-02-26 JP JP2023501964A patent/JPWO2022180801A1/ja active Pending
-
2023
- 2023-08-16 US US18/450,406 patent/US20230390921A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11104984A (ja) * | 1997-10-06 | 1999-04-20 | Fujitsu Ltd | 実環境情報表示装置及び実環境情報表示処理を実行するプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2003150219A (ja) * | 2001-11-12 | 2003-05-23 | Fanuc Ltd | 作業機械のシミュレーション装置 |
WO2007069721A1 (ja) * | 2005-12-16 | 2007-06-21 | Ihi Corporation | 三次元形状データの記憶・表示方法と装置および三次元形状の計測方法と装置 |
JP2016221659A (ja) * | 2015-06-03 | 2016-12-28 | キヤノン株式会社 | ロボットシステム、ロボットシステムの制御方法、プログラム、記録媒体及び物品の製造方法 |
JP2019141943A (ja) * | 2018-02-19 | 2019-08-29 | ファナック株式会社 | ロボットの動作をシミュレーションするシミュレーション装置 |
JP2019191989A (ja) * | 2018-04-26 | 2019-10-31 | キヤノン株式会社 | 仮想視点画像を生成するシステム、方法及びプログラム |
JP2020075338A (ja) * | 2018-11-09 | 2020-05-21 | オムロン株式会社 | ロボット制御装置、シミュレーション方法、及びシミュレーションプログラム |
JP2019089201A (ja) * | 2019-03-12 | 2019-06-13 | キヤノン株式会社 | 教示データ作成装置、教示データ作成装置の制御方法及びロボットシステム |
WO2020235057A1 (ja) * | 2019-05-22 | 2020-11-26 | 日本電気株式会社 | モデル生成装置、モデル生成システム、モデル生成方法、非一時的なコンピュータ可読媒体 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024134801A1 (ja) * | 2022-12-21 | 2024-06-27 | 日本電気株式会社 | 処理装置、制御装置、シミュレーションシステム、処理方法、制御方法、および記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
CN116917093A (zh) | 2023-10-20 |
JPWO2022180801A1 (ja) | 2022-09-01 |
US20230390921A1 (en) | 2023-12-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106873550B (zh) | 模拟装置以及模拟方法 | |
US20200338730A1 (en) | Trajectory planning device, trajectory planning method and program | |
US7324873B2 (en) | Offline teaching apparatus for robot | |
JP6576255B2 (ja) | ロボット軌道生成方法、ロボット軌道生成装置、および製造方法 | |
JP6585665B2 (ja) | 仮想オブジェクト表示システム | |
CN110298854B (zh) | 基于在线自适应与单目视觉的飞行蛇形臂协同定位方法 | |
JP3673749B2 (ja) | シミュレーション装置 | |
US7945349B2 (en) | Method and a system for facilitating calibration of an off-line programmed robot cell | |
US20110054685A1 (en) | Robot off-line teaching method | |
US20180036883A1 (en) | Simulation apparatus, robot control apparatus and robot | |
KR101126808B1 (ko) | 다축 제어 기계의 오차 평가 방법 및 장치 | |
JP2001105359A (ja) | ロボットシステム用グラフィック表示装置 | |
JP5490080B2 (ja) | スケルトンモデルの姿勢制御方法,及びプログラム | |
JP6915441B2 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム | |
JP4553437B2 (ja) | 画像検査システム及び制御方法 | |
JP7052250B2 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム | |
US20230390921A1 (en) | Simulation model correction of a machine system | |
JP3415427B2 (ja) | ロボットシミュレーションにおけるキャリブレーション装置 | |
JP2021059012A (ja) | 情報処理装置、情報処理方法及びロボットシステム | |
JP2019193975A (ja) | ロボット軌道生成方法、ロボット軌道生成装置、および製造方法 | |
JP2001216015A (ja) | ロボットの動作教示装置 | |
CN110568818A (zh) | 虚拟对象显示系统 | |
JP7249221B2 (ja) | センサ位置姿勢キャリブレーション装置及びセンサ位置姿勢キャリブレーション方法 | |
JPWO2022180801A5 (ja) | ||
JP2018132847A (ja) | 情報処理装置、情報処理方法、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21927901 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023501964 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180094494.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21927901 Country of ref document: EP Kind code of ref document: A1 |