WO2017119088A1 - ロボットシステムおよび制御方法 - Google Patents
ロボットシステムおよび制御方法 Download PDFInfo
- Publication number
- WO2017119088A1 WO2017119088A1 PCT/JP2016/050278 JP2016050278W WO2017119088A1 WO 2017119088 A1 WO2017119088 A1 WO 2017119088A1 JP 2016050278 W JP2016050278 W JP 2016050278W WO 2017119088 A1 WO2017119088 A1 WO 2017119088A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- shape
- basic
- unit
- operation method
- basic shape
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
- G05B2219/36404—Adapt teached position as function of deviation 3-D, 2-D position workpiece
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39534—By positioning fingers, dimension of object can be measured
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40564—Recognize shape, contour of object, extract position and orientation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/02—Arm motion controller
- Y10S901/09—Closed loop, sensor feedback controls arm movement
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/46—Sensing device
- Y10S901/47—Optical
Definitions
- the present invention relates to a system and a control method for operating an article by a robot.
- a method using a device that measures the shape of the surrounding environment such as a stereo camera or a 3D sensor.
- the shape measuring device it is possible to measure the distance to a surrounding entity, measure the three-dimensional shape of the entity, and recognize the shape, position, and orientation of the target article.
- a shape measuring device there is an operation method disclosed in Patent Document 1. In this operation method, the target article is detected from the shape of the surrounding environment obtained by measurement, and the posture of each detected article is recognized. By performing the gripping operation in accordance with the recognized posture, it is possible to grip the article flexibly even if the posture of each article is different.
- Patent Document 2 or Patent Document 3 discloses a method that enables gripping by storing a gripping method for each posture of the target article and selecting the gripping method based on the actual posture of the target article. ing.
- the operator registers in advance information on the operation method of the article, such as the shape of the target article and the gripping position of the article, which is required when operating the robot in accordance with the target article obtained by measurement. It is necessary to keep it.
- One aspect of the present invention is a mechanism unit that operates an article to be operated, a shape measurement unit that measures the shape of an object, a basic operation storage unit that stores basic operations representing operations of a basic mechanism unit, and a shape Based on the shape of the object measured by the measurement unit, an operation method calculation unit that calculates the operation method by modifying the stored basic motion, and controls the mechanism unit based on the operation method calculated by the operation method calculation unit And a control unit.
- the robot system further includes a basic shape storage unit that stores at least one basic shape having a predetermined shape and an operation method by the mechanism unit for the basic shape in pairs. , For the shape measured by the shape measurement unit, select at least one of the basic shapes stored in the basic shape storage unit and associate it as a selected basic shape, a selected basic shape, and a selection A basic shape deforming unit that deforms a selection operation method, which is an operation method paired with the basic shape, in accordance with the measured shape; At this time, the operation method calculation unit includes a basic operation deformation unit that deforms the basic operation based on the selected basic shape and the selection operation method after the deformation of the basic shape deformation unit, and is deformed by the basic operation deformation unit. The operation method is calculated based on the basic operation after the deformation.
- Another aspect of the present invention is a control method for controlling a mechanism unit by an information processing apparatus including a processing device, a storage device, an input device, and an output device.
- the storage unit stores at least one or more of a basic shape indicating a predetermined shape and an operation method by the mechanism unit for the basic shape.
- the processing device selects one basic shape from the stored basic shapes and associates it with the measurement shape input from the input device. Further, the processing device transforms the associated basic shape and the operation method paired with the associated basic shape according to the measurement shape. Furthermore, the processing device calculates the operation method of the mechanism unit based on the deformed basic shape and the operation method, and executes the operation of the mechanism unit based on the operation method.
- the storage unit stores a basic operation representing the basic operation of the mechanism unit
- the processing device is compatible with the modified operation method when calculating the operation method of the mechanism unit.
- the basic operation is modified so that the operation method is calculated from the modified basic operation.
- Still another aspect of the present invention provides a mechanism unit for operating an article to be operated, a basic shape showing a predetermined shape, a basic shape setting unit for setting an operation method for the basic shape by the mechanism unit, and a basic shape setting unit A shape from the basic shapes stored in the basic shape storage unit, a shape measurement unit that measures at least one of the basic shapes stored in the basic shape storage unit A basic shape associating unit that associates a shape close to the shape measured by the measuring unit, a basic shape deforming unit that deforms the basic shape and the operation method according to the measured shape, and a deformed shape deformed by the basic shape deforming unit And an operation method calculation unit that calculates an operation method of the mechanism unit based on the operation method, and a control unit that executes an operation based on the operation method calculated by the operation method calculation unit. It is the robot system.
- Process flow diagram of one embodiment of the present invention 1 is an overall perspective view of a manipulation system according to an embodiment of the present invention. Illustration of operation by basic shape setting unit Front view of basic shape setting screen Explanatory drawing of operation of basic shape synthesis unit Perspective view of shape measuring unit Conceptual diagram of the operation target extraction process Front view of the basic shape mapping interface Conceptual diagram of basic shape deformation processing by local shape transformation Configuration block diagram of operation method calculator Perspective view of basic motion deformation process Perspective view of basic motion synthesis processing Perspective view of suction manipulator Conceptual diagram of basic shape deformation processing for suction manipulators Conceptual diagram of automatic basic shape matching process Conceptual diagram of basic shape deformation processing by linear transformation Conceptual diagram of operation method deformation processing based on the center of gravity position
- notations such as “first”, “second”, and “third” are attached to identify the constituent elements, and do not necessarily limit the number or order.
- a number for identifying a component is used for each context, and a number used in one context does not necessarily indicate the same configuration in another context. Further, it does not preclude that a component identified by a certain number also functions as a component identified by another number.
- FIG. 1 is a detailed configuration diagram of the robot system of the present embodiment, and shows functional blocks constituting the embodiment and information (data) to be processed.
- a manipulator for holding a target article is mounted on the robot, taking a manipulator robot as an example, and a system for holding an arbitrary target article using a finger mechanism on the manipulator will be described.
- the shape measuring unit 100 measures and outputs measurement data 101 indicating the shape of the object to be grasped.
- a point cloud is handled as an example of the measurement data 101, but it is not limited to a point cloud as long as it is data indicating a shape.
- the basic shape setting unit 102 sets a basic shape prepared in advance by the operator and an operation method indicating how the mechanism unit 115 performs an operation on the basic shape, and stores the basic shape in the basic shape storage unit 103.
- an arbitrary shape and operation method are created by deforming a predetermined basic shape and operation method according to measurement data, and a basic shape before deformation is used as a basic shape.
- Examples of basic shapes include cubes, cylinders, and spheres. Further, if the rough shape of the article to be handled is known, a shape close to the shape of the article may be used as the basic shape.
- the basic shape may be any type as long as it is a shape before deformation, and is not limited to the above example. Assume that a plurality of basic shapes 104 and operation methods 105 set by the shape setting unit 102 are stored in the basic shape storage unit 103 as a pair.
- the basic shape associating unit 106 selects and selects the closest to the measurement data 101 from the basic shapes 104 stored in the basic shape storage unit.
- the operation method 105 stored as a pair with the basic shape 104 is selected.
- the selected basic shape is a selected basic shape 107, and the selected operation method is a selection operation method.
- the selection may be automatically performed by software by the system, or may be manually selected by an operator visually, or a combination of automatic and manual may be used.
- the basic shape deforming unit 109 deforms the shape so that the shape of the selected basic shape 108 geometrically matches the measurement data 101, and performs the same deformation in the selection operation method 108.
- the selected basic shape 107 after deformation is referred to as a basic shape 110 after deformation
- the selection operation method 108 after deformation is referred to as an operation method 111 after deformation.
- the operation method calculation unit 112 determines the operation method 113 of the mechanism unit 115 according to the post-deformation operation method 111 and the operation range of the mechanism unit 115. Data such as the operation range of the mechanism unit is stored as a constraint in a storage device accessible by the system. Finally, the control unit 114 controls the mechanism unit 115 based on the operation method 113.
- the system shown in FIG. 1 that controls the operation amount of an actuator or the like constituting the mechanism unit 115 is a general information processing apparatus (hereinafter, typical) including a processing device, a storage device, an input device, and an output device.
- a server will be described as an example.
- Functions such as the basic shape setting unit 102, the basic shape association unit 106, the basic information transformation unit 109, the operation method calculation unit 112, the control unit 114, and the like are executed by the processing device executing a program stored in the storage device. It is assumed that the predetermined processing is performed in cooperation with other hardware.
- a program executed by a server or means for realizing a server function may be referred to as “function”, “means”, “unit”, “module”, or the like.
- Measurement data 101, basic shape 104, operation method 105, selected basic shape 107, selection operation method 108, post-deformation basic shape 110, post-deformation operation method 111, operation method 113, etc. are stored in a storage device such as a semiconductor memory or a hard disk. Stored temporarily or permanently.
- the shape measuring unit 100 is configured to input measurement data 101 from a server input device (input interface) to a storage device or a processing device via a wired or wireless path.
- the mechanism unit 115 includes a manipulator described later, and is mechanically driven by the control unit 114 being controlled via an output device (output interface) of the server.
- the server can include a display device such as a monitor and an input device such as a keyboard and a mouse.
- the above system configuration may be configured by a single server, or any part of the input device, the output device, the processing device, and the storage device may be configured by another server connected via a network. .
- functions equivalent to those configured by software can also be realized by hardware such as FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit).
- FPGA Field Programmable Gate Array
- ASIC Application Specific Integrated Circuit
- FIG. 2 is a flowchart showing an outline of the processing flow of the system of this embodiment shown in FIG.
- a set of the basic shape 104 and the operation method 105 is set and stored in advance (S2000). This will be described in detail later with reference to FIGS.
- measurement data 101 of the target article is acquired (S2001). This will be described in detail later with reference to FIGS.
- the measurement data 101 and the basic shape 104 are associated with each other, and a combination of a basic shape and an operation method having a shape close to the measurement data 101 is selected (S2002).
- the selection may be performed automatically by the system or by the operator. This will be described in detail later with reference to FIG.
- the shape measuring unit 100 measures and outputs measurement data 101 indicating the surrounding shape using an optical measuring device or the like.
- the grip target measurement data setting unit 200 may output the measurement data 101 as it is, or the grip target measurement data setting unit 200 may be omitted.
- the basic shape setting unit 102 sets a basic shape prepared in advance by the operator and an operation method indicating how the mechanism unit 115 performs an operation on the basic shape, and stores the basic shape in the basic shape storage unit 103.
- a plurality of operation methods may be set for a predetermined basic shape.
- the basic shape 104 and the operation method 105 can be prepared as data files in advance and stored in the basic shape storage unit 103 via the input interface of the server.
- direct input can be performed from a keyboard or the like which is an input device of the server.
- information necessary for a monitor that is an output device of the server may be displayed as will be described later with reference to FIGS.
- an arbitrary shape and operation method are created by deforming a predetermined basic shape and operation method according to measurement data, and a basic shape before deformation is used as a basic shape.
- Examples of basic shapes include cubes, cylinders, and spheres.
- a shape close to the shape of the article may be used as the basic shape.
- the basic shape may be any type as long as it is a shape before deformation, and is not limited to the above example.
- a plurality of basic shapes 104 and operation methods 105 set by the basic shape setting unit 102 are stored in the basic shape storage unit 103 as a pair.
- the basic shape synthesis unit 201 synthesizes two or more types of basic shapes 104 and operation method 105 pairs, creates a new basic shape 104 and operation method 105 pair, and stores them in the basic shape storage unit 103.
- the basic shape associating unit 106 selects a basic shape 104 stored in the basic shape storage unit 103 and has a predetermined relationship with the measurement data 101.
- the operation method 105 that has been selected and stored in a pair with the selected basic shape 104 is selected.
- the selected basic shape is a selected basic shape 107, and the selected operation method is a selection operation method.
- the basic shape associating unit 106 automatically selects a plurality of basic shapes having the smallest geometric difference from the shape measured by the shape measuring unit. Can be configured.
- the basic shape associating unit 106 may select a plurality of items that are approximate to the measurement data, and allow the operator to select them. Alternatively, when the number of basic shapes is limited, the basic shape association unit 106 may list all the basic shapes 104 and allow the operator to select similar shapes.
- the basic shape deforming unit 109 deforms the shape so that the shape of the selected basic shape 107 geometrically matches the measurement data 101 to be grasped, and similarly performs the deformation in the selection operation method 108.
- the selected basic shape 107 after deformation is referred to as a basic shape 110 after deformation
- the selection operation method 108 after deformation is referred to as an operation method 111 after deformation.
- the post-deformation selection operation method 108 and the post-deformation operation method 111 may be added to the basic shape storage unit 103 as a pair. If the selection operation method 108 after deformation and the operation method 111 after deformation are configured as additional data, the additional data can be used as it is for the same gripping target.
- the operation method calculation unit 112 determines the operation method 113 of the mechanism unit 115 according to the post-deformation operation method 111 and the operation range of the mechanism unit 115.
- the post-deformation operation method 111 also includes multiple types.
- the operator may select any one.
- the operation method calculation unit 112 may select the operation method 113 or the like closest to the current posture of the mechanism unit 115 by software control.
- the motion method calculation unit 112 determines the motion method 113 by causing the basic motion deformation unit 204 to deform the basic motion 203 stored in the motion storage unit 202 based on the deformed basic shape 110 and the post-deformation operation method 111. .
- the basic motion synthesis unit 205 creates a new basic motion by combining two or more types of basic motions and stores the new basic motion in the basic motion storage unit 202.
- the control unit 114 controls the mechanism unit 115 based on the operation method 113.
- FIG. 1 includes a gripping target measurement data setting unit 200 and a basic shape synthesis unit 201
- the measurement data 101 can measure the shape of the gripping target, and the basic shape synthesis unit 201 does not add a shape. If the shape of the object to be grasped can be expressed by deformation of the basic shape stored in the shape storage unit 103, the operation is possible even if it is omitted.
- the basic shape storage unit 103 and the basic operation storage unit 202 are preferably composed of, for example, a nonvolatile semiconductor device or a hard disk.
- the basic motion storage unit 202 is built in the motion method calculation unit 112, but it goes without saying that it may be external to the motion method calculation unit 112.
- the basic shape storage unit 103 and the basic operation storage unit 202 can be configured as separate data files of the same hard disk device.
- the robot system need only be able to access these data files and does not necessarily have to be built in.
- the data file may be in an external server that can be accessed by the robot system via a network.
- the gripping target measurement data setting unit 200, the basic shape synthesis unit 201, the basic motion deformation unit 204, the basic motion synthesis unit 205, and the like perform a predetermined process by executing a program stored in the storage device by the processing device. It shall be performed in cooperation with other hardware.
- FIG. 3 shows an overall view of a manipulator system that is one embodiment of the present invention.
- the shape measuring unit 100 measures the article 300 in the environment, and the mechanism unit 115 holds the article 300.
- the mechanism unit 115 has a plurality of finger mechanisms.
- the mechanism 115 is not limited to a manipulator that grips and holds an article, but may be an adsorption manipulator that adsorbs an article. In addition, it is not limited to gripping / sucking an article, and any part of a target object may be operated by a predetermined method. It is good also as operation
- the mechanism unit 115 can be mounted on the carriage 301 and configured to be movable, for example.
- a server that realizes each unit illustrated in FIG. 1 can be built in the carriage 301 and moved together with the mechanism unit 115.
- the server 3000 can be placed in a remote place as a separate body from the carriage 301.
- an input device 3001, an output device 3002, a processing device 3003, a storage device 3004, and the like are connected by a bus 3005 to realize the configuration illustrated in FIG.
- the control unit 114 of the server 3000 may operate the mechanism unit 115 wirelessly or by wire using the interface 3006.
- the shape measuring unit 100 can also be connected to the server interface 3006 wirelessly or by wire.
- FIG. 4 shows an example of an operation controlled by the basic shape setting unit 102 for setting a predetermined basic shape input procedure and an operation method for the basic shape.
- a basic shape 400 is created using existing CAD software or the like.
- CAD software is stored in the storage device 3004 of the server and executed by the processing device 3003.
- Creation and input of the basic shape 400 can be performed by a user operating a server output device 3002 such as a monitor and a server input device 3001 such as a keyboard, mouse, and tablet.
- the data of the created basic shape 400 can be stored in a storage device 3004 such as a hard disk.
- an arbitrary shape and operation method are created by deforming a predetermined basic shape and operation method according to measurement data, and a basic shape before deformation is used as a basic shape.
- Examples of basic shapes include cubes, cylinders, and spheres. Further, if the rough shape of the article to be handled is known, a shape close to the shape of the article may be used as the basic shape.
- the basic shape may be any type as long as it is a shape before deformation, and is not limited to the above example.
- the basic shape 400 may be in any format as long as it indicates a shape.
- a basic shape 400 is input by a basic shape input unit (for example, an input interface for data from outside the server 3000 and a part of the input device 3001) 401, and an operation method display unit (for example, a monitor and a part of the output device 3002). ) 404.
- the operator sets the operation method 403 by the operation method instruction unit (for example, a keyboard or a mouse, and part of the input device 3001) 402 while viewing the basic shape 400 displayed on the operation method display unit 404.
- a plurality of types of operation methods may be set for a predetermined basic shape.
- a position to which a finger mechanism is attached when gripping is set.
- the operation method is not limited to the setting of the position where the finger mechanism is attached. For example, if it is the operation
- the input basic shape 400 and the set operation method 403 are paired and stored in the basic shape storage unit 103. By performing this multiple times, a plurality of types of combinations of the basic shape 104 and the operation method 105 are stored in the basic shape storage unit.
- the combination of the basic shape and the operation method is not limited to one-to-one, but may be one-to-many.
- a set of the basic shape 104 and the operation method 105 can be prepared in the basic shape storage unit 103.
- the basic shape setting unit 102 in FIG. 1 (or a part other than the basic shape storage unit 103 in FIG. 4) is configured separately from the robot system, and is connected to the robot system only at the time of shipment or maintenance of the robot system.
- the basic shape storage unit 103 can also be configured to store data.
- FIG. 5 shows an example of the interface of the basic shape setting screen controlled by the basic shape setting unit 102.
- the gripping position of the article is set as the operation method.
- the operation method display unit for example, the monitor
- the operation method being set the gripping position in this example
- the operator operates the pointer 500 using an operation method instruction unit (for example, a keyboard) 402 and instructs the gripping position 403 from the displayed basic shape 400.
- the gripping position 403 instructed with respect to the displayed basic shape 400 is registered and reflected on the operation method display unit 404 as the operation method 105.
- the operation method 105 is defined, and a plurality of types of combinations of the basic shape 104 and the operation method 105 are stored.
- Interface control including keyboard control and monitor display is performed by the processing device executing software of the basic shape setting unit 102 stored in the storage device.
- FIG. 6 shows details of an example of processing of the basic shape synthesis unit 201 that creates a new basic shape and operation method by synthesizing a plurality of basic shapes and operation methods set by the basic shape setting unit 102.
- the basic shape storage unit 103 stores a plurality of basic shapes 104 and operation methods 105 in pairs. First, two or more predetermined basic shapes and operation methods are taken out from these. Here, it is assumed that a pair of basic shape A600 and operation method A601 and a pair of basic shape B602 and operation method B603 are taken out.
- the extracted basic shape A 600 and basic shape B 602 are combined by the shape combining processing unit 604.
- the shape can be synthesized by adding and subtracting by expressing the shape with an implicit function or a CSG (Constructive / Solid / Geometry) model. It is also possible to instruct the composition position and method from the interface. Note that the method is not limited to the above as long as the shape can be synthesized.
- the above synthesis processing is performed by the processing device 3003 executing the software of the shape synthesis processing unit 604 stored in the storage device 3004.
- the shape synthesis processing unit 604 outputs the synthesized shape as a basic shape C605.
- Operation method display unit 404 displays basic shape C605, and operation method instruction unit 402 sets operation method 607.
- the setting method by the operation method instruction unit 402 and the display method of the operation method display unit 404 are the same as the methods shown in FIGS. 4 and 5, and the operation method may be set to the shape displayed on the screen.
- the operation method C607 may be a copy of the operation method A601 or the operation method B603.
- the basic shape synthesizing unit 201 can also be configured separately from the robot system main body, like the basic shape setting unit 102.
- FIG. 7 shows a state in which the shape measuring unit 100 measures the surrounding article 300.
- the shape measuring unit 100 can measure the distance to an entity in the irradiation direction by irradiating the laser in the direction of the article (measurement target or gripping target) 300 and measuring the time until the laser reflects and returns.
- the shape measuring unit 100 can measure the shape of the entire surroundings by repeating the irradiation process in all directions. As a measurement result, one point on the surface of the surrounding object can be measured for each irradiation of the laser. By repeating this, the surrounding shape can be measured as a point cloud.
- Measurement data 101 is obtained as a measurement result of the article 300.
- the shape measuring device is not limited to one using a laser as long as it can measure the shape, and a method using stereo vision, a method using ultrasonic waves, or the like may be used.
- FIG. 8 shows an operation target extraction process in which the gripping target measurement data setting unit 200 extracts only partial measurement data related to the gripping target from the measurement data 101 measured by the shape measuring unit 100.
- the shape measurement unit 100 measures the entire shape of the surroundings, the shape of the surrounding environment and a plurality of gripping target objects are reflected in the measurement data 101. Therefore, it is necessary to extract only the shape of the object to be grasped from among them. If the shape of the object to be grasped is already known, the shape stored in advance and the measured point cloud are geometrically matched, and the point cloud to be grasped is determined by using the closest match. Can be cut out.
- Measurement data 800, measurement data 801, and measurement data 802 can be divided.
- the point group on the extracted surface may be removed. Even when it is not a flat surface, it is possible to extract only the placed object by recording the shape of the environment in a state where there is no object around it and taking the difference from the shape. Furthermore, when the gripping target is not determined in advance, the operator can determine the target article by selecting the gripping target article from the measurement data 800, the measurement data 801, and the measurement data 802. Hereinafter, the gripping target article will be described as measurement data 800.
- the gripping target article is not limited to the measurement data 800. When the measurement data 101 measures only the gripping target article, the measurement data 101 may be directly handled in the subsequent processing.
- the above processing is performed by the processing device 3003 executing the software of the gripping target measurement data setting unit 200 stored in the storage device 3004.
- the image shown in FIG. 8 may be presented to the operator so that the operator can select and specify the part of the article to be gripped. it can.
- FIG. 9 shows a processing example of the basic shape associating unit 106 that determines the basic shape and the operation method corresponding to the measurement data 800 of the gripping target measured by the shape measuring unit 100 and extracted by the gripping target measurement data setting unit 200. .
- the measurement data 800 to be grasped and the basic shape storage unit 103 are stored.
- the basic shape and operation method pairs 901, 902, 903, and 904 are displayed.
- the operator instructs shape data to be grasped from the displayed measurement data.
- the measurement data 800 is a gripping target.
- the operator selects the pair closest to the measurement data 800 from the pair of basic shape and operation method.
- the pair 901 of the basic shape and the operation method is the closest.
- the basic shape association unit 106 can also automatically select.
- an algorithm in this case an algorithm similar to the operation of the shape deforming unit 109 described below can be used. For example, first, a basic figure is transformed into a mesh shape. Then, the basic shape of the mesh shape is associated with the measurement data, and the surface on the measurement data closest to the point on the mesh is set as the corresponding coordinate. In addition, the selection can be made based on the distance between the points on the mesh and the coordinates on the measurement data. As a simple example, a basic shape having the smallest average distance is selected.
- the above processing is performed by the processing device 3003 executing the software of the basic shape association unit 106 stored in the storage device 3004 and controlling the input device 3001 and the output device 3002.
- FIG. 10 shows basic shape deformation processing in which the shape deforming unit 109 deforms by locally converting the basic shape and the operation method in accordance with the measured measurement data.
- a pair of the basic shape 1000 and the operation method 1001 and the measurement data 800 are input.
- the basic shape 1000 is divided into a plurality of parts with a mesh to obtain a mesh shape 1002.
- the surface of the shape may be divided equally and the points may be connected using Delaunay division or the like.
- the mesh shape 1002 and the measurement data 800 are associated with each other, and a coordinate conversion 1004 from each point on the mesh to the corresponding coordinate is obtained.
- the surface on the measurement data closest to the point on the mesh is set as the corresponding coordinate.
- the rotation / translation amount may be calculated as preprocessing.
- the rotation / translation calculation can be performed by an existing method such as an ICP (Iterative Closest Point) algorithm.
- the post-deformation operation method 1006 is obtained from the deformation destination position of the mesh corresponding to the gripping position.
- the basic shape deforming unit 109 simulates in advance whether or not a correct operation can be performed when an operation is performed on the article to be grasped using the operation method after deformation. It may be output.
- the finger mechanism applies a force with a predetermined strength to the gripped position after deformation, it is calculated by simulation whether the grip can be stably gripped without dropping. If it is not stable, a gripping position that can be gripped stably may be calculated by searching. This search process can be obtained by optimizing the operation position by a convergence calculation such as a gradient method using the degree of stability as an evaluation value.
- the above processing is performed by the processing device 3003 executing the software of the shape deforming unit 109 stored in the storage device 3004.
- FIG. 11 is a block diagram of the operation method calculation unit 112 including data to be processed.
- the operation method calculation unit 112 calculates an operation method 113 for operating the mechanism unit 115 according to the post-deformation basic shape 110 and the post-deformation operation method 111 calculated by the basic shape deformation unit 119.
- a basic operation 1100 indicates a basic operation for the mechanism unit 115 to achieve a target operation.
- the basic operation 1100 is an opening or closing operation for each finger mechanism.
- the basic operation of bending and stretching is defined for each finger mechanism.
- the basic operation 1100 defined in advance is input to the basic operation setting unit 1101 and stored in the basic operation storage unit 202.
- the basic operation is stored for each mechanism, and the basic operation storage unit 202 stores a plurality of basic operations 203.
- the basic operation 1100 is defined for each finger mechanism, the basic operations for the number of finger mechanisms are stored.
- the basic motion deformation unit 204 deforms the basic motion based on the basic motion 203, the post-deformation basic shape 110, and the post-deformation operation method 111.
- the closing degree of the finger mechanism is adjusted so that the position of the finger mechanism after the operation matches the gripping position of the finger mechanism instructed by the post-deformation operation method. Whether the finger mechanisms match can be calculated from the distance between the position obtained from the finger mechanism and the instructed gripping position of the finger mechanism. This is performed for all finger mechanisms, and the operation method can be determined for each finger mechanism. Note that this operation deformation process is not limited to the gripping operation.
- the operation method calculation unit 112 may select a deformation result of the basic operation that is closest to the current posture of the mechanism unit 115 and set it as the operation method 113.
- FIG. 12 shows an example of the basic motion deformation process in which the basic motion deformation unit 204 creates the motion method 113 that can deform the basic motion 203 and grip the target article.
- the basic operation 1200, 1201, 1202 of each finger mechanism of the mechanism unit 115 is deformed according to the post-deformation operation position 1204 on the post-deformation basic shape 1203, thereby enabling a gripping operation.
- the basic operation 1205 (dotted arrow) before the deformation is transformed into a post-deformation operation 1206 (solid arrow).
- the amount of deformation at this time can be obtained by calculating the bending degree of the mechanism unit so that the gripping position (post-deformation operation position 1204) is closest to the trajectory of the finger mechanism defined in the basic operation 1200.
- the basic operation 1201 is changed from the basic operation 1207 before the deformation to the post-deformation operation 1208.
- the basic operation 1202 is the same.
- FIG. 13 shows a process for synthesizing the basic action 203 by the basic action synthesis unit 205.
- various types of operations can be performed.
- synthesis processing processing for selecting one of two or more types of operations and processing for adding quantity operations are shown.
- a basic operation 1303 and a basic operation 1304 are combined with the finger mechanism 1302 mounted on the mechanism unit 115.
- a basic operation 1303 is selected to obtain a post-synthesis operation 1311.
- a synthesis process for the finger mechanism 1305 will be described.
- the basic operation 1306 and the basic operation 1307 are stored, by selecting the basic operation 1307, the post-synthesis operation 1312 can be performed.
- the combined motion 1311 is obtained by adding these motion directions. It should be noted that which combination processing is performed for which set of basic operations is not limited to the above.
- the control unit 114 grips the target article by controlling a mechanism such as a motor mounted on the mechanism unit 115.
- a mechanism such as a motor mounted on the mechanism unit 115.
- PRM Proba- bilistic Roadmap Method
- RRT Rapidly-exploring Random Trees
- PID Proportional- Control using Integral-Derivative
- the surrounding measurement data measured by the shape measuring unit 100 can be used to pass through a gap between objects so that the mechanism unit 115 does not hit the surroundings when moving. By calculating the trajectory in this way, it is possible to operate even in a complicated environment.
- the basic motion deformation unit 204 may deform the motion in consideration of surrounding measurement data measured by the shape measurement unit 100.
- FIG. 14 shows an example in which a suction manipulator 1400 is mounted on the tip of the mechanism unit 115.
- the suction manipulator touches and sucks a predetermined object at its tip, and grips the article.
- an embodiment for holding various kinds of articles using an adsorption manipulator will be described.
- FIG. 15 shows the basic shape deformation process for the suction manipulator.
- the basic shape and the operation method 1500 are stored in the basic shape storage unit 103.
- the basic shape and operation method 1500 can be set by the processing described in FIGS.
- the operation method 1502 is paired with respect to the basic shape 1501, and similarly, the operation method 1504 is stored in the basic shape 1503 and the operation method 1506 is stored in the basic shape 1505 as a pair.
- the operation method the number of suctions and the suction positions for each are set.
- the suction manipulator can handle articles of various sizes by changing the number of suction mechanisms used.
- the basic shape closest to this is selected. This selection is performed by the basic shape association unit 106 described above. In the example of FIG. 15, it is assumed that the basic shape 1501 is closest.
- a deformed basic shape 1508 and a post-deformation operation method 1509 are obtained.
- the basic shape and the operation method are deformed by the basic shape deforming unit 109 described above. Thereby, since the suction position of the target article can be determined, gripping can be realized by operating the suction manipulator based on the position and sucking the determined suction position 1509.
- the basic shape association unit 106 automates the processing when selecting the basic shape and operation method corresponding to the measurement data.
- the basic shape associating process shown in FIG. 9 it is necessary for the operator to instruct the basic shape and the operation method corresponding to the measurement data every time the gripping operation is performed.
- the gripping operation can be performed without performing the instruction work for the basic shape and operation method pair corresponding to the measurement data.
- FIG. 16 shows a processing procedure for automating the processing when the basic shape associating unit 106 selects a pair of a basic shape and an operation method corresponding to the measurement data.
- the shape closest to the measurement data 1600 is calculated from the basic shapes 1601, 1602, 1603, and 1604.
- the difference between the basic shape and the measurement data In order to calculate the difference between the basic shape and the measurement data, basic shape deformation processing by the basic shape deformation unit 109 is executed for each basic shape, and the changed shapes 1605, 1506, 1607, 1608 and the measurement data
- the shape with the smallest difference may be calculated using the difference from 1600.
- the difference can be calculated by using the sum of the distances from each point of the measurement data to the surface of the closest deformed basic shape.
- the method is not limited to the above method as long as it is a method for calculating a difference between shapes.
- the difference between the changed shape 1605 and the measurement data 1600 is the smallest, and the basic shape 1601 is selected.
- the basic shape association unit 106 stores a pair of the basic shape and operation method corresponding to the measurement target article in advance, and stores the stored basic shape and operation method. These pairs may be associated with each other.
- FIG. 17 shows another example of processing of the basic shape deforming unit 109.
- FIG. 17 shows basic shape deformation processing when the basic shape deforming unit 109 performs deformation by linearly converting the basic shape and the operation method according to the measured measurement data 1700.
- a pair of a basic shape 1701 and an operation method 1702 and measurement data 1700 are input.
- the basic shape 1701 is compared with the measurement data 1700, geometric matching is performed, and the best matching conversion amount is obtained.
- the amount of conversion is determined by searching for a value that most closely matches the shape with rotation, translation, and enlargement ratio for each axis. As a specific method, it can be obtained by matching by full search, gradient method or ICP algorithm.
- the determination of whether or not they match can be calculated by counting how many points on the surface of the deformed basic shape 1704 out of the points on the measurement data 1700.
- FIG. 18 shows another example of processing of the basic shape deforming unit 109.
- FIG. 18 shows a process of deforming the operation method based on the barycentric position of the measurement data when the basic shape deforming unit 109 deforms the operation method.
- the operation method is also changed in the same manner as the degree of deformation of the basic shape.
- the basic shape 1800 shown in FIG. 18 is transformed into the changed basic shape 1806 using this method, the relative distance 1805 from the center of gravity position 1801 is reduced, and the three gripping positions are aligned. As a result, there is a problem that the operation becomes unstable.
- the present invention is not limited to the above-described embodiment, and includes various modifications.
- a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
Abstract
Description
図1のシステムは、処理装置、記憶装置、入力装置、出力装置を備える一般的な情報処理装置(以下、典型的な例としてサーバを例に説明する。)を利用して構成することができる。基本形状設定部102、基本形状対応付け部106、基本情報変形部109、動作方法算出部112、制御部114等の機能は、記憶装置に格納されたプログラムが処理装置によって実行されることで、定められた処理を他のハードウェアと協働して行うものとする。本明細書では、サーバが実行するプログラム、またはサーバの機能を実現する手段を、「機能」、「手段」、「部」、「モジュール」等と呼ぶ場合がある。
図1に戻り、本実施例のシステムの構成をより詳細に説明する。まず、形状計測部100は、光学的測定装置等を用いて周囲の形状を示す計測データ101を計測し出力する。ここで、計測データ101が、把持対象以外の周囲全体の形状も同時に計測していた場合は、その中から把持対象となる部分データのみを、把持対象計測データ設定部200で抽出することとする。なお、計測データ101が把持対象のみを計測したものであった場合は、把持対象計測データ設定部200は計測データ101をそのまま出力するか、もしくは把持対象計測データ設定部200は無くともよい。
Claims (15)
- 操作対象の物品を操作する機構部と、
物体の形状を計測する形状計測部と、
基本となる前記機構部の動作を表す基本動作を記憶する基本動作記憶部と、
前記形状計測部で計測された物体の形状に基づき、前記記憶された基本動作を変形して動作方法を算出する動作方法算出部と、
前記動作方法算出部が算出した動作方法に基づき、前記機構部の制御を実行する制御部と、
を備えることを特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
所定の形状を示す基本形状、および、該基本形状に対する前記機構部による操作方法を対にして、少なくとも1つ以上記憶する基本形状記憶部と、
前記形状計測部が計測した形状に対して、前記基本形状記憶部に記憶した前記基本形状の中からすくなくとも1つを選択し、選択基本形状として対応付ける基本形状対応付け部と、
前記選択基本形状、および、該選択基本形状と対になる前記操作方法である選択操作方法を、前記計測した形状に合わせて変形させる基本形状変形部と、
を備え、
前記動作方法算出部は、
前記基本形状変形部が変形した、変形後の前記選択基本形状および前記選択操作方法に基づいて、前記基本動作を変形させる基本動作変形部を備え、
前記基本動作変形部によって変形した変形後の基本動作に基づいて、動作方法を算出すること、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
前記形状計測部で計測した物体の形状の中から、前記操作対象の物品の形状を抽出する、把持対象計測データ設定部を備えること、
を特徴としたロボットシステム。 - 請求項1記載のロボットシステムであって、
前記基本形状変形部は、
前記選択基本形状が、前記計測した物体の形状と一致するように線形変換することを特徴とすること、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
前記基本形状変形部は、
前記選択基本形状が、前記計測した物体の形状に一致するように部位毎に局所的に変形させること、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
前記基本形状変形部は、
前記選択操作方法の変形を、前記計測した物体の重心からの相対距離に基づいて行うこと、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
前記制御部は、
前記機構部が周囲と衝突しないように、前記形状計測部が計測した形状を参照し周囲の隙間の状態に合わせて動作させること、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
2種類以上の前記基本形状を組み合わせて新たな基本形状とする、基本形状合成部を備えること、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
2種類以上の前記基本動作を組み合わせて新たな基本動作とする、基本動作合成部を備えること、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
前記基本形状対応付け部は、
複数の前記基本形状の中から、前記形状計測部が計測した形状と幾何的な差異の大きさが最も小さいものを選択する、
ことを特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
前記操作方法は、
前記基本形状を把持する位置によって規定されること、
を特徴としたロボットシステム。 - 請求項1に記載のロボットシステムであって、
前記操作方法は、
前記基本形状を吸着する位置、および、吸着する点数によって規定されること、
を特徴としたロボットシステム。 - 処理装置、記憶装置、入力装置、および出力装置を備える情報処理装置によって、機構部を制御する制御方法であって、
前記記憶部は、所定の形状を示す基本形状、および該基本形状に対する、前記機構部による操作方法を対にして、少なくとも1つ以上記憶し、
前記処理装置は、記憶した前記基本形状の中から1つの基本形状を選択し、前記入力装置から入力される計測形状と対応付け、
前記処理装置は、対応付けた前記基本形状および該対応付けた基本形状と対になる前記操作方法を、前記計測形状に合わせて変形し、
前記処理装置は、前記変形後の前記基本形状および前記操作方法に基づき、前記機構部の動作方法を算出し、
前記動作方法に基づき前記機構部の動作を実行する制御方法。 - 前記記憶部は、基本となる単純な前記機構部の動作を表す基本動作を記憶し、
前記処理装置は、前記機構部の動作方法を算出する際に、
変形後の前記操作方法と適合するように前記基本動作を変形し、
変形後の前記基本動作から前記動作方法を算出する請求項13記載の制御方法。 - 前記記憶部は、前記基本形状、および該基本形状に対する、前記機構部による操作方法を対にして、記憶する際に、
1つの前記基本形状に対して、複数の前記操作方法を対にして記憶し、
前記処理装置は、複数の前記操作方法を、前記計測形状に合わせて変形し、
前記処理装置は、前記機構部の動作方法を算出する際に、
前記計測形状に合わせて変形された前記複数の操作方法のうち、現在の前記機構部の姿勢と最も近い結果を選択し、動作方法とする、
請求項13記載の制御方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/050278 WO2017119088A1 (ja) | 2016-01-06 | 2016-01-06 | ロボットシステムおよび制御方法 |
US15/758,540 US10688665B2 (en) | 2016-01-06 | 2016-01-06 | Robot system, and control method |
CN201680045142.7A CN107848117B (zh) | 2016-01-06 | 2016-01-06 | 机器人系统以及控制方法 |
JP2017559981A JP6582061B2 (ja) | 2016-01-06 | 2016-01-06 | ロボットシステムおよび制御方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/050278 WO2017119088A1 (ja) | 2016-01-06 | 2016-01-06 | ロボットシステムおよび制御方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017119088A1 true WO2017119088A1 (ja) | 2017-07-13 |
Family
ID=59273389
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/050278 WO2017119088A1 (ja) | 2016-01-06 | 2016-01-06 | ロボットシステムおよび制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10688665B2 (ja) |
JP (1) | JP6582061B2 (ja) |
CN (1) | CN107848117B (ja) |
WO (1) | WO2017119088A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018008319A (ja) * | 2016-07-11 | 2018-01-18 | 株式会社安川電機 | ロボットシステム、ロボットの制御方法、ロボットコントローラ |
CN111975783A (zh) * | 2020-08-31 | 2020-11-24 | 广东工业大学 | 一种机器人抓取检测方法及系统 |
CN114454174A (zh) * | 2022-03-08 | 2022-05-10 | 江南大学 | 机械臂动作捕捉方法、介质、电子设备及系统 |
JP2022121671A (ja) * | 2017-11-07 | 2022-08-19 | 東芝テック株式会社 | 画像処理システム及び画像処理方法 |
US11433537B2 (en) | 2018-07-17 | 2022-09-06 | Fanuc Corporation | Automatic path generation device |
WO2022250152A1 (ja) * | 2021-05-28 | 2022-12-01 | 京セラ株式会社 | 保持位置決定装置、及び保持位置決定方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04205318A (ja) * | 1990-11-30 | 1992-07-27 | Honda Motor Co Ltd | プレイバックロボットにおける信号処理システム |
JPH07251391A (ja) * | 1994-03-16 | 1995-10-03 | Tokico Ltd | 工業用ロボット |
JPH08309537A (ja) * | 1994-07-29 | 1996-11-26 | Hitachi Zosen Corp | コルゲート重ね板継ぎ用溶接ロボットにおけるコルゲート部溶接方法 |
JPH1080881A (ja) * | 1996-09-06 | 1998-03-31 | Nippon Steel Corp | 多関節ロボットの制御装置及びその制御方法 |
JP2015089589A (ja) * | 2013-11-05 | 2015-05-11 | ファナック株式会社 | バラ積みされた物品をロボットで取出す装置及び方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW277013B (ja) | 1994-07-29 | 1996-06-01 | Hitachi Shipbuilding Eng Co | |
SE526119C2 (sv) * | 2003-11-24 | 2005-07-05 | Abb Research Ltd | Metod och system för programmering av en industrirobot |
WO2006006624A1 (ja) | 2004-07-13 | 2006-01-19 | Matsushita Electric Industrial Co., Ltd. | 物品保持システム、ロボット及びロボット制御方法 |
US7957583B2 (en) * | 2007-08-02 | 2011-06-07 | Roboticvisiontech Llc | System and method of three-dimensional pose estimation |
JP4835616B2 (ja) * | 2008-03-10 | 2011-12-14 | トヨタ自動車株式会社 | 動作教示システム及び動作教示方法 |
JP2011224695A (ja) | 2010-04-19 | 2011-11-10 | Toyota Motor Corp | ロボットの把持制御システム及びロボット |
CN103732363A (zh) * | 2011-08-19 | 2014-04-16 | 株式会社安川电机 | 机器人系统、机器人控制装置、机器人手部及机器人控制方法 |
US8996167B2 (en) * | 2012-06-21 | 2015-03-31 | Rethink Robotics, Inc. | User interfaces for robot training |
JP2014161965A (ja) | 2013-02-26 | 2014-09-08 | Toyota Industries Corp | 物品取り出し装置 |
JP6022393B2 (ja) * | 2013-03-28 | 2016-11-09 | 株式会社神戸製鋼所 | 溶接線情報設定装置、プログラム、自動教示システム、および溶接線情報設定方法 |
-
2016
- 2016-01-06 JP JP2017559981A patent/JP6582061B2/ja active Active
- 2016-01-06 CN CN201680045142.7A patent/CN107848117B/zh active Active
- 2016-01-06 WO PCT/JP2016/050278 patent/WO2017119088A1/ja active Application Filing
- 2016-01-06 US US15/758,540 patent/US10688665B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04205318A (ja) * | 1990-11-30 | 1992-07-27 | Honda Motor Co Ltd | プレイバックロボットにおける信号処理システム |
JPH07251391A (ja) * | 1994-03-16 | 1995-10-03 | Tokico Ltd | 工業用ロボット |
JPH08309537A (ja) * | 1994-07-29 | 1996-11-26 | Hitachi Zosen Corp | コルゲート重ね板継ぎ用溶接ロボットにおけるコルゲート部溶接方法 |
JPH1080881A (ja) * | 1996-09-06 | 1998-03-31 | Nippon Steel Corp | 多関節ロボットの制御装置及びその制御方法 |
JP2015089589A (ja) * | 2013-11-05 | 2015-05-11 | ファナック株式会社 | バラ積みされた物品をロボットで取出す装置及び方法 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018008319A (ja) * | 2016-07-11 | 2018-01-18 | 株式会社安川電機 | ロボットシステム、ロボットの制御方法、ロボットコントローラ |
JP2022121671A (ja) * | 2017-11-07 | 2022-08-19 | 東芝テック株式会社 | 画像処理システム及び画像処理方法 |
US11433537B2 (en) | 2018-07-17 | 2022-09-06 | Fanuc Corporation | Automatic path generation device |
DE102019118637B4 (de) | 2018-07-17 | 2022-11-10 | Fanuc Corporation | Automatische pfadgenerierungsvorrichtung |
CN111975783A (zh) * | 2020-08-31 | 2020-11-24 | 广东工业大学 | 一种机器人抓取检测方法及系统 |
CN111975783B (zh) * | 2020-08-31 | 2021-09-03 | 广东工业大学 | 一种机器人抓取检测方法及系统 |
WO2022250152A1 (ja) * | 2021-05-28 | 2022-12-01 | 京セラ株式会社 | 保持位置決定装置、及び保持位置決定方法 |
CN114454174A (zh) * | 2022-03-08 | 2022-05-10 | 江南大学 | 机械臂动作捕捉方法、介质、电子设备及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN107848117B (zh) | 2021-01-15 |
US10688665B2 (en) | 2020-06-23 |
JP6582061B2 (ja) | 2019-09-25 |
JPWO2017119088A1 (ja) | 2018-04-26 |
US20180250827A1 (en) | 2018-09-06 |
CN107848117A (zh) | 2018-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6582061B2 (ja) | ロボットシステムおよび制御方法 | |
US11548145B2 (en) | Deep machine learning methods and apparatus for robotic grasping | |
JP6640060B2 (ja) | ロボットシステム | |
US9878446B2 (en) | Determination of object-related gripping regions using a robot | |
EP3166084B1 (en) | Method and system for determining a configuration of a virtual robot in a virtual environment | |
US20130054030A1 (en) | Object gripping apparatus, object gripping method, and object gripping program | |
JP2018051704A (ja) | ロボット制御装置、ロボット、及びロボットシステム | |
EP1774443A1 (en) | System and method for simulating human movement using profile paths | |
JP2019018272A (ja) | モーション生成方法、モーション生成装置、システム及びコンピュータプログラム | |
JP6902369B2 (ja) | 提示装置、提示方法およびプログラム、ならびに作業システム | |
JP2018176311A (ja) | 情報処理装置、情報処理方法、プログラム、システム、および物品製造方法 | |
JP2018153873A (ja) | マニピュレータの制御装置、制御方法およびプログラム、ならびに作業システム | |
JP4649554B1 (ja) | ロボット制御装置 | |
JP6456557B1 (ja) | 把持位置姿勢教示装置、把持位置姿勢教示方法及びロボットシステム | |
US20220366660A1 (en) | Method and system for predicting a collision free posture of a kinematic system | |
JP2013182554A (ja) | 把持姿勢生成装置、把持姿勢生成方法及び把持姿勢生成プログラム | |
Jørgensen et al. | Usage of simulations to plan stable grasping of unknown objects with a 3-fingered Schunk hand | |
JP7376318B2 (ja) | アノテーション装置 | |
Mitrouchev et al. | Disassembly process simulation in virtual reality environment | |
CN115107020A (zh) | 训练用于控制机器人的神经网络的装置和方法 | |
JP2020087243A (ja) | 物体計数方法、物体計数装置、および、ロボットシステム | |
Park et al. | Robot-based Object Pose Auto-annotation System for Dexterous Manipulation | |
WO2023150238A1 (en) | Object placement | |
KIKUCHI et al. | Teaching of Grasp/Graspless Manipulation for Industrial Robots by Human Demonstration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16883591 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017559981 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15758540 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16883591 Country of ref document: EP Kind code of ref document: A1 |