WO2020026447A1 - Procédé d'apprentissage de paramètre et système de travail - Google Patents

Procédé d'apprentissage de paramètre et système de travail Download PDF

Info

Publication number
WO2020026447A1
WO2020026447A1 PCT/JP2018/029287 JP2018029287W WO2020026447A1 WO 2020026447 A1 WO2020026447 A1 WO 2020026447A1 JP 2018029287 W JP2018029287 W JP 2018029287W WO 2020026447 A1 WO2020026447 A1 WO 2020026447A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
image
parameter
imaging
evaluation
Prior art date
Application number
PCT/JP2018/029287
Other languages
English (en)
Japanese (ja)
Inventor
内田 剛
博史 大池
弘健 江嵜
アヌスヤ ナラサンビ
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2018/029287 priority Critical patent/WO2020026447A1/fr
Priority to JP2020534026A priority patent/JP7121127B2/ja
Priority to CN201880096177.2A priority patent/CN112512942B/zh
Publication of WO2020026447A1 publication Critical patent/WO2020026447A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/02Devices for feeding articles or materials to conveyors
    • B65G47/04Devices for feeding articles or materials to conveyors for feeding articles
    • B65G47/12Devices for feeding articles or materials to conveyors for feeding articles from disorderly-arranged article piles or from loose assemblages of articles
    • B65G47/14Devices for feeding articles or materials to conveyors for feeding articles from disorderly-arranged article piles or from loose assemblages of articles arranging or orientating the articles by mechanical or pneumatic means during feeding

Definitions

  • This specification discloses a parameter learning method and a work system.
  • the present disclosure has a main object of appropriately performing parameter learning without lowering work efficiency for unraveling a bulk state of works.
  • This disclosure employs the following means to achieve the above-mentioned main object.
  • a parameter learning method is a parameter learning method for controlling an actuator that unravels a bulk state of a plurality of works by a predetermined operation, and images the plurality of works during execution of the predetermined operation.
  • a learning step of learning a relationship with the parameter is a learning step of learning a relationship with the parameter.
  • the parameter learning method of the present disclosure captures a plurality of workpieces during execution of a predetermined operation of unraveling a bulk state of a plurality of workpieces, processes the captured images to evaluate a separation state of the plurality of workpieces, The relationship between the evaluation result and the parameter in the predetermined operation being executed is learned. Thereby, the learning of the parameter can be performed using the image captured during the execution of the predetermined operation, so that it is not necessary for the actuator to interrupt the predetermined operation for learning. For this reason, the parameter can be appropriately learned without lowering the work efficiency for unraveling the bulk state of the works.
  • FIG. 1 is a configuration diagram illustrating a schematic configuration of a work system.
  • FIG. 2 is a configuration diagram illustrating an outline of a configuration of a work transfer device.
  • FIG. 3 is a partial external view of the work transfer device 20 as viewed from the back side.
  • Explanatory drawing which shows the electrical connection relationship of the control apparatus 70.
  • Explanatory drawing which shows a mode that the bulk state of the work W is unraveled.
  • FIG. 3 is a block diagram showing functions of a control device 70.
  • 9 is a flowchart illustrating an example of a loosening operation control routine.
  • 9 is a flowchart illustrating an example of a process at the time of starting a loosening operation.
  • Explanatory drawing which shows an example of the calculation method of difference (DELTA) A, (DELTA) G.
  • Explanatory drawing which shows an example of area
  • FIG. 1 is a configuration diagram schematically showing the configuration of the work system 10
  • FIG. 2 is a configuration diagram schematically showing the configuration of the work transfer device 20
  • FIG. 3 is a partial external view of the work transfer device 20 as viewed from the back.
  • FIG. 4 is an explanatory diagram showing an electrical connection relationship of the control device 70. 1 and 2, the left-right direction is the X-axis direction, the front-rear direction is the Y-axis direction, and the up-down direction is the Z-axis direction.
  • the work system 10 is a system in which the work W stored in the supply box 12 is transferred to the mounting table T and aligned. As shown in FIG. 1, the work system 10 includes a mounting table transfer device 16, a work transfer device 20, a supply robot 40, and a sampling robot 50. These are installed on the workbench 11.
  • the mounting table transport device 16 has a pair of belt conveyors that are stretched in the left-right direction (X-axis direction) with an interval in the front-rear direction (Y-axis direction).
  • the mounting table T is transported from left to right by a belt conveyor.
  • the supply robot 40 is a robot for taking out the work W as various parts such as mechanical parts and electric parts from the supply box 12 and supplying the work W to the supply area A1 of the work transfer device 20 (see FIG. 2).
  • the supply robot 40 includes a vertical articulated robot arm 41 and an end effector 42.
  • the robot arm 41 includes a plurality of links, a plurality of joints that rotatably or pivotally connect the links, a drive motor 44 that drives each joint (see FIG. 4), and an encoder 45 that detects the angle of each joint. FIG. 4).
  • the plurality of links include a distal link to which the end effector 42 is attached, and a proximal link fixed to the worktable 11.
  • the end effector 42 can hold and release the work W.
  • the end effector 42 can use, for example, a mechanical chuck, a suction nozzle, an electromagnet, or the like, and supplies the work W to the supply area A1 in a bulk state.
  • the collection robot 50 is a robot for collecting the work W in the collection area A2 (see FIG. 2) of the work transfer device 20, transferring the work W to the mounting table T, and aligning the work W.
  • the sampling robot 50 includes a vertical articulated robot arm 51 and an end effector 52.
  • the robot arm 51 includes a plurality of links, a plurality of joints that rotatably or pivotally connect the links, a drive motor 54 (see FIG. 4) that drives each joint, and an encoder 55 that detects an angle of each joint. FIG. 4).
  • the plurality of links include a distal link to which the end effector 52 is attached, and a proximal link fixed to the worktable 11. The end effector 52 can hold and release the work W.
  • a camera 53 for imaging the work W conveyed by the work conveyance device 20 and the mounting table T conveyed by the mounting table conveyance device 16 to grasp their positions and states is provided at the end link of the robot arm 51. Is also attached.
  • the work transfer device 20 has a plurality of transfer lanes 21 that can transfer the work W from the supply area A1 to the collection area A2 in the front-rear direction (Y-axis direction).
  • a plurality of supply boxes 12 for accommodating works W to be supplied to each of the plurality of transfer lanes 21 are arranged behind the work transfer device 20.
  • the work transfer device 20 includes a conveyor belt 22 and a partition 25. As shown in FIG. 2, the conveyor belt 22 is stretched over a driving roller 23a and a driven roller 23b. The work W is placed on the upper surface portion 22a (placement portion) of the conveyor belt 22, and the work roller W is driven to rotate by the drive motor 38 (see FIG. 4) to convey the work W in the belt feeding direction. On both sides of the conveyor belt 22, side walls 24a and 24b are provided. The driving roller 23a and the driven roller 23b are rotatably supported by the side walls 24a and 24b. As shown in FIG. 3, the work transfer device 20 has a support plate 28 on the back side of the upper surface 22 a of the conveyor belt 22.
  • the support plate 28 prevents the conveyor belt 22 from bending due to the weight of the work W placed on the upper surface 22a.
  • openings 28a are formed at positions corresponding to the collection areas A2 of the plurality of transport lanes 21, respectively.
  • a vertical movement device 30 for pushing up the upper surface portion 22a from the back surface and moving the upper surface portion 22a up and down is arranged below each opening 28a.
  • the vertical movement device 30 includes a contact body 31 and a cylinder 32 for vertically moving the contact body 31 so as to penetrate the opening 28a.
  • the cylinder 32 is supported by a support base 29 fixed to the side walls 24a, 24b.
  • the partition 25 is a partition plate that partitions one conveyor belt 22 (upper surface portion 22a) into a plurality of transport lanes 21.
  • the partitions 25 extend in parallel to the side walls 24a and 24b arranged on both sides of the conveyor belt 22, and are arranged at equal intervals so that each of the transport lanes 21 has the same lane width.
  • control device 70 is configured as a known computer including a CPU, a ROM, a HDD, a RAM, an input / output interface, a communication interface, and the like.
  • Various signals from the encoder 45 of the supply robot 40, the encoder 55 of the collection robot 50, the camera 53, the input device 80, and the like are input to the control device 70.
  • the drive motor 38 of the work transfer device 20 From the control device 70, the drive motor 38 of the work transfer device 20, the vertical movement device 30 (cylinder 32), the drive motor 44 of the supply robot 40, the drive motor 54 of the collection robot 50, the camera 53, the mounting table transfer device 16, and the like. Are output.
  • the control device 70 learns parameters for controlling the cylinder 32 of the vertically moving device 30 and can control the cylinder 32 by determining an appropriate parameter based on the learning result.
  • FIG. 5 is an explanatory diagram showing a state where the bulk of the works W is unraveled.
  • a loosening operation in which the cylinder 32 moves the contact body 31 up and down (vibrates) dislodges the lump of the work W in a bulk state and becomes a separated state, so that the work W can be easily collected by the collection robot 50.
  • the control device 70 controls the cylinder 32 with parameters suitable for unraveling the work W according to the specifications of the work W, such as the weight, size, shape, and material of the work W.
  • examples of the parameters include an impact force and a vibration frequency when the conveyor belt 22 is pushed up by the vertical movement (vibration) of the contact body 31.
  • FIG. 6 is a block diagram showing functions of the control device 70.
  • the control device 70 includes a parameter learning unit 70A that mainly learns parameters, and a drive control unit 70B that mainly determines appropriate parameters and drives and controls the vertical movement device 30.
  • the parameter learning unit 70A includes a learning model 71, an imaging processing unit 72, an evaluation processing unit 73, and a learning processing unit 74.
  • the imaging processing unit 72 causes the camera 53 to image the work W in a bulk state or the separated state in which the work W is unraveled, and inputs the taken image.
  • the evaluation processing unit 73 processes the captured image, calculates a predetermined evaluation value regarding the separation state of the work W, and evaluates the separation state.
  • the learning processing unit 74 learns the relationship between the parameter in the unraveling operation being executed and the evaluation result with the evaluation processing unit 73 by well-known machine learning, and constructs a learning model 71 including a correlation with the specification of the work W and the like. I do. Note that examples of the learning method include reinforcement learning and a genetic algorithm, and other methods may be used.
  • the drive control unit 70B includes a parameter determination unit 75 and a drive unit 76.
  • the parameter determining unit 75 determines a parameter corresponding to the specification of the work W or the like using the learning model 71, or appropriately determines an arbitrary parameter.
  • the drive unit 76 controls the cylinder 32 of the vertical movement device 30 based on the parameters determined by the parameter determination unit 75.
  • each control such as the supply control and the transfer control of the work W, the loosening operation control, and the sampling and placement control is sequentially performed.
  • the supply control is performed by collecting the work W from the supply box 12 in accordance with the supply order and controlling the drive of the supply robot 40 so as to supply the work W to the supply area A1 of the corresponding transfer lane 21.
  • the supply order is, for example, an order specified by an operator by operating the input device 80.
  • the transfer control is performed by controlling the drive of the work transfer device 20 so that the work W supplied to the supply area A1 reaches the collection area A2.
  • the loosening operation control is performed by controlling the driving of the cylinder 32 of the vertical movement device 30 corresponding to the collection area A2 in a state where the work W has reached the collection area A2.
  • Sampling and placement control is performed by sampling the work W that has been unraveled and separated by the unraveling operation control, and drives and controls the sampling robot 50 so as to be aligned and mounted on the mounting table T. .
  • the collection robot 50 is driven and controlled so that the work W in the collection area A2 is captured by the camera 53, and the captured image is processed to collect the selected work W.
  • These controls performed on the work W in each transfer lane 21 may be performed in parallel if they do not affect the control on the work W in the other transfer lanes 21.
  • the details of the loosening operation control will be described based on the loosening operation control routine shown in FIG.
  • the control device 70 determines whether or not it is time to start the loosening operation (S100).
  • the control device 70 determines that it is the start timing when the work W reaches the sampling area A2 by the transport control and the vertical movement device 30 is in a state where it can be driven. Note that the control device 70 may determine that the start timing is reached, for example, when it is necessary to perform another loosening operation on the remaining work W from which some work W has been collected by performing the loosening operation.
  • the control device 70 executes the process for starting the loosening operation shown in FIG. 8 (S105).
  • the control device 70 first initializes the number n indicating the image capturing order to a value of 1 (S200), and before the start of the loosening operation, the image which is the image of the number 1 with the camera 53 1 is imaged (S205). Next, the control device 70 processes the image 1 to detect the outer edge of the lump area of the workpiece W, and calculates the area area A (1) and the center of gravity G (1) of the area area A (1). (S210). The work W is supplied in a bulk state to the supply area A1 and transported to the collection area A2.
  • the image 1 a plurality of works W are intertwined to form a lump, and the area of the work W and a conveyor belt serving as a background.
  • the brightness value and the like of the upper surface portion 22a of the second 22 differ.
  • the image 1 is converted into a gray scale image, and the boundary between the lump of the work W and the upper surface portion 22a is detected as the outer edge of the area of the work W from the gray scale image, and the area surrounded by the outer edge is detected. Is calculated as a region area A (n) (here, A (1)). Further, the position of the center of gravity of the region surrounded by the outer edge is calculated as the center of gravity G (n) (here, G (1)).
  • control device 70 is not limited to a device using a grayscale image, and may use a binarized image. Subsequently, the control device 70 sets the parameters of the loosening operation (S215), starts the loosening operation by driving the cylinder 32 of the vertical movement device 30 based on the set parameters (S220), and starts the loosening operation. The time processing ends. In S215, the control device 70 selects and sets parameters suitable for the current work W from the learning model 71. Note that the control device 70 may appropriately set an arbitrary parameter when the selection is difficult, for example, when the work W is new.
  • the control device 70 determines whether a predetermined timing during the loosening operation has come (S110).
  • the predetermined timing may be a timing each time a predetermined time elapses after starting the loosening operation, or a timing each time the cylinder 32 of the vertical movement device 30 performs the vertical movement a predetermined number of times. It is assumed that the predetermined timing occurs a plurality of times from the start of the loosening operation to the end thereof. If it is determined in S110 that the predetermined timing has come, the control device 70 updates the number n by incrementing the number n by one (S115), and the image n which is the image of the number n by the camera 53 during the execution of the loosening operation.
  • To capture the image n during the execution of the loosening operation means to capture the image n without interrupting the continuous vertical movement of the cylinder 32. For this reason, since the image n is captured in a state where the work W is jumping in various directions due to the vibration, the image is in a state where each work W is shaken.
  • the control device 70 processes the image n to calculate the area area A (n) and the center of gravity G (n) (S125), and separates the workpiece W sufficiently to be able to be collected by the collection robot 50. It is determined whether (spreading) has been performed (S130).
  • the process of S125 is performed in the same manner as the process at the start of the loosening operation S210, except that the image n in which the workpiece W is blurred is used. Note that, even in the image n in which the work W is shaken, since it is possible to detect the approximate boundary between the lump of the work W and the upper surface 22a of the conveyor belt 22, the outer edge of the area of the work W is detected.
  • control device 70 determines that the work W has been sufficiently separated in S130 when a state in which the work can be collected by the collection robot 50 because some of the individual works W can be recognized from the processed image n.
  • control device 70 predicts a region area Ae in which the work W is sufficiently separated based on, for example, the specifications and the number of the work W, and calculates the region area A (n) of the work W calculated in S125. Is greater than or equal to the area Ae, it is determined that the workpiece W has been sufficiently separated.
  • the control device 70 calculates the difference ⁇ A and the difference ⁇ G as evaluation values to evaluate the separation state (S135).
  • the difference ⁇ A is calculated as an area difference between the area A (n) and the reference area.
  • the difference ⁇ G is calculated as the distance difference between the center of gravity G (n) and the reference center of gravity.
  • FIG. 9 is an explanatory diagram showing an example of a method of calculating the differences ⁇ A and ⁇ G.
  • the difference ⁇ A (A (n) ⁇ A (1) is used by using the area A (1) and the center of gravity G (1) of the image 1 captured in the processing at the start of the loosening operation as a reference. )) And the difference ⁇ G (G (n) ⁇ G (1)) are calculated.
  • the difference ⁇ A, ⁇ G can be calculated more accurately than the clear image 1 in which the workpiece W is stationary and does not shake because the image is captured before the start of the loosening operation.
  • the differences ⁇ A and ⁇ G may appear more prominently as the loosening operation time becomes longer, so that the evaluation becomes easier and learning can be performed more appropriately.
  • the area area A (n-1) and the center of gravity G (n-1) of the image (n-1) captured at a predetermined timing immediately before the predetermined timing at which the image n is captured are calculated. Is used as a reference to calculate the difference ⁇ A (A (n) ⁇ A (n ⁇ 1)) and the difference ⁇ G (G (n) ⁇ G (n ⁇ 1)). In this method, evaluation can be performed even when a parameter is changed during a loosening operation in a process described later.
  • 9A and 9B may be selected based on, for example, an operator's designation by operating the input device 80, or the method shown in FIG. May be changed to the method shown in FIG.
  • FIG. 10 is an explanatory view showing an example of the area A and the center of gravity G.
  • the area A (1) and the center of gravity G (1) as references used in the method shown in FIG. 9A are indicated by solid lines, and the area A (n) and the center of gravity G (n) of the image n are indicated by dotted lines.
  • FIG. 10A shows a state in which the area A is large without changing the center of gravity G, and the difference ⁇ G is small and the difference ⁇ A is large. Since such a state is a state in which the work W is unraveled and separated without greatly deviating from the sampling area A2, the evaluation is high.
  • FIG. 10A shows a state in which the area A is large without changing the center of gravity G, and the difference ⁇ G is small and the difference ⁇ A is large. Since such a state is a state in which the work W is unraveled and separated without greatly deviating from the sampling area A2, the evaluation is high.
  • 10B shows a state in which the center of gravity G moves with almost no change in the area A of the region, and the difference ⁇ A is small and the difference ⁇ G is large.
  • the position of the work W is shifted as a whole without sufficiently loosening the lump of the work W, so that the evaluation is low.
  • the image n captured during the unraveling operation and the work W is shaken it is possible to detect the lump area of the work W and evaluate the separation state by simple processing.
  • illustration is omitted, if the area A is large even if the center of gravity G is moved, it can be evaluated that the lump of the work W is unraveled, so the evaluation is higher than that of FIG. 10B. It will be.
  • the control device 70 updates the learning model 71 by learning such an evaluation result for the current parameter (S140), and determines whether or not a change to a more suitable parameter is necessary (S145).
  • the control device 70 makes the determination in S145 based on whether or not a parameter more suitable for the current work W can be selected from the updated learning model 71.
  • the control device 70 changes the parameter to a more suitable one, continues the loosening operation (S150), and returns to S110.
  • the control device 70 determines that the parameter change is unnecessary because the evaluation of the separation state with respect to the current parameter is relatively high, the control device 70 continues the loosening operation with the current parameter (S155), and returns to S110.
  • the control device 70 determines that the work W has been sufficiently separated in S130 while performing such processing, the control device 70 stops driving the cylinder 32 of the vertical movement device 30 to end the loosening operation (S160). , And returns to S100.
  • the cylinder 32 of the vertically moving device 30 of the present embodiment corresponds to an actuator
  • S120 of the loosening operation control routine in FIG. 7 corresponds to an in-operation imaging step
  • S125 and S135 of the same processing correspond to evaluation steps
  • S140 corresponds to a learning step
  • S205 of the processing at the start of the loosening operation in FIG. 7 corresponds to a pre-operation imaging step
  • S210 of the processing corresponds to an acquisition step.
  • the work system 10 corresponds to a work system
  • the sampling robot 50 corresponds to a robot
  • the camera 53 corresponds to an imaging device
  • the imaging processing unit 72 that executes S120 of the unraveling operation control routine corresponds to an imaging processing unit.
  • the evaluation processing unit 73 executing S125 and S135 of the processing corresponds to an evaluation processing unit
  • the learning processing unit 74 executing S140 of the processing corresponds to a learning processing unit.
  • the separation state of the plurality of works W is evaluated by processing the image n captured during the unraveling operation of the plurality of works W. Learn the relationship with the parameters in the operation. For this reason, since the lifting / lowering device 30 does not need to interrupt the loosening operation for learning, learning can be appropriately performed without lowering the work efficiency of the loosening operation.
  • the evaluation value is accurately obtained. Learning can be performed more appropriately. Further, the separation state of the workpiece W is evaluated based on the immediately preceding image (n-1) with respect to the image n captured at a predetermined timing during the unraveling operation. Therefore, even if the parameter is changed during the unraveling operation.
  • the learning can be performed by acquiring the evaluation value without interrupting the loosening operation.
  • the separation state is evaluated using the area A (n) surrounded by the outer edge of the area of the work W, even if a clear image cannot be obtained, the separation state is appropriately evaluated by simple processing. can do. Further, since the separation state is evaluated using the area A (n) and the center of gravity G (n), the separation state can be more appropriately evaluated.
  • the evaluation is performed using the area A (n) and the center of gravity G (n).
  • the present invention is not limited to this, and only the area A (n) may be used.
  • another evaluation value may be used.
  • the height of the lump of the work W is detected from an image taken from the side, and the height is evaluated as an evaluation value. May be used.
  • the bulk state of the works W in the image 1 and the separated state of the works W in the immediately preceding image n are used.
  • the present invention is not limited to this. Only one of them may be used.
  • the image n captured immediately before the parameter change or the separation state of the workpiece W in the image n captured immediately after the parameter change may be used as a reference.
  • the work transfer device 20 includes the vertical movement device 30 for each of the transfer lanes 21. However, the upper and lower portions 22a of the plurality of transfer lanes 21 are moved up and down by one vertical movement device 30. It may be a thing.
  • the work transfer device 20 may include a support plate having an opening formed to extend over the plurality of transfer lanes 21.
  • the work transfer device 20 includes the up-down movement device 30 that moves the collection area A2 of the conveyor belt 22 (upper surface portion 22a) up and down, but moves the supply area A1 and other regions up and down.
  • the vertical movement device 30 may be provided.
  • the cylinder 32 of the up-and-down movement device 30 is exemplified as an actuator for unraveling the bulk state of the works W, but the actuator is not limited to this.
  • the workpiece W may be unraveled by an actuator that reciprocates a leveling member such as a brush in the X-axis direction or the Y-axis direction.
  • a leveling member such as a brush in the X-axis direction or the Y-axis direction.
  • the separation state of the workpiece W can be evaluated without interrupting the reciprocation of the actuator by capturing the image n when the leveling member moves to the forward end position or the backward end position.
  • the parameters include the angle at which the leveling member is brought into contact with the workpiece W, the speed of the reciprocating motion, and the like.
  • a leveling member may be attached as the end effector 52 of the collection robot 50, and the collection robot 50 may perform a loosening operation (leveling operation).
  • a parameter for controlling the drive motor 54 as an actuator of the sampling robot 50 may be learned.
  • the actuator that releases the work by the predetermined operation may be included in a robot that collects the work and performs the predetermined work.
  • the parameter learning method and the work system by the computer of the present disclosure may be configured as follows.
  • the in-operation imaging step the plurality of workpieces are imaged at every predetermined timing during execution of the predetermined operation, and in the evaluation step, the image is captured in the in-operation imaging step.
  • the separation state may be evaluated using the separation state immediately before processing the image captured in the immediately preceding operation-in-progress imaging step as a reference. In this way, even when the parameter is changed during execution of the predetermined operation, the change in the separation state can be properly grasped, so that the learning can be performed more appropriately.
  • the image is processed to detect an outer edge of a region where the plurality of workpieces are present, the area of the region is calculated, and the separation state is calculated based on the area. May be evaluated. In this case, even if a clear image cannot be obtained because an image is captured during execution of the predetermined operation, the separation state can be appropriately evaluated by simple processing.
  • a work system is a work system including an actuator that releases a plurality of works in a bulk state by a predetermined operation, a robot that collects the works and performs a predetermined work, and an imaging device that captures an image.
  • An imaging processing unit that causes the imaging device to image the plurality of workpieces during execution of the predetermined operation; an evaluation processing unit that processes a captured image to evaluate a separation state of the plurality of workpieces;
  • the gist of the present invention is to include a learning processing unit that learns a relationship between the evaluation result of the evaluation processing unit and the parameter in the predetermined operation being executed.
  • the working system performs learning of a parameter for controlling an actuator that performs a predetermined operation using an image captured during execution of the predetermined operation, similarly to the above-described parameter learning method. Therefore, the work device does not need to interrupt the predetermined operation for learning. For this reason, the parameter can be appropriately learned without lowering the work efficiency for unraveling the bulk state of the works.
  • a function for realizing each step of the parameter learning method may be added.
  • the present disclosure is applicable to, for example, the manufacturing industry of working systems.

Abstract

L'invention concerne un procédé d'apprentissage de paramètre destiné à commander un actionneur de façon à ce qu'il desserre un état en vrac d'une pluralité de pièces à travailler par l'intermédiaire d'une opération prescrite, le procédé comprenant une étape d'imagerie à mi-opération, destinée à imager la pluralité de pièces à travailler pendant l'exécution d'une opération prescrite, une étape d'évaluation, destinée à traiter une image capturée dans l'étape d'imagerie à mi-opération et à évaluer l'état de séparation de la pluralité de pièces à travailler, et une étape d'apprentissage, destinée à apprendre la relation entre les résultats d'évaluation dans l'étape d'évaluation et un paramètre dans l'opération prescrite pendant l'exécution de cette dernière.
PCT/JP2018/029287 2018-08-03 2018-08-03 Procédé d'apprentissage de paramètre et système de travail WO2020026447A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2018/029287 WO2020026447A1 (fr) 2018-08-03 2018-08-03 Procédé d'apprentissage de paramètre et système de travail
JP2020534026A JP7121127B2 (ja) 2018-08-03 2018-08-03 パラメータの学習方法および作業システム
CN201880096177.2A CN112512942B (zh) 2018-08-03 2018-08-03 参数的学习方法及作业系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/029287 WO2020026447A1 (fr) 2018-08-03 2018-08-03 Procédé d'apprentissage de paramètre et système de travail

Publications (1)

Publication Number Publication Date
WO2020026447A1 true WO2020026447A1 (fr) 2020-02-06

Family

ID=69231559

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029287 WO2020026447A1 (fr) 2018-08-03 2018-08-03 Procédé d'apprentissage de paramètre et système de travail

Country Status (3)

Country Link
JP (1) JP7121127B2 (fr)
CN (1) CN112512942B (fr)
WO (1) WO2020026447A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023184034A1 (fr) * 2022-03-31 2023-10-05 Ats Automation Tooling Systems Inc. Systèmes et procédés d'alimentation de pièces à travailler à une chaîne de fabrication

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3172494B2 (ja) * 1997-11-17 2001-06-04 アデプト テクノロジー インコーポレイティッド 衝撃式パーツフィーダ
JP2010241592A (ja) * 2009-04-03 2010-10-28 Satoru Kobayashi 無振動型パーツフィーダ
JP2017030135A (ja) * 2015-07-31 2017-02-09 ファナック株式会社 ワークの取り出し動作を学習する機械学習装置、ロボットシステムおよび機械学習方法
WO2018092211A1 (fr) * 2016-11-16 2018-05-24 株式会社Fuji Dispositif de transfert et système de transport

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62174617A (ja) * 1986-01-29 1987-07-31 Teijin Eng Kk 粉粒体計量方法
CA2735512A1 (fr) * 2010-04-01 2011-10-01 Siemens Aktiengesellschaft Procede et appareil pour la mesure d'un parametre pendant le transport d'objets vers un dispositif de traitement
WO2013084831A1 (fr) * 2011-12-07 2013-06-13 花王株式会社 Procédé d'application d'une poudre et dispositif et procédé d'application pour la fabrication d'un élément chauffant à l'aide de ce procédé
CN104085667B (zh) * 2014-06-30 2016-05-25 合肥美亚光电技术股份有限公司 进料自动调节模块及其方法、装置、散料异物检测机构
WO2016043324A1 (fr) * 2014-09-19 2016-03-24 株式会社イシダ Dispositif de dispersion/d'apport et dispositif de pesage combine
CH711104A2 (de) * 2015-05-18 2016-11-30 Finatec Holding Ag Prüfverfahren und Prüfsystem zur Prüfung von Werkstücken.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3172494B2 (ja) * 1997-11-17 2001-06-04 アデプト テクノロジー インコーポレイティッド 衝撃式パーツフィーダ
JP2010241592A (ja) * 2009-04-03 2010-10-28 Satoru Kobayashi 無振動型パーツフィーダ
JP2017030135A (ja) * 2015-07-31 2017-02-09 ファナック株式会社 ワークの取り出し動作を学習する機械学習装置、ロボットシステムおよび機械学習方法
WO2018092211A1 (fr) * 2016-11-16 2018-05-24 株式会社Fuji Dispositif de transfert et système de transport

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023184034A1 (fr) * 2022-03-31 2023-10-05 Ats Automation Tooling Systems Inc. Systèmes et procédés d'alimentation de pièces à travailler à une chaîne de fabrication

Also Published As

Publication number Publication date
JP7121127B2 (ja) 2022-08-17
JPWO2020026447A1 (ja) 2021-08-02
CN112512942B (zh) 2022-05-17
CN112512942A (zh) 2021-03-16

Similar Documents

Publication Publication Date Title
JP6734402B2 (ja) 作業機
JP7163506B2 (ja) 作業ロボットおよび作業システム
JP7231706B2 (ja) 作業機
WO2020026447A1 (fr) Procédé d'apprentissage de paramètre et système de travail
JP7283881B2 (ja) 作業システム
JP6898374B2 (ja) ロボット装置の動作を調整する動作調整装置およびロボット装置の動作を調整する動作調整方法
EP3205457B1 (fr) Procédé de transfert et appareil de transfert
JP5606424B2 (ja) 部品取り出し方法、及び部品取り出しシステム
WO2018092211A1 (fr) Dispositif de transfert et système de transport
JP6814295B2 (ja) 部品供給装置および作業システム
WO2020021643A1 (fr) Système de sélection et procédé de sélection d'effecteur terminal
WO2023013056A1 (fr) Procédé de saisie de pièces et système de saisie de pièces
JP7440635B2 (ja) ロボットシステム
CN114450133A (zh) 机器人控制系统、机器人控制方法以及程序
JP7257514B2 (ja) 部品実装システムおよび学習装置
JP4846678B2 (ja) 電子部品装着装置
JP6915085B2 (ja) 作業機および把持位置探索方法
CN111508014B (zh) 排除随机堆叠的多个工件的干涉的系统
TW202027934A (zh) 排除隨機堆疊之複數個工件之干涉的系統
JP5013124B2 (ja) スパッタ除去装置およびスパッタ除去方法
CN116529028A (zh) 用于供给柔性环形工件的装置和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18928452

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020534026

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18928452

Country of ref document: EP

Kind code of ref document: A1