WO2021049186A1 - Système d'inspection d'outil - Google Patents

Système d'inspection d'outil Download PDF

Info

Publication number
WO2021049186A1
WO2021049186A1 PCT/JP2020/028549 JP2020028549W WO2021049186A1 WO 2021049186 A1 WO2021049186 A1 WO 2021049186A1 JP 2020028549 W JP2020028549 W JP 2020028549W WO 2021049186 A1 WO2021049186 A1 WO 2021049186A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
camera
image
chipping
inspection
Prior art date
Application number
PCT/JP2020/028549
Other languages
English (en)
Japanese (ja)
Inventor
太輔 山▲崎▼
大悟 檜山
正浩 白根
匠 渡辺
Original Assignee
株式会社牧野フライス製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社牧野フライス製作所 filed Critical 株式会社牧野フライス製作所
Publication of WO2021049186A1 publication Critical patent/WO2021049186A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/09Arrangements for observing, indicating or measuring on machine tools for indicating or measuring cutting pressure or for determining cutting-tool condition, e.g. cutting ability, load on tool
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces

Definitions

  • This application relates to a tool inspection system.
  • Patent Document 1 discloses a system including a machine tool and a robot that supplies a work to the machine tool and takes out the work from the machine tool.
  • a moving part (for example, an arm) of the robot is provided with a camera that captures an image of a tool attached to the machine tool.
  • the robot supplies the unmachined work to the machine tool and takes out the machined work from the machine tool.
  • an image of the tool attached to the machine tool is taken with a camera, and the tool shape is detected from the image by image processing.
  • the amount of tool wear is calculated based on the detected tool shape.
  • the position of the tool is corrected based on the calculated amount of wear of the tool.
  • Patent Document 2 discloses a system for inspecting wear and chipping of a tool tip used in a machine tool.
  • a camera and a lighting device are arranged within the machine tool to photograph a tool attached to the spindle of the machine tool. The camera and lighting device photograph the tool from a direction perpendicular to the rotation axis of the spindle.
  • a camera and a lighting device are provided on a robot placed around a chip inspection and replacement workbench outside the machine tool. The robot is instructed to have the camera face the flank of the tip of a tool mounted on a tip inspection and replacement workbench.
  • the length of wear and chip is determined based on the image taken by the camera, and if the calculated length is larger than the reference value, it is determined that the chip needs to be replaced. ..
  • One aspect of the present disclosure is a tool inspection stage for holding a tool and a tool held on the tool inspection stage in a tool inspection system for evaluating a tool stored in a machine tool that rotates a tool to machine an workpiece. It has a first camera that captures the image of the tool from the end face side of the tool, a second camera that captures the image of the tool held on the tool inspection stage from the side surface side of the tool, and a robot arm that holds the tool.
  • a tool that evaluates at least one of tool wear or chipping based on the unmanned carrier that transports the tool from the machine tool to the tool inspection stage and the tool images taken by the first and second cameras. It is a tool inspection system equipped with an evaluation unit.
  • the image of the tool is captured by the first camera and the second camera on the end face side (image of the end face of the tool can be captured) and the side surface side (side surface of the tool, respectively). It is possible to take an image from). Therefore, it is possible to efficiently evaluate various types of tools with one tool inspection system.
  • the tool inspection stage may have a slide mechanism for moving in a direction perpendicular to the rotation axis of the tool, and the tool inspection stage has a first camera and a second camera facing the tool. May move between the position of and the second position away from the first position.
  • the robot arm can attach the tool to the tool inspection stage at a second position away from the first camera and the second camera, and takes an image of the tool by the slide mechanism.
  • the tool inspection stage can be returned to the first position for the tool. Therefore, it is possible to prevent the robot arm and the tool from coming into contact with the first camera and the second camera when the tool is attached to the inspection stage (and when the tool is removed from the inspection stage).
  • the tool evaluation unit is an image of the tool taken by the first camera and the second camera based on the result of machine learning using a plurality of past images taken by the first camera and the second camera.
  • Image processing may be performed on the camera.
  • tool wear and chipping are detected by image processing based on machine learning. Therefore, by using a plurality of images evaluated based on a certain standard (for example, judgment of a skilled operator) for machine learning, evaluation with little variation can be performed.
  • tool wear and chipping can be inspected accurately and efficiently.
  • FIG. 4A shows an example of an image of the tool including wear before image processing.
  • FIG. 4B shows an example of an image after image processing.
  • FIG. 5A shows an example of an image of a tool that does not include chipping.
  • FIG. 5B shows an example of an image of a tool containing a small chipping.
  • FIG. 5 (c) shows an example of an image of a tool containing moderate chipping.
  • FIG. 5D shows an example of an image of a tool containing a large chipping.
  • It is a flowchart which shows the evaluation of the wear of a tool.
  • It is a flowchart which shows the evaluation of the chipping of a tool, and shows the example of the case where the threshold value is medium chipping scale.
  • FIG. 1 is a schematic view showing a production system 200 to which the tool inspection system 100 according to the embodiment is applied.
  • the machine tool 60 processes the workpiece, and at least one of the wear or chipping of the tool T used by the machine tool 60 is evaluated by the tool inspection system 100.
  • the tool T is a holder with a cutting tool such as an insert tip or a solid end mill.
  • the production system 200 can include, for example, one or more machine tools 60, a tool inspection system 100, a cabinet 80, and an integration station 90. Further, the production system 200 may include a main control device (not shown) that controls the entire production system 200, and the main control device includes a control device (local control device) for each component of the production system 200. You may communicate.
  • the production system 200 may further include other components.
  • the machine tool 60 can be various types of machine tools (for example, a machining center) in which the tool T is rotated by the spindle 61 to process the workpiece (for example, a machining center).
  • the machine tool 60 can have a magazine 62 for storing a plurality of tools T.
  • the control device of the machine tool 60 (for example, NC device and machine control device) mutually with, for example, the main control device of the production system 200, the control device 73 of the unmanned transport vehicle 70, and the control device 5 of the tool inspection system 100. You may communicate.
  • the cabinet 80 stores, for example, a plurality of holders and cutting tools for use in the machine tool 60 individually, or a plurality of tools T prepared so that the cutting tools can be attached to the holders by an operator and introduced into the machine tool 60.
  • the tool T newly introduced in the production system 200 has a cutting tool attached to a holder by an operator and is stored in the cabinet 80.
  • the tools T determined by the tool inspection system 100 to be replaced are collected (details will be described later).
  • the tool inspection system 100 inspects at least one of the wear and chipping of the cutting tool attached to the tool T stored and used in the machine tool 60.
  • the tool inspection system 100 can operate on one or more machine tools 60.
  • the tool inspection system 100 includes one or a plurality of automatic guided vehicles (AGV (Automated Guided Vehicle)) 70 and a tool inspection device 10.
  • AGV Automatic Guided Vehicle
  • the unmanned transport vehicle 70 transports the uninspected tool T from the machine tool 60 or the cabinet 80 to the tool inspection stage 1 of the tool inspection device 10, and transfers the inspected tool T from the tool inspection device 10 to the machine tool 60. , Cabinet 80, or integration station 90. Further, the automatic guided vehicle 70 may be configured to transport other articles in the production system 200.
  • the automatic guided vehicle 70 has a vehicle body 71, a robot arm 72, and a control device (local control device) 73.
  • the vehicle body 71 is configured to move between the machine tool 60, the tool inspection device 10, the cabinet 80, and the integration station 90.
  • the vehicle body 71 can be, for example, a trackless vehicle.
  • the robot arm 72 can be, for example, a multi-axis, articulated robot and can include a hand for holding the tool T (and other articles).
  • the control device 73 is configured to control the vehicle body 71 and the robot arm 72.
  • the control device 73 may communicate with each other, for example, with the main control device of the production system 200, the NC device and the machine control device of the machine tool 60, and the control device 5 of the tool inspection device 10.
  • the tool inspection device 10 is arranged on the table 20, for example.
  • the table 20 can be used by an operator for work (eg, attachment of a cutting tool to a holder constituting the tool T).
  • the tool inspection device 10 includes a tool inspection stage (which may also be simply referred to as an “inspection stage” in the present disclosure) 1, a first camera 2, a second camera 3, a housing 4, and a control device 5. ,have.
  • FIG. 2 is a schematic perspective view showing the tool inspection device 10 in FIG. 1
  • FIG. 3 is a schematic side view showing the tool inspection device 10 of FIG.
  • the control device 5 is not shown in FIGS. 2 and 3.
  • the inspection stage 1 is configured to hold the tool T.
  • the inspection stage 1 includes a main body 11, a positioning mechanism 12, and a slide mechanism 13.
  • the main body 11 can be, for example, a substantially flat plate.
  • the main body 11 includes a handle 11a.
  • the handle 11a is configured to be gripped by the robot arm 72 and the operator.
  • the positioning mechanism 12 is configured such that the tool T is placed on the positioning mechanism 12 with the tip of the tool T facing upward.
  • the positioning mechanism 12 may be fixed on the body 11.
  • the positioning mechanism 12 may be rotatably attached to the main body 11 about the rotation axis Ot of the tool T in order to adjust the position of the tool T in the rotation direction.
  • the position in the rotation direction may be adjusted by an actuator connected to the control device 5.
  • the tool T includes a plurality of cutting edges (for example, when a plurality of cutting tools are attached to the tool T)
  • the tool T is placed on the positioning mechanism 12 by adjusting the rotational position of the positioning mechanism 12. There is no need to fix it.
  • the positioning mechanism 12 includes a protrusion 12a that engages with a notch in the holder of the tool T.
  • the positioning mechanism 12 is configured to hold the tool T in a predetermined direction (position in a predetermined rotation direction) by engaging the protrusion 12a with the notch of the holder.
  • the protrusion 12a may include a further protrusion (not shown) for preventing the tool T from sliding along the protrusion 12a in the radial direction. Therefore, the tool T can be arranged in a desired orientation at a predetermined position on the inspection stage 1. Therefore, the first camera 2 and the second camera 3 can take an image of the tool T at a predetermined position and in a desired direction on the inspection stage 1.
  • the slide mechanism 13 is configured to move the main body 11 and the positioning mechanism 12 on the main body 11 in a direction (third direction) D3 perpendicular to the rotation axis Ot of the tool T.
  • the slide mechanism 13 can include, for example, a pair of linear guides L.
  • the rail La of the linear guide L can be fixed to the bottom wall of the housing 4, and the block Lb can be fixed to the main body 11.
  • the inspection stage 1 is in the first position (first camera 2 and second camera 3) where the first camera 2 and the second camera 3 face the tool T. Is configured to move between P1 (a position where an image of the tool T can be taken) and a second position (not shown) separated from the first position. In FIG. 2, the inspection stage 1 is located at the first position P1.
  • the first camera 2 captures an image of the tool T held on the inspection stage 1 (more specifically, an image of the cutting edge of the tool T) from the end face side of the tool T.
  • the first camera 2 captures an image of the cutting edge of the tool T from a direction (first direction) D1 parallel to the rotation axis Ot of the tool T.
  • the first camera 2 may be fixed to the upper wall of the housing 4.
  • the first camera 2 may be attached to the upper wall of the housing 4 so as to slide along directions D1 and D2. In this case, for example, the position in the direction D1 and the direction D2 may be adjusted by the actuator connected to the control device 5.
  • the first camera 2 is attached to the upper wall of the housing 4 so that the image of the cutting edge of the tool T can be photographed from a direction tilted with respect to the direction D1 so that the photographing angle can be adjusted. You may be.
  • the image can be taken from the direction perpendicular to the cutting edge (for example, variable lead). Insert cutter or solid end mill with corners).
  • this can be dealt with by software (for example, the image is captured from the same direction during machine learning and wear / chipping evaluation, or Correct the captured image, etc.).
  • the first camera 2 can include, for example, a CCD or CMOS. Further, the first camera 2 can include, for example, an optical element such as a lens and a polarizing filter.
  • a ring illumination 21 is attached to the first camera 2.
  • the ring illumination 21 is configured to emit a ring-shaped light centered on the first camera 2 toward the target tool T.
  • the ring illumination 21 allows the first camera 2 to capture an image of the tool T under constant light conditions, independent of the brightness of the surrounding environment and the direction of light from the surroundings.
  • the ring illumination 21 may include, for example, one or more LEDs.
  • the second camera 3 captures an image of the tool T held on the inspection stage 1 (more specifically, an image of the cutting edge of the tool T) from the side surface side of the tool T.
  • the second camera 3 captures an image of the cutting edge of the tool T from a direction (second direction) D2 perpendicular to the rotation axis Ot of the tool T.
  • the direction D2 is perpendicular to the direction D3.
  • the direction D2 may be parallel to the direction D3.
  • the second camera 3 may be fixed to the side wall of the housing 4.
  • the second camera 3 may be attached to the side wall of the housing 4 so as to slide along directions D2 and D1.
  • the position in the direction D2 and the direction D1 may be adjusted by the actuator connected to the control device 5.
  • the second camera 3 is attached to the side wall of the housing 4 so that the image of the cutting edge of the tool T can be photographed from a direction tilted with respect to the direction D2 so that the photographing angle can be adjusted. You may.
  • the image can be taken from the direction perpendicular to the cutting edge.
  • this can be dealt with by software as described above.
  • the second camera 3 can include, for example, a CCD or CMOS. Further, the second camera 3 can include, for example, an optical element such as a lens and a polarizing filter.
  • a ring illumination 31 is attached to the second camera 3.
  • the ring illumination 31 is configured to emit a ring-shaped light centered on the second camera 3 toward the target tool T.
  • the ring illumination 31 allows the second camera 3 to capture an image of the tool T under constant light conditions, independent of the brightness of the surrounding environment and the direction of light from the surroundings.
  • the ring illumination 31 may include, for example, one or more LEDs.
  • the housing 4 supports and houses the inspection stage 1, the first camera 2, and the second camera 3.
  • the housing 4 may be open in the direction in which the inspection stage 1 moves.
  • the housing 4 can include, for example, a frame and a plate fixed to the frame.
  • the plate may be transparent so that the operator can see the tool T from the outside.
  • the control device (tool evaluation unit) 5 of the tool T (more specifically, the tool) is based on the images of the tool T taken by the first camera 2 and the second camera 3. Evaluate at least one of (T's cutting edge) wear or chipping. Images obtained by the first camera 2 and the second camera 3 are input to the control device 5.
  • the control device 5 may be able to communicate with the first camera 2 and the second camera 3 by wire or wirelessly. Further, the control device 5 may be configured to control each component of the tool inspection device 10.
  • the control device 5 may communicate with each other, for example, with the main control device of the production system 200, the NC device and the machine control device of the machine tool 60, and the control device 73 of the unmanned transport vehicle 70.
  • the control device 5 can include, for example, a storage device 51 and a processor 52. Further, the control device 5 includes, for example, a ROM (read only memory), a RAM (random access memory), an input device (for example, a mouse, a keyboard and / or a touch panel), and / or a display device (for example, a liquid crystal display and /). Further, other components such as a touch panel) can be further provided, and the components of the control device 5 are connected to each other via a bus (not shown) or the like. The control device 5 may further include other components.
  • the control device 5 can be, for example, a computer, a server, a tablet, or the like.
  • the storage device 51 can be, for example, one or more hard disk drives.
  • the storage device 51 may exist in a distant place connected by a network instead of in the housing of the control device 5.
  • the storage device 51 can store a plurality of past images (teacher data) taken from the direction D1 by the first camera 2 and taken from the direction D2 by the second camera 3.
  • the storage device 51 may store various other data.
  • the processor 52 can be, for example, one or more CPUs or GPUs.
  • the processor 52 is configured to perform machine learning using a plurality of teacher data stored in the storage device 51. Based on the result of machine learning, the processor 52 performs image processing on a new image of the tool T taken by the first camera 2 and the second camera 3, thereby performing image processing on the new image of the tool T from the new image.
  • machine learning may be performed by another unshown processor independent of the tool inspection system 100, where the processor 52 is new based on the results of machine learning by the other processor. It may be configured to perform image processing on various images.
  • a neural network eg, a convolutional neural network
  • a network U-Net having an encoder portion and a decoder portion can be used.
  • the processor 52 may be further configured to perform various processes related to the tool inspection device 10. A program for executing each process in the processor 52 can be stored in the storage device 51, for example.
  • FIG. 4A shows an example of an image of the tool T including wear before image processing
  • FIG. 4B shows an example of an image after image processing
  • the tool T may include a wear area A after use.
  • “wear” can mean a state in which the cutting edge of the tool T is worn away as the tool T is used.
  • the processor 52 can use, for example, semantic segmentation as machine learning-based image processing to detect such wear regions A.
  • the processor 52 can determine the wear width W based on the image after image processing.
  • the width W can be the distance from the cutting edge to the edge of the wear area in the direction perpendicular to the cutting edge. For example, when the width W is equal to or greater than a predetermined threshold value (for example, 0.2 mm), the processor 52 can determine that the tool T needs to replace the cutting tool.
  • a predetermined threshold value for example, 0.2 mm
  • 5 (a), 5 (b), 5 (c), and 5 (d) each include no chipping, include small chipping, include moderate chipping, and include large chipping, respectively.
  • An example of an image of the tool T is shown.
  • the tool T may include a chipping C after use (or if the tool T is defective).
  • "chipping" may mean a state in which a part of the cutting edge of the tool T is missing.
  • the processor 52 can use, for example, image recognition.
  • the processor 52 can determine the presence / absence of chipping C and the size of chipping C based on image processing.
  • the processor 52 may determine that the tool T needs to replace the cutting tool. In another embodiment, if the tool T includes a chipping C, the processor 52 may determine that the tool T needs to replace the cutting tool, regardless of the size of the chipping C.
  • the tool inspection system 100 acquires teacher data and executes machine learning prior to the evaluation of the tool T. Specifically, when evaluating wear with reference to FIG. 2, the inspection stage 1 is moved from the first position P1 to the second position using the handle 11a, and the tool T including wear is inspected. It is placed on the positioning mechanism 12 of 1. Subsequently, the inspection stage 1 is returned from the second position to the first position P1 by using the handle 11a. These operations may be performed by the operator or by the robot arm 72. An image of the tool T is taken by the first camera 2 and the second camera 3 and input to the control device 5. For each captured image, the wear region A is set based on a certain criterion (for example, the judgment of a skilled operator).
  • the image and the wear area A are stored in the storage device 51 as teacher data.
  • the above operation is repeated for a plurality of tools T including wear. Further, as the teacher data that does not include wear, the above operation may be executed for one or more tools T that do not include wear.
  • the above operation is performed for a plurality of tools T including and without chipping.
  • the size of chipping C (including "no chipping") is set for each captured image based on a certain standard (for example, judgment by a skilled operator). When evaluating wear / chipping for different types of tools, perform the above actions for each tool.
  • the processor 52 uses a plurality of teacher data stored in the storage device 51 to perform machine learning in which the input is an image and the output is the wear region A.
  • the processor 52 uses a plurality of teacher data stored in the storage device 51 to perform machine learning in which the input is an image and the output is the magnitude of chipping C.
  • FIG. 6 is a flowchart showing the evaluation of tool wear.
  • the automatic guided vehicle 70 places the tool T on the inspection stage 1 (step S100).
  • the usage time of each tool T is stored in the main control device (or the machine control device of each machine tool 60) of the production system 200. be able to.
  • the automatic guided vehicle 70 is based on a command from the main control device of the production system 200. Then, the corresponding tool T is taken out from the magazine 62 of the machine tool 60 and carried to the tool inspection device 10.
  • the robot arm 72 uses the handle 11a to position the inspection stage 1 from the first position P1 after placing the tool T on the vehicle body 71 (not shown in FIG. 2). Move to position 2. Subsequently, the robot arm 72 picks up the tool T from the vehicle body 71, then places the tool T on the positioning mechanism 12 of the inspection stage 1, and again uses the handle 11a to position the inspection stage 1 in the second position. Return to the first position P1.
  • an image of the tool T is subsequently taken by the first camera 2 and the second camera 3 (step S102).
  • the tool T includes a plurality of cutting edges (for example, a tip)
  • an image of each cutting edge may be taken.
  • the position of the cutting edge may be stored in advance in the storage device 51 according to the type of the tool T, for example, and the processor 52 reads the position of the cutting edge from the storage device 51 for each tool T to be photographed. It may be.
  • an image of the cutting edge of the tool T can be easily taken even for the tool T having a plurality of cutting edges having a non-uniform pitch or leads.
  • the captured image is input to the control device 5 by the first camera 2 and the second camera 3 (step S104).
  • the processor 52 executes image processing (for example, semantic segmentation) on each image captured by the first camera 2 and the second camera 3 based on the result of machine learning (step S106). ).
  • image processing for example, semantic segmentation
  • the wear region A is detected.
  • the processor 52 determines the width W of the detected wear region A in each image taken by the first camera 2 and the second camera 3 (step S108). Subsequently, the processor 52 determines whether or not the width W of the wear region A is equal to or greater than the threshold value in each image captured by the first camera 2 and the second camera 3 (step S110).
  • step S110 When it is determined in step S110 that the width W of the wear region A is equal to or greater than the threshold value, the processor 52 determines that the tool T needs to replace the cutting tool, and the control device 73 of the automatic guided vehicle 70 is determined. , Send a command to carry the tool T to the integration station 90 (step S112). With the above, a series of operations is completed. When it is determined that the width W of the wear region A is equal to or greater than the threshold value in at least one image taken by the first camera 2 or the second camera 3, the processor 52 determines that the tool T replaces the cutting tool. Can be determined to be necessary.
  • the processor 52 executes steps S102 to S110 for all the cutting edges, and in at least one image of the plurality of cutting edges, the width W of the wear region A is set. If it is determined that the value is equal to or greater than the threshold value, the processor 52 can determine that the tool T needs to replace only a part of the corresponding cutting tools. In these cases, for example, the processor 52 may notify the operator at which cutting edge of the tool T and by which camera the wear region A equal to or greater than the threshold value is detected. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice.
  • the cutting tool of the tool T carried to the integration station 90 can be replaced or regrinded, for example. Further, the processor 52 may send a command to the automatic guided vehicle 70 so as to carry the substitute tool T to the magazine 62 of the corresponding machine tool 60.
  • step S110 when it is determined that the width W of the wear region A is not equal to or greater than the threshold value in all the images, the processor 52 determines that the tool T does not need to replace the cutting tool, and the automatic guided vehicle. A command is transmitted to the control device 73 of the 70 to return the tool T to the magazine 62 of the corresponding machine tool 60 (step S114). With the above, a series of operations is completed.
  • the command to the control device 73 of the automatic guided vehicle 70 may be directly transmitted from the control device 5 of the tool inspection device 10 to the control device 73 of the automatic guided vehicle 70, or the command of the production system 200. It may be transmitted indirectly via the main controller.
  • FIG. 7 is a flowchart showing the evaluation of tool chipping.
  • the automatic guided vehicle 70 places the tool T on the inspection stage 1 (step S200), and takes an image of the tool T by the first camera 2 and the second camera 3 (step S202). Subsequently, the captured image is input to the control device 5 (step S204). It should be noted that if both the wear and chipping of the tool T are evaluated and the above steps S100 to S104 have already been executed, steps S200 to 204 are omitted. That is, the evaluation of wear and the evaluation of chipping can be performed based on the same image.
  • the processor 52 executes image processing (for example, image recognition) on each image captured by the first camera 2 and the second camera 3 based on the result of machine learning (step S206).
  • image processing for example, image recognition
  • the size of chipping C is determined.
  • the processor 52 determines whether or not the size of the chipping C is a medium size or more (step S208).
  • step S208 When it is determined in step S208 that the size of the chipping C is medium or larger, the processor 52 determines that the tool T needs to replace the cutting tool, and the control device 73 of the automatic guided vehicle 70 is determined. Then, a command is transmitted to carry the tool T to the integration station 90 (step S210). With the above, a series of operations is completed. If it is determined that the size of the chipping C is medium or larger in at least one image taken by the first camera 2 or the second camera 3, the processor 52, the tool T, and the cutting tool It can be determined that the replacement is necessary.
  • the processor 52 executes steps S102 to S110 for all the cutting edges, and the size of the chipping C is medium in at least one image of the plurality of cutting edges.
  • the processor 52 can determine that the tool T needs to be replaced. In these cases, for example, the processor 52 may inform the operator at which cutting edge of the tool T and by which camera the chipping C of medium size or larger was detected. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice.
  • the cutting tool of the tool T carried to the integration station 90 can be replaced or regrinded, for example. Further, the processor 52 may send a command to the automatic guided vehicle 70 so as to carry the substitute tool T to the magazine 62 of the corresponding machine tool 60.
  • step S208 for example, if it is determined in all images that the size of the chipping C is not greater than or equal to the medium size, the processor 52 determines that the tool T does not require replacement of the cutting tool and is unmanned. A command is transmitted to the control device 73 of the automatic guided vehicle 70 to return the tool T to the magazine 62 of the corresponding machine tool 60 (step S212). With the above, a series of operations is completed.
  • both the wear evaluation (FIG. 6) and the chipping evaluation (FIG. 7) are executed, these may be executed in order or in parallel.
  • the processor 52, the tool T, and the cutting tool are replaced. Is necessary.
  • the processor 52 uses the tool T only if the width W of the wear area of a cutting edge is greater than or equal to a threshold and the chipping size of the cutting edge is greater than or equal to a medium size. It may be determined that the cutting tool needs to be replaced.
  • the cutting edge with the tool T corresponds to at least one of a state in which the amount of wear is equal to or greater than a predetermined threshold value or a state in which the magnitude of chipping is equal to or greater than a predetermined threshold value.
  • the tool T can determine that the cutting tool needs to be replaced.
  • the image of the tool T is captured by the first camera 2 and the second camera 3 on the end face side of the tool T (an image of the end face of the tool T can be taken). And it is possible to take a picture from the side surface side of the tool T (an image of the side surface of the tool T can be taken). As described above, wear and chipping of the tool T can occur on the end face, side surface, or both of the tool T, depending on the type of the tool T. Therefore, in the tool inspection system 100, it is possible to efficiently evaluate various types of tools with one tool inspection system.
  • the inspection stage 1 has a slide mechanism 13 for moving in the direction D3 perpendicular to the rotation axis Ot of the tool T, and the inspection stage 1 is the first camera 2.
  • the second camera 3 is configured to move between the first position P1 facing the tool T and the second position separated from the first position. Therefore, the robot arm 72 can attach the tool T to the inspection stage 1 at a second position away from the first camera 2 and the second camera 3, and can attach the tool T to the inspection stage 1. Can be removed. Therefore, it is possible to prevent the robot arm 72 and the tool T from coming into contact with the first camera 2 and the second camera 3.
  • the control device 5 uses the first camera 2 and the first camera 2 and the second camera 3 based on the result of machine learning using a plurality of past images taken by the first camera 2 and the second camera 3. It is configured to perform image processing on a new image of the tool T taken by the second camera 3. Therefore, wear and chipping of the tool T are detected by image processing based on machine learning. Therefore, by using a plurality of images evaluated based on a certain standard (for example, judgment of a skilled operator) for machine learning, evaluation with little variation can be performed.
  • a certain standard for example, judgment of a skilled operator
  • the processor 52 determines “YES” in step S110 of FIG. 6, step S208 of FIG. 7, or both when it is determined that the tool T needs to replace the cutting tool. Case), a command is sent to the automatic guided vehicle 70 to carry the tool T to the integration station 90.
  • the processor 52 may notify the operator if the tool T determines that the cutting tool needs to be replaced. The notification may be indicated by, for example, the display device of the control device 5 and / or may be indicated by voice.
  • the wear and chipping of the tool T used in the machine tool 60 are evaluated.
  • the tool T to be evaluated may be, for example, an unused tool stored in the cabinet 80 and newly introduced into the production system 200. In this case, it is possible to evaluate whether or not the tool T has an initial defect. Therefore, it is possible to prevent the tool T having an initial defect from being introduced into the production system 200.
  • Tool inspection stage 2 First camera 3 Second camera 5
  • Control device (tool evaluation unit) 13
  • Slide mechanism 60 Machine tool 61

Abstract

Le système d'inspection d'outil (100) de l'invention comprend : une platine d'inspection d'outil (1) destinée à maintenir un outil (T) ; une première caméra (2) servant à capturer, à partir d'un côté de surface d'extrémité de l'outil (T), une image de l'outil (T) maintenu sur la platine d'inspection d'outil (1) ; une seconde caméra (3) servant à capturer, à partir d'un côté de surface latérale de l'outil (T), une image de l'outil (T) maintenu sur la platine d'inspection d'outil (1) ; un véhicule de transport sans pilote (70) qui comporte un bras de robot (72) pour maintenir et transporter l'outil (T) d'une machine-outil (60) à la platine d'inspection d'outil (1) ; et une unité d'évaluation d'outil (5) qui évalue au moins l'usure et/ou l'écaillage de l'outil sur la base des images de l'outil (T), capturées par la première caméra (2) et la seconde caméra (3).
PCT/JP2020/028549 2019-09-13 2020-07-22 Système d'inspection d'outil WO2021049186A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-167567 2019-09-13
JP2019167567A JP2021043167A (ja) 2019-09-13 2019-09-13 工具検査システム

Publications (1)

Publication Number Publication Date
WO2021049186A1 true WO2021049186A1 (fr) 2021-03-18

Family

ID=74862253

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/028549 WO2021049186A1 (fr) 2019-09-13 2020-07-22 Système d'inspection d'outil

Country Status (2)

Country Link
JP (1) JP2021043167A (fr)
WO (1) WO2021049186A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114705690A (zh) * 2022-04-19 2022-07-05 华侨大学 一种刀具视觉自动化检测设备及刀具检测方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115365889A (zh) * 2022-09-17 2022-11-22 杭州鹏润电子有限公司 一种断刀检测方法、系统及存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014178150A (ja) * 2013-03-13 2014-09-25 Aron Denki Co Ltd 切削工具検査装置
WO2018092222A1 (fr) * 2016-11-16 2018-05-24 株式会社牧野フライス製作所 Système de machine-outil

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014178150A (ja) * 2013-03-13 2014-09-25 Aron Denki Co Ltd 切削工具検査装置
WO2018092222A1 (fr) * 2016-11-16 2018-05-24 株式会社牧野フライス製作所 Système de machine-outil

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114705690A (zh) * 2022-04-19 2022-07-05 华侨大学 一种刀具视觉自动化检测设备及刀具检测方法

Also Published As

Publication number Publication date
JP2021043167A (ja) 2021-03-18

Similar Documents

Publication Publication Date Title
JP6465722B2 (ja) 加工装置
US7218995B2 (en) Device and method for workpiece calibration
JP6935168B2 (ja) 加工装置
JP5002961B2 (ja) 欠陥検査装置及び欠陥検査方法
WO2021049186A1 (fr) Système d'inspection d'outil
WO2018220776A1 (fr) Machine-outil et procédé de détermination de défaut d'outil
JP5519047B1 (ja) 切削工具検査装置
US20080225281A1 (en) Visual inspection apparatus
JP7337495B2 (ja) 画像処理装置およびその制御方法、プログラム
JP2002018680A (ja) 工作機械
JPH04233245A (ja) 半導体チップと導体リード・フレームの検査及び位置合せのためのシステム及び方法
EP2915597A1 (fr) Système d'inspection visuelle pour machine numérique programmable
JP6336353B2 (ja) 工作機械
US11911862B2 (en) Method for automated positioning of a blank in a processing machine
CN113146172A (zh) 一种基于多视觉的检测与装配系统及方法
JP5483305B2 (ja) 寸法測定装置、及びワーク製造方法
CN109732601B (zh) 一种自动标定机器人位姿与相机光轴垂直的方法和装置
WO2020012569A1 (fr) Système de machine-outil et procédé de détermination d'outil
JP6632367B2 (ja) 組立装置、組立装置の制御方法、および物品の製造方法
KR20220044741A (ko) 웨이퍼 외관 검사 장치 및 방법
JP7007993B2 (ja) ダイシングチップ検査装置
JP2020127996A (ja) 工作機械、異物検知方法、および、異物検知プログラム
JP7155825B2 (ja) 製品異常判定装置
US11295406B2 (en) Image management device
KR20210058329A (ko) 다면 비전 검사 알고리즘 및 이를 이용한 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20863108

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20863108

Country of ref document: EP

Kind code of ref document: A1