WO2020194444A1 - Système de traitement - Google Patents

Système de traitement Download PDF

Info

Publication number
WO2020194444A1
WO2020194444A1 PCT/JP2019/012482 JP2019012482W WO2020194444A1 WO 2020194444 A1 WO2020194444 A1 WO 2020194444A1 JP 2019012482 W JP2019012482 W JP 2019012482W WO 2020194444 A1 WO2020194444 A1 WO 2020194444A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing system
work
detection
processing
powder supply
Prior art date
Application number
PCT/JP2019/012482
Other languages
English (en)
Japanese (ja)
Inventor
壮史 松田
俊光 倉見
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2019/012482 priority Critical patent/WO2020194444A1/fr
Publication of WO2020194444A1 publication Critical patent/WO2020194444A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/04Automatically aligning, aiming or focusing the laser beam, e.g. using the back-scattered light
    • B23K26/046Automatically focusing the laser beam

Definitions

  • the present invention relates to, for example, the technical field of a processing system for performing a processing operation.
  • Patent Document 1 describes a processing system that performs a processing operation of forming a modeled object by melting a powdery material with an energy beam and then solidifying the melted material.
  • a processing system it is a technical problem to suppress the occurrence of an abnormality caused by contact between a member used for performing a processing operation and another object.
  • a powder supply member that supplies powder to the object from a powder supply port and the energy beam are used for the object.
  • a processing system including an irradiation optical system for irradiating an object and a sensor for acquiring relative position information between the powder supply member and the object is provided.
  • a powder supply member that supplies powder to the object from a powder supply port and the energy beam irradiate the object.
  • the member provided with the irradiation optical system and the sensor for acquiring the relative position information between the powder supply member and the object, and the powder supply port is provided, extends in the first direction and has a first position.
  • the dimension of the second part along the second direction intersecting with the first direction, including one part and the second part located on the first direction side of the first part, is in the second direction.
  • a machining system that is smaller than the size of the first portion along the line is provided.
  • FIG. 1 is a cross-sectional view showing the structure of the processing system of the first embodiment.
  • FIG. 2 is a system configuration diagram showing a system configuration of the processing system of the first embodiment.
  • FIG. 3 is a schematic view showing the structure of the first detection device.
  • FIG. 4A and FIG. 4B is a schematic diagram showing a detection principle by the first detection device.
  • FIG. 5 is a schematic view showing the structure of the second detection device.
  • FIG. 6 is a schematic view showing the structure of the third detection device.
  • FIG. 7 is a schematic view showing the structure of the third detection device.
  • FIG. 8 is a schematic view showing the structure of the fourth detection device.
  • FIG. 9 is a schematic view showing the structure of the fifth detection device.
  • FIG. 10 is a schematic view showing the structure of the sixth detection device.
  • FIGS. 11 (a) to 11 (e) is a cross-sectional view showing a state in which light is irradiated and a modeling material is supplied in a certain region on the work.
  • FIGS. 12 (a) to 12 (c) is a cross-sectional view showing a process of forming a three-dimensional structure.
  • FIG. 13 is a cross-sectional view showing the structure of the nozzle member included in the processing system of the second embodiment.
  • FIGS. 14 (a) to 14 (c) is a cross-sectional view showing how the tip member separates from the main body member when stress is applied to the tip member.
  • FIG. 15 is a schematic view showing a modified example of the nozzle member.
  • an embodiment of the processing system will be described using the processing system SYS that performs additional processing on the work W, which is an example of an object.
  • an embodiment of the processing system will be described using the processing system SYS that performs additional processing based on the laser overlay welding method (LMD: Laser Metal Deposition).
  • LMD Laser Metal Deposition
  • the modeling material M supplied to the work W is melted by the processing light EL to form a three-dimensional structure ST integrated with or separable from the work W. It is an additional process to be performed.
  • the laser overlay welding method includes direct metal deposition, directed energy deposition, laser cladding, laser engineered net shaping, direct light fabrication, and laser consolidation.
  • Foundation, Shape Deposition Manufacturing, Wire-Feed Laser Deposition, Gas Through Wire, Laser Powder Fusion, Laser Metal Forming, Selective Laser Powder Remelting, Laser Direct -It may also be called casting, laser powder deposition, laser additive manufacturing, or laser rapid forming.
  • each of the X-axis direction and the Y-axis direction is a horizontal direction (that is, a predetermined direction in the horizontal plane), and the Z-axis direction is a vertical direction (that is, a direction orthogonal to the horizontal plane). Yes, in effect, in the vertical direction).
  • the rotation directions (in other words, the inclination direction) around the X-axis, the Y-axis, and the Z-axis are referred to as the ⁇ X direction, the ⁇ Y direction, and the ⁇ Z direction, respectively.
  • the Z-axis direction may be the direction of gravity.
  • the XY plane may be horizontal.
  • machining system SYS1 of the first embodiment
  • machining system SYS1 the machining system SYS of the first embodiment
  • machining system SYS1 the machining system SYS of the first embodiment
  • FIG. 1 is a cross-sectional view showing an example of the structure of the processing system SYS1 of the first embodiment.
  • FIG. 2 is a system configuration diagram showing an example of the system configuration of the processing system SYS1 of the first embodiment.
  • the processing system SYS1 can form a three-dimensional structure ST (that is, a three-dimensional object having a size in any of the three-dimensional directions and a three-dimensional object).
  • the processing system SYS1 can form the three-dimensional structure ST on the work W that is the basis for forming the three-dimensional structure ST.
  • the processing system SYS1 can form a three-dimensional structure ST by performing additional processing on the work W.
  • the machining system SYS1 can form the three-dimensional structure ST on the stage 31.
  • the processing system SYS1 can form the three-dimensional structure ST on the existing structure. Is.
  • the processing system SYS1 may form a three-dimensional structure ST integrated with the existing structure.
  • the operation of forming the three-dimensional structure ST integrated with the existing structure is equivalent to the operation of adding a new structure to the existing structure.
  • the processing system SYS1 may form a three-dimensional structure ST separable from the existing structure.
  • FIG. 1 shows an example in which the work W is an existing structure held by the stage 31. Further, in the following, the description will proceed with reference to an example in which the work W is an existing structure held by the stage 31.
  • the processing system SYS1 can form the three-dimensional structure ST by the laser overlay welding method. That is, it can be said that the processing system SYS1 is a 3D printer that forms an object by using the laminated modeling technique.
  • the laminated modeling technique is also referred to as rapid prototyping, rapid manufacturing, or additive manufacturing.
  • the processing system SYS1 has a material supply device 1, a processing device 2, a stage device 3, a light source 4, and a gas supply device 5, as shown in FIGS. 1 and 2. , A housing 6, a control device 7, and a detection device 8. At least a part of each of the processing device 2 and the stage device 3 is housed in the chamber space 63IN inside the housing 6.
  • the material supply device 1 supplies the modeling material M to the processing device 2.
  • the material supply device 1 corresponds to the required amount so that the modeling material M required per unit time for the processing device 2 to form the three-dimensional structure ST is supplied to the processing device 2. A desired amount of modeling material M is supplied.
  • the modeling material M is a material that can be melted by irradiation with a processing light EL having a predetermined intensity or higher.
  • a modeling material M for example, at least one of a metal material and a resin material can be used.
  • the modeling material M other materials different from the metal material and the resin material may be used.
  • the modeling material M is a powdery material. That is, the modeling material M is a powder.
  • the powder may contain a granular material in addition to the powdery material.
  • the modeling material M may contain, for example, a powder having a particle size within the range of 90 micrometers ⁇ 40 micrometers.
  • the average particle size of the powder constituting the modeling material M may be, for example, 75 micrometers or any other size.
  • the modeling material M does not have to be powder, and for example, a wire-shaped modeling material or a gaseous modeling material may be used.
  • the processing system SYS1 may process the modeling material M with an energy beam such as a charged particle beam to form a modeled object.
  • the processing device 2 forms the three-dimensional structure ST using the modeling material M supplied from the material supply device 1.
  • the processing apparatus 2 includes a processing head 21 and a head drive system 22.
  • the processing head 21 includes an irradiation optical system 211 and a material nozzle (that is, a supply system for supplying the modeling material M) 212.
  • the processing head 21 and the head drive system 22 are housed in the chamber space 63IN.
  • at least a part of the processing head 21 and / or the head drive system 22 may be arranged in the external space 64OUT, which is the space outside the housing 6.
  • the external space 64OUT may be a space accessible to the operator of the processing system SYS1.
  • the irradiation optical system 211 is an optical system (for example, a condensing optical system) for emitting processed light EL. It is optically connected to the light source 4 that emits the processed light EL via an optical transmission member (not shown) such as an optical fiber or a light pipe.
  • the irradiation optical system 211 emits processed light EL propagating from the light source 4 via the optical transmission member.
  • the irradiation optical system 211 emits the processing light EL so that the processing light EL advances in the chamber space 63IN.
  • the irradiation optical system 211 irradiates the processed light EL downward (that is, the ⁇ Z side) from the irradiation optical system 211.
  • a stage 31 is arranged below the irradiation optical system 211.
  • the irradiation optical system 211 irradiates the work W with the processing light EL.
  • the irradiation optical system 211 can irradiate the irradiation area EA set on the work W as the area where the processing light EL is irradiated (typically, the light is focused). ..
  • the state of the irradiation optical system 211 can be switched between a state in which the irradiation area EA is irradiated with the processing light EL and a state in which the irradiation area EA is not irradiated with the processing light EL under the control of the control device 7. ..
  • the direction of the processed light EL emitted from the irradiation optical system 211 is not limited to directly below (that is, coincident with the ⁇ Z axis direction), and is, for example, a direction inclined by a predetermined angle with respect to the Z axis. May be good.
  • the irradiation optical system 211 includes an optical member 2111 located closest to the work W side (that is, the stage side) along the optical path of the processed light EL.
  • the optical member 2111 located closest to the work W side may be referred to as a terminal optical member or a final optical member.
  • the irradiation optical system 211 irradiates the work W with the processing light EL via the optical member 2111.
  • the irradiation optical system 211 further includes a holding member 2112 that holds the optical member 2111.
  • the holding member 2112 is arranged at a position where the optical member 2111 can be held.
  • the holding member 2112 may be arranged around the optical member 2111.
  • the holding member 2112 may be arranged around the optical path of the processed optical EL via the optical member 2111.
  • the optical member 2111 is arranged on the work W side with respect to the holding member 2112.
  • the holding member 2112 may be arranged on the work W side with respect to the optical member 2111.
  • the material nozzle 212 includes a nozzle member 2121 that supplies the modeling material M.
  • the nozzle member 2121 is a member extending in one direction.
  • the nozzle member 2121 is a tubular member in which a hollow space extending in one direction for the modeling material M to pass through is formed.
  • the material nozzle 212 includes a holding member 2122 that holds the nozzle member 2121. However, the material nozzle 212 does not have to include the holding member 2122.
  • the material nozzle 212 supplies the modeling material M from the supply outlet 2123 formed on the nozzle member 2121 (for example, spraying, ejecting, or spraying).
  • the material nozzle 212 is physically connected to the material supply device 1 which is a supply source of the modeling material M via a pipe (not shown) or the like.
  • the material nozzle 212 supplies the modeling material M supplied from the material supply device 1 via the pipe.
  • the material nozzle 212 may pump the modeling material M supplied from the material supply device 1 via a pipe. That is, the modeling material M from the material supply device 1 and a gas for transportation (for example, an inert gas such as nitrogen or argon) may be mixed and pumped to the material nozzle 212 via a pipe.
  • a gas for transportation for example, an inert gas such as nitrogen or argon
  • the purge gas supplied from the gas supply device 5 may be used as the transport gas.
  • the material nozzle 212 is drawn in a tubular shape in FIG.
  • the shape of the material nozzle 212 is not limited to this shape.
  • the material nozzle 212 supplies the modeling material M toward the chamber space 63IN.
  • the material nozzle 212 supplies the modeling material M downward (that is, the ⁇ Z side) from the material nozzle 212.
  • a stage 31 is arranged below the material nozzle 212. When the work W is mounted on the stage 31, the material nozzle 212 supplies the modeling material M toward the work W.
  • the traveling direction of the modeling material M supplied from the material nozzle 212 is a direction inclined by a predetermined angle (an acute angle as an example) with respect to the Z-axis direction, but even if it is on the ⁇ Z side (that is, directly below). Good.
  • the material nozzle 212 is aligned with the irradiation optical system 211 so that the irradiation optical system 211 supplies the modeling material M toward the irradiation region EA on which the processing light EL is irradiated. That is, the material nozzle 212 and the irradiation region 212 are irradiated so that the supply region MA and the irradiation region EA set on the work W as the region for supplying the modeling material M coincide with (or at least partially overlap) the material nozzle 212.
  • the optical system 211 is aligned.
  • the material nozzle 212 may be aligned so as to supply the modeling material M to the molten pool MP formed by the processing light EL emitted from the irradiation optical system 211.
  • the irradiation optical system 211 irradiates the work W with a processed light EL having excellent straightness. Therefore, the irradiation optical system 211 can irradiate the processing light EL at a desired position on the surface of the work W regardless of the size of the distance between the irradiation optical system 211 and the work W.
  • the material nozzle 212 supplies the modeling material M having a physical size toward the work W. In this case, the material M is affected by the state within the chamber space 63IN (eg, the state of airflow) and / or gravity.
  • the supply path of the modeling material M between the material nozzle 212 and the work W may fluctuate under the influence of the state (for example, the airflow state) and / or gravity in the chamber space 63IN. There is. Therefore, depending on the distance between the material nozzle 212 and the work W, the material nozzle 212 may not be able to supply the modeling material M to a desired position on the surface of the work W. Typically, the greater the distance between the material nozzle 212 and the work W, the more likely it is that the material nozzle 212 will not be able to supply the modeling material M to the desired position on the surface of the work W. Become. Therefore, in the first embodiment, as shown in FIG.
  • the material nozzle 212 may be arranged so that the distance between the material nozzle 212 and the work W is relatively short.
  • the material nozzle 212 may be arranged so that the nozzle member 2121 (particularly, the supply port 2123) of the material nozzle 212 is closer to the work W than the optical member 2111 of the irradiation optical system 211.
  • the material nozzle 212 may be arranged so that the distance between the nozzle member 2121 and the work W is shorter than the distance between the optical member 2111 and the work W. Since the work W is held by the stage 31, the material nozzle 212 may be arranged so that the nozzle member 2121 is closer to the stage 31 than the optical member 2111.
  • the material nozzle 212 may be arranged so that the distance between the nozzle member 2121 and the stage 31 is shorter than the distance between the injection unit 213 and the stage 31.
  • the head drive system 22 moves the processing head 21.
  • the head drive system 22 moves the processing head 21 within the chamber space 63IN, for example.
  • the head drive system 22 moves the machining head 21 along at least one of the X-axis, the Y-axis, and the Z-axis.
  • each of the irradiation region EA and the supply region MA moves on the work W along at least one of the X-axis and the Y-axis.
  • the head drive system 22 may move the machining head 21 along at least one rotation direction in the ⁇ X direction, the ⁇ Y direction, and the ⁇ Z direction in addition to at least one of the X-axis, the Y-axis, and the Z-axis. .. In other words, the head drive system 22 may rotate the machining head 21 around at least one of the X-axis, Y-axis, and Z-axis. The head drive system 22 may change the posture of the processing head 21 around at least one of the X-axis, the Y-axis, and the Z-axis.
  • the head drive system 22 includes an actuator such as a motor, for example.
  • the irradiation optical system 211 and the material nozzle 212 may be moved separately.
  • the head drive system 22 may be capable of adjusting at least one of the position of the injection unit 213, the direction of the injection unit 213, the position of the nozzle member 2121, and the direction of the nozzle member 2121.
  • the irradiation region EA where the irradiation optical system 211 irradiates the processing light EL and the supply region MA where the material nozzle 212 supplies the modeling material M can be controlled separately.
  • the stage device 3 includes a stage 31.
  • the stage 31 is housed in the chamber space 63IN.
  • the stage 31 can support the work W.
  • the state of "the stage 31 supporting the work W" here may mean a state in which the work W is directly or indirectly supported by the stage 31.
  • the stage 31 may be able to hold the work W. That is, the stage 31 may support the work W by holding the work W. Alternatively, the stage 31 does not have to be able to hold the work W.
  • the work W may be placed on the stage 31. That is, the stage 31 may support the work W placed on the stage 31. At this time, the work W may be mounted on the stage 31 without being clamped.
  • the "stage 31 supporting the work W" state in the present embodiment may also include a state in which the stage 31 holds the work W and a state in which the work W is placed on the stage 31. Since the stage 31 is housed in the chamber space 63IN, the work W supported by the stage 31 is also housed in the chamber space 63IN. Further, the stage 31 can release the held work W when the work W is held.
  • the irradiation optical system 211 described above irradiates the processed beam PL at least during a period in which the stage 31 supports the work W. Further, the material nozzle 212 described above supplies the modeling material M during at least a part of the period in which the stage 31 supports the work W.
  • a part of the modeling material M supplied by the material nozzle 212 may be scattered or spilled from the surface of the work W to the outside of the work W (for example, around the stage 31). Therefore, the processing system SYS1 may be provided with a recovery device for recovering the scattered or spilled modeling material M around the stage 31.
  • the stage 31 may be provided with a mechanical chuck, a vacuum suction chuck, or the like in order to hold the work W.
  • the stage 31 may be movable by a stage drive system (not shown).
  • the stage drive system may move the stage 31 within the chamber space 63IN, for example.
  • the stage drive system may move the stage 31 along at least one of the X-axis, the Y-axis, and the Z-axis.
  • the irradiation region EA moves on the work W along at least one of the X-axis and the Y-axis.
  • the stage drive system may move the stage 31 along at least one rotation direction in the ⁇ X direction, the ⁇ Y direction, and the ⁇ Z direction in addition to at least one of the X-axis, the Y-axis, and the Z-axis.
  • the stage drive system 31 includes an actuator such as a motor, for example.
  • the processing device 2 does not have to include the head drive system 22.
  • the light source 4 emits, for example, at least one of infrared light and ultraviolet light as processed light EL.
  • the processed light EL light of other wavelengths, for example, light having a wavelength in the visible region may be used.
  • the processing light EL is a laser beam.
  • the light source 4 may include a laser light source such as a semiconductor laser. Examples of the laser light source include at least one such as a laser diode (LD: Laser Diode), a fiber laser, a CO 2 laser, a YAG laser, and an excimer laser.
  • the processing light EL does not have to be a laser beam, and the light source 4 may include an arbitrary light source (for example, at least one such as an LED (Light Emitting Side) and a discharge lamp).
  • the gas supply device 5 is a supply source of purge gas for purging the chamber space 631IN.
  • the purge gas contains an inert gas.
  • An example of the inert gas is nitrogen gas or argon gas.
  • the gas supply device 5 supplies purge gas to the chamber space 63IN. As a result, the chamber space 63IN becomes a space purged by the purge gas.
  • the gas supply device 5 may be a cylinder in which an inert gas such as nitrogen gas or argon gas is stored. When the inert gas is nitrogen gas, the gas supply device 5 may be a nitrogen gas generator that generates nitrogen gas from the atmosphere as a raw material.
  • the housing 6 is a storage device that accommodates at least a part of each of the processing device 2 and the stage device 3 in the chamber space 63IN, which is the internal space of the housing 6.
  • the housing 6 includes a partition member 61 that defines a chamber space 63IN.
  • the partition member 61 is a member that separates the chamber space 63IN from the external space 64OUT of the housing 6.
  • the partition member 61 faces the chamber space 63IN via its inner wall 611, and faces the outer space 64OUT through its outer wall 612. In this case, the space surrounded by the partition member 61 (more specifically, the space surrounded by the inner wall 611 of the partition member 61) becomes the chamber space 63IN.
  • the partition member 61 may be provided with a door that can be opened and closed. This door may be opened when the work W is placed on the stage 31 and when the work W and / or the modeled object is taken out from the stage 31, and may be closed during the modeling.
  • the control device 7 controls the operation of the processing system SYS1.
  • the control device 7 may include, for example, a CPU (Central Processing Unit) (or a GPU (Graphics Processing Unit) in addition to or in place of the CPU) and a memory.
  • the control device 7 functions as a device that controls the operation of the machining system SYS1 by the CPU executing a computer program.
  • This computer program is a computer program for causing the control device 7 (for example, the CPU) to perform (that is, execute) the operation described later to be performed by the control device 7. That is, this computer program is a computer program for making the control device 7 function so that the processing system SYS1 performs the operation described later.
  • the computer program executed by the CPU may be recorded in a memory (that is, a recording medium) included in the control device 7, or may be an arbitrary storage medium built in the control device 7 or externally attached to the control device 7 (that is, a recording medium). For example, it may be recorded on a hard disk or a semiconductor memory). Alternatively, the CPU may download the computer program to be executed from a device external to the control device 7 via the network interface.
  • a memory that is, a recording medium
  • the CPU may download the computer program to be executed from a device external to the control device 7 via the network interface.
  • the control device 7 may control the injection mode of the processed light EL by the irradiation optical system 211.
  • the injection mode may include, for example, at least one of the intensity of the processing light EL and the injection timing of the processing light EL.
  • the injection mode may include, for example, the ratio of the length of the emission time of the pulsed light to the emission period of the pulsed light (so-called duty ratio).
  • the injection mode may include, for example, the length of the emission time of the pulsed light itself or the emission cycle itself.
  • the control device 7 may control the movement mode of the processing head 21 by the head drive system 22.
  • the movement mode may include, for example, at least one of a movement amount, a movement speed, a movement direction, and a movement timing.
  • the control device 7 may control the supply mode of the modeling material M by the material supply device 1.
  • the supply mode of the modeling material M by the material nozzle 212 is mainly determined by the supply mode of the modeling material M by the material supply device 1. Therefore, controlling the supply mode of the modeling material M by the material supply device 1 can be regarded as equivalent to controlling the supply mode of the modeling material M by the material nozzle 212.
  • the supply mode may include, for example, at least one of a supply amount (particularly, a supply amount per unit time) and a supply timing.
  • the control device 7 does not have to be provided inside the processing system SYS1.
  • the control device 7 may be provided outside the processing system SYS1 as a server or the like.
  • the control device 7 and the processing system SYS1 may be connected by a wired and / or wireless network (or a data bus and / or a communication line).
  • a wired network for example, a network using a serial bus type interface represented by at least one of IEEE1394, RS-232x, RS-422, RS-423, RS-485 and USB may be used.
  • a network using a parallel bus interface may be used.
  • a network using an Ethernet (registered trademark) compliant interface represented by at least one of 10BASE-T, 100BASE-TX and 1000BASE-T may be used.
  • a network using radio waves may be used.
  • An example of a network using radio waves is a network conforming to IEEE802.1x (for example, at least one of wireless LAN and Bluetooth®).
  • a network using infrared rays may be used.
  • a network using optical communication may be used.
  • the control device 7 and the processing system SYS1 may be configured so that various types of information can be transmitted and received via the network. Further, the control device 7 may be able to transmit information such as commands and control parameters to the processing system SYS1 via the network.
  • the processing system SYS1 may include a receiving device that receives information such as commands and control parameters from the control device 7 via the network.
  • a part of the control device 7 may be provided inside the processing system SYS1 and a part of the control device 7 may be provided outside the processing system SYS1.
  • Recording media for recording computer programs executed by the CPU include CD-ROMs, CD-Rs, CD-RWs, flexible disks, MOs, DVD-ROMs, DVD-RAMs, DVD-Rs, DVD + Rs, and DVD-RWs. , DVD + RW and Blu-ray (registered trademark) and other optical discs, magnetic tape and other magnetic media, magneto-optical disks, USB memory and other semiconductor memories, and any other medium that can store programs. May be good.
  • the recording medium may include a device capable of recording a computer program (for example, a general-purpose device or a dedicated device in which the computer program is implemented in at least one form such as software and firmware).
  • each process or function included in the computer program may be realized by a logical processing block realized in the control device 7 by the control device 7 (that is, a computer) executing the computer program. It may be realized by hardware such as a predetermined gate array (FPGA, ASIC) included in the control device 7, or a mixture of a logical processing block and a partial hardware module that realizes a part of the hardware. It may be realized in the form of.
  • FPGA predetermined gate array
  • the detection device 8 is a device (for example, a sensor) capable of detecting (acquiring) information regarding the relative positional relationship between two different objects. More specifically, the detection device 8 is a device capable of detecting (acquiring) information regarding the relative positional relationship between the detection target and the approaching target that can move relative to the detection target. The detection device 8 is a device capable of detecting (acquiring) information regarding the relative positional relationship between the detection target and the approaching object that may approach the detection target. Since the structure of the detection device 8 will be described in detail later with reference to FIGS. 3 to 10, detailed description here will be omitted.
  • Information on the relative positional relationship between two different objects may include information on the degree of proximity of two different objects.
  • the detection device 8 may be a device that can detect (acquire) information on the degree of proximity of two different objects. More specifically, the detection device 8 may be a device that can detect (acquire) information on the degree of approach between the detection target and the approach target that can move relative to the detection target. The detection device 8 may be a device that can detect (acquire) information on the degree of approach between the detection target and the approaching object that may approach the detection target.
  • Information on the relative positional relationship between two different objects is whether or not either the detection object or the approaching object exists in a certain range determined according to either the detecting object or the approaching object.
  • the detection device 8 can use information for determining whether or not an approaching object exists in a certain range determined according to the detection object (for example, a range of a certain distance or less from the detection object). It may be a device that can detect (acquire).
  • the detection device 8 can use information for determining whether or not the detection object exists in a certain range determined according to the approaching object (for example, a range of a certain distance or less from the approaching object). It may be a device that can detect (acquire).
  • the detection device 8 may detect (acquire) information regarding the distance between the detection target object and the approaching target object.
  • the detection device 8 may detect (acquire) information regarding the contact between the detection object and the approaching object. For example, the detection device 8 may detect information regarding the presence or absence of contact between the detection object and the approaching object.
  • the detection result of the detection device 8 is output to the control device 7.
  • the control device 7 controls the relative position between the detection target object and the approaching target object based on the detection result of the detection device 8.
  • the control device 7 determines whether or not a predetermined approach condition regarding the degree of approach between the detection target object and the approach target object is satisfied based on the detection result (acquisition result) by the detection device 8. ..
  • the approach condition may include a condition that the detection target and the approach target are in contact with each other.
  • the approach condition may include a condition that the distance between the detection target and the approach target is less than the permissible value even though the detection target and the approach target are not in contact with each other. That is, the approach condition is that the detection target and the approach target are not in contact with each other.
  • the detection target and the approach target are so close to each other that the distance between the detection target and the approach target is less than the allowable value. It may include the condition that it exists.
  • control device 7 determines that the approach condition is satisfied, it is caused by an abnormality caused by the contact between the detection target object and the approach target object (or due to the approach between the detection target object and the approach target object).
  • the relative position or relative posture of the detected object and the approaching object is controlled so as to suppress the occurrence of the abnormalities (the same shall apply hereinafter).
  • the control device 7 typically has a relative position or relative position between the detection object and the approaching object so as to avoid the occurrence of an abnormality due to the contact between the detection object and the approaching object. Control posture.
  • An example of an abnormality that may occur due to contact between a detection object and an approaching object is an abnormality that occurs in at least one of the detection object and the approaching object.
  • Examples of abnormalities that occur in the detection target include at least one of damage to the detection target, destruction of the detection target, failure of the detection target, and misalignment of the detection target from the normal position.
  • Examples of abnormalities that occur in the approaching object include at least one of damage to the approaching object, destruction of the approaching object, failure of the approaching object, and misalignment of the approaching object from the normal position.
  • the control device 7 determines the relative position or the relative posture of the detection target and the approaching object so as to avoid contact between the detection target and the approaching object. You may control it. In this case, since the contact between the detection target and the approaching object is avoided, no abnormality occurs due to the contact between the detection target and the approaching object.
  • the control device 7 changes the relative position or the relative posture of the detection object and the approaching object so that the detection object and the approaching object approach each other. You may limit the operation of.
  • the control device 7 changes the relative position or the relative posture of the detection target and the approach target so that the detection target and the approach target approach each other. The operation for doing so may be prohibited. In this case, since the detection target and the approaching object do not come closer to each other, the contact between the detection target and the approaching object becomes larger than the case where the detection target and the approaching object come closer to each other. The resulting abnormality is less likely to occur. That is, the possibility of an abnormality caused by the contact between the detection target object and the approaching target object is reduced.
  • the detection target is detected before the detection target and the approaching object come into contact with each other. Since further access between the object and the approaching object is restricted, no abnormality occurs due to contact between the detection object and the approaching object.
  • the detection object and the approaching object are further changed while the detection object and the approaching object are in contact with each other. The approach is restricted.
  • the force applied from the detection target to the approaching object and / the approaching target increases, which may lead to damage to the detection object and / or the approaching object.
  • the operation of restricting the operation for changing the relative position or the relative posture of the detection object and the approaching object so that the detection object and the approaching object approach each other is the operation of restricting the detection object and the approaching object. It can be regarded as equivalent to the operation of controlling the relative position or the relative posture of the detected object and the approaching object so that they do not approach each other.
  • the relative position or relative position between the detection target and the approach target is avoided so that the detection target and the approach target are avoided from further approaching.
  • the posture may be controlled.
  • the detection target and the approach target are for the same reason as in the case where the operation for changing the relative position of the detection target and the approach target is restricted so that the detection target and the approach target are close to each other.
  • the abnormality caused by the contact between the detection object and the approaching object is less likely to occur. If further approach between the detection object and the approaching object is avoided in a state where the detection object and the approaching object are not in contact with each other, the detection object and the approaching object do not come into contact with each other.
  • the operation of controlling the relative position or the relative posture of the detection target and the approaching object so as to avoid further approach between the detection target and the approaching object is performed on the detection target and the approaching object. It can be said that this is a specific example of an operation of controlling the relative position between the detection target object and the approaching target object so as to avoid contact with the object. Further, if further approach between the detection object and the approaching object is avoided, the detection object and the approaching object do not approach each other. Therefore, the operation of controlling the relative position of the detection target and the approaching object so as to avoid further approach between the detection target and the approaching object is such that the detection target and the approaching object approach each other. It can be said that this is a specific example of an operation of limiting the operation for changing the relative position or the relative posture of the detection object and the approaching object.
  • the control device 7 controls the relative position or the relative posture of the detection object and the approaching object so that the detection object and the approaching object are separated from each other when the approach condition is satisfied. May be good.
  • an abnormality caused by contact between the detection target and the approaching object is less likely to occur as compared with the case where the detection target and the approaching object are not separated from each other.
  • the detection target is detected before the detection target and the approaching object come into contact with each other. Since the object and the approaching object are separated from each other, no abnormality occurs due to the contact between the detection object and the approaching object.
  • the detecting object and the approaching object that are in contact are separated from each other. For this reason, the contact between the detection target and the approaching object is caused as compared with the case where the detection target and the approaching object continue to be in contact with each other (furthermore, the detection target and the approaching object are further approached). Abnormality is less likely to occur.
  • the operation of controlling the relative position or the relative posture of the detection object and the approaching object so that the detection object and the approaching object are separated from each other avoids the contact between the detection object and the approaching object.
  • this is a specific example of an operation of controlling the relative position or the relative posture of the detected object and the approaching object so as to be performed. Further, when the detection target and the approach target are separated from each other, the detection target and the approach target do not approach each other. Therefore, the operation of controlling the relative position or the relative posture of the detection target and the approaching object so that the detection target and the approaching object are separated from each other causes the detection target and the approaching object to approach each other. It can be said that this is a specific example of an operation of limiting the operation for changing the relative position or the relative posture of the detected object and the approaching object.
  • the control device 7 maintains the relative position or the relative posture of the detection target and the approach target at the time when the approach condition is satisfied (that is, relative).
  • the relative position between the detection object and the approaching object may be controlled so that the position is not changed).
  • the detection target and the approach target are for the same reason as in the case where the operation for changing the relative position of the detection target and the approach target is restricted so that the detection target and the approach target are close to each other.
  • an abnormality caused by the contact between the detection target and the approaching object is less likely to occur.
  • the operation of controlling the relative position between the detection target and the approaching object so as to maintain the relative position between the detection target and the approaching object avoids the contact between the detection target and the approaching object.
  • it can be said that it is a specific example of the operation of controlling the relative position or the relative posture of the detected object and the approaching object. Further, if the relative position or the relative posture of the detection object and the approaching object is maintained, the detection object and the approaching object do not approach each other.
  • the operation of controlling the relative position or the relative posture of the detection target and the approaching object so that the relative position between the detection target and the approaching object is maintained is performed between the detection target and the approaching object. It can be said that this is a specific example of an operation of limiting the operation for changing the relative position or the relative posture of the detected object and the approaching object so that the two are close to each other.
  • the control device 7 may stop the operation of the position change device capable of changing the relative position between the detection target object and the approach target object.
  • the detection target is the processing head 21, as will be described later
  • the head drive system 22 can change the relative position between the processing head 21 which is the detection target and the approaching object.
  • the control device 7 may stop the operation of the head drive system 22 when the approach condition is satisfied.
  • the operation of the position changing device is performed for the same reason as in the case where the operation for changing the relative position of the detected object and the approaching object is restricted so that the detection object and the approaching object are close to each other.
  • the operation of stopping the operation of the position changing device is an operation of controlling the relative position between the detection object and the approaching object so as to avoid contact between the detection object and the approaching object, and an operation of controlling the relative position between the detection object and the approaching object. It can be said that it is a specific example of each of the operations of limiting the operation for changing the relative position of the detected object and the approaching object so that the object and the approaching object approach each other.
  • the head drive system 22 which is an example of the position changing device, moves the machining head 21 along at least one of the ⁇ X direction, the ⁇ Y direction, and the ⁇ Z direction, in other words, when the posture of the machining head 21 is changed.
  • the head drive system 22 can change the relative posture of the processing head 21 which is the detection target and the approaching target. Similar to the above, the control device 7 may stop the operation of the head drive system 22 when the approach condition is satisfied.
  • the position changing device Compared to the case where the operation is not stopped (in particular, the position changing device operates so that the detected object and the approaching object are closer to each other), an abnormality occurs due to the contact between the detected object and the approaching object. It becomes difficult.
  • the operation of stopping the operation of the position changing device is an operation of controlling the relative posture of the detection object and the approaching object so as to avoid contact between the detection object and the approaching object, and an operation of detecting. It can be said that this is a specific example of each of the actions of limiting the action for changing the relative postures of the detected object and the approaching object so that the object and the approaching object come close to each other.
  • the detection device 8 may be a device capable of acquiring relative position information of two different objects (detection object and approaching object).
  • the relative position information of two different objects may include information about the distance or distance between the two different objects. Further, the relative position information of two different objects may include information on whether or not the other object (approaching object) exists within a certain range from one object (detection object). .. Further, the relative position information of two different objects may include information regarding the position of one object (detection object) and the position of the other object (approaching object). Further, the relative position information of two different objects may include information on the contact between one object (detection object) and the other object (approaching object).
  • control device 7 controls two different objects (detection object and approaching object) when the distance or distance between the two different objects (detection object and approaching object) is within a predetermined distance or interval.
  • the operation may be stopped, or control may be performed so that two different objects (detection object and approaching object) move away from each other.
  • control device 7 operates two different objects (detection object and approach object) when the other object (approaching object) exists within a certain range from one object (detection object). May be controlled so that the two different objects (detection object and approaching object) move away from each other.
  • control device 7 has two different objects (detection object and approach object) when the position of one object (detection object) and the position of the other object (approach object) have a specific relationship.
  • the operation of the object) may be stopped, or control may be performed so that two different objects (detection object and approaching object) move away from each other.
  • the control device 7 stops the operation of two different objects (detection object and approach object) when one object (detection object) and the other object (approach object) come into contact with each other. Or, control may be performed so that two different objects (detection object and approaching object) move away from each other.
  • the detection device 8 may be a device capable of measuring (acquiring) the distance between two different objects (detection object and approaching object).
  • the processing head 21 (that is, at least a part of the processing apparatus 2) can be mentioned. This is because, as described above, the processing head 21 can be moved by the head drive system 22.
  • the detection device 8 may detect information on the positional relationship between the machining head 21 and an approaching object that may approach the machining head 21.
  • a work W is an example of an approaching object that may approach the machining head 21 when the machining head 21 is moving. In this case, the detection device 8 may detect information regarding the positional relationship between the machining head 21 and the work W.
  • a stage 31 is another example of an approaching object that may approach the machining head 21 in a situation where the machining head 21 is moving. In this case, the detection device 8 may detect information regarding the positional relationship between the processing head 21 and the stage 31.
  • the relative position information of the two different objects (detection object and approaching object) described above includes information on the relative posture relationship between the two different objects (detection object and approaching object). You may be.
  • the material nozzle 212 included in the processing head 21 is arranged at a position closer to at least one of the work W and the stage 31 than the irradiation optical system 211 included in the processing head 21.
  • the material nozzle 212 can be said to be one of the members closest to at least one of the work W and the stage 31 among the members constituting the processing head 21 (further, the processing apparatus 2). Therefore, the material nozzle 212 is more likely to come into contact with at least one of the work W and the stage 31 than the irradiation optical system 211.
  • the detection device 8 may detect information regarding the positional relationship between the material nozzle 212 and at least one of the work W and the stage 31.
  • the detection device 8 may detect information regarding the positional relationship between the nozzle member 2121 of the material nozzle 212 and at least one of the work W and the stage 31.
  • the description will proceed with reference to an example in which the detection device 8 detects information regarding the positional relationship between the material nozzle 212 (particularly, the nozzle member 2121) and the work W.
  • the control device 7 suppresses the occurrence of an abnormality caused by the contact between the material nozzle 212 and the work W.
  • the relative position between the material nozzle 212 and the work W is controlled. Specifically, the relative position between the material nozzle 212 and the work W is changed by the head drive system 22 that can move the processing head 21 including the material nozzle 212. Therefore, the head drive system 22 controls the relative position between the material nozzle 212 and the work W so as to suppress the occurrence of an abnormality caused by the contact between the material nozzle 212 and the work W under the control of the control device 7. Will be done.
  • the control device 7 controls the relative position between the material nozzle 212 and the work W so that the machining operation for forming the three-dimensional structure ST is performed while the material nozzle 212 is away from the work W. You may.
  • the control device 7 performs a machining operation for forming the three-dimensional structure ST in a state where the nozzle member 2121 of the material nozzle 212 (particularly, the supply outlet 2131 at the tip thereof) is separated from the work W.
  • the relative position between the material nozzle 212 and the work W may be controlled.
  • the control device 7 controls the relative position between the material nozzle 212 and the work W so as to suppress the occurrence of an abnormality caused by the contact between the material nozzle 212 and the work W by controlling the stage drive system. You may.
  • the processing system SYSTEM has the first detection device 8a, the second detection device 8b, the third detection device 8c, the fourth detection device 8d, and the fifth detection device 8e as the detection device 8. And at least one of the sixth detection devices 8f may be used. Therefore, in the following, the structures of the first detection device 8a to the sixth detection device 8f will be described in order.
  • FIG. 3 is a schematic view showing the structure of the first detection device 8a.
  • FIG. 4A and FIG. 4B is a schematic diagram showing a detection principle by the first detection device 8a.
  • At least a part of the first detection device 8a is arranged in the material nozzle 212 which is the detection target. More specifically, at least a part of the first detection device 8a is arranged on the nozzle member 2121 of the material nozzle 212.
  • the iron core 81a and the coil 83a, which will be described later, constituting the first detection device 8a are arranged in the material nozzle 212. Therefore, at least a part of the first detection device 8a moves in accordance with the movement of the material nozzle 212. In other words, at least a part of the first detection device 8a moves with the material nozzle 212.
  • the first detection device 8a is a proximity sensor. Therefore, the first detection device 8a can detect (acquire) information on the positional relationship between the first detection device 8a and the work W without contacting the work W which is an approaching object. Specifically, the first detection device 8a can detect whether or not the work W exists in a certain range from the first detection device 8a without contacting the work W which is an approaching object. it can. Further, since the first detection device 8a is arranged on the material nozzle 212, the closer the first detection device 8a is to the work W, the closer the material nozzle 212 also approaches the work W.
  • the operation of detecting (acquiring) the information regarding the positional relationship between the first detection device 8a and the work W can be regarded as equivalent to the operation of detecting (acquiring) the information regarding the positional relationship between the material nozzle 212 and the work W. .. That is, the first detection device 8a can detect information regarding the positional relationship between the material nozzle 212 and the work W. Specifically, the first detection device 8a can detect whether or not the work W exists in a certain range from the material nozzle 212. In this case, as an approach condition, a condition that the material nozzle 212 and the work W are not in contact with each other and the distance between the material nozzle 212 and the work W is less than an allowable value may be used.
  • the permissible value may be set to a constant value in advance, or may be set to a value according to the moving speed of the processing head 21, for example. As an example, when the moving speed of the machining head is slow, the permissible value may be set small, and when the moving speed is fast, the permissible value may be set large. Further, this permissible value may be variable regardless of the moving speed of the processing head 21.
  • the first detection device 8a When the first detection device 8a is a proximity sensor, the first detection device 8a provides information on the positional relationship between the material nozzle 212 and the work W in a state where the material nozzle 212 and the work W are not in contact with each other. Is detected (acquired). The first detection device 8a detects (acquires) information regarding the positional relationship between the material nozzle 212 and the work W during a period in which the material nozzle 212 and the work W are not in contact with each other. In this case, the first detection device 8a is arranged at a position where information regarding the positional relationship between the material nozzle 212 and the work W can be detected (acquired) when the material nozzle 212 and the work W are not in contact with each other. Will be done.
  • the first detection device 8a may be arranged at the position closest to the work W side (that is, the stage 31 side) of the nozzle members 2121.
  • the first detection device 8a may detect the presence or absence of contact between the material nozzle 212 and the work W.
  • the first detection device 8a detects the presence or absence of contact between the material nozzle 212 and the work W, a condition that the material nozzle 212 and the work W are in contact with each other may be used as an approach condition.
  • the type of proximity sensor used as the first detection device 8a is not particularly limited.
  • an inductive proximity sensor may be used as the first detection device 8a.
  • a capacitance type proximity sensor may be used as the first detection device 8a.
  • a magnetic proximity sensor may be used as the first detection device 8a.
  • FIG. 3 shows an example in which an inductive proximity sensor is used as the first detection device 8a.
  • the first detection device 8a includes an iron core 81a, a support member 82a, a coil 83a, and a detection circuit 84a.
  • the iron core 81a is fixed to the material nozzle 212 via the support member 82a.
  • FIG. 3 shows an example in which an inductive proximity sensor is used as the first detection device 8a.
  • the first detection device 8a includes an iron core 81a, a support member 82a, a coil 83a, and a detection circuit 84a.
  • the iron core 81a is fixed to the material nozzle 212 via the support member 82a
  • the iron core 81a is fixed to the outer surface of the material nozzle 212 via the support member 82a.
  • the coil 83a is wound around an iron core.
  • a current is supplied to the coil 83a from the detection circuit 84a.
  • a magnetic field is generated from the coil 83a.
  • the work W is induced by electromagnetic induction.
  • An eddy current is generated in.
  • the impedance of the coil 83a changes due to the eddy current.
  • the detection circuit 84a detects information on the positional relationship between the material nozzle 212 and the work W by detecting this change in impedance.
  • the inductive proximity sensor shown in FIG. 3 When the inductive proximity sensor shown in FIG. 3 is used as the first detection device 8a, if no eddy current flows through the work W, the first detection device 8a has a positional relationship between the material nozzle 212 and the work W. Information about is difficult to detect. Therefore, in this case, a work made of a metal material (particularly, a conductor material) is used as the work W so that an eddy current flows.
  • the modeling material M supplied from the material nozzle 212 is also a metal material
  • the modeling material M from the material nozzle 212 may affect the impedance of the coil 83a.
  • the detection circuit 84a may detect a change from the impedance in a state where the work W is located within a range not affected by the magnetic field from the coil 83a.
  • the first detection device 8a is arranged on the material nozzle 212.
  • the first detection device 8a may be arranged at a position different from that of the material nozzle 212.
  • the first detection device 8a may be arranged on the work W or the stage 31.
  • the first detection device 8a when the inductive proximity sensor is used as the first detection device 8a, the first detection device 8a includes an iron core 81a. However, the first detection device 8a does not have to include the iron core 81a. Even in this case, the first detection device 8a can detect the degree of proximity between the material nozzle 212 and the work W as long as the coil 83a is provided.
  • FIG. 5 is a schematic view showing the structure of the second detection device 8b.
  • the same components as the components included in the first detection device 8a described above are designated by the same reference numerals, and detailed description thereof will be omitted.
  • the coil 83a is wound around the iron core 81a in that the coil 83a is wound around the material nozzle 212 (particularly, the nozzle member 2121). It is different from the detection device 8a. That is, in the second detection device 8b, the material nozzle 212 (particularly, the nozzle member 2121) used as a member for supplying the modeling material M is also used as the iron core 81. Therefore, the second detection device 8b does not have to include the iron core 81a. Further, the second detection device 8b may not include a support member 82a for fixing the iron core 81a to the material nozzle 212. However, in this case, the material nozzle 212 (particularly, the nozzle member 2121) is made of a metal material (particularly, a conductor material). Other features of the second detection device 8b may be the same as those of the first detection device 8a.
  • FIGS. 6 to 7. are schematic views showing the structure of the third detection device 8c.
  • the third detection device 8c detects information on the contact state between the material nozzle 212 (particularly, the nozzle member 2121, the same applies hereinafter) and the work W as information on the positional relationship between the first detection device 8a and the work W (the same applies hereinafter). get. More specifically, the third detection device 8c detects information regarding the presence or absence of contact between the material nozzle 212 and the work W. For example, the third detection device 8c may electrically detect the presence or absence of contact between the material nozzle 212 and the work W. For example, the third detection device 8c may magnetically detect the presence or absence of contact between the material nozzle 212 and the work W.
  • the third detection device 8c electrically detects the presence or absence of contact between the material nozzle 212 and the work W. Specifically, in the example shown in FIG. 6, the third detection device 8c detects the presence / absence of electrical continuity between the material nozzle 212 and the work W to determine the presence / absence of contact between the material nozzle 212 and the work W. To detect. In this case, the third detection device 8c detects information on the state of electrical continuity between the material nozzle 212 and the work W as information on the positional relationship between the material nozzle 212 and the work W. However, when the third detection device 8c detects the presence or absence of electrical continuity between the material nozzle 212 and the work W, each of the material nozzle 212 and the work W is composed of a metal material (particularly, a conductor material). ..
  • the third detection device 8c includes wiring 81c, wiring 82c, and a detection circuit 83c.
  • the wiring 81c is electrically connected to the material nozzle 212.
  • the wiring 82c is electrically connected to the work W. Either one of the wirings 81c and 82c may be electrically connected to the electrical ground of the processing system SYS1. Either one of the wirings 81c and 82c may be grounded.
  • the detection circuit 83c detects whether or not an electrically closed circuit is formed via the wirings 81c and 82c, and thus whether or not the material nozzle 212 and the work W are in contact with each other (that is, whether or not there is electrical continuity).
  • the detection circuit 83c can detect the presence or absence of contact between the material nozzle 212 and the work W by detecting whether or not an electrically closed circuit is formed via the wirings 81c and 82c. ..
  • the wiring 82c may be electrically connected to the stage 31 holding the work W.
  • the detection circuit 83c can detect the presence or absence of contact between the material nozzle 212 and the work W.
  • electrical contacts connected to the wiring 82c may be provided at one or more positions on the upper surface of the stage 31.
  • the material nozzle 212 is physically connected to the material supply device 1 which is a supply source of the modeling material M via a pipe or the like (not shown). If this pipe is made of metal, the detection circuit 83c may erroneously detect the presence or absence of contact between the material nozzle 212 and the work W due to the influence of the pipe. For example, when the wiring 82 connected to the work W is connected to the ground of the machining system SYS1 and the pipe connected to the material nozzle 212 is also connected to the ground of the machining SYS1, the material nozzle 212 and the work It can be said that W is in a state of being substantially electrically conductive via the pipe.
  • the detection circuit 83c may erroneously detect that the closed circuit is formed through the wirings 81c and 82c even though the material nozzle 212 and the work W are not in contact with each other. Therefore, even if the material nozzle 212 (particularly, the portion of the material nozzle 212 that is electrically connected to the wiring 81c) is electrically insulated from the pipe that connects the material nozzle 212 and the material supply device 1. Good. When detecting the presence or absence of contact between the pipe and the work W, the pipe and the material nozzle 212 (particularly, the portion of the material nozzle 212 that is electrically connected to the wiring 81c) are electrically connected. May be good.
  • FIG. 8 is a schematic view showing the structure of the fourth detection device 8d.
  • the same components as the components included in the third detection device 8c described above are designated by the same reference numerals, and detailed description thereof will be omitted.
  • the fourth detection device 8d makes contact between the material nozzle 212 and the work W in that it detects information regarding the contact state between the contact member 84d arranged on the material nozzle 212 and the work W. It is different from the third detection device 8c that detects information about. That is, the fourth detection device 8d detects information on the relative positional relationship between the material nozzle 212 and the work W when a part of the fourth detection device 8d comes into contact with the work W. Other features of the fourth detection device 8d may be the same as those of the third detection device 8c.
  • the contact member 84d is a metal member.
  • the wiring 81c is electrically connected to the contact member 84d.
  • the contact member 84d is fixed to the material nozzle 212 via the support member 85d.
  • the contact member 84d may be aligned with respect to the material nozzle 212 so that the contact member 84d and the work W come into contact with each other when the material nozzle 212 and the work W come into contact with each other.
  • the contact member 84d may be aligned with respect to the material nozzle 212 so that the material nozzle 212 and the contact member 84d come into contact with the work W at the same time.
  • the contact member 84d may be aligned with respect to the material nozzle 212 so that the contact member 84d and the work W come into contact with each other before the material nozzle 212 and the work W come into contact with each other.
  • FIG. 8 shows an example in which the contact member 84d is aligned with respect to the material nozzle 212 so that the contact member 84d and the work W come into contact with each other before the material nozzle 212 and the work W come into contact with each other.
  • the fourth detection device 8d detects the presence or absence of contact between the contact member 84d and the work W.
  • the principle that the fourth detection device 8d detects the presence or absence of contact between the contact member 84d and the work W is the same as the principle that the third detection device 8c detects the presence or absence of contact between the material nozzle 212 and the work W. You may. That is, the fourth detection device 8d detects whether or not an electrically closed circuit is formed via the wiring 81c connected to the contact member 84d and the wiring 82c connected to the work W, thereby making contact. The presence or absence of contact between the member 84d and the work W may be detected.
  • the control device 7 specifies whether or not the material nozzle 212 and the work W are so close (or, in some cases, already in contact with each other) that the contact member 84d and the work W are in contact with each other. be able to.
  • the condition that the material nozzle 212 and the work W are close to each other (or, in some cases, already in contact with each other) so that the contact member 84d and the work W are in contact with each other is used as the above-mentioned approach condition.
  • You may. That is, the condition that the contact member 84d and the work W are in contact with each other may be used as the above-mentioned approach condition.
  • the contact member 84d is singular, but the fourth detection device may have a plurality of contact members 84d.
  • FIG. 8 is a schematic view showing the structure of the fifth detection device 8e.
  • the fifth detection device 8e optically detects information regarding the positional relationship between the material nozzle 212 and the work W. Specifically, the fifth detection device 8e detects information regarding the positional relationship between the material nozzle 212 and the work W using the detection light ML.
  • the fifth detection device 8e includes a light transmission system 81e and a light receiving system 82e in order to detect information regarding the positional relationship between the material nozzle 212 and the work W using the detection light ML.
  • Each of the light transmitting system 81e and the light receiving system 82e is arranged on the processing head 21. Therefore, the positional relationship between the processing head 21, the light transmitting system 81e, and the light receiving system 82e is fixed regardless of the movement of the processing head 21. That is, the positional relationship between each of the irradiation optical system 211 and the material nozzle 212 and each of the light transmission system 81e and the light receiving system 82e is fixed regardless of the movement of the processing head 21.
  • the light transmission system 81e emits the detection light ML.
  • the light transmission system 81e emits the detection light ML toward the work W.
  • the irradiation region LA of the detection light ML on the surface of the work W (for example, the modeling surface MS) is separated from the irradiation region EA of the processing light EL on the surface of the work W (for example, the modeling surface MS). , Aligned with the irradiation optical system 211.
  • the irradiation optical system 211 so that the irradiation region LA of the detection light ML is located at a position separated from the irradiation region EA of the processing light EL along at least one of the X-axis direction and the Y-axis direction. Aligned with.
  • the irradiation optical system 211 is located so that the irradiation region LA is located in front of the irradiation region EA along the moving direction of the processed light EL on the surface of the work W (for example, the modeling surface MS). It may be aligned with respect to.
  • the light transmission system 81e is attached to the irradiation optical system 211 so that the irradiation region LA is located behind the irradiation region EA along the moving direction of the processed light EL on the surface of the work W (for example, the modeling surface MS). It may be aligned with respect to it. Further, a plurality of irradiation region LAs of the detection light ML may be provided, and the irradiation region LA may be positioned in front of and behind the irradiation region EA along the moving direction of the processing light EL. In this case, there may be a plurality of fifth detection devices 8e. When there are a plurality of irradiation region LAs of the detection light ML, the plurality of irradiation region LAs may be switched and used according to the moving direction of the processing light EL.
  • the light receiving system 82e receives the detection light ML from the work W (that is, the detection light EL reflected by the work W). That is, the light receiving system 82e receives the detection light ML via the work W. The light receiving system 82e receives the detection light ML from the irradiation region LA. The light receiving result by the light receiving system 82e is output to the control device 7.
  • the control device 7 calculates the position of the irradiation region LA in the Z-axis direction based on the light receiving result by the light receiving system 82e. That is, the control device 7 calculates the position of the work W in the Z-axis direction (particularly, the position of the surface of the work W in the Z-axis direction) based on the light receiving result by the light receiving system 82e.
  • the information regarding the position of the work W in the Z-axis direction calculated based on the light receiving result by the light receiving system 82e is processed. It contains information about the position of the work W with respect to the head 21 in the Z-axis direction.
  • control device 7 obtains information on the relative positional relationship between the machining head 21 and the work W based on the information on the position of the work W in the Z-axis direction calculated based on the light receiving result by the light receiving system 82e. Can be calculated. As a result, the control device 7 can calculate information regarding the relative positional relationship between the material nozzle 212 and the work W.
  • FIG. 10 is a schematic view showing the structure of the sixth detection device 8f.
  • the sixth detection device 8f optically detects information regarding the positional relationship between the material nozzle 212 and the work W.
  • the sixth detection device 8f optically detects the information regarding the positional relationship between the material nozzle 212 and the work W by a method different from that of the fifth detection device 8e.
  • the sixth detection device 8f includes a camera (that is, an image pickup device) 81f.
  • the camera 81f captures the material nozzle 212.
  • the camera 81f images the nozzle member 2121 of the material nozzle 212 (particularly, the supply outlet 2123 at its tip).
  • the camera 81f is aligned with respect to the material nozzle 212 so that the imaging range 81f of the camera 81f includes at least a portion of the nozzle member 2121 (particularly, the supply outlet 2123 at the tip of the nozzle member 2121). There is. Further, the camera 81f captures the work W in addition to the material nozzle 212. Therefore, the camera 81f is aligned with respect to the material nozzle 212 so that the imaging range 81f of the camera 81f includes at least a part of the work W.
  • the image capture result of the camera 81f (that is, the image captured by the camera 81f) is output to the control device 7.
  • the control device 7 calculates information on the positional relationship between the material nozzle 212 and the work W based on the image captured by the camera 81f. As described above, the camera 81f captures both the material nozzle 212 and the work W. Therefore, the control device 7 can calculate information regarding the positional relationship between the material nozzle 212 and the work W by analyzing the image captured by the camera 81f.
  • the camera 81f as an example of the sixth detection device 8f may image the work W without imaging the nozzle member 2121.
  • the camera 81f may image the three-dimensional structure ST formed on the work W.
  • the control device 7 determines whether or not the dimensions of the three-dimensional structure ST are as designed, or whether or not there is an abnormality in the modeling process. Good. If the control device 7 is not (for example, if the dimensions of the three-dimensional structure ST are not as designed or there is an abnormality in the modeling process), the control device 7 may output warning information and stop the operation of the machining system SYS1. You may.
  • a plurality of sixth detection devices 8f may be provided.
  • a plurality of sixth detection devices 8f may be arranged so that the work W can be imaged from different directions.
  • each of the first detection device 8a to the sixth detection device 8f described above is only a specific example of the detection device 8. Therefore, it goes without saying that any detection device having a structure different from that of the first detection device 8a to the eighth detection device 8f may be used as the detection device 8.
  • a distance sensor capable of measuring the distance between the material nozzle 212 and the work W may be used as the detection device 8.
  • the distance sensor is arranged on the material nozzle 212, it may be arranged on a member (for example, work W or stage 31) different from the material nozzle 212.
  • a contact sensor capable of detecting the presence or absence of contact with an object may be used as the detection device 8.
  • the contact sensor is arranged on the material nozzle 212, but may be arranged on a member (for example, work W or stage 31) different from the material nozzle 212.
  • each of the first detection device 8a to the sixth detection device 8f described above has information on the distance between the material nozzle 212 and the work W, information on the positional relationship between the material nozzle 212 and the work W, or the material.
  • information on the distance between the material nozzle 212 and the 3D model ST, the material nozzle 212 and the 3D model Information on the positional relationship with the ST, or information on the degree of proximity between the material nozzle 212 and the three-dimensional modeled object ST may be detected (acquired).
  • the processing system SYS1 forms the three-dimensional structure ST on the work W based on the three-dimensional model data (for example, CAD (Computer Aided Design) data) of the three-dimensional structure ST to be formed.
  • the three-dimensional model data at least the measurement data of the three-dimensional object measured by the measuring device (not shown) provided in the processing system SYS1 and the measurement data of the three-dimensional shape measuring machine provided separately from the processing system SYS1.
  • One may be used.
  • An example of a three-dimensional shape measuring machine is a contact-type three-dimensional coordinate measuring machine having a probe that can move with respect to the work W and can contact the work W.
  • An example of a three-dimensional shape measuring machine is a non-contact type three-dimensional measuring machine.
  • a non-contact type 3D measuring machine As an example of a non-contact type 3D measuring machine, a pattern projection type 3D measuring machine, an optical cutting type 3D measuring machine, a time of flight type 3D measuring machine, and a moiretopography type 3D measuring machine , At least one of a holographic interference type three-dimensional measuring machine, a CT (Computed Tomography) type three-dimensional measuring machine, and an MRI (Magnetic resonance imaging) type three-dimensional measuring machine.
  • the three-dimensional model data includes, for example, STL (Stareo Lithografy) format, VRML (Virtual Reality Modeling Language) format, AMF (Adaptive Manufacturing File Format), and IGES (Initial Technology) IGES (Initial Technology) format.
  • the Automotive Manufactures-Surfaces Interface) format, HP / GL (Hewlett-Packard Graphics Language) format, bitmap format and the like can be used.
  • the processing system SYS1 sequentially forms, for example, a plurality of layered partial structures (hereinafter referred to as "structural layers") SLs arranged along the Z-axis direction.
  • structural layers layered partial structures
  • the processing system SYS1 sequentially forms a plurality of structural layers SL obtained by cutting the three-dimensional structure ST into round slices along the Z-axis direction.
  • the three-dimensional structure ST which is a laminated structure in which a plurality of structural layers SL are laminated, is formed.
  • the flow of the operation of forming the three-dimensional structure ST by forming the plurality of structural layers SL one by one in order will be described.
  • each structural layer SL Under the control of the control device 7, the processing system SYS1 sets an irradiation region EA in a desired region on the modeling surface MS corresponding to the surface of the work W or the surface of the formed structural layer SL, and the irradiation region EA is set with respect to the irradiation region EA.
  • the processing light EL is irradiated from the irradiation optical system 211.
  • the region occupied by the processed light EL emitted from the irradiation optical system 211 on the modeling surface MS may be referred to as an irradiation region EA.
  • the focus position (that is, the condensing position) of the processed light EL coincides with the modeling surface MS.
  • the molten pool (that is, the pool of metal melted by the processing light EL) MP is formed in the desired region on the modeling surface MS by the processing light EL emitted from the irradiation optical system 211. It is formed.
  • the processing system SYS1 sets a supply region MA in a desired region on the modeling surface MS under the control of the control device 7, and supplies the modeling material M to the supply region MA from the material nozzle 212.
  • the processing system SYS1 supplies the modeling material M from the material nozzle 212 to the molten pool MP.
  • the modeling material M supplied to the molten pool MP melts.
  • the processing light EL is not irradiated to the molten pool MP as the processing head 21 moves, the modeling material M melted in the molten pool MP is cooled and solidified (that is, solidified).
  • the solidified modeling material M is deposited on the modeling surface MS. That is, a modeled object is formed by the deposit of the solidified modeling material M.
  • a series of modeling processes including formation of the molten pool MP by irradiation with such processing light EL, supply of the modeling material M to the molten pool MP, melting of the supplied modeling material M, and solidification of the molten modeling material M can be performed.
  • the processing head 21 is repeatedly moved relative to the modeling surface MS along the XY plane. That is, when the processing head 21 moves relative to the modeling surface MS, the irradiation region EA also moves relative to the modeling surface MS. Therefore, a series of modeling processes is repeated while moving the irradiation region EA relative to the modeling surface MS along the XY plane (that is, in the two-dimensional plane).
  • the processed light EL is selectively irradiated to the irradiation region EA set in the region where the modeled object is to be formed on the modeled surface MS, but it is not desired to form the modeled object on the modeled surface MS.
  • the irradiation area EA set in the area is not selectively irradiated (it can be said that the irradiation area EA is not set in the area where the modeled object is not desired to be formed). That is, the processing system SYS1 moves the irradiation region EA along the predetermined movement locus on the modeling surface MS, and transfers the processing light EL to the modeling surface MS at a timing according to the distribution mode of the region where the modeled object is to be formed. Irradiate.
  • the mode of distribution of the region where the modeled object is to be formed may be referred to as a distribution pattern or a pattern of the structural layer SL.
  • the molten pool MP also moves on the modeling surface MS along the movement locus according to the movement locus of the irradiation region EA.
  • the molten pool MP is sequentially formed on the modeling surface MS in the portion of the region along the movement locus of the irradiation region EA that is irradiated with the processing light EL.
  • the supply region MA also moves on the modeling surface MS along the movement locus according to the movement locus of the irradiation region EA. Become.
  • a structural layer SL corresponding to an aggregate of the modeled objects made of the solidified modeling material M is formed on the modeling surface MS. That is, the structural layer SL corresponding to the aggregate of the shaped objects formed on the modeling surface MS in the pattern corresponding to the moving locus of the molten pool MP (that is, the shape corresponding to the moving locus of the molten pool MP in a plan view).
  • the structural layer SL) to have is formed.
  • the modeling material M is supplied to the irradiation region EL, and the irradiation region EL is irradiated with the processing light EL having a strength that does not allow the molten pool MP. You may.
  • the irradiation area EA is moved with respect to the modeling surface MS, but the modeling surface MS may be moved with respect to the irradiation area EA.
  • the processing system SYS1 repeatedly performs the operation for forming such a structural layer SL under the control of the control device 7 based on the three-dimensional model data. Specifically, first, the three-dimensional model data is sliced at a stacking pitch to create slice data. It should be noted that data obtained by partially modifying this slice data according to the characteristics of the processing system SYS1 may be used. The processing system SYS1 performs the operation for forming the first structural layer SL # 1 on the modeling surface MS corresponding to the surface of the work W, that is, the three-dimensional model data corresponding to the structural layer SL # 1, that is, the structural layer. This is performed based on the slice data corresponding to SL # 1.
  • the processing system SYS1 uses information on the tool path which is the locus of the irradiation region EA (supply region MA) passing through the region where the structural layer SL # 1 exists in the slice data corresponding to the structural layer SL # 1. May be operated. As a result, the structural layer SL # 1 is formed on the modeling surface MS as shown in FIG. 12A. After that, the processing system SYS1 sets the surface (that is, the upper surface) of the structural layer SL # 1 on the new modeling surface MS, and then forms the second structural layer SL # 2 on the new modeling surface MS. To do. In order to form the structural layer SL # 2, the control device 7 first controls the head drive system 22 so that the machining head 21 moves along the Z axis.
  • the control device 7 controls the head drive system 22 so that the irradiation region EA and the supply region MA are set on the surface of the structural layer SL # 1 (that is, the new modeling surface MS).
  • the machining head 21 is moved toward the + Z side.
  • the focus position of the processing light EL coincides with the new modeling surface MS.
  • the processing system SYS1 operates on the structural layer SL # 1 based on the slice data corresponding to the structural layer SL # 2 in the same operation as the operation of forming the structural layer SL # 1 under the control of the control device 7.
  • the structural layer SL # 2 is formed on the surface.
  • the structural layer SL # 2 is formed as shown in FIG. 12 (b).
  • the same operation is repeated until all the structural layers SL constituting the three-dimensional structure ST to be formed on the work W are formed.
  • the three-dimensional structure ST is formed by the laminated structure in which a plurality of structural layers SL are laminated.
  • the material nozzle 212 when a predetermined approach condition regarding the degree of approach between the material nozzle 212 (or an arbitrary detection object) and the work W (or an arbitrary approach object) is satisfied, the material nozzle 212 The relative position of the material nozzle 212 and the work W can be controlled so as to suppress the occurrence of an abnormality caused by the contact between the material nozzle and the work W. Therefore, the occurrence of an abnormality caused by the contact between the material nozzle 212 and the work W is appropriately suppressed. As a result, the machining system SYS1 can appropriately perform additional machining on the work W.
  • machining system SYS2 the machining system SYS of the second embodiment
  • the processing system SYS2 of the second embodiment is different from the processing system SYS1 of the first embodiment described above in that the shape of the material nozzle 212 (particularly, the outer shape of the nozzle member 2121) is different.
  • Other features of the processing system SYS2 may be the same as the processing system SYS1. Therefore, in the following, the nozzle member 2121e included in the processing system SYS2 of the second embodiment will be described with reference to FIG.
  • FIG. 13 is a cross-sectional view showing the structure of the nozzle member 2121e included in the processing system SYS2 of the second embodiment.
  • the nozzle member 2121e includes a main body member 21211e, a tip member 21212e, and a connecting member 21213e.
  • the main body member 21211e is a member fixed to the main body of the processing head 21 directly or via another member.
  • the tip member 21212e is a member located on the ⁇ Z side of the main body member 21211e in the direction in which the nozzle member 2121e extends (in the Z-axis direction in the example shown in FIG. 13).
  • the tip member 21212e is a member arranged at a position closer to the stage 31 than the main body member 21211e.
  • the tip member 21212e is a member arranged at a position closer to the work W held by the stage 31 than the main body member 21211e.
  • the tip member 21212e is a member that comes into contact with the work W before the main body member 21211e when the nozzle member 2121e and the work W come into contact with each other.
  • the tip member 21212e is a member that first comes into contact with the work W when the nozzle member 2121e and the work W come into contact with each other.
  • the connecting member 21213e is a member that connects the main body member 21211e and the tip member 21212e.
  • the connecting member 21213e is a member located on the ⁇ Z side of the main body member 21211e in the direction in which the nozzle member 2121e extends.
  • the connecting member 21213e is a member located on the + Z side of the tip member 21212e in the direction in which the nozzle member 2121e extends.
  • a supply pipe 21251e is formed inside the main body member 2121e along the direction in which the nozzle member 2121e extends.
  • a supply pipe 21252e is formed inside the tip member 2122e.
  • a supply pipe 21253e is formed inside the connecting member 2123e.
  • the supply pipe 21251e is connected to the supply pipe 21253e.
  • the supply pipe 21253e is connected to the supply pipe 21252e. Therefore, the supply pipes 21251e to 21253e form a series of supply pipes 2125e.
  • the modeling material M supplied from the material supply device 1 is supplied to the work W from the nozzle member 2121e via the supply pipe 2125e via the supply outlet 2123. Therefore, the pipe connecting the material supply device 1 and the nozzle member 2121e (that is, the pipe for supplying the modeling material M) is connected to the supply pipe 2125e (particularly, the supply pipe 21251e of the main body member 2151e).
  • the connecting member 21213e is a member that forms a notch 2124e between the main body member 21211e and the tip member 21212e.
  • the connecting member 21213e is the dimensions of the main body member 21211e and the tip member 21212e (for example, the dimensions in the direction in which the nozzle member 2121e intersects in the extending direction, and in the example shown in FIG. Includes parts with dimensions smaller than (dimensions in the direction). This portion constitutes the notch 2124e.
  • the notch 2124e may be referred to as a notch or a dent.
  • This notch 21214e is mainly used to separate the tip member 21212e from the main body member 21211e when an external force is applied to the tip member 21212e.
  • FIGS. 14 (a) to 14 (c) a state in which the tip member 21212e separates from the main body member 21211e when an external force is applied to the tip member 2122e will be described.
  • 14 (a) to 14 (c) are cross-sectional views showing how the tip member 21212e separates from the main body member 21211e when an external force is applied to the tip member 21212e.
  • the connecting member 21213e is deformed (typically plastically deformed). Further, when the degree of deformation of the connecting member 21213e exceeds the permissible amount due to the continuous application of an external force from the work W to the connecting member 21213e via the tip member 21212e, the connecting member 21213e breaks as shown in FIG. 14C. To do. As a result, the tip member 21212e connected to the main body member 21211e via the connecting member 21213e is separated from the main body member 21211e.
  • the tip member 21212e is separated from the main body member 21211e due to the external force applied to the tip member 21212e. Therefore, even if the nozzle member 2121e and the work W (or an arbitrary approaching object) come into contact with each other, the tip member 21212e is separated so that the nozzle member 2121e is in a state of being in contact with the work W. It changes to a state where it is not in contact with the work W.
  • the nozzle changes from a state in which it is in contact with the work W to a state in which it is not in contact with the work W. Therefore, as compared with the case where the tip member 21212e is not separated, there is a high possibility that the occurrence of an abnormality caused by the contact between the nozzle member 2121e and the work W can be suppressed. In particular, there is a high possibility that the occurrence of damage to the work W or the modeled object can be suppressed.
  • the processing system SYS2 does not have to include the detection device 8.
  • the processing system SYS2 does not have to control the relative position between the material nozzle 212 (or an arbitrary detection object) and the work W (or an arbitrary approach object) based on the detection result of the detection device 8. .. That is, the processing system SYS2 suppresses the occurrence of an abnormality caused by the contact between the material nozzle 212 (or an arbitrary detection object) and the work W (or an arbitrary approaching object) with the material nozzle 212 and the work. It is not necessary to control the relative position with W.
  • the material nozzle 212 is used as an example of the object to be detected.
  • the object to be detected may be an object different from the material nozzle 212.
  • the irradiation optical system 211 may be a detection target.
  • the detection device 8 may detect the relative positional relationship between the irradiation optical system 211 and the work W (or any other approaching object).
  • the processing system SYS1 establishes the irradiation optical system 211 and the work W (or other) when a predetermined approach condition regarding the degree of proximity between the irradiation optical system 211 and the work W (or any other approach object) is satisfied.
  • the relative position of the irradiation optical system 211 and the work W (or any other approaching object) may be controlled so as to suppress the occurrence of anomalies caused by contact with the arbitrary approaching object). ..
  • the irradiation optical system 211 includes an optical member 2111 and a holding member 2112.
  • the optical member 2111 may be a detection target.
  • the detection device 8 may detect the relative positional relationship between the optical member 2111 and the work W (or any other approaching object).
  • the processing system SYS1 establishes the optical member 2111 and the work W (or any other arbitrary object) when a predetermined approach condition regarding the degree of proximity between the optical member 2111 and the work W (or any other object of approach) is satisfied.
  • the relative position of the optical member 2111 and the work W (or any other approaching object) may be controlled so as to suppress the occurrence of an abnormality caused by the contact with the approaching object).
  • the holding member 2112 when the holding member 2112 is arranged closer to the work W than the optical member 2111, the holding member 2112 may be the object to be detected.
  • the detection device 8 may detect the relative positional relationship between the holding member 2112 and the work W (or any other approaching object).
  • the processing system SYS1 establishes the holding member 2112 and the work W (or any other optional object) when a predetermined approach condition regarding the degree of approach between the holding member 2112 and the work W (or any other object to be approached) is satisfied.
  • the relative position of the holding member 2112 and the work W (or any other approaching object) may be controlled so as to suppress the occurrence of an abnormality caused by the contact with the approaching object).
  • the holding member 2112 when the holding member 2112 and a part of the nozzle member 2121 are also used and the holding member 2112 is provided with the supply outlet 2123, the holding member 2112 is closer to the holding member 2112 than the supply outlet 2123.
  • the holding member 2112 may be the object to be detected.
  • each of the work W and the stage 31 is used as an example of the approaching object.
  • the approaching object may be an object different from the work W and the stage 31.
  • a member arranged around at least one of the work W and the stage 31 may be an approaching object.
  • the housing 6 (for example, the partition member 61) may be an object to be approached.
  • the processing device 2 melts the modeling material M by irradiating the modeling material M with the processing light EL.
  • the processing apparatus 2 may melt the modeling material M by irradiating the modeling material M with an arbitrary energy beam.
  • the processing device 2 may include a beam irradiation device capable of irradiating an arbitrary energy beam in addition to or in place of the irradiation optical system 211.
  • Any energy beam includes, but is not limited to, a charged particle beam such as an electron beam, an ion beam, or an electromagnetic wave.
  • the processing system SYS can form the three-dimensional structure ST by the laser overlay welding method.
  • the processing system SYS can form the three-dimensional structure ST from the modeling material M by another method capable of forming the three-dimensional structure ST by irradiating the modeling material M with the processing light EL (or an arbitrary energy beam). It may be formed.
  • Other methods include, for example, a powder bed melting bonding method (Power Bed Fusion) such as a powder sintering laminated molding method (SLS: Selective Laser Sintering), a binder jetting method (Binder Jetting), or a laser metal fusion method (LMF:). Laser Metal Fusion) can be mentioned.
  • the processing system SYS may use an arbitrary method for additional processing, which is different from the method capable of forming the three-dimensional structure ST by irradiating the modeling material M with the processing light EL (or an arbitrary energy beam).
  • the three-dimensional structure ST may be formed.
  • the processing system SYS forms the three-dimensional structure ST by supplying the modeling material M from the material nozzle 212 toward the irradiation region EA where the irradiation optical system 211 irradiates the processing light EL. ..
  • the processing system SYS may form the three-dimensional structure ST by supplying the modeling material M from the material nozzle 212 without irradiating the processing light EL from the irradiation optical system 211.
  • the processing system SYS melts the modeling material M on the modeling surface MS by spraying the modeling material M onto the modeling surface MS from the material nozzle 212, and solidifies the melted modeling material M.
  • the dimensional structure ST may be formed.
  • the processing system SYS melts the modeling material M on the modeling surface MS and solidifies the molten modeling material M by blowing a gas containing the modeling material M onto the modeling surface MS from the material nozzle 212 at an ultra-high speed.
  • the three-dimensional structure ST may be formed.
  • the processing system SYS melts the modeling material M on the modeling surface MS by spraying the heated modeling material M onto the modeling surface MS from the material nozzle 212, and solidifies the molten modeling material M.
  • the three-dimensional structure ST may be formed.
  • the processing system SYS (particularly, the processing head 21) does not have to include the irradiation optical system 211. Good.
  • the processing system SYS performs a removal processing capable of removing at least a part of the object by irradiating an object such as a work W with a processing light EL (or an arbitrary energy beam) in addition to or instead of the additional processing. You may.
  • the processing system SYS irradiates an object such as a work W with processing light EL (or an arbitrary energy beam) in addition to or in place of at least one of addition processing and removal processing to mark at least a part of the object. Marking processing capable of forming (for example, letters, numbers or figures) may be performed. Even in this case, the above-mentioned effects can be enjoyed.
  • a processing device that performs processing operations using an energy beam, A detection device that detects the degree of proximity of an object and a target member that is at least a part of the processing device, and A position changing device for changing the relative position between the object and the target member is provided.
  • the position changing device is a processing system that changes the relative position so as to avoid contact between the object and the target member when a predetermined approach condition regarding the degree of approach is satisfied.
  • Appendix 2 A processing device that performs processing operations using an energy beam, A detection device that detects the degree of proximity of an object and a target member that is at least a part of the processing device, and A position changing device for changing the relative position between the object and the target member is provided.
  • the position changing device is a processing system that limits an operation for changing the relative position in a direction in which the object and the target member approach each other when a predetermined approach condition regarding the degree of approach is satisfied.
  • Appendix 3 The processing system according to Appendix 1 or 2, wherein the approach condition includes a contact condition that the object and the target member come into contact with each other.
  • the approach condition is that although the object and the supply device are not in contact with each other, the object and the target member are so close to each other that the distance between the object and the target member is less than a predetermined value.
  • the processing system according to any one of Appendix 1 to 3, which includes contact conditions.
  • [Appendix 5] The processing according to any one of Appendix 1 to 4, wherein the position changing device changes the relative position so as to avoid further approaching the object and the target member when the approach condition is satisfied. system.
  • the position changing device changes the relative position so that the object and the target member are separated from each other when the approach condition is satisfied.
  • the position changing device is relative so as to reduce the possibility that an abnormal state of the object and / or the processing device occurs due to a collision between the object and the target member when the approach condition is satisfied.
  • Appendix 12 The processing system according to any one of Appendix 1 to 11, wherein at least a part of the processing apparatus is also used as at least a part of the detection apparatus.
  • Appendix 13 The processing system according to any one of Appendix 1 to 12, wherein at least a part of the target member is also used as at least a part of the detection device.
  • a processing device that performs processing operations using an energy beam, A detection device for detecting the degree of proximity between an object and a target member which is at least a part of the processing device is provided. A processing system in which at least a part of the target member is also used as at least a part of the detection device.
  • the detector includes a coil and an iron core. The processing system according to any one of Appendix 12 to 14, wherein at least a part of the target member is also used as the iron core.
  • Appendix 16 The processing system according to Appendix 15, wherein the coil is wound around at least a part of the target member that is also used as the iron core.
  • [Appendix 21] The processing system according to Appendix 20, wherein the target member includes at least a part of the supply device.
  • [Appendix 22] The processing system according to any one of Appendix 1 to 21, wherein the detection device can detect the presence or absence of contact between the object and the target member as the degree of approach.
  • [Appendix 23] The processing system according to any one of Appendix 1 to 22, wherein the detection device can detect an index value relating to a distance between the object and the target member as the degree of approach.
  • [Appendix 24] The processing system according to Appendix 23, wherein the detection device can detect an index value related to the distance during a period in which the object and the target member are not in contact with each other.
  • [Appendix 25] The processing system according to any one of Supplementary note 1 to 24, wherein the detection device detects the degree of proximity between the object and the target member without contacting the object.
  • [Appendix 26] The processing system according to Appendix 25, wherein the detection device includes a proximity sensor.
  • Appendix 27] The processing system according to any one of Appendix 1 to 26, wherein the detection device is arranged on the target member.
  • Appendix 28] The processing system according to any one of Appendix 1 to 27, wherein the detection device optically detects the position of the object.
  • the detection device images the object.
  • the processing apparatus irradiates the object to be processed with the energy beam to perform the processing operation on the object to be processed.
  • the irradiation device irradiates the object to be processed with the energy beam to perform the processing operation on the object to be processed. Further equipped with a mounting device on which the object to be processed is mounted, The processing system according to any one of Supplementary note 1 to 32, wherein the object includes the above-described device.
  • the target member includes a fixing member fixed to the processing apparatus, a non-fixing member not fixed to the processing apparatus, and a connecting member connecting the fixing member and the non-fixing member. The processing system according to any one of the above.
  • a processing device that performs processing operations using an energy beam, A position changing device for changing the relative position between the object and the target member which is at least a part of the processing device is provided.
  • the target member is a processing system including a fixing member fixed to the processing apparatus, a non-fixing member not fixed to the processing apparatus, and a connecting member connecting the fixing member and the non-fixing member.
  • Appendix 36 The processing system according to Appendix 34 or 35, wherein the connecting member forms a notch between the fixed member and the non-fixed member.
  • the connecting member can be broken due to stress applied to the connecting member from the outside of the target member via the non-fixing member.
  • a processing device that performs processing operations using an energy beam, A detection device that detects the degree of proximity of an object and a target member that is at least a part of the processing device, and A position changing device that changes the relative position between the object and the target member, When a predetermined approach condition regarding the degree of approach is satisfied, a control signal for controlling the position change device is received so as to perform the relative position change operation for avoiding contact between the object and the target member.
  • a processing system equipped with a receiver.
  • a processing device that performs processing operations using an energy beam, A detection device that detects the degree of proximity of an object and a target member that is at least a part of the processing device, and A position changing device that changes the relative position between the object and the target member, When a predetermined approach condition regarding the degree of approach is satisfied, a control signal for prohibiting execution of an operation for changing the relative position of the object and the target member in a direction in which the object and the target member approach each other is transmitted.
  • Appendix 42 In a processing system that performs additional processing on an object A powder supply member that supplies powder to the object from the powder supply port, A sensor for acquiring relative position information between the powder supply member and the object is provided.
  • the member provided with the powder supply port extends in the first direction and includes a first portion and a second portion of the first portion located on the first direction side.
  • Appendix 43 A processing method for processing an object to be processed by using the processing system according to any one of Appendix 1 to 42.
  • [Appendix 44] Detecting the degree of proximity between an object and a target member, which is at least a part of the processing equipment, A processing method including changing the relative position between the object and the target member so as to avoid contact between the object and the target member when a predetermined approach condition regarding the degree of approach is satisfied.
  • [Appendix 45] Detecting the degree of proximity between an object and a target member, which is at least a part of the processing equipment, When a predetermined approach condition regarding the degree of approach is satisfied, it is prohibited to execute an operation for changing the relative position of the object and the target member in a direction in which the object and the target member approach each other. Processing method including.
  • [Appendix 46] In a processing method in which an object is processed using an energy beam, Supplying powder to the object from the powder supply port Irradiating the object with the energy beam and A processing method including acquiring relative position information between a member provided with the powder supply port and the object.
  • [Appendix 47] In a processing method in which an object is processed using an energy beam, Supplying powder to the object from the powder supply port Irradiating the object with the energy beam and Including the acquisition of relative position information between the member provided with the powder supply port and the object.
  • the member provided with the powder supply port extends in the first direction and includes a first portion and a second portion of the first portion located on the first direction side.
  • Appendix 49 In the processing method of performing additional processing on an object Supplying powder to the object from the powder supply port Including the acquisition of relative position information between the member provided with the powder supply port and the object.
  • the member provided with the powder supply port extends in the first direction and includes a first portion and a second portion of the first portion located on the first direction side.
  • Appendix 50 A computer program that causes a computer to execute the processing method according to any one of Appendix 43 to 49.
  • Appendix 51 Detecting the degree of proximity between an object and a target member, which is at least a part of the processing equipment, When a predetermined approach condition regarding the degree of approach is satisfied, the computer is provided with a processing method including changing the relative position between the object and the target member so as to avoid contact between the object and the target member. A computer program to run.
  • Appendix 52 Detecting the degree of proximity between an object and a target member, which is at least a part of the processing equipment, When a predetermined approach condition regarding the degree of approach is satisfied, it is prohibited to execute an operation for changing the relative position of the object and the target member in a direction in which the object and the target member approach each other.
  • a computer program that causes a computer to execute the processing method including it.
  • Appendix 53 A recording medium on which the computer program according to any one of Appendix 50 to 52 is recorded.
  • a processing device that performs a processing operation using an energy beam, a detection device that detects the degree of proximity between an object and a target member that is at least a part of the processing device, and a relative position between the object and the target member.
  • a control device that controls a machining system including a position changing device to be changed.
  • a control device that controls the position changing device so as to perform the relative position changing operation for avoiding contact between the object and the target member when a predetermined approach condition regarding the degree of approach is satisfied.
  • a processing device that performs a processing operation using an energy beam, a detection device that detects the degree of proximity between an object and a target member that is at least a part of the processing device, and a relative position between the object and the target member.
  • a control device that controls a machining system including a position changing device to be changed. When a predetermined approach condition regarding the degree of approach is satisfied, control is performed to prohibit execution of an operation for changing the relative position of the object and the target member in a direction in which the object and the target member approach each other. Control device.
  • a control device that controls a processing system that processes an object using an energy beam
  • the processing system is a relative of a powder supply member that supplies powder to the object from a powder supply port, an irradiation optical system that irradiates the object with an energy beam, and the powder supply member and the object. It is provided with a sensor for acquiring various position information and a position change device for changing the positional relationship between the object and the member.
  • the control device is a control device that controls a change operation of the positional relationship by the position change device by using an output from the sensor.
  • a control device that controls a processing system that processes an object using an energy beam
  • the processing system is a relative of a powder supply member that supplies powder to the object from a powder supply port, an irradiation optical system that irradiates the object with the energy beam, and the powder supply member and the object.
  • a sensor for acquiring various position information and a position change device for changing the positional relationship between the object and the member are provided.
  • the member provided with the powder supply port extends in the first direction and includes a first portion and a second portion of the first portion located on the first direction side. The dimensions of the second portion along the second direction intersecting the first direction are smaller than the dimensions of the first portion along the second direction.
  • the control device is a control device that controls a change operation of the positional relationship by the position change device by using an output from the sensor.
  • the present invention is not limited to the above-described embodiment, and can be appropriately modified within the scope of claims and within a range not contrary to the gist or idea of the invention that can be read from the entire specification, and a processing system accompanied by such modification. Processing methods, computer programs, recording media, receiving devices and control devices are also included in the technical scope of the present invention.
  • SYSTEM processing system 1 Material supply device 2 Processing device 21 Processing head 211 Irradiation optical system 212 Material nozzle 22 Head drive system 31 Stage W work M Modeling material SL Structural layer MS Modeling surface EA Irradiation area MA Supply area MP Molten pond EL processing light

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)

Abstract

L'invention concerne un système de traitement, utilisant un faisceau d'énergie pour effectuer une opération de traitement sur un objet, qui comprend : un élément d'alimentation en poudre qui fournit une poudre à l'objet à partir d'une ouverture d'alimentation en poudre; un système optique d'irradiation qui expose l'objet à un faisceau d'énergie; et un capteur qui acquiert des informations de position relative pour l'élément d'alimentation électrique et l'objet.
PCT/JP2019/012482 2019-03-25 2019-03-25 Système de traitement WO2020194444A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/012482 WO2020194444A1 (fr) 2019-03-25 2019-03-25 Système de traitement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/012482 WO2020194444A1 (fr) 2019-03-25 2019-03-25 Système de traitement

Publications (1)

Publication Number Publication Date
WO2020194444A1 true WO2020194444A1 (fr) 2020-10-01

Family

ID=72609292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/012482 WO2020194444A1 (fr) 2019-03-25 2019-03-25 Système de traitement

Country Status (1)

Country Link
WO (1) WO2020194444A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7374390B1 (ja) 2023-02-08 2023-11-06 三菱電機株式会社 欠陥推定装置、数値制御装置、付加製造装置、および欠陥推定方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6127194A (ja) * 1984-07-16 1986-02-06 Mitsubishi Electric Corp レ−ザ加工機
JPS62109888U (fr) * 1985-12-27 1987-07-13
JPS63104796A (ja) * 1986-10-21 1988-05-10 Mitsubishi Electric Corp レ−ザ加工機の加工ヘツド
JPH01157788A (ja) * 1987-12-16 1989-06-21 Mitsubishi Electric Corp レーザ加工装置の加工ヘッド衝突防止装置
JP2013119098A (ja) * 2011-12-07 2013-06-17 Hitachi Ltd レーザ肉盛装置とレーザ肉盛方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6127194A (ja) * 1984-07-16 1986-02-06 Mitsubishi Electric Corp レ−ザ加工機
JPS62109888U (fr) * 1985-12-27 1987-07-13
JPS63104796A (ja) * 1986-10-21 1988-05-10 Mitsubishi Electric Corp レ−ザ加工機の加工ヘツド
JPH01157788A (ja) * 1987-12-16 1989-06-21 Mitsubishi Electric Corp レーザ加工装置の加工ヘッド衝突防止装置
JP2013119098A (ja) * 2011-12-07 2013-06-17 Hitachi Ltd レーザ肉盛装置とレーザ肉盛方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7374390B1 (ja) 2023-02-08 2023-11-06 三菱電機株式会社 欠陥推定装置、数値制御装置、付加製造装置、および欠陥推定方法

Similar Documents

Publication Publication Date Title
JP7226339B2 (ja) 処理装置、処理方法、コンピュータプログラム、記録媒体及び制御装置
CN112867579B (zh) 附加制造装置及附加制造方法
JP7380769B2 (ja) 処理装置及び処理方法、加工方法、並びに、造形装置及び造形方法
EP3560635A1 (fr) Système de fabrication additive comportant des capteurs mobiles
WO2022163148A1 (fr) Système de traitement
WO2022018853A1 (fr) Système de traitement
WO2020194444A1 (fr) Système de traitement
CN114450584A (zh) 层叠造形系统
JP2023080120A (ja) 加工システム、加工方法、コンピュータプログラム、記録媒体及び制御装置
WO2021149683A1 (fr) Système de traitement
JP7468614B2 (ja) 加工システム
JP7400803B2 (ja) 加工システム及び加工方法
WO2022157914A1 (fr) Procédé de traitement
WO2021019644A1 (fr) Système de traitement, procédé de traitement, dispositif de commande, programme informatique, support d'enregistrement et appareil de traitement
JP2022115799A (ja) 加工システム
CN113939394B (zh) 造型单元
WO2020194448A1 (fr) Système de construction
WO2020194445A1 (fr) Système de traitement
US20230158607A1 (en) Processing system
EP4177001A1 (fr) Système de traitement et dispositif optique
JP7201064B2 (ja) 処理装置及び処理方法
TW201946712A (zh) 造形系統與造形方法
WO2022157931A1 (fr) Système formatif

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19920700

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19920700

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP