WO2023032662A1 - 情報処理装置および工作機械 - Google Patents
情報処理装置および工作機械 Download PDFInfo
- Publication number
- WO2023032662A1 WO2023032662A1 PCT/JP2022/031001 JP2022031001W WO2023032662A1 WO 2023032662 A1 WO2023032662 A1 WO 2023032662A1 JP 2022031001 W JP2022031001 W JP 2022031001W WO 2023032662 A1 WO2023032662 A1 WO 2023032662A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- captured image
- unit
- display
- camera
- image
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims description 79
- 238000012937 correction Methods 0.000 claims abstract description 68
- 238000003384 imaging method Methods 0.000 claims abstract description 47
- 230000010365 information processing Effects 0.000 claims abstract description 42
- 230000003287 optical effect Effects 0.000 claims abstract description 27
- 238000000034 method Methods 0.000 claims abstract description 26
- 238000003860 storage Methods 0.000 claims abstract description 25
- 230000008569 process Effects 0.000 claims abstract description 20
- 239000012530 fluid Substances 0.000 claims abstract description 10
- 238000002347 injection Methods 0.000 claims description 37
- 239000007924 injection Substances 0.000 claims description 37
- 230000002093 peripheral effect Effects 0.000 claims description 9
- 238000004140 cleaning Methods 0.000 description 35
- 239000002826 coolant Substances 0.000 description 34
- 238000003754 machining Methods 0.000 description 21
- 238000013500 data storage Methods 0.000 description 13
- 238000012986 modification Methods 0.000 description 13
- 230000004048 modification Effects 0.000 description 13
- 238000001514 detection method Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 12
- 239000007788 liquid Substances 0.000 description 11
- 238000012423 maintenance Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000009825 accumulation Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 7
- 238000009434 installation Methods 0.000 description 7
- 238000005520 cutting process Methods 0.000 description 6
- 239000003550 marker Substances 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 4
- 239000000470 constituent Substances 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000010730 cutting oil Substances 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000001050 lubricating effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000002844 melting Methods 0.000 description 1
- 230000008018 melting Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000000843 powder Substances 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q11/00—Accessories fitted to machine tools for keeping tools or parts of the machine in good working condition or for cooling work; Safety devices specially combined with or arranged in, or specially adapted for use in connection with, machine tools
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q11/00—Accessories fitted to machine tools for keeping tools or parts of the machine in good working condition or for cooling work; Safety devices specially combined with or arranged in, or specially adapted for use in connection with, machine tools
- B23Q11/0042—Devices for removing chips
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q17/00—Arrangements for observing, indicating or measuring on machine tools
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q17/00—Arrangements for observing, indicating or measuring on machine tools
- B23Q17/24—Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q17/00—Arrangements for observing, indicating or measuring on machine tools
- B23Q17/24—Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
- B23Q17/2409—Arrangements for indirect observation of the working space using image recording means, e.g. a camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q17/00—Arrangements for observing, indicating or measuring on machine tools
- B23Q17/24—Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
- B23Q17/248—Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves using special electromagnetic means or methods
- B23Q17/249—Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves using special electromagnetic means or methods using image analysis, e.g. for radar, infrared or array camera images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q11/00—Accessories fitted to machine tools for keeping tools or parts of the machine in good working condition or for cooling work; Safety devices specially combined with or arranged in, or specially adapted for use in connection with, machine tools
- B23Q11/10—Arrangements for cooling or lubricating tools or work
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- the present invention relates to an information processing device that processes captured images used in machine tools.
- Chips are generated when workpieces are processed by machine tools. If a large amount of chips accumulate in the processing chamber, it becomes difficult to continue processing. For this reason, conventionally, the operator has taken measures such as periodically stopping the operation of the machine tool and removing the chips using an air blower or the like. However, such manual chip removal reduces the operating efficiency of the machine tool.
- An aspect of the present invention is an information processing device that processes an imaged image used in a machine tool that images an imaging area with a camera and jets fluid toward a target based on the imaged image.
- This information processing device includes a reference information storage unit that stores in advance a line along the contour of a specific area of a machine tool in a display image, and a camera that captures the captured image so that the contour of the specific area displayed in the captured image is aligned with the line. and a display control unit that causes a display unit to display the captured image corrected by the correction unit.
- Another aspect of the present invention is an information processing device that processes a captured image used in a machine tool that captures an image of an imaging area with a camera and ejects fluid toward a target based on the information of the captured image.
- This information processing device includes a display control unit that controls display of a line along the contour of a specific area of a machine tool at a specific position on a display unit, and a display unit that superimposes a captured image on the line and displays it on the display unit.
- a correction unit for rotating the captured image so that the contour of the specific area of the machine tool captured in the image is aligned with the line.
- the display control unit causes the display unit to display the captured image corrected by the correction unit.
- Yet another aspect of the present invention is a machine tool.
- This machine tool includes a camera for imaging an imaging area set in a machining chamber, an information processing device for processing the captured image, and an injection unit for injecting fluid toward a target based on the processed imaged image. And prepare.
- the information processing device includes a reference information storage unit that stores in advance a line along the contour of the specific area set in the processing chamber in the displayed image, and a captured image that matches the contour of the specific area displayed in the captured image to the line. in the direction of rotation of the optical axis of the camera, and a display control unit for displaying the captured image corrected by the correction unit on the display unit.
- the present invention even if there are restrictions on the angle adjustment of the camera in the machine tool, it is possible to match the imaging area displayed on the screen with the imaging area recognized by the software.
- FIG. 3 is a functional block diagram of an information processing device;
- FIG. 3 is a block diagram showing the configuration of a recognition unit;
- FIG. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing an example of the screen which an operator operates.
- 4 is a flowchart showing the flow of correction processing; 4 is a flowchart schematically showing the flow of cleaning control; It is a figure showing the structure of the information processing system which concerns on a modification. It is a figure showing the image processing method concerning a modification.
- FIG. 1 is a perspective view showing the appearance of the machine tool according to the embodiment.
- the machine tool 1 is configured as a multitasking machine that processes a workpiece into a desired shape while appropriately exchanging tools.
- a machine tool 1 is provided with a processing chamber 2 inside a device housing.
- the processing chamber 2 is provided with a processing device for processing a work.
- An operation panel 4 for operating the processing apparatus is provided on the front surface of the apparatus housing.
- FIG. 2 is a hardware configuration diagram of the machine tool 1.
- the machine tool 1 includes an information processing device 100 , a machining control device 102 , a machining device 104 , a tool changing section 106 , a tool storage section 108 and an imaging section 110 .
- the machining control device 102 functions as a numerical control unit and outputs control signals to the machining device 104 according to a machining program (NC program).
- the processing device 104 processes a workpiece by moving a tool spindle (not shown; hereinafter simply referred to as “spindle”) according to instructions from the machining control device 102 .
- the processing device 104 includes a mechanism for driving the spindle, a liquid reservoir 112 for storing coolant, and a liquid injection section 114 for injecting coolant.
- the coolant is used as cutting oil for removing heat and lubricating the tools and workpieces during machining, and is also used as a cleaning liquid for removing chips scattered in the machining chamber 2 .
- the liquid injection section 114 includes a pump that draws up coolant from the liquid injection section 114, a nozzle that injects the coolant, and an actuator that drives the nozzle.
- the information processing device 100 includes an operation panel 4, and outputs control commands to the processing control device 102 based on operator input.
- the information processing apparatus 100 also controls the screen displayed on the monitor of the operation panel 4 according to the operator's operation input.
- the tool storage section 108 stores tools.
- the tool changer 106 corresponds to a so-called ATC (Automatic Tool Changer), takes out a tool from the tool storage part 108 according to a change instruction from the machining control device 102, and replaces the tool on the spindle with the taken out tool.
- ATC Automatic Tool Changer
- the imaging unit 110 is, for example, a camera equipped with an imaging device such as a CCD or CMOS, and images an imaging area set in the processing chamber 2 .
- an imaging area an area in which chips generated by machining the workpiece are assumed to exist is set in advance.
- the angle of view of the camera is set so that the distribution and accumulation of chips can be grasped over a wide range in the processing chamber 2 .
- the imaging unit 110 outputs the captured image to the information processing device 100 .
- FIG. 3 is a perspective view showing the configuration inside the processing chamber 2.
- FIG. 3(A) shows a state seen obliquely from above
- FIG. 3(B) shows a state seen obliquely from below.
- the processing chamber 2 is surrounded by four side surfaces, and a main shaft 10 is provided on one side surface so as to be vertically and horizontally movable.
- the main shaft 10 has a horizontal rotating shaft, and a tool T is coaxially attached to its tip.
- a side surface facing the main shaft 10 in the axial direction has a revolving door 12 .
- a support plate 14 extends horizontally from the revolving door 12 .
- the revolving door 12 is a door that can rotate around a vertical axis.
- a table 16 is provided below the support plate 14 .
- a pallet 18 is detachably attached to the table 16 , and a work is placed and fixed on the pallet 18 .
- the table 16 is movable in the axial direction of the main shaft 10 and rotatable in the horizontal plane. By rotating the table 16, the work on the pallet 18 can be rotated.
- the workpiece approaches or separates from the tool T by linearly driving the table 16 . That is, by controlling the rotation and movement of the table 16 and the movement of the spindle 10, the workpiece can be processed into a desired shape.
- the support plate 14 is fitted with the pallet 18 at the position where the table 16 is farthest from the spindle 10 .
- the support plate 14 By rotating the revolving door 12 in this state, the support plate 14 separates the pallet 18 from the table 16 and rotates together with the pallet 18 .
- the pallet 18 on which the work has been processed can be carried out from the processing chamber 2 and the pallet 18 to which the work to be processed next is fixed can be carried into the processing chamber 2 .
- a chip conveyor 20 is provided below the table 16 and the spindle 10 for conveying chips to the outside of the processing chamber 2 .
- the table 16 moves above the chip conveyor 20 .
- a shooter 22 is provided below the table 16 .
- the chute 22 guides the chips flowing from above due to cleaning onto the chip conveyor 20 .
- the bottoms located on both sides of the table 16 are slopes 24, which are inclined downward toward the shooter 22 so that chips scattered during machining can easily flow to the shooter 22.
- the upper part of the processing chamber 2 is shielded by a cover 26 (ceiling).
- a cover 26 is provided with a plurality of nozzles 28 for supplying coolant.
- the nozzle 28 is configured to be replaceable together with the cover 26 .
- the nozzle 28 constitutes a liquid injection section 114 and is connected to the liquid storage section 112 via piping, valves, pumps and the like (not shown) (see FIG. 2).
- the nozzle 28 is configured to be three-dimensionally rotatable. By rotating the nozzle 28, the injection direction of the coolant can be controlled. By specifying the direction of the nozzle 28 and driving the pump, the coolant can be injected toward the target in the processing chamber 2 . Chips generated by machining the workpiece are washed away by the coolant and carried out of the machining chamber 2 by the chip conveyor 20 .
- two nozzles 28 are installed in this embodiment, the number can be set as appropriate.
- a plurality of cameras 30 for capturing images of the inside of the processing chamber 2 from above are also installed in the upper part of the processing chamber 2 . That is, camera 30 is fixed to machine tool 1 . In this embodiment, two cameras 30 are attached to the side wall slightly below the cover 26 in the processing chamber 2 . In this embodiment, the camera 30 cannot be integrated with the cover 26 due to wiring reasons.
- the camera 30 constitutes an image capturing unit 110, and captures an image of the machining status of the workpiece by the tool T, and an image of chips generated by the machining (see FIG. 2). Since two cameras 30 are provided, an area that cannot be captured by one camera 30 can be captured by the other camera 30 .
- the imaging unit 110 outputs the captured image to the information processing device 100 .
- FIG. 4 is a diagram schematically showing the structure of the camera 30.
- the camera 30 of this embodiment has a structure in which rotation about its optical axis L is restricted. That is, the camera 30 is rotatable around the yaw axis L1 orthogonal to the optical axis L and around the pitch axis L2 orthogonal to the optical axis L and the yaw axis L1, but cannot be rotated around the optical axis L. . Therefore, if the coordinates of the imaging area displayed on the screen deviate from the coordinates of the imaging area recognized by the software due to an installation error of the camera 30 or the like, manual adjustment becomes difficult. Therefore, the information processing apparatus 100 executes correction processing for eliminating the deviation of the coordinates when the operator adjusts the angle of the camera.
- FIG. 5 is a functional block diagram of the information processing device 100.
- Each component of the information processing apparatus 100 includes computing units such as a CPU (Central Processing Unit) and various computer processors, storage devices such as memory and storage, hardware including wired or wireless communication lines connecting them, and storage devices. , and implemented by software that supplies processing instructions to the computing unit.
- a computer program may consist of a device driver, an operating system, various application programs located in their higher layers, and a library that provides common functions to these programs.
- Each block described below represents a functional block rather than a hardware configuration.
- the information processing device 100 includes a user interface processing unit 130 , a data processing unit 132 and a data storage unit 134 .
- the user interface processing unit 130 receives an operation input from an operator, and is in charge of user interface processing such as image display.
- the data processing unit 132 executes various processes based on data acquired by the user interface processing unit 130 and data stored in the data storage unit 134 .
- Data processing unit 132 also functions as an interface for user interface processing unit 130 and data storage unit 134 .
- the data storage unit 134 stores various programs and setting data.
- the user interface processing section 130 includes an input section 140 and an output section 142 .
- the input unit 140 receives an operation input from an operator via the touch panel of the monitor on the operation panel 4 or the like.
- the output unit 142 includes a display unit 144 that displays an image or the like on the screen of the monitor.
- the output unit 142 provides various types of information to the operator through its display.
- a display unit 144 displays an image captured by the imaging unit 110 .
- Data processing unit 132 includes acquisition unit 150 , grid setting unit 152 , detection unit 154 , recognition unit 156 , correction unit 158 , display control unit 160 and ejection control unit 162 .
- Acquisition unit 150 acquires an image captured by imaging unit 110 .
- the grid setting unit 152 divides (divides) the captured image into a plurality of grids in order to determine (analyze) the presence of a predetermined physical quantity (cutting debris) in the imaging area.
- the grid has a specific shape (square in this embodiment, but any geometric shape is acceptable) (details will be described later).
- an area divided by a plurality of grids in a captured image will also be referred to as a "grid area”.
- an image composed of a plurality of grids is also called a "grid image”.
- the detection unit 154 detects an operator's operation input via the input unit 140 . Although the details will be described later, the operator can grasp the accumulation state of chips in the processing chamber 2 by referring to the captured image displayed on the display unit 144 . The operator can specify the cleaning range (spray range) of the coolant by specifying the area of the captured image via the touch panel. The detection unit 154 detects the instruction input by the operator as the indicated position in the captured image. The detection unit 154 may detect the pointing position based on the grid area created by the grid setting unit 152 .
- the recognition unit 156 automatically recognizes chips based on the grid area set in the captured image, determines whether chips exist in the grid area, and determines the amount of chips present. . These determinations are also referred to as "chip determinations". If the recognition unit 156 determines that chips are present in the grid area, the recognition unit 156 recognizes the position on the captured image corresponding to the grid area as the accumulation position of the chips. When the chip stacking position is recognized, an automatic detection signal is output to the injection control section 162 .
- the automatic detection signal includes at least information regarding the predetermined location where debris has been recognized in the captured image.
- the correction unit 158 executes correction processing for matching the coordinates of the imaging area by the camera 30 with the coordinates used for the coolant control command based on the operator's operation. Since this correction process is based on the premise that there is an installation error in the camera 30 , it is performed when the camera 30 is installed in the processing chamber 2 . The details will be described later.
- the display control unit 160 causes the display unit 144 to display the image captured by the camera 30 .
- the display control unit 160 displays an operation screen for the correction and displays the captured image after correction.
- the grid setting unit 152 divides the captured image into grids after correction.
- the display control unit 160 causes the display unit 144 to display the corrected captured image and the grid so as to overlap each other.
- the recognizer 156 analyzes the presence of targets within the grid. An example of a display screen related to this correction will be described in detail later.
- the injection control unit 162 outputs a coolant injection command toward the target position to the machining control device 102 when cleaning control is performed.
- This injection command includes information specifying the position to inject the coolant (information specifying the injection route, etc.).
- the machining control device 102 receives this injection command, drives the liquid injection unit 114, and controls injection of the coolant.
- This cleaning control is to wash away chips in the processing chamber 2 with coolant (that is, to wash the inside of the processing chamber 2), and in this embodiment, an automatic cleaning mode or a manual cleaning mode can be selected.
- the injection control unit 162 automatically sets the coolant injection position based on the automatic detection signal output by the recognition unit 156, and outputs an injection command.
- the position for injecting the coolant is set based on the operator's area specification detected by the detection unit 154, and an injection command is output.
- the data storage unit 134 stores various programs including a correction processing program for executing the correction processing described above and a cleaning control program for executing cleaning control, as well as various data necessary for these processing.
- Data storage unit 134 includes correction information storage unit 146 and reference information storage unit 148 .
- the correction information storage unit 146 stores the captured image acquired by the acquisition unit 150 during the correction process, the grid area (grid image) created by the grid setting unit 152, and the predetermined Position information, information on the amount of chips, position information detected by the detection unit 154, and the like are stored.
- the correction information storage unit 146 also stores correction information (calibration data) acquired in a correction process, which will be described later.
- the reference information storage unit 148 stores an image for guiding the operator's operation input in the correction process.
- the data storage unit 134 also functions as a work area when arithmetic processing is performed.
- FIG. 6 is a block diagram showing the configuration of the recognition unit 156. As shown in FIG.
- the recognition unit 156 includes a model learning unit 41 , a calculation unit 43 and a determination unit 44 .
- the model learning unit 41 creates a learning model.
- the learning model is a model that can calculate and output the probability of which one of the predetermined items related to chips in the grid area corresponds. is.
- This learning model can be created by, for example, using pairs of input data and output data as teacher data and inputting them into a CNN (convolutional neural network) in advance for learning.
- the input data can be the grid area
- the output data can be information about the presence and amount of chips in the grid area.
- the data storage unit 134 includes a model storage unit 42.
- the model storage unit 42 stores learning models for automatically determining the presence or absence of chips.
- the learning model is read into the calculator 43 as needed.
- the calculation unit 43 calculates the matching probability of a predetermined item related to chips on a grid-by-grid basis based on the captured image. Specifically, the calculation unit 43 uses the learning model learned by the model learning unit 41 to determine whether there are many chips (class 2), few chips (class 1), or no chips for the grid area. (Class 0)”, the probabilities relating to which of the three items it corresponds to are calculated. These classes indicate the degree of presence of material (such as chips) recognized in the imaging area.
- the determination unit 44 determines to which of classes 0 to 2 the chips in the grid area belong, based on the probability calculated by the calculation unit 43 for the input grid area.
- the determination unit 44 determines that there is debris in the grid area (that is, determines that it is class 2 or class 1)
- automatic detection including position information on the captured image corresponding to the position of the grid area in the grid image
- a signal is output to display control unit 160 and injection control unit 162 .
- the recognizing unit 156 automatically recognizes chips based on the captured image captured by the imaging unit 110 during or after the processing of the workpiece, and can perform automatic cleaning by injecting coolant.
- the automatic cleaning may be performed periodically, or may be performed by giving some instructions such as instructions from the operator.
- the camera 30 of the present embodiment is restricted in rotation around the optical axis, so there is a limit to its adjustment. Therefore, if left as it is, there is a possibility of causing a setting error in the target of the coolant injection control. Therefore, in the present embodiment, the correction unit 158 rotates the image captured by the camera 30 to correct the target setting error. The details will be described below.
- FIG. 7 to 10 are diagrams showing the method of correcting the screen display.
- each camera 30 is corrected. Since the other cameras 30 are the same, description thereof is omitted.
- the camera 30 is installed in the processing chamber 2 so that the predetermined imaging area is included in the angle of view.
- the imaging area includes areas where swarf scattering and deposition is expected.
- the captured image P0 shown in FIG. 7 is displayed on the screen (reference screen 170) of the display unit 144 when the camera 30 is installed in the processing chamber 2.
- FIG. When the operation input by the operator shifts to a maintenance mode (described later) for correcting the screen display, the display control unit 160 superimposes the baseline BL (thick line) and the auxiliary line AL (chain-dotted line) on the captured image P0. display.
- the baseline BL is a line set to follow the outline of the specific area SA in the processing chamber 2 when the screen is in the aligned state, and is stored in the reference information storage unit 148 in advance.
- the specific area SA is an area surrounded by a boundary line formed by the slope 24 and the edge of the pallet 18 .
- the auxiliary line AL is a cross-shaped auxiliary line passing through the center C of the captured image P0 in this embodiment.
- the processing chamber 2 is provided with a marker M on which the center C overlaps when the screens are aligned. The marker M is attached to the center setting position preset in the imaging area.
- the center C of the captured image P0 is shifted from the marker M.
- the outline of the specific area SA is shifted from the baseline BL. Therefore, the screen is not in alignment. Therefore, the operator first manually adjusts the angle of the camera 30 within a possible range to bring the screens closer to matching.
- the operator rotates the camera 30 around the yaw axis L1 and the pitch axis L2 so that the center C of the captured image P0 overlaps the marker M as shown in FIG. 8(A). Up to this point, it can be easily performed manually. However, in the illustrated example, the outline of the specific area SA is shifted from the baseline BL even at this stage. Therefore, if the camera 30 can be rotated around the optical axis L, the deviation can be eliminated. However, in this embodiment, the rotation of the camera 30 around the optical axis L is restricted as described above.
- the captured image P0 is rotated around the center C (that is, in the rotation direction of the optical axis L of the camera 30) so that the correction unit 158 aligns the contour of the specific area SA with the baseline BL. ) rotate.
- the screen is brought into a matching state, and the target setting error in the coolant injection control is corrected.
- the captured image P0 being tilted with respect to the reference screen 170 at this time, there is a possibility that the screen display will be uncomfortable.
- the grid area GA is set from the beginning based on an ideal state in which there is no installation error of the camera 30 .
- the aspect ratio (screen ratio) of the captured image P0 (see dotted line) is 4:3, and the number of grids is 28 ⁇ 21. For this reason, if the captured image P0 is rotated around the optical axis, the grid may be excessive or insufficient in the peripheral portion of the captured image P0. If the image does not correspond to each grid, it is not possible to accurately determine the degree of chip accumulation.
- the display control unit 160 extracts an image of a specific shape by cutting out the outer peripheral portion from the captured image P0 after rotating the angle of view.
- the image extracted at this time will be referred to as "extracted image P1".
- the "specific shape” is a rectangular shape having the same aspect ratio (4:3) as the captured image P0.
- the “cutting” referred to here may be performed by erasing the outer peripheral portion of the captured image P0 and leaving the image portion of the judgment area JA as the data of the extracted image P1, or by removing the judgment area JA without erasing the outer peripheral portion. may be extracted as data of the extracted image P1.
- the display control unit 160 cuts out the outer peripheral portion from the captured image P0 in units of at least one of rows and columns of the grid.
- the rotation angle ⁇ of the captured image P0 is set to 3 degrees. The portion where the minute grid image overlaps is clipped. As a result, the number of grids is 26 ⁇ 19.
- the extracted image P1 is displayed as it is, it will be smaller than the reference screen 170, and there is a possibility that a blank space will be formed around it and cause discomfort. Therefore, as shown in FIG. 10A, the extracted image P1 is enlarged to a preset display size and displayed on the reference screen 170. FIG. Here, the image is enlarged to the same scale as the original captured image P0. As a result, as shown in FIG. 10B, the extracted image P1 has the same size as the reference screen 170, and is displayed so that the sides of the rectangle are at the same positions as the sides of the rectangle of the original captured image P0.
- the recognition and determination of substances (chips) by the recognition unit 156 are performed in the initially set grid area (the number of grids: 26 ⁇ 19).
- the screen display by the display control unit 160 may be a part of the cutout of the captured image P0.
- the coordinates of the imaging area displayed on the reference screen 170 can be matched with the coordinates of the imaging area recognized by the software, and the coolant can be jetted toward the target position for correct confirmation.
- the position of the grid set by the grid setting unit 152 and the position of the grid recognized by the operator from the image are matched.
- the position indicated by the operator via the touch panel corresponds to one of the grids, and the area of the image where the grids overlap is set as the coolant injection target (target position).
- FIG. 11 is a diagram showing an example of a screen operated by an operator.
- a screen 200 shown in FIG. 11A is displayed on the monitor of the operation panel 4 .
- the screen 200 includes the above-described reference screen 170 and operation screen 202 side by side.
- the operation screen 202 is provided with an automatic cleaning button 210 , a manual cleaning button 212 , a cleaning path adjustment button 214 and a detail setting button 216 .
- the automatic cleaning button 210 is selected when executing the automatic cleaning mode.
- a manual wash button 212 is selected when executing the manual wash mode.
- the cleaning path adjustment button 214 is selected when adjusting the cleaning path with coolant.
- a maintenance button 220 When the detail setting button 216 is selected, a maintenance button 220, an auxiliary line display button 222, and an image rotation operation section 224 are displayed as shown.
- the maintenance button 220 When the maintenance button 220 is selected by the operator, the maintenance mode is entered, and the baseline BL is displayed superimposed on the captured image P0. Also, when the auxiliary line display button 222 is turned on, the auxiliary line AL is displayed.
- the operator While looking at the reference screen 170, the operator adjusts the yaw angle and pitch angle of the camera 30 so that the center C of the captured image P0 overlaps the marker M as described above. After that, by touching the + button or - button of the image rotation operation section 224, the rotation angle ⁇ of the captured image P0 can be adjusted.
- the correction unit 158 rotates the captured image P0 clockwise by 0.1 degrees each time the + button is touched by the operator, and rotates the captured image P0 counterclockwise by 0.1 degrees each time the - button is touched. rotate each. As shown in FIG. 11B, when the auxiliary line display button 222 is turned off, the auxiliary line AL is hidden.
- FIG. 12 is a flowchart showing the flow of correction processing. This process is executed when the operator selects the maintenance button 220 .
- the display control unit 160 displays the captured image P0 on the reference screen 170 as a maintenance screen (S10), and displays the baseline BL so as to overlap it (S12).
- the display control unit 160 displays the auxiliary line AL (S16). If the auxiliary line display button 222 is off (N of S14), the auxiliary line AL is hidden (S18).
- the display control unit 160 rotates the captured image P0 (S22).
- the correction unit 158 calculates the determination area JA (S26). That is, an area in which the image is surely included in the grid area GA is set as the judgment area JA according to the inclination of the captured image P0.
- This determination area JA functions as an "AI inference area" in which chip determination is performed on a grid-by-grid basis.
- the display control unit 160 extracts the extracted image P1 by cutting out the peripheral portion from the captured image P0 based on the judgment area JA (S28), adjusts (enlarges) the extracted image P1, and displays it on the reference screen 170. (S30).
- the correction unit 158 stores this series of correction information as calibration data in the correction information storage unit 146 (S32).
- This correction information includes the set angle (rotational angle ⁇ ) of the captured image P0, the setting of the determination area JA, the enlargement ratio (set magnification) of the extracted image P1, and the like. If the confirmation button is not selected (N of S24), the processing of S26-S32 is skipped.
- a predetermined maintenance termination condition such as when the operator selects another button (Y in S34)
- the display control unit 160 terminates the display of the maintenance screen (S36). If the maintenance end condition is not satisfied (N of S34), the process returns to S14.
- FIG. 13 is a flow chart schematically showing the flow of cleaning control.
- the acquisition unit 150 acquires the captured image P0 (S40).
- the correction unit 158 reads the correction information (calibration data) stored in the correction information storage unit 146 (S42), and internally reflects the correction process described above on the captured image P0.
- the correction unit 158 rotates the captured image P0 at a set angle (S44), sets a determination region (S46), and extracts an extraction image (S48). Subsequently, the extracted image is enlarged by the set magnification (S50) and displayed on the screen as a captured image (S52). Then, when the operator gives an instruction to display the chip accumulation state (Y in S54), the display control unit 160 displays the grid image superimposed on the captured image (S56), and further displays the chip accumulation state. (S58). This chip accumulation state is displayed by means of color coding or the like according to the class determined by the recognition unit 156 .
- the detection unit 154 detects this.
- the injection control unit 162 sets a coolant injection route based on the injection position (S62), and outputs an injection command to the machining control device 102 to inject the coolant based on the injection route (S64).
- the display control unit 160 terminates the display of the cleaning operation screen (S68). If the cleaning mode end condition is not satisfied (N of S66), the process returns to S40.
- the machine tool has been described above based on the embodiment.
- the operator can finely adjust the angle (yaw angle, pitch angle) of the camera 30 while checking the captured image P0. .
- the camera 30 cannot adjust the rotation (roll angle) around the optical axis due to its structure, it can be corrected by rotating the captured image P0. That is, according to the present embodiment, even if there are restrictions on the angle adjustment of the camera 30 in the machine tool 1, it is possible to match the imaging area displayed on the screen with the imaging area recognized by the software. Therefore, even when the operator gives an instruction to wash chips with coolant based on the captured image displayed on the screen, the washing instruction position and the coolant injection position match. That is, it is possible to maintain high accuracy of cleaning control.
- the correction unit 158 aligns the outline of the specific area of the machine tool displayed in the captured image with the preset line. , performs correction to rotate the captured image in the direction of rotation of the optical axis of the camera.
- the grid setting unit 152 divides the grid based on the corrected captured image, and the display control unit 160 displays the corrected captured image and the grid in an overlapping manner on the display unit. This makes it possible to match the imaging area displayed on the screen (grid display) with the imaging area recognized by the software. Since the position of the grid set in the captured image and the position of the grid recognizable by the operator from the image match, it is possible to accurately jet the fluid toward the target within the grid.
- the extracted image P1 is extracted by cutting out the peripheral portion from the captured image P0 tilted by rotation, and enlarged to a preset display size for display. bottom. Therefore, the frame shape of the captured image displayed on the screen can be matched to the screen while maintaining the rectangular frame shape. Therefore, the presence or absence of correction (that is, due to individual differences in installed cameras) does not significantly change the appearance of the captured image, and the corrected image does not cause the operator to feel uncomfortable.
- the machine tool 1 is described as a multitasking machine, but it may be a turning center or a machining center. It may also be an additional processing machine that processes a material (for example, metal powder) while melting it with a laser. In that case, the material that scatters during processing becomes the "physical quantity" recognized by the substance recognition unit.
- a material for example, metal powder
- the coolant is exemplified as the fluid that is injected for chip removal.
- liquid (cleaning liquid) other than coolant or gas such as air may be used.
- a gas injection section for injecting gas is provided instead of the liquid injection section.
- the rotation angle ⁇ of the captured image P0 was 3 degrees in the correction process (FIG. 8B). stomach.
- the information processing device 100 is used as the internal computer of the machine tool 1, and an example is shown in which it is configured integrally with the operation panel 4, but it may be configured independently of the operation panel 4. In that case, the information processing apparatus may use the monitor of the control panel as a remote desktop to function as a display unit. Alternatively, the information processing device may be an external computer connected to the machine tool. In that case, the information processing device may be a general laptop PC (Personal Computer) or a tablet computer.
- PC Personal Computer
- FIG. 14 is a diagram showing the configuration of an information processing system according to a modification.
- symbol is attached
- the information processing device is installed outside the machine tool. That is, the information processing system includes a machine tool 301 , a data processing device 310 and a data storage device 312 .
- the data processing device 310 functions as an "information processing device".
- the machine tool 301 includes a machining control device 102 , a machining device 104 , a tool changer 106 , a tool storage 108 , a data processor 330 , an operation panel 304 and an imaging unit 110 .
- Data processing unit 330 includes communication unit 332 , grid setting unit 152 , detection unit 154 , recognition unit 156 and injection control unit 162 .
- Communication unit 332 has a receiving unit and a transmitting unit, and is in charge of communication with external devices including data processing device 310 and data storage device 312 .
- the operation panel 304 includes a user interface processing section 130 and a data processing section 334 .
- the data processing unit 334 outputs a control command to the processing control device 102 based on the operation input by the operator. Further, the screen displayed on the display unit 144 is controlled according to the operation input by the operator.
- the data processing device 310 includes a communication section 320 , a correction section 158 and a display control section 160 .
- the communication unit 320 has a receiving unit and a transmitting unit and takes charge of communication with the machine tool 301 .
- Data storage device 312 includes correction information storage section 146 and reference information storage section 148 .
- the data storage device 312 is wired to the data processing device 310 in this modification, but may be wirelessly connected. In other variations, data storage device 312 may be incorporated as part of data processing device 310 .
- the functions of the information processing apparatus 100 in the above embodiment are realized by dividing them into the inside and the outside of the machine tool. With such a configuration, it is possible to obtain the same effects as those of the above-described embodiment.
- the size and shape of the grid may be configured to be changeable as needed. Also in this case, it is preferable to obtain an extracted image by cutting out the outer peripheral portion of the captured image in units of grids.
- the information processing apparatus may include a drive control section that controls the first rotation about the yaw axis and the second rotation about the pitch axis in connection with the angle control of the camera.
- the drive control unit controls the first rotation and the second rotation so that the optical axis of the camera is positioned at a center setting position (marker M) preset in the imaging area prior to correction by the correction unit. .
- the configuration in which the camera 30 cannot rotate around the optical axis is exemplified as the restricted state around the optical axis of the camera 30 .
- the camera 30 may be rotated around the optical axis by a predetermined amount, but the angle of rotation may be limited.
- the camera 30 itself may have a rotatable structure, but its rotation may be restricted to avoid interference with surrounding structures.
- the configuration in which the center C of the captured image P0 and the optical axis L of the camera 30 are aligned and the captured image P0 is rotated around the optical axis L is illustrated.
- the captured image may be rotated independently of the optical axis.
- the correction unit rotates the captured image so that the contour of the specific area is aligned with the baseline. Specifically, it is preferable to rotate around an orthogonal axis that is orthogonal to the plane of the captured image and passes through the center of the captured image.
- FIG. 15 is a diagram showing an image processing method according to a modification.
- the captured image shown in the upper part of FIG. 15 can be corrected as shown in the lower part of FIG. That is, the grid setting unit 152 performs processing (grid line division processing) to superimpose grid lines on preset positions of the captured image received from the imaging unit 110 .
- processing grid line division processing
- the overlapping relationship between the grid lines and the captured image P10 may be as shown in the upper diagram of FIG.
- the display control unit 160 ignores the grid area whose four sides do not overlap the captured image, and displays only the grid area whose four sides overlap the captured image P10 according to the size of the reference screen. Processing can be performed (bottom of FIG. 15).
- the grid setting unit 152 detects that there is a grid area whose four sides do not overlap the captured image P10 (or a grid area whose four sides do not appear), and the grid is formed in a mesh shape so that the four sides overlap the captured image P10. A process of adjusting by moving the entire grid line up, down, left, or right may be performed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Machine Tool Sensing Apparatuses (AREA)
- Auxiliary Devices For Machine Tools (AREA)
Abstract
Description
図1は、実施形態に係る工作機械の外観を表す斜視図である。
工作機械1は、工具を適宜交換しながらワークを所望の形状に加工する複合加工機として構成されている。工作機械1は、装置筐体の内部に加工室2が設けられる。加工室2には、ワークを加工する加工装置が設けられる。装置筐体の前面には、加工装置を操作するための操作盤4が設けられる。
工作機械1は、情報処理装置100、加工制御装置102、加工装置104、工具交換部106、工具格納部108および撮像部110を含む。加工制御装置102は、数値制御部として機能し、加工プログラム(NCプログラム)にしたがって加工装置104に制御信号を出力する。加工装置104は、加工制御装置102からの指示にしたがって工具主軸(図示略:以下、単に「主軸」という)を動かしてワークを加工する。
図3(A)に示すように、加工室2は、四つの側面に囲まれており、その一側面において主軸10が上下および左右に移動可能に設けられている。主軸10は、水平方向の回転軸を有し、先端に工具Tが同軸状に取り付けられる。主軸10と軸線方向に対向する側面は旋回扉12を有する。旋回扉12から水平に支持プレート14が延出している。旋回扉12は、鉛直方向の軸を中心に回転できる扉である。
本実施形態のカメラ30は、その光軸L周りの回転が規制された構造を有する。すなわち、カメラ30は、光軸Lに直交するヨー軸L1周りと、光軸Lおよびヨー軸L1に直交するピッチ軸L2周りには回動自在であるが、光軸L周りには回動できない。このため、カメラ30の取り付け誤差などにより、画面に表示される撮像エリアの座標と、ソフトウェアが認識する撮像エリアの座標とがずれた場合、手作業での調整は困難となる。そこで、情報処理装置100は、作業者によるカメラの角度調整に際し、それらの座標のずれを解消するための補正処理を実行する。
情報処理装置100の各構成要素は、CPU(Central Processing Unit)および各種コンピュータプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。
取得部150は、撮像部110により撮像された画像を取得する。グリッド設定部152は、撮像エリアにおける所定の物理量(切屑)の存在を判定(解析)するために撮像画像を複数のグリッドに区分(分割)する。グリッドは、特定形状(本実施形態では正方形であるが、幾何学的形状であればよい)を有する(詳細後述)。以下、撮像画像において複数のグリッドにより区分される領域を「グリッド領域」ともいう。また、複数のグリッドにより構成される画像を「グリッド画像」ともいう。
図6は、認識部156の構成を表すブロック図である。
認識部156は、モデル学習部41、算出部43および判定部44を備える。
既に述べたように、上述した清掃制御システムを導入する場合、画面に表示される撮像エリアの座標と、ソフトウェアが認識する撮像エリアの座標とがずれる可能性がある。このため、本実施形態では、工作機械1の使用に先立って作業者(オペレータ)がカメラ30の角度を調整するなど、そのずれを解消する作業を行う。なお、説明の便宜上、両座標が一致した状態を「画面の整合状態」とも表現する。
なお、本実施形態ではカメラ30が複数設けられるため、それぞれのカメラ30について補正が行われるが、便宜上、その一つについての補正を例に説明する。他のカメラ30については同様であるので、説明を省略する。
操作盤4のモニタには、図11(A)に示す画面200が表示される。画面200は、上述した参照画面170と操作画面202とを横並びに含む。操作画面202には、自動洗浄ボタン210、手動洗浄ボタン212、洗浄経路調整ボタン214、詳細設定ボタン216が設けられる。
本処理は、オペレータによるメンテナンスボタン220の選択を契機に実行される。
表示制御部160は、メンテナンス画面として参照画面170に撮像画像P0を表示させ(S10)、ベースラインBLを重ねるように表示させる(S12)。
自動洗浄モードおよび手動洗浄モードのいずれかにおいて清掃制御が開始されると、取得部150が、撮像画像P0を取得する(S40)。続いて、補正部158が、補正情報格納部146に格納された補正情報(キャリブレーションデータ)を読み出し(S42)、撮像画像P0に対して内部処理的に上述した補正処理を反映させる。
本実施形態では、加工室2におけるカメラ30の設置に取り付け誤差があった場合、作業者(オペレータ)が撮像画像P0を確認しつつ、カメラ30の角度(ヨー角、ピッチ角)を微調整できる。カメラ30は構造上、光軸周りの回転(ロール角)を調整できないが、撮像画像P0を回転させることでその補正を行うことができる。すなわち、本実施形態によれば、工作機械1におけるカメラ30の角度調整に制約があったとしても、画面表示される撮像エリアとソフトウェアが認識する撮像エリアとの整合を図ることができる。このため、オペレータが画面に表示された撮像画像に基づいてクーラントによる切屑の洗浄指示を行う場合であっても、その洗浄指示位置とクーラントの噴射位置とが整合する。つまり、洗浄制御の精度を高く維持することができる。
上記実施形態では、工作機械1を複合加工機として説明したが、ターニングセンタであってもよいし、マシニングセンタであってもよい。また、材料(例えば金属粉末)をレーザで溶かしながら加工する付加加工の機械であってもよい。その場合、加工時に飛散する材料が、物質認識部により認識される「物理量」となる。
本変形例では、情報処理装置が工作機械の外部に設置される。すなわち、情報処理システムは、工作機械301、データ処理装置310およびデータ格納装置312を備える。データ処理装置310が「情報処理装置」として機能する。
本変形例では、図15上段に示す撮像画像を図15下段に示すように補正することができる。すなわち、グリッド設定部152は、撮像部110から受信した撮像画像の予め設定された位置にグリッド線を重ねる処理(グリッド線分割処理)を行う。そのように処理すると、グリッド線と撮像画像P10との重なりが図15上段の図のような関係になる場合がある。このグリット線と撮像画像P10とを重ねたデータは、撮像画像P10と重なる最外周のグリッドについて4辺のグリッド線が表れないグリッド領域が存在する。このような場合に、表示制御部160は、4辺すべてが撮像画像と重ならないグリッド領域を無視し、4辺すべてが撮像画像P10と重なるグリッド区画だけを参照画面の大きさにあわせて表示する処理を行うことが可能である(図15下段)。また、グリッド設定部152が、4辺が撮像画像P10と重ならないグリッド領域(または4辺が表れないグリッド領域)があることを検出し、4辺が撮像画像P10と重なるように網目状にあるグリッド線全体を上下左右に動かして調整するような処理を行ってもよい。
Claims (9)
- カメラにより撮像エリアを撮像し、撮像された撮像画像に基づいて目標に向けて流体を噴射する工作機械で用いられる前記撮像画像を処理する情報処理装置であって、
表示画像中に工作機械の特定エリアの輪郭に沿うラインを予め記憶する基準情報格納部と、
撮像画像に表示される前記特定エリアの輪郭を前記ラインに合わせるよう、前記撮像画像を前記カメラの光軸の回転方向に回転させる補正部と、
前記補正部による補正後の撮像画像を表示部に表示させる表示制御部と、
を備える、情報処理装置。 - カメラにより撮像エリアを撮像し、撮像された撮像画像の情報に基づいて目標に向けて流体を噴射する工作機械で用いられる前記撮像画像を処理する情報処理装置であって、
工作機械の特定エリアの輪郭に沿うラインを表示部の特定の位置に表示制御する表示制御部と、
前記撮像画像と前記ラインとを重ねて前記表示部に表示させた状態で、前記撮像画像に撮像された工作機械の特定エリアの輪郭を前記ラインに合わせるよう、前記撮像画像を回転させる補正部と、を備え、
前記表示制御部は、前記補正部による補正後の撮像画像を表示部に表示させる、情報処理装置。 - 前記カメラが前記光軸周りの回転が規制されたものであり、
前記補正部は、前記撮像画像の回転により前記目標の設定誤差を補正する、請求項1に記載の情報処理装置。 - オペレータの操作入力を受け付ける入力部をさらに備え、
前記表示制御部は、前記補正部による補正に先立って前記撮像画像に前記ラインを重ねて表示させ、
前記補正部は、オペレータの操作入力に応じて前記撮像画像を回転させる、請求項1に記載の情報処理装置。 - 前記表示制御部は、オペレータの操作入力に応じて前記光軸を通る補助線を前記撮像画像に重ねて表示させる、請求項4に記載の情報処理装置。
- 前記カメラの角度制御に関し、前記光軸に直交するヨー軸周りの第1の回動と、前記光軸および前記ヨー軸に直交するピッチ軸周りの第2の回動と、を制御する駆動制御部をさらに備え、
前記駆動制御部は、前記補正部による補正に先立って前記カメラの光軸が前記撮像エリアに予め設定された中心設定位置に位置するよう、前記第1の回動および前記第2の回動を制御する、請求項1に記載の情報処理装置。 - 前記表示制御部は、画角を回転した後の撮像画像から外周部を切り取る形で長方形状の画像を前記表示部に表示させる、請求項1に記載の情報処理装置。
- 前記撮像エリアの撮影画像を複数のグリッドに区分するグリッド設定部と、
前記撮影画像に基づき、前記撮像エリアにおける予め定める物理量を前記グリッド単位で認識する認識部と、
前記流体を噴射する噴射部を制御する噴射制御部と、
を備え、
前記噴射制御部は、前記撮像エリア内で認識された前記物理量に基づいて前記目標を設定する、請求項1に記載の情報処理装置。 - 加工室に設定される撮像エリアを撮像するカメラと、
撮像された撮像画像を処理する情報処理装置と、
処理された撮像画像に基づいて目標に向けて流体を噴射する噴射部と、
を備える工作機械であって、
前記情報処理装置は、
表示画像中に前記加工室に設定された特定エリアの輪郭に沿うラインを予め記憶する基準情報格納部と、
撮像画像に表示される前記特定エリアの輪郭を前記ラインに合わせるよう、前記撮像画像を前記カメラの光軸の回転方向に回転させる補正部と、
前記補正部による補正後の撮像画像を表示部に表示させる表示制御部と、
を含む、工作機械。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280059787.1A CN117897256A (zh) | 2021-09-06 | 2022-08-17 | 信息处理装置及机床 |
EP22864247.6A EP4393640A1 (en) | 2021-09-06 | 2022-08-17 | Iinformation processing device and machine tool |
US18/594,937 US20240202890A1 (en) | 2021-09-06 | 2024-03-04 | Information processing device and machine tool |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-144424 | 2021-09-06 | ||
JP2021144424A JP7066905B1 (ja) | 2021-09-06 | 2021-09-06 | 情報処理装置および工作機械 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/594,937 Continuation US20240202890A1 (en) | 2021-09-06 | 2024-03-04 | Information processing device and machine tool |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023032662A1 true WO2023032662A1 (ja) | 2023-03-09 |
Family
ID=81600643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/031001 WO2023032662A1 (ja) | 2021-09-06 | 2022-08-17 | 情報処理装置および工作機械 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240202890A1 (ja) |
EP (1) | EP4393640A1 (ja) |
JP (1) | JP7066905B1 (ja) |
CN (1) | CN117897256A (ja) |
WO (1) | WO2023032662A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7143053B1 (ja) | 2022-06-09 | 2022-09-28 | Dmg森精機株式会社 | 情報処理装置およびプログラム |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010045733A (ja) * | 2008-08-18 | 2010-02-25 | Sony Corp | 画像処理装置、画像処理方法、プログラム、および撮像装置 |
JP2014099832A (ja) * | 2012-11-16 | 2014-05-29 | Xacti Corp | カメラ |
JP2014165554A (ja) * | 2013-02-21 | 2014-09-08 | Brother Ind Ltd | 制御装置およびコンピュータプログラム |
JP2018024094A (ja) * | 2017-11-14 | 2018-02-15 | ファナック株式会社 | 工作機械の洗浄システム |
JP2019030917A (ja) * | 2017-08-04 | 2019-02-28 | オークマ株式会社 | 加工屑検出装置および工作機械 |
JP6887033B1 (ja) | 2020-02-12 | 2021-06-16 | Dmg森精機株式会社 | 画像処理装置、工作機械および画像処理方法 |
JP6921354B1 (ja) * | 2021-05-24 | 2021-08-18 | Dmg森精機株式会社 | 情報処理装置 |
JP2021144424A (ja) | 2020-03-11 | 2021-09-24 | 株式会社野村総合研究所 | コンピュータプログラム |
-
2021
- 2021-09-06 JP JP2021144424A patent/JP7066905B1/ja active Active
-
2022
- 2022-08-17 WO PCT/JP2022/031001 patent/WO2023032662A1/ja active Application Filing
- 2022-08-17 EP EP22864247.6A patent/EP4393640A1/en active Pending
- 2022-08-17 CN CN202280059787.1A patent/CN117897256A/zh active Pending
-
2024
- 2024-03-04 US US18/594,937 patent/US20240202890A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010045733A (ja) * | 2008-08-18 | 2010-02-25 | Sony Corp | 画像処理装置、画像処理方法、プログラム、および撮像装置 |
JP2014099832A (ja) * | 2012-11-16 | 2014-05-29 | Xacti Corp | カメラ |
JP2014165554A (ja) * | 2013-02-21 | 2014-09-08 | Brother Ind Ltd | 制御装置およびコンピュータプログラム |
JP2019030917A (ja) * | 2017-08-04 | 2019-02-28 | オークマ株式会社 | 加工屑検出装置および工作機械 |
JP2018024094A (ja) * | 2017-11-14 | 2018-02-15 | ファナック株式会社 | 工作機械の洗浄システム |
JP6887033B1 (ja) | 2020-02-12 | 2021-06-16 | Dmg森精機株式会社 | 画像処理装置、工作機械および画像処理方法 |
JP2021144424A (ja) | 2020-03-11 | 2021-09-24 | 株式会社野村総合研究所 | コンピュータプログラム |
JP6921354B1 (ja) * | 2021-05-24 | 2021-08-18 | Dmg森精機株式会社 | 情報処理装置 |
Also Published As
Publication number | Publication date |
---|---|
US20240202890A1 (en) | 2024-06-20 |
JP7066905B1 (ja) | 2022-05-13 |
CN117897256A (zh) | 2024-04-16 |
JP2023037692A (ja) | 2023-03-16 |
EP4393640A1 (en) | 2024-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2082850B1 (en) | Generating device of processing robot program | |
JP5725796B2 (ja) | 工具の測定方法及び測定装置、並びに工作機械 | |
JP5832083B2 (ja) | 工具寸法の測定方法及び測定装置 | |
US8175861B2 (en) | Machining simulation method and machining simulation apparatus | |
WO2023032662A1 (ja) | 情報処理装置および工作機械 | |
US11048231B2 (en) | Beam tool pathing for 3D compound contours using machining path surfaces to maintain a single solid representation of objects | |
US20100063615A1 (en) | Machining status monitoring method and machining status monitoring apparatus | |
JP5404450B2 (ja) | 加工状況監視装置 | |
US9902070B2 (en) | Robot system and robot control method for adjusting position of coolant nozzle | |
US20230330812A1 (en) | Autonomous modification of waterjet cutting systems | |
CN110405529B (zh) | 数值控制装置 | |
JP2022180272A (ja) | 情報処理装置 | |
CN114746213B (zh) | 显示装置、机床及液体的放出方法 | |
WO2022250052A1 (ja) | 情報処理装置およびプログラム | |
WO2021192890A1 (ja) | 工作機械、工作機械の制御方法、および、工作機械の制御プログラム | |
EP3511120B1 (en) | Machine tool provided with display device | |
WO2023032876A1 (ja) | 情報処理装置および工作機械 | |
JP7405899B2 (ja) | 情報処理装置および工作機械 | |
JP7014918B1 (ja) | 工作機械 | |
JP7143053B1 (ja) | 情報処理装置およびプログラム | |
WO2022163459A1 (en) | Image processing device and machine tool | |
JP6946587B1 (ja) | 画像処理装置および工作機械 | |
JP4908375B2 (ja) | 電子部品装着方法及び電子部品装着装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22864247 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280059787.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022864247 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022864247 Country of ref document: EP Effective date: 20240325 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |