WO2023032876A1 - Information processing device, and machine tool - Google Patents

Information processing device, and machine tool Download PDF

Info

Publication number
WO2023032876A1
WO2023032876A1 PCT/JP2022/032317 JP2022032317W WO2023032876A1 WO 2023032876 A1 WO2023032876 A1 WO 2023032876A1 JP 2022032317 W JP2022032317 W JP 2022032317W WO 2023032876 A1 WO2023032876 A1 WO 2023032876A1
Authority
WO
WIPO (PCT)
Prior art keywords
captured image
unit
grid
area
display
Prior art date
Application number
PCT/JP2022/032317
Other languages
French (fr)
Japanese (ja)
Inventor
絢一郎 奥野
Original Assignee
Dmg森精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022072847A external-priority patent/JP7405899B2/en
Application filed by Dmg森精機株式会社 filed Critical Dmg森精機株式会社
Publication of WO2023032876A1 publication Critical patent/WO2023032876A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q11/00Accessories fitted to machine tools for keeping tools or parts of the machine in good working condition or for cooling work; Safety devices specially combined with or arranged in, or specially adapted for use in connection with, machine tools
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof

Definitions

  • the present invention relates to an information processing device that processes captured images used in machine tools.
  • Chips are generated when workpieces are processed by machine tools. If a large amount of chips accumulate in the processing chamber, it becomes difficult to continue processing. For this reason, conventionally, the operator has taken measures such as periodically stopping the operation of the machine tool and removing the chips using an air blower or the like. However, such manual chip removal reduces the operating efficiency of the machine tool.
  • An aspect of the present invention is an information processing device.
  • This information processing apparatus includes an acquisition unit that acquires a captured image of an imaging area by an imaging unit installed in a machine tool, a grid setting unit that divides the captured image into grids each having a size including a plurality of pixels, and a grid arrangement.
  • a display control unit for displaying an extraction region, which is a part of the captured image, extracted by cutting out the outer peripheral portion of the captured image in units of rows and columns, on a reference screen of the display unit.
  • the display control unit causes the display unit to display the extraction area, which is smaller than the frame of the reference screen by cutting the outer periphery from the captured image, so that the contour of the extraction area expands toward the frame of the reference screen.
  • This machine tool includes an imaging unit that captures an image of an imaging area set in a processing chamber, a display unit that displays an image captured by the imaging unit, and a grid setting unit that divides the captured image into grids each having a size including a plurality of pixels. and a display control unit for displaying, on a reference screen of a display unit, an extraction region which is a part of the captured image extracted by cutting out the outer peripheral portion of the captured image in units of rows and columns of the grid arrangement.
  • the display control unit causes the display unit to display the extraction area, which is smaller than the frame of the reference screen by cutting the outer periphery from the captured image, so that the contour of the extraction area expands toward the frame of the reference screen.
  • the present invention it is possible to display the captured image of the imaging unit installed in the machine tool so that the operator does not feel uncomfortable.
  • FIG. 3 is a functional block diagram of an information processing device;
  • FIG. 3 is a block diagram showing the configuration of a recognition unit;
  • FIG. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing the correction method of a screen display. It is a figure showing an example of the screen which an operator operates.
  • 4 is a flowchart showing the flow of correction processing; 4 is a flowchart schematically showing the flow of cleaning control; It is a figure showing correction processing when rotation angle theta of a captured image is set to 3.3 degrees. It is a figure showing the structure of the information processing system which concerns on a modification. It is a figure showing the image processing method concerning a modification.
  • FIG. 1 is a perspective view showing the appearance of the machine tool according to the embodiment.
  • the machine tool 1 is configured as a multitasking machine that processes a workpiece into a desired shape while appropriately exchanging tools.
  • a machine tool 1 is provided with a processing chamber 2 inside a device housing.
  • the processing chamber 2 is provided with a processing device for processing a work.
  • An operation panel 4 for operating the processing apparatus is provided on the front surface of the apparatus housing.
  • FIG. 2 is a hardware configuration diagram of the machine tool 1.
  • the machine tool 1 includes an information processing device 100 , a machining control device 102 , a machining device 104 , a tool changing section 106 , a tool storage section 108 and an imaging section 110 .
  • the machining control device 102 functions as a numerical control unit and outputs control signals to the machining device 104 according to a machining program (NC program).
  • the processing device 104 processes a workpiece by moving a tool spindle (not shown; hereinafter simply referred to as “spindle”) according to instructions from the machining control device 102 .
  • the processing device 104 includes a mechanism for driving the spindle, a liquid reservoir 112 for storing coolant, and a liquid injection section 114 for injecting coolant.
  • the coolant is used as cutting oil for removing heat and lubricating the tools and workpieces during machining, and is also used as a cleaning liquid for removing chips scattered in the machining chamber 2 .
  • the liquid injection section 114 includes a pump that draws up coolant from the liquid injection section 114, a nozzle that injects the coolant, and an actuator that drives the nozzle.
  • the information processing device 100 includes an operation panel 4, and outputs control commands to the processing control device 102 based on operator input.
  • the information processing apparatus 100 also controls the screen displayed on the monitor of the operation panel 4 according to the operator's operation input.
  • the tool storage section 108 stores tools.
  • the tool changer 106 corresponds to a so-called ATC (Automatic Tool Changer), takes out a tool from the tool storage part 108 according to a change instruction from the machining control device 102, and replaces the tool on the spindle with the taken out tool.
  • ATC Automatic Tool Changer
  • the imaging unit 110 is, for example, a camera equipped with an imaging device such as a CCD or CMOS, and images an imaging area set in the processing chamber 2 .
  • an imaging area an area in which chips generated by machining the workpiece are assumed to exist is set in advance.
  • the angle of view of the camera is set so that the distribution and accumulation of chips can be grasped over a wide range in the processing chamber 2 .
  • the imaging unit 110 outputs the captured image to the information processing device 100 .
  • FIG. 3 is a perspective view showing the configuration inside the processing chamber 2.
  • FIG. 3(A) shows a state seen obliquely from above
  • FIG. 3(B) shows a state seen obliquely from below.
  • the processing chamber 2 is surrounded by four side surfaces, and a main shaft 10 is provided on one side surface so as to be vertically and horizontally movable.
  • the main shaft 10 has a horizontal rotating shaft, and a tool T is coaxially attached to its tip.
  • a side surface facing the main shaft 10 in the axial direction has a revolving door 12 .
  • a support plate 14 extends horizontally from the revolving door 12 .
  • the revolving door 12 is a door that can rotate around a vertical axis.
  • a table 16 is provided below the support plate 14 .
  • a pallet 18 is detachably attached to the table 16 , and a work is placed and fixed on the pallet 18 .
  • the table 16 is movable in the axial direction of the main shaft 10 and rotatable in the horizontal plane. By rotating the table 16, the work on the pallet 18 can be rotated.
  • the workpiece approaches or separates from the tool T by linearly driving the table 16 . That is, by controlling the rotation and movement of the table 16 and the movement of the spindle 10, the workpiece can be processed into a desired shape.
  • the support plate 14 is fitted with the pallet 18 at the position where the table 16 is farthest from the spindle 10 .
  • the support plate 14 By rotating the revolving door 12 in this state, the support plate 14 separates the pallet 18 from the table 16 and rotates together with the pallet 18 .
  • the pallet 18 on which the work has been processed can be carried out from the processing chamber 2 and the pallet 18 to which the work to be processed next is fixed can be carried into the processing chamber 2 .
  • a chip conveyor 20 is provided below the table 16 and the spindle 10 for conveying chips to the outside of the processing chamber 2 .
  • the table 16 moves above the chip conveyor 20 .
  • a shooter 22 is provided below the table 16 .
  • the chute 22 guides the chips flowing from above due to cleaning onto the chip conveyor 20 .
  • the bottoms located on both sides of the table 16 are slopes 24, which are inclined downward toward the shooter 22 so that chips scattered during machining can easily flow to the shooter 22.
  • the upper part of the processing chamber 2 is shielded by a cover 26 (ceiling).
  • a cover 26 is provided with a plurality of nozzles 28 for supplying coolant.
  • the nozzle 28 is configured to be replaceable together with the cover 26 .
  • the nozzle 28 constitutes a liquid injection section 114 and is connected to the liquid storage section 112 via piping, valves, pumps and the like (not shown) (see FIG. 2).
  • the nozzle 28 is configured to be three-dimensionally rotatable. By rotating the nozzle 28, the injection direction of the coolant can be controlled. By specifying the direction of the nozzle 28 and driving the pump, the coolant can be injected toward the target in the processing chamber 2 . Chips generated by machining the workpiece are washed away by the coolant and carried out of the machining chamber 2 by the chip conveyor 20 .
  • two nozzles 28 are installed in this embodiment, the number can be set as appropriate.
  • a plurality of cameras 30 for capturing images of the inside of the processing chamber 2 from above are also installed in the upper part of the processing chamber 2 .
  • two cameras 30 are attached to the side wall slightly below the cover 26 in the processing chamber 2 .
  • the camera 30 cannot be integrated with the cover 26 due to wiring reasons.
  • the camera 30 constitutes an image capturing unit 110, and captures an image of the machining status of the workpiece by the tool T, and an image of chips generated by the machining (see FIG. 2). Since two cameras 30 are provided, an area that cannot be captured by one camera 30 can be captured by the other camera 30 .
  • the imaging unit 110 outputs the captured image to the information processing device 100 .
  • FIG. 4 is a diagram schematically showing the structure of the camera 30.
  • the camera 30 of this embodiment has a structure in which rotation about its optical axis L is restricted. That is, the camera 30 is rotatable around the yaw axis L1 orthogonal to the optical axis L and around the pitch axis L2 orthogonal to the optical axis L and the yaw axis L1, but cannot be rotated around the optical axis L. . Therefore, if the coordinates of the imaging area displayed on the screen deviate from the coordinates of the imaging area recognized by the software due to an installation error of the camera 30 or the like, manual adjustment becomes difficult. Therefore, the information processing apparatus 100 executes correction processing for eliminating the deviation of the coordinates when the operator adjusts the angle of the camera.
  • FIG. 5 is a functional block diagram of the information processing device 100.
  • Each component of the information processing apparatus 100 includes computing units such as a CPU (Central Processing Unit) and various computer processors, storage devices such as memory and storage, hardware including wired or wireless communication lines connecting them, and storage devices. , and implemented by software that supplies processing instructions to the computing unit.
  • a computer program may consist of a device driver, an operating system, various application programs located in their higher layers, and a library that provides common functions to these programs.
  • Each block described below represents a functional block rather than a hardware configuration.
  • the information processing device 100 includes a user interface processing unit 130 , a data processing unit 132 and a data storage unit 134 .
  • the user interface processing unit 130 receives an operation input from an operator, and is in charge of user interface processing such as image display.
  • the data processing unit 132 executes various processes based on data acquired by the user interface processing unit 130 and data stored in the data storage unit 134 .
  • Data processing unit 132 also functions as an interface for user interface processing unit 130 and data storage unit 134 .
  • the data storage unit 134 stores various programs and setting data.
  • the user interface processing section 130 includes an input section 140 and an output section 142 .
  • the input unit 140 receives an operation input from an operator via the touch panel of the monitor on the operation panel 4 or the like.
  • the output unit 142 includes a display unit 144 that displays an image or the like on the screen of the monitor.
  • the output unit 142 provides various types of information to the operator through its display.
  • a display unit 144 displays an image captured by the imaging unit 110 .
  • Data processing unit 132 includes acquisition unit 150 , grid setting unit 152 , detection unit 154 , recognition unit 156 , correction unit 158 , display control unit 160 and ejection control unit 162 .
  • Acquisition unit 150 acquires an image captured by imaging unit 110 .
  • the grid setting unit 152 divides (divides) the captured image into a plurality of grids in order to determine the presence of a predetermined physical quantity (chip) in the imaging area.
  • the grid has a specific shape (square in this embodiment, but any geometric shape is acceptable) (details will be described later).
  • an area divided by a plurality of grids in a captured image will also be referred to as a "grid area”.
  • an image composed of a plurality of grids is also called a "grid image”.
  • the detection unit 154 detects an operator's operation input via the input unit 140 . Although the details will be described later, the operator can grasp the accumulation state of chips in the processing chamber 2 by referring to the captured image displayed on the display unit 144 . The operator can specify the cleaning range (spray range) of the coolant by specifying the area of the captured image via the touch panel. The detection unit 154 detects the instruction input by the operator as the indicated position in the captured image. The detection unit 154 may detect the pointing position based on the grid area created by the grid setting unit 152 .
  • the recognition unit 156 automatically recognizes chips based on the grid area set in the captured image, determines whether chips exist in the grid area, and determines the amount of chips present. . These determinations are also referred to as "chip determinations". If the recognition unit 156 determines that chips are present in the grid area, the recognition unit 156 recognizes the position on the captured image corresponding to the grid area as the accumulation position of the chips. When the chip stacking position is recognized, an automatic detection signal is output to the injection control section 162 .
  • the automatic detection signal includes at least information regarding the predetermined location where debris has been recognized in the captured image.
  • the correction unit 158 executes correction processing for matching the coordinates of the imaging area by the camera 30 with the coordinates used for the coolant control command based on the operator's operation. Since this correction process is based on the premise that there is an installation error in the camera 30 , it is performed when the camera 30 is installed in the processing chamber 2 . The details will be described later.
  • the display control unit 160 causes the display unit 144 to display the image captured by the camera 30 .
  • the display control unit 160 displays an operation screen for the correction and displays the captured image after correction. An example of a display screen related to this correction will be described in detail later.
  • the injection control unit 162 outputs a coolant injection command toward the target position to the machining control device 102 when cleaning control is performed.
  • This injection command includes information specifying the position to inject the coolant (information specifying the injection route, etc.).
  • the machining control device 102 receives this injection command, drives the liquid injection unit 114, and controls injection of the coolant.
  • This cleaning control is to wash away chips in the processing chamber 2 with coolant (that is, to wash the inside of the processing chamber 2), and in this embodiment, an automatic cleaning mode or a manual cleaning mode can be selected.
  • the injection control unit 162 automatically sets the coolant injection position based on the automatic detection signal output by the recognition unit 156, and outputs an injection command.
  • the position for injecting the coolant is set based on the operator's area specification detected by the detection unit 154, and an injection command is output.
  • the data storage unit 134 stores various programs including a correction processing program for executing the correction processing described above and a cleaning control program for executing cleaning control, as well as various data necessary for these processing.
  • Data storage unit 134 includes correction information storage unit 146 and reference information storage unit 148 .
  • the correction information storage unit 146 stores the captured image acquired by the acquisition unit 150 during the correction process, the grid area (grid image) created by the grid setting unit 152, and the predetermined Position information, information on the amount of chips, position information detected by the detection unit 154, and the like are stored.
  • the correction information storage unit 146 also stores correction information (calibration data) acquired in a correction process, which will be described later.
  • the reference information storage unit 148 stores an image for guiding the operator's operation input in the correction process.
  • the data storage unit 134 also functions as a work area when arithmetic processing is performed.
  • FIG. 6 is a block diagram showing the configuration of the recognition unit 156. As shown in FIG.
  • the recognition unit 156 includes a model learning unit 41 , a calculation unit 43 and a determination unit 44 .
  • the model learning unit 41 creates a learning model.
  • the learning model is a model that can calculate and output the probability of which one of the predetermined items related to chips in the grid area corresponds. is.
  • This learning model can be created by, for example, using pairs of input data and output data as teacher data and inputting them into a CNN (convolutional neural network) in advance for learning.
  • the input data can be the grid area
  • the output data can be information about the presence and amount of chips in the grid area.
  • the data storage unit 134 includes a model storage unit 42.
  • the model storage unit 42 stores learning models for automatically determining the presence or absence of chips.
  • the learning model is read into the calculator 43 as needed.
  • the calculation unit 43 calculates the matching probability of a predetermined item related to chips on a grid-by-grid basis based on the captured image. Specifically, the calculation unit 43 uses the learning model learned by the model learning unit 41 to determine whether there are many chips (class 2), few chips (class 1), or no chips for the grid area. (Class 0)”, the probabilities relating to which of the three items it corresponds to are calculated. These classes indicate the degree of presence of material (such as chips) recognized in the imaging area.
  • the determination unit 44 determines to which of classes 0 to 2 the chips in the grid area belong, based on the probability calculated by the calculation unit 43 for the input grid area.
  • the determination unit 44 determines that there is debris in the grid area (that is, determines that it is class 2 or class 1)
  • automatic detection including position information on the captured image corresponding to the position of the grid area in the grid image
  • a signal is output to display control unit 160 and injection control unit 162 .
  • the recognizing unit 156 automatically recognizes chips based on the captured image captured by the imaging unit 110 during or after the processing of the workpiece, and can perform automatic cleaning by injecting coolant.
  • the automatic cleaning may be performed periodically, or may be performed by giving some instructions such as instructions from the operator.
  • the camera 30 of the present embodiment is restricted in rotation around the optical axis, so there is a limit to its adjustment. Therefore, if left as it is, there is a possibility of causing a setting error in the target of the coolant injection control. Therefore, in the present embodiment, the correction unit 158 rotates the image captured by the camera 30 to correct the target setting error. The details will be described below.
  • FIG. 7 to 10 are diagrams showing the method of correcting the screen display.
  • each camera 30 is corrected. Since the other cameras 30 are the same, description thereof is omitted.
  • the camera 30 is installed in the processing chamber 2 so that the predetermined imaging area is included in the angle of view.
  • the imaging area includes areas where swarf scattering and deposition is expected.
  • the captured image P0 shown in FIG. 7 is displayed on the screen (reference screen 170) of the display unit 144 when the camera 30 is installed in the processing chamber 2.
  • FIG. When the operation input by the operator shifts to a maintenance mode (described later) for correcting the screen display, the display control unit 160 superimposes the baseline BL (thick line) and the auxiliary line AL (chain-dotted line) on the captured image P0. display.
  • the baseline BL is a line set to follow the outline of the specific area SA in the processing chamber 2 when the screen is in the aligned state, and is stored in the reference information storage unit 148 in advance.
  • the specific area SA is an area surrounded by a boundary line formed by the slope 24 and the edge of the pallet 18 .
  • the auxiliary line AL is a cross-shaped auxiliary line passing through the center C of the captured image P0 in this embodiment.
  • the processing chamber 2 is provided with a marker M on which the center C overlaps when the screens are aligned. The marker M is attached to the center setting position preset in the imaging area.
  • the center C of the captured image P0 is shifted from the marker M.
  • the outline of the specific area SA is shifted from the baseline BL. Therefore, the screen is not in alignment. Therefore, the operator first manually adjusts the angle of the camera 30 within a possible range to bring the screens closer to matching.
  • the operator rotates the camera 30 around the yaw axis L1 and the pitch axis L2 so that the center C of the captured image P0 overlaps the marker M as shown in FIG. 8(A). Up to this point, it can be easily performed manually. However, in the illustrated example, the outline of the specific area SA is shifted from the baseline BL even at this stage. Therefore, if the camera 30 can be rotated around the optical axis L, the deviation can be eliminated. However, in this embodiment, the rotation of the camera 30 around the optical axis L is restricted as described above.
  • the captured image P0 is rotated around the center C (that is, in the rotation direction of the optical axis L of the camera 30) so that the correction unit 158 aligns the contour of the specific area SA with the baseline BL. ) rotate.
  • the screen is brought into a matching state, and the target setting error in the coolant injection control is corrected.
  • the captured image P0 being tilted with respect to the reference screen 170 at this time, there is a possibility that the screen display will be uncomfortable.
  • the grid area GA is set based on an ideal state in which there is no installation error of the camera 30.
  • the aspect ratio (screen ratio) of the captured image P0 (see dotted line) is 4:3, and the number of grids is 28 ⁇ 21. More specifically, each area surrounded by intersections of a plurality of grid lines arranged in parallel in the vertical direction and a plurality of grid lines arranged in parallel in the horizontal direction constitutes one section of the grid.
  • Each grid line is displayed parallel to the frame of the reference screen 170 (display screen).
  • the reference screen 170 may be partitioned so that the frame and the grid lines overlap, but as shown in FIG. A form in which lines are displayed may be used. Since the aspect ratio of the frame of reference screen 170 is also 4:3, the outer frame of captured image P0 contacts the frame of reference screen 170 in this ideal state.
  • the captured image P0 of this embodiment is an image having 3584 ⁇ 2688 pixels. It is divided into square grids that are easy to understand visually. Then, in this embodiment, it can be divided into 28 ⁇ 21 grids, and 128 ⁇ 128 pixels are present in each grid.
  • Each grid on the reference screen 170 includes a plurality of pixels forming the captured image P0. In the above ideal state, the size of the grid area GA and the size of the captured image P0 are the same. Therefore, the number of pixels of the grid area GA and the number of pixels of the captured image P0 match, and the grid area GA includes 3584 ⁇ 2688 pixels.
  • both the reference screen 170 and the captured image P0 are rectangular (rectangular), and the aspect ratio is 4:3.
  • the grid area GA is positioned so that the edges of the grid are in contact with the inside of the frame of the reference screen 170 .
  • the number of pixels per grid can be appropriately set within a range of, for example, 50 ⁇ 50 to 250 ⁇ 250.
  • One grid is preferably set to a size that allows accumulated chips to be identified by image analysis.
  • the display control unit 160 extracts a region of a specific shape by cutting out the peripheral portion of the captured image P0 after rotating the angle of view.
  • the region of the image extracted at this time will be referred to as "extracted region P1".
  • the extraction area P1 is an image corresponding to the determination area JA and is part of the captured image P0.
  • the "specific shape” is a rectangular shape having the same aspect ratio (4:3) as the captured image P0.
  • the first to fourth grids from the right have an occupation area of 50% or less, and the fifth to tenth grids have an occupation area larger than 50%. 80% or less.
  • the extraction area P1 is set in consideration of the area occupied by each grid.
  • the outer perimeter cut in the captured image P0 is a grid row or grid column including a grid in which the captured image P0 occupies 50% or less of the grid section.
  • the “cutting” referred to here may be performed by erasing the outer peripheral portion of the captured image P0 and leaving the image portion of the judgment area JA as the data of the extraction region P1, or by removing the judgment area JA without erasing the outer peripheral portion. may be extracted as the data of the extraction region P1.
  • the display control unit 160 cuts out the outer peripheral portion from the captured image P0 in units of at least one of rows and columns of the grid.
  • the rotation angle ⁇ of the captured image P0 is set to 3 degrees.
  • the portion where the minute grid image overlaps is clipped.
  • the number of grids in the grid area GA (extraction area P1) is 26 ⁇ 19. Since the number of pixels per grid remains unchanged at 128 ⁇ 128, the number of pixels in the grid area GA is 3328 ⁇ 2432.
  • the extraction area P1 which is a part of the captured image P0, is displayed as it is, it will become smaller than the reference screen 170, and a margin may be formed around it, which may cause discomfort.
  • the captured image P0 is displayed in a form that includes the extraction region P1
  • the divisions of the grid do not correspond to the reference screen 170, which may cause the operator to feel uncomfortable. Therefore, as shown in FIG. 10A, the extraction area P1 is enlarged to a preset display size and displayed on the reference screen 170.
  • FIG. 10A the image is enlarged to the same scale as the original captured image P0.
  • the extraction region P1 which is smaller than the frame of the reference screen 170 by cutting the outer periphery from the captured image P0, is enlarged and displayed so that the outer frame approaches the frame of the reference screen 170.
  • the extraction area P1 is enlarged while maintaining the number of pixels. Therefore, the number of pixels of the image displayed within the frame of the reference screen 170 is reduced from 3584 ⁇ 2688 of the captured image P0 to 3328 ⁇ 2432 of the extraction region P1.
  • the extraction area P1 has the same size as the reference screen 170, and is displayed so that the sides of the rectangle are at the same positions as the sides of the rectangle of the original captured image P0. That is, the extraction area P ⁇ b>1 is enlarged so that the end of the grid in the grid area GA (that is, the grid area after the outer periphery is cut) approaches the frame of the reference screen 170 .
  • the recognition and determination of substances (chips) by the recognition unit 156 are performed in the initially set grid area (the number of grids: 26 ⁇ 19).
  • the screen display by the display control unit 160 may be a part of the cutout of the captured image P0.
  • the coordinates of the imaging area displayed on the reference screen 170 can be matched with the coordinates of the imaging area recognized by the software, and the coolant can be jetted toward the target position for correct confirmation.
  • the position of the grid set by the grid setting unit 152 and the position of the grid recognized by the operator from the image are matched.
  • the position indicated by the operator via the touch panel corresponds to one of the grids, and the area of the image where the grids overlap is set as the coolant injection target (target position).
  • FIG. 11 is a diagram showing an example of a screen operated by an operator.
  • a screen 200 shown in FIG. 11A is displayed on the monitor of the operation panel 4 .
  • the screen 200 includes the above-described reference screen 170 and operation screen 202 side by side.
  • the operation screen 202 is provided with an automatic cleaning button 210 , a manual cleaning button 212 , a cleaning path adjustment button 214 and a detail setting button 216 .
  • the automatic cleaning button 210 is selected when executing the automatic cleaning mode.
  • a manual wash button 212 is selected when executing the manual wash mode.
  • the cleaning path adjustment button 214 is selected when adjusting the cleaning path with coolant.
  • a maintenance button 220 When the detail setting button 216 is selected, a maintenance button 220, an auxiliary line display button 222, and an image rotation operation section 224 are displayed as shown.
  • the maintenance button 220 When the maintenance button 220 is selected by the operator, the maintenance mode is entered, and the baseline BL is displayed superimposed on the captured image P0. Also, when the auxiliary line display button 222 is turned on, the auxiliary line AL is displayed.
  • the operator While looking at the reference screen 170, the operator adjusts the yaw angle and pitch angle of the camera 30 so that the center C of the captured image P0 overlaps the marker M as described above. After that, by touching the + button or - button of the image rotation operation section 224, the rotation angle ⁇ of the captured image P0 can be adjusted.
  • the correction unit 158 rotates the captured image P0 clockwise by 0.1 degrees each time the + button is touched by the operator, and rotates the captured image P0 counterclockwise by 0.1 degrees each time the - button is touched. rotate each. As shown in FIG. 11B, when the auxiliary line display button 222 is turned off, the auxiliary line AL is hidden.
  • FIG. 12 is a flowchart showing the flow of correction processing. This process is executed when the operator selects the maintenance button 220 .
  • the display control unit 160 displays the captured image P0 on the reference screen 170 as a maintenance screen (S10), and displays the baseline BL so as to overlap it (S12).
  • the display control unit 160 displays the auxiliary line AL (S16). If the auxiliary line display button 222 is off (N of S14), the auxiliary line AL is hidden (S18).
  • the display control unit 160 rotates the captured image P0 (S22).
  • the correction unit 158 calculates the determination area JA (S26). That is, an area in which the image is surely included in the grid area GA is set as the judgment area JA according to the inclination of the captured image P0.
  • This determination area JA functions as an "AI inference area" in which chip determination is performed on a grid-by-grid basis.
  • the display control unit 160 Based on the determination area JA, the display control unit 160 extracts the extraction area P1 by cutting out the outer periphery from the captured image P0 (S28), adjusts (enlarges) the extraction area P1, and displays it on the reference screen 170. (S30).
  • the correction unit 158 stores this series of correction information as calibration data in the correction information storage unit 146 (S32).
  • This correction information includes the set angle (rotational angle ⁇ ) of the captured image P0, the setting of the determination area JA, the enlargement ratio (set magnification) of the extraction area P1, and the like. If the confirmation button is not selected (N of S24), the processing of S26-S32 is skipped.
  • a predetermined maintenance termination condition such as when the operator selects another button (Y in S34)
  • the display control unit 160 terminates the display of the maintenance screen (S36). If the maintenance end condition is not satisfied (N of S34), the process returns to S14.
  • FIG. 13 is a flow chart schematically showing the flow of cleaning control.
  • the acquisition unit 150 acquires the captured image P0 (S40).
  • the correction unit 158 reads the correction information (calibration data) stored in the correction information storage unit 146 (S42), and internally reflects the correction process described above on the captured image P0.
  • the correction unit 158 rotates the captured image P0 at a set angle (S44), sets a determination region (S46), and extracts an extraction image (S48). Subsequently, the extracted image is enlarged by the set magnification (S50) and displayed on the screen as a captured image (S52). Then, when the operator gives an instruction to display the chip accumulation state (Y in S54), the display control unit 160 displays the grid image superimposed on the captured image (S56), and further displays the chip accumulation state. (S58). This chip accumulation state is displayed by means of color coding or the like according to the class determined by the recognition unit 156 .
  • the detection unit 154 detects this.
  • the injection control unit 162 sets a coolant injection route based on the injection position (S62), and outputs an injection command to the machining control device 102 to inject the coolant based on the injection route (S64).
  • the display control unit 160 terminates the display of the cleaning operation screen (S68). If the cleaning mode end condition is not satisfied (N of S66), the process returns to S40.
  • the machine tool has been described above based on the embodiment.
  • the operator can finely adjust the angle (yaw angle, pitch angle) of the camera 30 while checking the captured image P0. .
  • the camera 30 cannot adjust the rotation (roll angle) around the optical axis due to its structure, it can be corrected by rotating the captured image P0. That is, according to the present embodiment, even if there are restrictions on the angle adjustment of the camera 30 in the machine tool 1, it is possible to match the imaging area displayed on the screen with the imaging area recognized by the software. Therefore, even when the operator gives an instruction to wash chips with coolant based on the captured image displayed on the screen, the washing instruction position and the coolant injection position match. That is, it is possible to maintain high accuracy of cleaning control.
  • the extracted region P1 is extracted by cutting out the outer peripheral portion from the captured image P0 tilted by rotation, and is enlarged to a preset display size and displayed. bottom. Therefore, the frame shape of the captured image displayed on the screen can be matched to the screen while maintaining the rectangular frame shape. Therefore, the presence or absence of correction (that is, due to individual differences in installed cameras) does not significantly change the appearance of the captured image, and the corrected image does not cause the operator to feel uncomfortable.
  • the machine tool 1 is described as a multitasking machine, but it may be a turning center or a machining center. It may also be an additional processing machine that processes a material (for example, metal powder) while melting it with a laser. In that case, the material that scatters during processing becomes the "physical quantity" recognized by the substance recognition unit.
  • a material for example, metal powder
  • the coolant is exemplified as the fluid that is injected for chip removal.
  • liquid (cleaning liquid) other than coolant or gas such as air may be used.
  • a gas injection section for injecting gas is provided instead of the liquid injection section.
  • the rotation angle ⁇ of the captured image P0 was 3 degrees in the correction process (FIG. 8B). stomach.
  • FIG. 14 is a diagram showing correction processing when the rotation angle ⁇ of the captured image P0 is 3.3 degrees.
  • the number of pixels of the captured image P0 is 3584 ⁇ 2688.
  • the number of pixels of the reference screen 170 (display screen) is 3337 ⁇ 2500, which is smaller than the number of pixels of the captured image P0.
  • the rotation angle ⁇ of the captured image P0 with respect to the reference screen 170 is 3.3°, the entire captured image P0 cannot fit within the reference screen 170 .
  • the peripheral portion of the captured image P0 is cut off, and a partial region of the captured image P0 that fits within the reference screen 170 as the maximum rectangle is extracted as an extraction region P1.
  • the extraction region P1 is a residual partial image obtained by cutting the outer peripheral portion of the captured image P0 after rotation.
  • the outline of the extraction area P1 overlaps the outline of the grid area GA. That is, of the grids displayed on the reference screen 170 and superimposed on the captured image P0, the grid area GA is an area that includes the ends of the grid, that is, the area that includes the entire grid area. In other words, the grid area GA is left in units of rows and columns of the grid arrangement. In this case, the number of grids is 26 ⁇ 19.
  • This grid area GA is an area in which predetermined determination processing (processing such as chip determination using a learning model) is performed in grid units. In this modified example, this grid area GA is defined as an extraction area P1. Since the extraction area P1 is inside the reference screen 170 (reference screen), the number of pixels of the extraction area P1 is smaller than the number of pixels of the reference screen 170, which is 3328 ⁇ 2432.
  • the extraction area P1 which is smaller than the frame of the reference screen 170 by cutting the outer periphery of the captured image P0 in this way, is displayed so that the outline of the extraction area P1 expands toward the frame of the reference screen 170 ( See Figure 10).
  • the display control unit 160 stops enlarging the extraction region P1 when the upper outline of the extraction region P1 moves to a position where it overlaps or touches the frame of the reference screen 170 .
  • the extraction region P1 is enlarged so that one of the two pairs of parallel sides forming the outline of the extraction region P1 is in contact with the frame of the reference screen 170 and the other is positioned inside the frame of the reference screen 170. be done.
  • extraction and enlargement of the extraction region P1 by cutting out the outer periphery of the captured image P0 may be a form in which the outer periphery is hidden and then enlarged. may not be displayed on the reference screen 170 as a result of enlarging the .
  • the information processing device 100 is used as the internal computer of the machine tool 1, and an example is shown in which it is configured integrally with the operation panel 4, but it may be configured independently of the operation panel 4. In that case, the information processing apparatus may use the monitor of the control panel as a remote desktop to function as a display unit. Alternatively, the information processing device may be an external computer connected to the machine tool. In that case, the information processing device may be a general laptop PC (Personal Computer) or a tablet computer.
  • PC Personal Computer
  • FIG. 15 is a diagram showing the configuration of an information processing system according to a modification.
  • symbol is attached
  • the information processing device is installed outside the machine tool. That is, the information processing system includes a machine tool 301 , a data processing device 310 and a data storage device 312 .
  • the data processing device 310 functions as an "information processing device".
  • the machine tool 301 includes a machining control device 102 , a machining device 104 , a tool changer 106 , a tool storage 108 , a data processor 330 , an operation panel 304 and an imaging unit 110 .
  • Data processing unit 330 includes communication unit 332 , grid setting unit 152 , detection unit 154 , recognition unit 156 and injection control unit 162 .
  • Communication unit 332 has a receiving unit and a transmitting unit, and is in charge of communication with external devices including data processing device 310 and data storage device 312 .
  • the operation panel 304 includes a user interface processing section 130 and a data processing section 334 .
  • the data processing unit 334 outputs a control command to the processing control device 102 based on the operation input by the operator. Further, the screen displayed on the display unit 144 is controlled according to the operation input by the operator.
  • the data processing device 310 includes a communication section 320 , a correction section 158 and a display control section 160 .
  • the communication unit 320 has a receiving unit and a transmitting unit and takes charge of communication with the machine tool 301 .
  • Data storage device 312 includes correction information storage section 146 and reference information storage section 148 .
  • the data storage device 312 is wired to the data processing device 310 in this modification, but may be wirelessly connected. In other variations, data storage device 312 may be incorporated as part of data processing device 310 .
  • the functions of the information processing apparatus 100 in the above embodiment are realized by dividing them into the inside and the outside of the machine tool. With such a configuration, it is possible to obtain the same effects as those of the above-described embodiment.
  • the size and shape of the grid may be configured to be changeable as needed. Also in this case, it is preferable to obtain an extracted image by cutting out the outer peripheral portion of the captured image in units of grids.
  • a drive control section may be provided that controls the first rotation about the yaw axis and the second rotation about the pitch axis.
  • the drive control unit controls the first rotation and the second rotation so that the optical axis of the camera is positioned at a center setting position (marker M) preset in the imaging area prior to correction by the correction unit. .
  • the configuration in which the camera 30 cannot rotate around the optical axis is exemplified as the restricted state around the optical axis of the camera 30 .
  • the camera 30 may be rotated around the optical axis by a predetermined amount, but the angle of rotation may be limited.
  • the camera 30 itself may have a rotatable structure, but its rotation may be restricted to avoid interference with surrounding structures.
  • the configuration in which the center C of the captured image P0 and the optical axis L of the camera 30 are aligned and the captured image P0 is rotated around the optical axis L is illustrated.
  • the captured image may be rotated independently of the optical axis.
  • the correction unit rotates the captured image so that the contour of the specific area is aligned with the baseline. Specifically, it is preferable to rotate around an orthogonal axis that is orthogonal to the plane of the captured image and passes through the center of the captured image.
  • FIG. 16 is a diagram showing an image processing method according to a modification.
  • the captured image shown in the upper part of FIG. 16 can be corrected as shown in the lower part of FIG. That is, the grid setting unit 152 performs processing (grid line division processing) to superimpose grid lines on preset positions of the captured image received from the imaging unit 110 .
  • processing grid line division processing
  • the overlapping relationship between the grid lines and the captured image P10 may be as shown in the upper diagram of FIG.
  • the display control unit 160 ignores the grid area whose four sides do not overlap the captured image, and displays only the grid area whose four sides overlap the captured image P10 according to the size of the reference screen. Processing can be performed (bottom of FIG. 16).
  • the grid setting unit 152 detects that there is a grid area whose four sides do not overlap the captured image P10 (or a grid area whose four sides do not appear), and the grid is formed in a mesh shape so that the four sides overlap the captured image P10. A process of adjusting by moving the entire grid line up, down, left, or right may be performed.
  • chips present in the imaging area were exemplified as an object to be imaged by the imaging unit, but there is no particular limitation as long as it is a physical quantity such as a material or contaminant generated along with processing.
  • a part of the captured image is extracted by cutting the outer periphery.
  • the extraction area is displayed such that its outline expands toward the frame of the reference screen.
  • a part of the captured image is extracted by cutting off the outer periphery without rotating the captured image, and the extraction area is displayed so that the contour expands toward the frame of the reference screen.
  • the outer peripheral portion may be cut off.
  • the extracted area may be enlarged and displayed in the same manner so that the operator does not feel uncomfortable due to the clipping.
  • the peripheral portion includes a grid in which the captured image occupies 50% or less of the grid section
  • the size of the grid section is constant, Since the target area is also the same, the operator can operate while viewing the screen without feeling discomfort. Moreover, if the size of the grid is the same, the internal structure of the machine tool can also be seen in the same size, so the operator does not feel uncomfortable. When used for the purpose of recognizing the state of accumulation of chips, the effect becomes even more pronounced.

Abstract

An information processing device according to one embodiment of the present invention comprises: an acquisition unit that acquires a captured image of an image capture area, captured by an image capturing unit which is disposed on a machine tool; a grid setting unit that divides the captured image into a grid large enough to contain multiple pixels; and a display control unit that, taking the arranged rows and columns of the grid as units, causes a reference screen of a display unit to display an extraction region, which is a portion of the captured image that has been extracted by cutting out the outer peripheral portion of the captured image. The display control unit causes the display unit to display the extraction region, which is smaller than the frame of the reference screen as a result of cutting out the outer peripheral portion from the captured image, so that the contour of the extraction region is enlarged toward the frame of the reference screen.

Description

情報処理装置および工作機械Information processing equipment and machine tools
 本発明は、工作機械で用いられる撮像画像を処理する情報処理装置に関する。 The present invention relates to an information processing device that processes captured images used in machine tools.
 工作機械においてワークを加工すると切屑が生じる。加工室に切屑が多く堆積すると、加工の継続が困難となる。このため、従来は作業者が工作機械の運転を定期的に停止し、エアブローなどを使って切屑を除去するなどの対処をしていた。しかし、このような手作業による切屑の除去は、工作機械の稼働効率を低下させる。 Chips are generated when workpieces are processed by machine tools. If a large amount of chips accumulate in the processing chamber, it becomes difficult to continue processing. For this reason, conventionally, the operator has taken measures such as periodically stopping the operation of the machine tool and removing the chips using an air blower or the like. However, such manual chip removal reduces the operating efficiency of the machine tool.
 そこで近年、加工室にカメラを設置し、その撮像画像を解析することで切屑を自動的に除去するシステムが提案されている(特許文献1参照)。このシステムでは、解析結果に基づいて切屑の飛散位置を特定し、その飛散位置を目標としてクーラントを出射する制御(以下「清掃制御」ともいう)が行われる。また、モニタの画面に表示される撮像画像に切屑の飛散範囲を重ねて表示し、オペレータがその画面上でクーラントの出射範囲を指定できる技術も提案されている。清掃制御用のソフトウェアには、カメラによる撮像エリアの座標と、クーラントの制御指令に用いる座標との対応関係が予め設定される。 Therefore, in recent years, a system has been proposed in which chips are automatically removed by installing a camera in the processing chamber and analyzing the captured image (see Patent Document 1). In this system, the scattering position of chips is specified based on the analysis result, and control (hereinafter also referred to as "cleaning control") is performed to target the scattering position and discharge coolant. A technology has also been proposed in which the chip scattering range is superimposed on a captured image displayed on a monitor screen, and the operator can specify the coolant emission range on the screen. The software for cleaning control is preset with a corresponding relationship between the coordinates of the area captured by the camera and the coordinates used for the coolant control command.
特許第6887033号公報Japanese Patent No. 6887033
 ところで、このようなシステムを新たに導入する場合、既存の工作機械に後付けで組み込まれることも想定される。その場合、後付けされたカメラの取り付け誤差や個体差などにより、撮像画像が本来みえる角度で表示されなくなる可能性がある。この対処として、例えば撮像画像の表示角度を補正するなども考えられるが、撮像画像の画角が傾くことでオペレータに違和感を与える可能性もある。 By the way, when introducing a new system like this, it is assumed that it will be retrofitted to existing machine tools. In that case, there is a possibility that the captured image will not be displayed at the angle at which it should be seen due to mounting errors or individual differences of the retrofitted camera. As a countermeasure, for example, correcting the display angle of the captured image may be considered.
 本発明のある態様は情報処理装置である。この情報処理装置は、工作機械に設置される撮像部による撮像エリアの撮像画像を取得する取得部と、撮像画像を複数の画素を含む大きさのグリッドに区分するグリッド設定部と、グリッド並びの行および列を単位として、撮像画像の外周部を切り取る形で抽出した、撮像画像の一部である抽出領域を表示部の参照画面に表示させる表示制御部と、を備える。表示制御部は、撮像画像から外周部を切り取ることで参照画面の枠よりも小さくなった抽出領域を、抽出領域の輪郭が参照画面の枠に向かって拡大するよう表示部に表示させる。 An aspect of the present invention is an information processing device. This information processing apparatus includes an acquisition unit that acquires a captured image of an imaging area by an imaging unit installed in a machine tool, a grid setting unit that divides the captured image into grids each having a size including a plurality of pixels, and a grid arrangement. a display control unit for displaying an extraction region, which is a part of the captured image, extracted by cutting out the outer peripheral portion of the captured image in units of rows and columns, on a reference screen of the display unit. The display control unit causes the display unit to display the extraction area, which is smaller than the frame of the reference screen by cutting the outer periphery from the captured image, so that the contour of the extraction area expands toward the frame of the reference screen.
 本発明の別の態様は工作機械である。この工作機械は、加工室に設定された撮像エリアを撮像する撮像部と、撮像部による撮像画像を表示する表示部と、撮像画像を複数の画素を含む大きさのグリッドに区分するグリッド設定部と、グリッド並びの行および列を単位として、撮像画像の外周部を切り取る形で抽出した、撮像画像の一部である抽出領域を表示部の参照画面に表示させる表示制御部と、を備える。表示制御部は、撮像画像から外周部を切り取ることで参照画面の枠よりも小さくなった抽出領域を、抽出領域の輪郭が参照画面の枠に向かって拡大するよう表示部に表示させる。 Another aspect of the present invention is a machine tool. This machine tool includes an imaging unit that captures an image of an imaging area set in a processing chamber, a display unit that displays an image captured by the imaging unit, and a grid setting unit that divides the captured image into grids each having a size including a plurality of pixels. and a display control unit for displaying, on a reference screen of a display unit, an extraction region which is a part of the captured image extracted by cutting out the outer peripheral portion of the captured image in units of rows and columns of the grid arrangement. The display control unit causes the display unit to display the extraction area, which is smaller than the frame of the reference screen by cutting the outer periphery from the captured image, so that the contour of the extraction area expands toward the frame of the reference screen.
 本発明によれば、工作機械に設置される撮像部の撮像画像をオペレータに違和感を与えないように表示できる。 According to the present invention, it is possible to display the captured image of the imaging unit installed in the machine tool so that the operator does not feel uncomfortable.
実施形態に係る工作機械の外観を表す斜視図である。It is a perspective view showing the appearance of the machine tool concerning an embodiment. 工作機械のハードウェア構成図である。It is a hardware block diagram of a machine tool. 加工室内の構成を表す斜視図である。It is a perspective view showing the structure in a processing chamber. カメラの構造を模式的に示す図である。It is a figure which shows the structure of a camera typically. 情報処理装置の機能ブロック図である。3 is a functional block diagram of an information processing device; FIG. 認識部の構成を表すブロック図である。3 is a block diagram showing the configuration of a recognition unit; FIG. 画面表示の補正方法を表す図である。It is a figure showing the correction method of a screen display. 画面表示の補正方法を表す図である。It is a figure showing the correction method of a screen display. 画面表示の補正方法を表す図である。It is a figure showing the correction method of a screen display. 画面表示の補正方法を表す図である。It is a figure showing the correction method of a screen display. オペレータが操作する画面の一例を表す図である。It is a figure showing an example of the screen which an operator operates. 補正処理の流れを表すフローチャートである。4 is a flowchart showing the flow of correction processing; 清掃制御の流れを概略的に表すフローチャートである。4 is a flowchart schematically showing the flow of cleaning control; 撮像画像の回転角θを3.3度としたときの補正処理を表す図である。It is a figure showing correction processing when rotation angle theta of a captured image is set to 3.3 degrees. 変形例に係る情報処理システムの構成を表す図である。It is a figure showing the structure of the information processing system which concerns on a modification. 変形例に係る画像処理方法を表す図である。It is a figure showing the image processing method concerning a modification.
 以下、図面を参照しつつ、本発明の一実施形態について説明する。
 図1は、実施形態に係る工作機械の外観を表す斜視図である。
 工作機械1は、工具を適宜交換しながらワークを所望の形状に加工する複合加工機として構成されている。工作機械1は、装置筐体の内部に加工室2が設けられる。加工室2には、ワークを加工する加工装置が設けられる。装置筐体の前面には、加工装置を操作するための操作盤4が設けられる。
An embodiment of the present invention will be described below with reference to the drawings.
FIG. 1 is a perspective view showing the appearance of the machine tool according to the embodiment.
The machine tool 1 is configured as a multitasking machine that processes a workpiece into a desired shape while appropriately exchanging tools. A machine tool 1 is provided with a processing chamber 2 inside a device housing. The processing chamber 2 is provided with a processing device for processing a work. An operation panel 4 for operating the processing apparatus is provided on the front surface of the apparatus housing.
 図2は、工作機械1のハードウェア構成図である。
 工作機械1は、情報処理装置100、加工制御装置102、加工装置104、工具交換部106、工具格納部108および撮像部110を含む。加工制御装置102は、数値制御部として機能し、加工プログラム(NCプログラム)にしたがって加工装置104に制御信号を出力する。加工装置104は、加工制御装置102からの指示にしたがって工具主軸(図示略:以下、単に「主軸」という)を動かしてワークを加工する。
FIG. 2 is a hardware configuration diagram of the machine tool 1. As shown in FIG.
The machine tool 1 includes an information processing device 100 , a machining control device 102 , a machining device 104 , a tool changing section 106 , a tool storage section 108 and an imaging section 110 . The machining control device 102 functions as a numerical control unit and outputs control signals to the machining device 104 according to a machining program (NC program). The processing device 104 processes a workpiece by moving a tool spindle (not shown; hereinafter simply referred to as “spindle”) according to instructions from the machining control device 102 .
 加工装置104は、主軸を駆動する機構のほか、クーラントを貯留する液体貯留部112、およびクーラントを噴射する液体噴射部114を備える。クーラントは、加工時における工具およびワークの除熱や潤滑のための切削油として用いられるが、加工室2内に飛散した切屑を除去するための洗浄液としても用いられる。液体噴射部114は、液体噴射部114からクーラントをくみ上げるポンプと、クーラントを噴射するノズルと、ノズルを駆動するアクチュエータを備える。 The processing device 104 includes a mechanism for driving the spindle, a liquid reservoir 112 for storing coolant, and a liquid injection section 114 for injecting coolant. The coolant is used as cutting oil for removing heat and lubricating the tools and workpieces during machining, and is also used as a cleaning liquid for removing chips scattered in the machining chamber 2 . The liquid injection section 114 includes a pump that draws up coolant from the liquid injection section 114, a nozzle that injects the coolant, and an actuator that drives the nozzle.
 情報処理装置100は、操作盤4を含み、オペレータの操作入力に基づいて加工制御装置102に制御指令を出力する。情報処理装置100は、また、オペレータの操作入力に応じて操作盤4のモニタに表示される画面を制御する。工具格納部108は工具を格納する。工具交換部106は、いわゆるATC(Automatic Tool Changer)に対応し、加工制御装置102からの交換指示にしたがって、工具格納部108から工具を取り出し、主軸にある工具と取り出した工具とを交換する。 The information processing device 100 includes an operation panel 4, and outputs control commands to the processing control device 102 based on operator input. The information processing apparatus 100 also controls the screen displayed on the monitor of the operation panel 4 according to the operator's operation input. The tool storage section 108 stores tools. The tool changer 106 corresponds to a so-called ATC (Automatic Tool Changer), takes out a tool from the tool storage part 108 according to a change instruction from the machining control device 102, and replaces the tool on the spindle with the taken out tool.
 撮像部110は、例えば、CCDやCMOSなどの撮像素子を備えたカメラであり、加工室2内に設定された撮像エリアを撮像する。「撮像エリア」として、ワークの加工により発生する切屑の存在が想定される領域が予め設定される。加工室2内の広い範囲で切屑の分布や堆積状況が把握できるよう、カメラの画角が設定されている。撮像部110は、撮像した画像を情報処理装置100へ出力する。 The imaging unit 110 is, for example, a camera equipped with an imaging device such as a CCD or CMOS, and images an imaging area set in the processing chamber 2 . As the "imaging area", an area in which chips generated by machining the workpiece are assumed to exist is set in advance. The angle of view of the camera is set so that the distribution and accumulation of chips can be grasped over a wide range in the processing chamber 2 . The imaging unit 110 outputs the captured image to the information processing device 100 .
 図3は、加工室2内の構成を表す斜視図である。図3(A)は斜め上方からみた様子を示し、図3(B)は斜め下方からみた様子を示す。
 図3(A)に示すように、加工室2は、四つの側面に囲まれており、その一側面において主軸10が上下および左右に移動可能に設けられている。主軸10は、水平方向の回転軸を有し、先端に工具Tが同軸状に取り付けられる。主軸10と軸線方向に対向する側面は旋回扉12を有する。旋回扉12から水平に支持プレート14が延出している。旋回扉12は、鉛直方向の軸を中心に回転できる扉である。
FIG. 3 is a perspective view showing the configuration inside the processing chamber 2. As shown in FIG. FIG. 3(A) shows a state seen obliquely from above, and FIG. 3(B) shows a state seen obliquely from below.
As shown in FIG. 3(A), the processing chamber 2 is surrounded by four side surfaces, and a main shaft 10 is provided on one side surface so as to be vertically and horizontally movable. The main shaft 10 has a horizontal rotating shaft, and a tool T is coaxially attached to its tip. A side surface facing the main shaft 10 in the axial direction has a revolving door 12 . A support plate 14 extends horizontally from the revolving door 12 . The revolving door 12 is a door that can rotate around a vertical axis.
 支持プレート14の下方にテーブル16が設けられている。テーブル16にはパレット18が着脱可能に取り付けられ、パレット18にワークが載置され固定される。ワークを固定したパレット18を複数用意しておくことで、パレット18の変更によりワークを変更でき、時間の効率化を図ることができる。 A table 16 is provided below the support plate 14 . A pallet 18 is detachably attached to the table 16 , and a work is placed and fixed on the pallet 18 . By preparing a plurality of pallets 18 on which works are fixed, it is possible to change the work by changing the pallet 18, and it is possible to improve the efficiency of time.
 テーブル16は、主軸10の軸線方向に移動可能であり、また水平面内で回転できる。テーブル16を回転駆動することで、パレット18上のワークを回転させることができる。テーブル16を直線駆動することで、ワークが工具Tに近接又は離間する。すなわち、テーブル16の回転および移動と、主軸10の移動を制御することにより、ワークを所望に形状に加工できる。 The table 16 is movable in the axial direction of the main shaft 10 and rotatable in the horizontal plane. By rotating the table 16, the work on the pallet 18 can be rotated. The workpiece approaches or separates from the tool T by linearly driving the table 16 . That is, by controlling the rotation and movement of the table 16 and the movement of the spindle 10, the workpiece can be processed into a desired shape.
 テーブル16が主軸10から最も離間する位置において、支持プレート14がパレット18と嵌合する。この状態で旋回扉12を回転させることで、支持プレート14がパレット18をテーブル16から分離させ、パレット18と一体に回転する。それにより、ワークの加工が終了したパレット18を加工室2から搬出するとともに、次に加工するワークが固定されたパレット18を加工室2に搬入できる。 The support plate 14 is fitted with the pallet 18 at the position where the table 16 is farthest from the spindle 10 . By rotating the revolving door 12 in this state, the support plate 14 separates the pallet 18 from the table 16 and rotates together with the pallet 18 . As a result, the pallet 18 on which the work has been processed can be carried out from the processing chamber 2 and the pallet 18 to which the work to be processed next is fixed can be carried into the processing chamber 2 .
 テーブル16および主軸10の下方には、切屑を加工室2の外に搬送するためのチップコンベア20が設けられている。テーブル16は、チップコンベア20の上方を移動する。テーブル16の下方にはシュータ22が設けられている。シュータ22は、洗浄によって上方から流れてくる切屑をチップコンベア20上に導く。 A chip conveyor 20 is provided below the table 16 and the spindle 10 for conveying chips to the outside of the processing chamber 2 . The table 16 moves above the chip conveyor 20 . A shooter 22 is provided below the table 16 . The chute 22 guides the chips flowing from above due to cleaning onto the chip conveyor 20 .
 加工室2においてテーブル16の両サイドに位置する底面は斜面24となっており、加工中に飛散した切屑がシュータ22へ流れやすくなるよう、シュータ22へ向けて下向きに傾斜している。 In the machining chamber 2, the bottoms located on both sides of the table 16 are slopes 24, which are inclined downward toward the shooter 22 so that chips scattered during machining can easily flow to the shooter 22.
 図3(B)に示すように、加工室2の上部はカバー26(天井)により遮蔽されている。カバー26には、クーラントを供給するための複数のノズル28が設置される。本実施形態では、ノズル28がカバー26ごと交換可能な構造とされている。ノズル28は、液体噴射部114を構成し、図示しない配管、バルブおよびポンプ等を介して液体貯留部112に接続される(図2参照)。ノズル28は、三次元的に回転可能に構成されている。ノズル28を回転させることで、クーラントの噴射方向を制御できる。ノズル28の向きを特定してポンプを駆動することにより、加工室2内の目標に向けてクーラントを噴射できる。ワークの加工により生じた切屑はクーラントにより洗い流され、チップコンベア20により加工室2の外に搬出される。本実施形態では2つのノズル28を設置しているが、その数については適宜設定できる。 As shown in FIG. 3(B), the upper part of the processing chamber 2 is shielded by a cover 26 (ceiling). A cover 26 is provided with a plurality of nozzles 28 for supplying coolant. In this embodiment, the nozzle 28 is configured to be replaceable together with the cover 26 . The nozzle 28 constitutes a liquid injection section 114 and is connected to the liquid storage section 112 via piping, valves, pumps and the like (not shown) (see FIG. 2). The nozzle 28 is configured to be three-dimensionally rotatable. By rotating the nozzle 28, the injection direction of the coolant can be controlled. By specifying the direction of the nozzle 28 and driving the pump, the coolant can be injected toward the target in the processing chamber 2 . Chips generated by machining the workpiece are washed away by the coolant and carried out of the machining chamber 2 by the chip conveyor 20 . Although two nozzles 28 are installed in this embodiment, the number can be set as appropriate.
 加工室2の上部にはまた、加工室2内を上方から撮像する複数のカメラ30が設置されている。本実施形態では、加工室2におけるカバー26のやや下方の側壁に2つのカメラ30が取り付けられる。本実施形態では、カメラ30は配線等の都合によりカバー26と一体にできない構造とされている。カメラ30は、撮像部110を構成し、工具Tによるワークの加工状況を撮像するとともに、加工により生じた切屑を撮像する(図2参照)。カメラ30が2つ設けられることで、一方のカメラ30では撮像できない領域を他方のカメラ30で撮像できる。撮像部110は、撮像した画像を情報処理装置100へ出力する。 A plurality of cameras 30 for capturing images of the inside of the processing chamber 2 from above are also installed in the upper part of the processing chamber 2 . In this embodiment, two cameras 30 are attached to the side wall slightly below the cover 26 in the processing chamber 2 . In this embodiment, the camera 30 cannot be integrated with the cover 26 due to wiring reasons. The camera 30 constitutes an image capturing unit 110, and captures an image of the machining status of the workpiece by the tool T, and an image of chips generated by the machining (see FIG. 2). Since two cameras 30 are provided, an area that cannot be captured by one camera 30 can be captured by the other camera 30 . The imaging unit 110 outputs the captured image to the information processing device 100 .
 図4は、カメラ30の構造を模式的に示す図である。
 本実施形態のカメラ30は、その光軸L周りの回転が規制された構造を有する。すなわち、カメラ30は、光軸Lに直交するヨー軸L1周りと、光軸Lおよびヨー軸L1に直交するピッチ軸L2周りには回動自在であるが、光軸L周りには回動できない。このため、カメラ30の取り付け誤差などにより、画面に表示される撮像エリアの座標と、ソフトウェアが認識する撮像エリアの座標とがずれた場合、手作業での調整は困難となる。そこで、情報処理装置100は、作業者によるカメラの角度調整に際し、それらの座標のずれを解消するための補正処理を実行する。
FIG. 4 is a diagram schematically showing the structure of the camera 30. As shown in FIG.
The camera 30 of this embodiment has a structure in which rotation about its optical axis L is restricted. That is, the camera 30 is rotatable around the yaw axis L1 orthogonal to the optical axis L and around the pitch axis L2 orthogonal to the optical axis L and the yaw axis L1, but cannot be rotated around the optical axis L. . Therefore, if the coordinates of the imaging area displayed on the screen deviate from the coordinates of the imaging area recognized by the software due to an installation error of the camera 30 or the like, manual adjustment becomes difficult. Therefore, the information processing apparatus 100 executes correction processing for eliminating the deviation of the coordinates when the operator adjusts the angle of the camera.
 図5は、情報処理装置100の機能ブロック図である。
 情報処理装置100の各構成要素は、CPU(Central Processing Unit)および各種コンピュータプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。
FIG. 5 is a functional block diagram of the information processing device 100. As shown in FIG.
Each component of the information processing apparatus 100 includes computing units such as a CPU (Central Processing Unit) and various computer processors, storage devices such as memory and storage, hardware including wired or wireless communication lines connecting them, and storage devices. , and implemented by software that supplies processing instructions to the computing unit. A computer program may consist of a device driver, an operating system, various application programs located in their higher layers, and a library that provides common functions to these programs. Each block described below represents a functional block rather than a hardware configuration.
 情報処理装置100は、ユーザインタフェース処理部130、データ処理部132およびデータ格納部134を含む。ユーザインタフェース処理部130は、オペレータからの操作入力を受け付けるほか、画像表示など、ユーザインタフェースに関する処理を担当する。データ処理部132は、ユーザインタフェース処理部130により取得されたデータおよびデータ格納部134に格納されているデータに基づいて各種処理を実行する。データ処理部132は、ユーザインタフェース処理部130およびデータ格納部134のインタフェースとしても機能する。データ格納部134は、各種プログラムと設定データを格納する。 The information processing device 100 includes a user interface processing unit 130 , a data processing unit 132 and a data storage unit 134 . The user interface processing unit 130 receives an operation input from an operator, and is in charge of user interface processing such as image display. The data processing unit 132 executes various processes based on data acquired by the user interface processing unit 130 and data stored in the data storage unit 134 . Data processing unit 132 also functions as an interface for user interface processing unit 130 and data storage unit 134 . The data storage unit 134 stores various programs and setting data.
 ユーザインタフェース処理部130は、入力部140および出力部142を含む。入力部140は、操作盤4におけるモニタのタッチパネル等を介してオペレータの操作入力を受け付ける。出力部142は、モニタの画面に画像等を表示する表示部144を含む。出力部142は、その表示によりオペレータに各種情報を提供する。表示部144は、撮像部110による撮像画像を表示する。 The user interface processing section 130 includes an input section 140 and an output section 142 . The input unit 140 receives an operation input from an operator via the touch panel of the monitor on the operation panel 4 or the like. The output unit 142 includes a display unit 144 that displays an image or the like on the screen of the monitor. The output unit 142 provides various types of information to the operator through its display. A display unit 144 displays an image captured by the imaging unit 110 .
 データ処理部132は、取得部150、グリッド設定部152、検知部154、認識部156、補正部158、表示制御部160および噴射制御部162を含む。
 取得部150は、撮像部110により撮像された画像を取得する。グリッド設定部152は、撮像エリアにおける所定の物理量(切屑)の存在を判定するために撮像画像を複数のグリッドに区分(分割)する。グリッドは、特定形状(本実施形態では正方形であるが、幾何学的形状であればよい)を有する(詳細後述)。以下、撮像画像において複数のグリッドにより区分される領域を「グリッド領域」ともいう。また、複数のグリッドにより構成される画像を「グリッド画像」ともいう。
Data processing unit 132 includes acquisition unit 150 , grid setting unit 152 , detection unit 154 , recognition unit 156 , correction unit 158 , display control unit 160 and ejection control unit 162 .
Acquisition unit 150 acquires an image captured by imaging unit 110 . The grid setting unit 152 divides (divides) the captured image into a plurality of grids in order to determine the presence of a predetermined physical quantity (chip) in the imaging area. The grid has a specific shape (square in this embodiment, but any geometric shape is acceptable) (details will be described later). Hereinafter, an area divided by a plurality of grids in a captured image will also be referred to as a "grid area". Also, an image composed of a plurality of grids is also called a "grid image".
 検知部154は、入力部140を介したオペレータの操作入力を検知する。詳細については後述するが、オペレータは、表示部144に表示された撮像画像を参照することで、加工室2における切屑の堆積状況を把握できる。オペレータは、そのうえでタッチパネルを介してその撮像画像の領域を指定することで、クーラントによる清掃範囲(噴射範囲)を指示できる。検知部154は、このオペレータによる指示入力を撮像画像における指示位置として検知する。検知部154は、グリッド設定部152によって作成されたグリッド領域に基づいた指示位置を検知してもよい。 The detection unit 154 detects an operator's operation input via the input unit 140 . Although the details will be described later, the operator can grasp the accumulation state of chips in the processing chamber 2 by referring to the captured image displayed on the display unit 144 . The operator can specify the cleaning range (spray range) of the coolant by specifying the area of the captured image via the touch panel. The detection unit 154 detects the instruction input by the operator as the indicated position in the captured image. The detection unit 154 may detect the pointing position based on the grid area created by the grid setting unit 152 .
 認識部156は、撮像画像に設定されたグリッド領域に基づいて自動的に切屑を認識し、そのグリッド領域において切屑が存在しているか否かを判定し、また、存在する切屑の量を判定する。これらの判定を「切屑判定」ともいう。認識部156は、グリッド領域に切屑があると判定した場合、グリッド領域に対応する撮像画像上の位置を切屑の堆積位置として認識する。切屑の堆積位置を認識すると、自動検知信号を噴射制御部162へ出力する。自動検知信号は、撮像画像において切屑が堆積していると認識された所定位置に関する情報を少なくとも含む。 The recognition unit 156 automatically recognizes chips based on the grid area set in the captured image, determines whether chips exist in the grid area, and determines the amount of chips present. . These determinations are also referred to as "chip determinations". If the recognition unit 156 determines that chips are present in the grid area, the recognition unit 156 recognizes the position on the captured image corresponding to the grid area as the accumulation position of the chips. When the chip stacking position is recognized, an automatic detection signal is output to the injection control section 162 . The automatic detection signal includes at least information regarding the predetermined location where debris has been recognized in the captured image.
 補正部158は、清掃制御の実行に先立って、オペレータの操作に基づき、カメラ30による撮像エリアの座標と、クーラントの制御指令に用いる座標とを整合させるための補正処理を実行する。この補正処理は、カメラ30の取り付け誤差があることを前提とするものであるため、加工室2にカメラ30を設置した際に行われる。その詳細については後述する。 Prior to execution of cleaning control, the correction unit 158 executes correction processing for matching the coordinates of the imaging area by the camera 30 with the coordinates used for the coolant control command based on the operator's operation. Since this correction process is based on the premise that there is an installation error in the camera 30 , it is performed when the camera 30 is installed in the processing chamber 2 . The details will be described later.
 表示制御部160は、カメラ30による撮像画像を表示部144に表示させる。補正部158による補正がなされる場合、表示制御部160は、その補正のための操作画面を表示させ、また補正後の撮像画像を表示させる。この補正に関わる表示画面の例については後に詳述する。 The display control unit 160 causes the display unit 144 to display the image captured by the camera 30 . When correction is performed by the correction unit 158, the display control unit 160 displays an operation screen for the correction and displays the captured image after correction. An example of a display screen related to this correction will be described in detail later.
 噴射制御部162は、清掃制御がなされる場合に加工制御装置102に対し、目標位置に向けたクーラントの噴射指令を出力する。この噴射指令には、クーラントを噴射する位置を特定する情報(噴射経路を特定する情報など)が含まれる。加工制御装置102は、この噴射指令を受けて液体噴射部114を駆動し、クーラントの噴射を制御する。 The injection control unit 162 outputs a coolant injection command toward the target position to the machining control device 102 when cleaning control is performed. This injection command includes information specifying the position to inject the coolant (information specifying the injection route, etc.). The machining control device 102 receives this injection command, drives the liquid injection unit 114, and controls injection of the coolant.
 この清掃制御は、クーラントにより加工室2内の切屑を洗い流すもの(つまり加工室2内を洗浄するもの)であり、本実施形態では、自動洗浄モードと手動洗浄モードが選択可能である。噴射制御部162は、自動洗浄モードにおいては、認識部156が出力した自動検知信号に基づいてクーラントを噴射する位置を自動的に設定し、噴射指令を出力する。また、手動洗浄モードにおいては、検知部154が検知したオペレータの領域指定に基づいてクーラントを噴射する位置を設定し、噴射指令を出力する。 This cleaning control is to wash away chips in the processing chamber 2 with coolant (that is, to wash the inside of the processing chamber 2), and in this embodiment, an automatic cleaning mode or a manual cleaning mode can be selected. In the automatic cleaning mode, the injection control unit 162 automatically sets the coolant injection position based on the automatic detection signal output by the recognition unit 156, and outputs an injection command. Further, in the manual cleaning mode, the position for injecting the coolant is set based on the operator's area specification detected by the detection unit 154, and an injection command is output.
 データ格納部134は、上述した補正処理を実行するための補正処理プログラム、清掃制御を実行するための清掃制御プログラムを含む各種プログラムのほか、これらの処理に必要な各種データを格納する。データ格納部134は、補正情報格納部146および基準情報格納部148を含む。補正情報格納部146は、補正処理に際して取得部150で取得された撮像画像、グリッド設定部152で作成されたグリッド領域(グリッド画像)、認識部156で切屑が存在していると認識された所定位置の情報および切屑の量に関する情報、検知部154で検知された位置情報などを格納する。補正情報格納部146は、また、後述する補正処理にて取得した補正情報(キャリブレーションデータ)を格納する。基準情報格納部148は、補正処理においてオペレータの操作入力をガイドするための画像を格納する。データ格納部134は、また、演算処理が行われる際のワークエリアとしても機能する。 The data storage unit 134 stores various programs including a correction processing program for executing the correction processing described above and a cleaning control program for executing cleaning control, as well as various data necessary for these processing. Data storage unit 134 includes correction information storage unit 146 and reference information storage unit 148 . The correction information storage unit 146 stores the captured image acquired by the acquisition unit 150 during the correction process, the grid area (grid image) created by the grid setting unit 152, and the predetermined Position information, information on the amount of chips, position information detected by the detection unit 154, and the like are stored. The correction information storage unit 146 also stores correction information (calibration data) acquired in a correction process, which will be described later. The reference information storage unit 148 stores an image for guiding the operator's operation input in the correction process. The data storage unit 134 also functions as a work area when arithmetic processing is performed.
 ここで、認識部156による切屑認識の概要について説明する。
 図6は、認識部156の構成を表すブロック図である。
 認識部156は、モデル学習部41、算出部43および判定部44を備える。
Here, an overview of chip recognition by the recognition unit 156 will be described.
FIG. 6 is a block diagram showing the configuration of the recognition unit 156. As shown in FIG.
The recognition unit 156 includes a model learning unit 41 , a calculation unit 43 and a determination unit 44 .
 モデル学習部41は、学習モデルを作成する。学習モデルは、入力としてグリッド設定部152で作成されたグリッド領域の一つを入力すると、当該グリッド領域内の切屑に関する所定の項目のうちどれに該当するか、その確率を算出して出力できるモデルである。この学習モデルは、例えば対となる入力データと出力データを教師データとし、予めCNN(畳み込みニューラルネットワーク)に入力して学習させることで作成できる。本実施形態においては、入力データにグリッド領域を、出力データに当該グリッド領域における切屑の有無および量に関する情報を用いることができる。 The model learning unit 41 creates a learning model. When one of the grid areas created by the grid setting unit 152 is input as an input, the learning model is a model that can calculate and output the probability of which one of the predetermined items related to chips in the grid area corresponds. is. This learning model can be created by, for example, using pairs of input data and output data as teacher data and inputting them into a CNN (convolutional neural network) in advance for learning. In this embodiment, the input data can be the grid area, and the output data can be information about the presence and amount of chips in the grid area.
 データ格納部134はモデル記憶部42を含む。モデル記憶部42は、切屑の有無を自動的に判定する学習モデルを記憶する。学習モデルは、必要に応じて算出部43に読み込まれる。 The data storage unit 134 includes a model storage unit 42. The model storage unit 42 stores learning models for automatically determining the presence or absence of chips. The learning model is read into the calculator 43 as needed.
 算出部43は、撮像画像に基づき、グリッド単位で切屑に関する所定項目の該当確率を算出する。具体的には、算出部43は、モデル学習部41で学習した学習モデルを用いて、グリッド領域について「切屑が多い(クラス2)」、「切屑が少ない(クラス1)」、「切屑がない(クラス0)」という3項目のどれに該当するかに関する確率を算出する。これらのクラスは、撮像エリアで認識された物質(切屑等)の存在度合いを示す。 The calculation unit 43 calculates the matching probability of a predetermined item related to chips on a grid-by-grid basis based on the captured image. Specifically, the calculation unit 43 uses the learning model learned by the model learning unit 41 to determine whether there are many chips (class 2), few chips (class 1), or no chips for the grid area. (Class 0)”, the probabilities relating to which of the three items it corresponds to are calculated. These classes indicate the degree of presence of material (such as chips) recognized in the imaging area.
 判定部44は、入力されたグリッド領域に関して算出部43が算出した確率から、当該グリッド領域の切屑がクラス0~2のどれに該当するかを判定する。判定部44は、グリッド領域に切屑があると判定(つまり、クラス2またはクラス1であると判定)した場合、グリッド画像における当該グリッド領域の位置に対応した撮像画像上の位置情報を含む自動検知信号を表示制御部160および噴射制御部162に出力する。 The determination unit 44 determines to which of classes 0 to 2 the chips in the grid area belong, based on the probability calculated by the calculation unit 43 for the input grid area. When the determination unit 44 determines that there is debris in the grid area (that is, determines that it is class 2 or class 1), automatic detection including position information on the captured image corresponding to the position of the grid area in the grid image A signal is output to display control unit 160 and injection control unit 162 .
 このようにして、認識部156は、ワークの加工中、または加工終了後、撮像部110で撮像された撮像画像に基づいて自動的に切屑を認識し、クーラントを噴射する自動洗浄を行うことができる。当該自動洗浄は、定期的に実行されてもよいし、作業者による指示など何らかの指示を与えることで実行されてもよい。 In this manner, the recognizing unit 156 automatically recognizes chips based on the captured image captured by the imaging unit 110 during or after the processing of the workpiece, and can perform automatic cleaning by injecting coolant. can. The automatic cleaning may be performed periodically, or may be performed by giving some instructions such as instructions from the operator.
 次に、本実施形態における補正方法について詳細に説明する。
 既に述べたように、上述した清掃制御システムを導入する場合、画面に表示される撮像エリアの座標と、ソフトウェアが認識する撮像エリアの座標とがずれる可能性がある。このため、本実施形態では、工作機械1の使用に先立って作業者(オペレータ)がカメラ30の角度を調整するなど、そのずれを解消する作業を行う。なお、説明の便宜上、両座標が一致した状態を「画面の整合状態」とも表現する。
Next, the correction method in this embodiment will be described in detail.
As already described, when the above-described cleaning control system is introduced, there is a possibility that the coordinates of the imaging area displayed on the screen and the coordinates of the imaging area recognized by the software may deviate. Therefore, in the present embodiment, prior to using the machine tool 1, the worker (operator) adjusts the angle of the camera 30 to eliminate the deviation. For convenience of explanation, the state in which the two coordinates match is also expressed as "screen matching state".
 図4に関連して説明したように、本実施形態のカメラ30は、光軸周りの回転が規制されているため、その調整には限界がある。このため、そのままではクーラントの噴射制御の目標に設定誤差を生じさせる可能性がある。そこで本実施形態では、補正部158がカメラ30による撮像画像を回転させることで目標の設定誤差を補正する。以下、その詳細について説明する。 As described with reference to FIG. 4, the camera 30 of the present embodiment is restricted in rotation around the optical axis, so there is a limit to its adjustment. Therefore, if left as it is, there is a possibility of causing a setting error in the target of the coolant injection control. Therefore, in the present embodiment, the correction unit 158 rotates the image captured by the camera 30 to correct the target setting error. The details will be described below.
 図7~図10は、画面表示の補正方法を表す図である。
 なお、本実施形態ではカメラ30が複数設けられるため、それぞれのカメラ30について補正が行われるが、便宜上、その一つについての補正を例に説明する。他のカメラ30については同様であるので、説明を省略する。
7 to 10 are diagrams showing the method of correcting the screen display.
In this embodiment, since a plurality of cameras 30 are provided, each camera 30 is corrected. Since the other cameras 30 are the same, description thereof is omitted.
 カメラ30は、予め定める撮像エリアが画角に入るように加工室2に設置される。撮像エリアは、切屑の飛散および堆積が予測されるエリアを含む。ここでは、加工室2にカメラ30が設置されたときに、表示部144の画面(参照画面170)に図7に示す撮像画像P0が表示されたとする。オペレータの操作入力により、画面表示を補正するためのメンテナンスモード(後述)に移行されると、表示制御部160は、撮像画像P0に重ねてベースラインBL(太線)および補助線AL(一点鎖線)を表示させる。 The camera 30 is installed in the processing chamber 2 so that the predetermined imaging area is included in the angle of view. The imaging area includes areas where swarf scattering and deposition is expected. Here, it is assumed that the captured image P0 shown in FIG. 7 is displayed on the screen (reference screen 170) of the display unit 144 when the camera 30 is installed in the processing chamber 2. FIG. When the operation input by the operator shifts to a maintenance mode (described later) for correcting the screen display, the display control unit 160 superimposes the baseline BL (thick line) and the auxiliary line AL (chain-dotted line) on the captured image P0. display.
 ベースラインBLは、画面が整合状態にあるときに加工室2における特定エリアSAの輪郭に沿うよう設定されたラインであり、基準情報格納部148に予め記憶されている。本実施形態では、斜面24およびパレット18のエッジによる境界線で囲まれる領域を特定エリアSAとしている。補助線ALは、本実施形態では撮像画像P0の中心Cを通る十字補助線である。加工室2には画面が整合状態にあるときにその中心Cが重なるマーカMが付されている。マーカMは、撮像エリアに予め設定された中心設定位置に付される。 The baseline BL is a line set to follow the outline of the specific area SA in the processing chamber 2 when the screen is in the aligned state, and is stored in the reference information storage unit 148 in advance. In this embodiment, the specific area SA is an area surrounded by a boundary line formed by the slope 24 and the edge of the pallet 18 . The auxiliary line AL is a cross-shaped auxiliary line passing through the center C of the captured image P0 in this embodiment. The processing chamber 2 is provided with a marker M on which the center C overlaps when the screens are aligned. The marker M is attached to the center setting position preset in the imaging area.
 図示の状態では、撮像画像P0の中心CがマーカMからずれている。また、特定エリアSAの輪郭がベースラインBLからずれている。このため、画面は整合状態にない。そこで、オペレータは、まず手作業でカメラ30の角度を可能な範囲で調整し、画面の整合状態に近づける。 In the illustrated state, the center C of the captured image P0 is shifted from the marker M. Also, the outline of the specific area SA is shifted from the baseline BL. Therefore, the screen is not in alignment. Therefore, the operator first manually adjusts the angle of the camera 30 within a possible range to bring the screens closer to matching.
 すなわち、オペレータがカメラ30をヨー軸L1周りおよびピッチ軸L2周りに回動させることで、図8(A)に示すように、撮像画像P0の中心CがマーカMに重なるようにする。ここまでは手作業で容易に行える。しかし、図示の例ではこの段階でも特定エリアSAの輪郭がベースラインBLからずれている。このため、カメラ30を光軸L周りに回動させることができればずれは解消できるが、本実施形態では上述のように、カメラ30の光軸L周りの回動が規制されている。 That is, the operator rotates the camera 30 around the yaw axis L1 and the pitch axis L2 so that the center C of the captured image P0 overlaps the marker M as shown in FIG. 8(A). Up to this point, it can be easily performed manually. However, in the illustrated example, the outline of the specific area SA is shifted from the baseline BL even at this stage. Therefore, if the camera 30 can be rotated around the optical axis L, the deviation can be eliminated. However, in this embodiment, the rotation of the camera 30 around the optical axis L is restricted as described above.
 そこで、図8(B)に示すように、補正部158が特定エリアSAの輪郭をベースラインBLに合わせるよう、撮像画像P0を中心Cの周りに(つまりカメラ30の光軸Lの回転方向に)回転させる。それにより、画面が整合状態となるようにし、クーラントの噴射制御における目標の設定誤差を補正する。ただし、このとき参照画面170に対して撮像画像P0が傾く結果、画面表示に違和感を生じさせる可能性がある。 Therefore, as shown in FIG. 8B, the captured image P0 is rotated around the center C (that is, in the rotation direction of the optical axis L of the camera 30) so that the correction unit 158 aligns the contour of the specific area SA with the baseline BL. ) rotate. Thereby, the screen is brought into a matching state, and the target setting error in the coolant injection control is corrected. However, as a result of the captured image P0 being tilted with respect to the reference screen 170 at this time, there is a possibility that the screen display will be uncomfortable.
 図9(A)に示すように、グリッド領域GAは、カメラ30の取り付け誤差がない理想的な状態を基準に設定されている。この理想的な状態において、撮像画像P0(点線参照)の縦横比(画面比率)は4:3、グリッド数は28×21とされている。より詳細には、縦方向に平行に配列された複数のグリッド線と、横方向に平行に配列された複数のグリッド線との交点により囲まれるそれぞれの領域がグリッドの一区画となる。各グリッド線は、参照画面170(表示画面)の枠と平行に表示される。また、参照画面170(表示画面)の枠とグリッド線とが重なるように区画してもよいが、図9(A)のように、参照画面170の枠の内側で枠に隣接する位置にグリッド線が表示される形態でもよい。参照画面170の枠の縦横比も4:3であるため、この理想的な状態においては、撮像画像P0の外枠が参照画面170の枠に接することとなる。 As shown in FIG. 9A, the grid area GA is set based on an ideal state in which there is no installation error of the camera 30. In this ideal state, the aspect ratio (screen ratio) of the captured image P0 (see dotted line) is 4:3, and the number of grids is 28×21. More specifically, each area surrounded by intersections of a plurality of grid lines arranged in parallel in the vertical direction and a plurality of grid lines arranged in parallel in the horizontal direction constitutes one section of the grid. Each grid line is displayed parallel to the frame of the reference screen 170 (display screen). Alternatively, the reference screen 170 (display screen) may be partitioned so that the frame and the grid lines overlap, but as shown in FIG. A form in which lines are displayed may be used. Since the aspect ratio of the frame of reference screen 170 is also 4:3, the outer frame of captured image P0 contacts the frame of reference screen 170 in this ideal state.
 本実施形態の撮像画像P0は、3584×2688の画素数を有する画像である。それを視覚的に理解しやすい正方形のグリッドに区画を行う。そうすると、本実施形態では、28×21個のグリッドに区画することができ、1つのグリッドあたり128×128の画素が存在することになる。
 また、参照画面170における各グリッドには、撮像画像P0を構成する複数の画素が含まれる。上記理想的な状態においては、グリッド領域GAの大きさと撮像画像P0の大きさとが同じである。そのため、グリッド領域GAの画素数と撮像画像P0の画素数とが一致し、グリッド領域GAには3584×2688の画素が含まれる。
The captured image P0 of this embodiment is an image having 3584×2688 pixels. It is divided into square grids that are easy to understand visually. Then, in this embodiment, it can be divided into 28×21 grids, and 128×128 pixels are present in each grid.
Each grid on the reference screen 170 includes a plurality of pixels forming the captured image P0. In the above ideal state, the size of the grid area GA and the size of the captured image P0 are the same. Therefore, the number of pixels of the grid area GA and the number of pixels of the captured image P0 match, and the grid area GA includes 3584×2688 pixels.
 本実施形態において、参照画面170と撮像画像P0はともに矩形状(長方形状)をなし、その縦横比は4:3とされている。グリッド領域GAは、参照画面170の枠の内側にグリッドの端部が接するように位置する。なお、1つのグリッドあたりの画素数については、例えば50×50~250×250の範囲で適宜設定できる。1つのグリッドは、堆積する切りくずが画像解析で判別できる程度の大きさに設定することが好ましい。 In this embodiment, both the reference screen 170 and the captured image P0 are rectangular (rectangular), and the aspect ratio is 4:3. The grid area GA is positioned so that the edges of the grid are in contact with the inside of the frame of the reference screen 170 . Note that the number of pixels per grid can be appropriately set within a range of, for example, 50×50 to 250×250. One grid is preferably set to a size that allows accumulated chips to be identified by image analysis.
 ところで、このような構成において仮に撮像画像P0を光軸周りに回転した場合、撮像画像P0の周縁部においてグリッドに過不足が生じる場合がある。グリッドの一つひとつに画像が対応しなければ、切屑の堆積度合いを正確に判定することはできない。 By the way, in such a configuration, if the captured image P0 is rotated around the optical axis, the excess or deficiency of the grid may occur in the peripheral portion of the captured image P0. If the image does not correspond to each grid, it is not possible to accurately determine the degree of chip accumulation.
 そこで、グリッド領域GAにおいて、画像が確実に含まれる領域(二点鎖線参照)を判定領域JAとし、その判定領域JAのみを切屑判定に利用することとする。すなわち、図9(B)に示すように、表示制御部160は、画角を回転した後の撮像画像P0の外周部を切り取る形で特定形状の領域を抽出する。以下、このとき抽出された画像の領域を「抽出領域P1」という。抽出領域P1は、判定領域JAに対応する画像であり、撮像画像P0の一部である。本実施形態では、「特定形状」を撮像画像P0と同じ縦横比(4:3)を有する長方形状とする。 Therefore, in the grid area GA, the area (see the two-dot chain line) in which the image is definitely included is set as the determination area JA, and only the determination area JA is used for chip determination. That is, as shown in FIG. 9B, the display control unit 160 extracts a region of a specific shape by cutting out the peripheral portion of the captured image P0 after rotating the angle of view. Hereinafter, the region of the image extracted at this time will be referred to as "extracted region P1". The extraction area P1 is an image corresponding to the determination area JA and is part of the captured image P0. In this embodiment, the "specific shape" is a rectangular shape having the same aspect ratio (4:3) as the captured image P0.
 具体的には、撮像画像P0の右上領域におけるグリッド領域GAの外側では、右から1番目から4番目のグリッドが占有面積50%以下となり、5番目から10番目のグリッドが占有面積50%より大きく80%以下となっている。このようなグリッドごとの占有面積を考慮して抽出領域P1を設定する。撮像画像P0において切り取られる外周部は、グリッドの区画において撮像画像P0が占有している部分が50%以下であるグリッドを含むグリッドの行またはグリッドの列である。 Specifically, outside the grid area GA in the upper right area of the captured image P0, the first to fourth grids from the right have an occupation area of 50% or less, and the fifth to tenth grids have an occupation area larger than 50%. 80% or less. The extraction area P1 is set in consideration of the area occupied by each grid. The outer perimeter cut in the captured image P0 is a grid row or grid column including a grid in which the captured image P0 occupies 50% or less of the grid section.
 なお、ここでいう「切り取り」は、撮像画像P0の外周部を消去して判定領域JAの画像部分を抽出領域P1のデータとして残すものでもよいし、外周部の消去はせずに判定領域JAの画像部分を抽出して抽出領域P1のデータとするものでもよい。 Note that the “cutting” referred to here may be performed by erasing the outer peripheral portion of the captured image P0 and leaving the image portion of the judgment area JA as the data of the extraction region P1, or by removing the judgment area JA without erasing the outer peripheral portion. may be extracted as the data of the extraction region P1.
 切屑判定の画像処理がグリッド単位で行われるため、表示制御部160は、撮像画像P0からの外周部の切り取りをグリッドの行および列の少なくとも一方を単位として実行する。図示の例では、撮像画像P0の回転角θが3度とされており、外周一列分(つまり上下端の2行と左右端の2列分)のグリッド画像と、撮像画像P0においてその外周一列分のグリッド画像が重なる部分が切り取られている。その結果、グリッド領域GA(抽出領域P1)のグリッド数は26×19となっている。1つのグリッドあたりの画素数は128×128で変化しないため、グリッド領域GAの画素数は3328×2432となる。 Since the image processing for shavings determination is performed in units of grids, the display control unit 160 cuts out the outer peripheral portion from the captured image P0 in units of at least one of rows and columns of the grid. In the illustrated example, the rotation angle θ of the captured image P0 is set to 3 degrees. The portion where the minute grid image overlaps is clipped. As a result, the number of grids in the grid area GA (extraction area P1) is 26×19. Since the number of pixels per grid remains unchanged at 128×128, the number of pixels in the grid area GA is 3328×2432.
 ただし、撮像画像P0の一部である抽出領域P1をそのまま表示すると、参照画面170よりも小さくなり、周囲に余白が形成されて違和感を生じさせる可能性がある。また、抽出領域P1を含む形で撮像画像P0を表示すると、グリッドの区画と参照画面170とが対応せず、作業者に違和感を生じさせる可能性がある。そこで、図10(A)に示すように、抽出領域P1を予め設定した表示サイズに拡大して参照画面170に表示させる。ここでは、元の撮像画像P0と同じ縮尺で拡大する。すなわち、撮像画像P0から外周部を切り取ることで参照画面170の枠よりも小さくなった抽出領域P1を、その外枠が参照画面170の枠に近接するように拡大して表示させる。このとき、抽出領域P1をその画素数を維持したまま拡大する。このため、参照画面170の枠内に表示される画像の画素数は、撮像画像P0の3584×2688から抽出領域P1の3328×2432に減少することとなる。 However, if the extraction area P1, which is a part of the captured image P0, is displayed as it is, it will become smaller than the reference screen 170, and a margin may be formed around it, which may cause discomfort. Moreover, if the captured image P0 is displayed in a form that includes the extraction region P1, the divisions of the grid do not correspond to the reference screen 170, which may cause the operator to feel uncomfortable. Therefore, as shown in FIG. 10A, the extraction area P1 is enlarged to a preset display size and displayed on the reference screen 170. FIG. Here, the image is enlarged to the same scale as the original captured image P0. That is, the extraction region P1, which is smaller than the frame of the reference screen 170 by cutting the outer periphery from the captured image P0, is enlarged and displayed so that the outer frame approaches the frame of the reference screen 170. FIG. At this time, the extraction area P1 is enlarged while maintaining the number of pixels. Therefore, the number of pixels of the image displayed within the frame of the reference screen 170 is reduced from 3584×2688 of the captured image P0 to 3328×2432 of the extraction region P1.
 その結果、図10(B)に示すように、抽出領域P1は参照画面170と同じサイズとなり、その長方形の辺が元の撮像画像P0の長方形の辺と同じ位置となるように表示される。すなわち、グリッド領域GA(つまり外周部の切り取り後のグリッド領域)におけるグリッドの端部が、参照画面170の枠に近接するように抽出領域P1が拡大される。 As a result, as shown in FIG. 10B, the extraction area P1 has the same size as the reference screen 170, and is displayed so that the sides of the rectangle are at the same positions as the sides of the rectangle of the original captured image P0. That is, the extraction area P<b>1 is enlarged so that the end of the grid in the grid area GA (that is, the grid area after the outer periphery is cut) approaches the frame of the reference screen 170 .
 なお、認識部156による物質(切屑)の認識および判定は、当初に設定したグリッド領域(グリッド数:26×19)で行われる。撮像画像P0の外周が切り取られることで、表示制御部160による画面表示は、その一部となることがある。 It should be noted that the recognition and determination of substances (chips) by the recognition unit 156 are performed in the initially set grid area (the number of grids: 26×19). The screen display by the display control unit 160 may be a part of the cutout of the captured image P0.
 以上の補正処理により、参照画面170に表示される撮像エリアの座標と、ソフトウェアが認識する撮像エリアの座標とを整合させることができ、クーラントを目標位置に向けて正確認の噴射させることができる。グリッド設定部152が設定するグリッドの位置と、オペレータが画像から認識するグリッドの位置とが一致するようになる。オペレータがタッチパネルを介して指示する位置はいずれかのグリッドに対応し、そのグリッドが重なる画像の領域がクーラントの噴射対象(目標位置)として設定される。 By the correction process described above, the coordinates of the imaging area displayed on the reference screen 170 can be matched with the coordinates of the imaging area recognized by the software, and the coolant can be jetted toward the target position for correct confirmation. . The position of the grid set by the grid setting unit 152 and the position of the grid recognized by the operator from the image are matched. The position indicated by the operator via the touch panel corresponds to one of the grids, and the area of the image where the grids overlap is set as the coolant injection target (target position).
 上述のようにグリッド領域の外周が切り取られて表示されたとしても、表示中のグリッドの位置と撮像エリアの実際の位置とが正確に対応する。このため、クーラントの噴射制御を高精度に保つことができる。 Even if the outer circumference of the grid area is cut off and displayed as described above, the position of the grid being displayed and the actual position of the imaging area correspond accurately. Therefore, coolant injection control can be maintained with high accuracy.
 図11は、オペレータが操作する画面の一例を表す図である。
 操作盤4のモニタには、図11(A)に示す画面200が表示される。画面200は、上述した参照画面170と操作画面202とを横並びに含む。操作画面202には、自動洗浄ボタン210、手動洗浄ボタン212、洗浄経路調整ボタン214、詳細設定ボタン216が設けられる。
FIG. 11 is a diagram showing an example of a screen operated by an operator.
A screen 200 shown in FIG. 11A is displayed on the monitor of the operation panel 4 . The screen 200 includes the above-described reference screen 170 and operation screen 202 side by side. The operation screen 202 is provided with an automatic cleaning button 210 , a manual cleaning button 212 , a cleaning path adjustment button 214 and a detail setting button 216 .
 自動洗浄ボタン210は、自動洗浄モードを実行する際に選択される。手動洗浄ボタン212は、手動洗浄モードを実行する際に選択される。洗浄経路調整ボタン214は、クーラントによる洗浄経路を調整する際に選択される。各ボタンを選択すると、それぞれプルダウンメニューが表示され、各洗浄モードにおける操作項目を選択できるが、その詳細については説明を省略する。 The automatic cleaning button 210 is selected when executing the automatic cleaning mode. A manual wash button 212 is selected when executing the manual wash mode. The cleaning path adjustment button 214 is selected when adjusting the cleaning path with coolant. When each button is selected, a pull-down menu is displayed, and an operation item in each cleaning mode can be selected, but detailed description thereof will be omitted.
 詳細設定ボタン216を選択すると、図示のようにメンテナンスボタン220、補助線表示ボタン222および画像回転操作部224が表示される。オペレータによりメンテナンスボタン220が選択されることでメンテナンスモードへ移行され、撮像画像P0に重ねてベースラインBLが表示される。また、補助線表示ボタン222がオンにされると、補助線ALが表示される。 When the detail setting button 216 is selected, a maintenance button 220, an auxiliary line display button 222, and an image rotation operation section 224 are displayed as shown. When the maintenance button 220 is selected by the operator, the maintenance mode is entered, and the baseline BL is displayed superimposed on the captured image P0. Also, when the auxiliary line display button 222 is turned on, the auxiliary line AL is displayed.
 オペレータは、参照画面170を見ながら、上述のように撮像画像P0の中心CがマーカMに重なるようにカメラ30のヨー角およびピッチ角を調整する。その後、画像回転操作部224の+ボタン又は-ボタンをタッチすることで撮像画像P0の回転角θを調整できる。補正部158は、オペレータにより+ボタンがタッチされるごとに撮像画像P0を時計回りに0.1度ずつ回転させ、-ボタンがタッチされるごとに撮像画像P0を反時計回りに0.1度ずつ回転させる。図11(B)に示すように、補助線表示ボタン222がオフにされると、補助線ALが非表示とされる。 While looking at the reference screen 170, the operator adjusts the yaw angle and pitch angle of the camera 30 so that the center C of the captured image P0 overlaps the marker M as described above. After that, by touching the + button or - button of the image rotation operation section 224, the rotation angle θ of the captured image P0 can be adjusted. The correction unit 158 rotates the captured image P0 clockwise by 0.1 degrees each time the + button is touched by the operator, and rotates the captured image P0 counterclockwise by 0.1 degrees each time the - button is touched. rotate each. As shown in FIG. 11B, when the auxiliary line display button 222 is turned off, the auxiliary line AL is hidden.
 図12は、補正処理の流れを表すフローチャートである。
 本処理は、オペレータによるメンテナンスボタン220の選択を契機に実行される。
 表示制御部160は、メンテナンス画面として参照画面170に撮像画像P0を表示させ(S10)、ベースラインBLを重ねるように表示させる(S12)。
FIG. 12 is a flowchart showing the flow of correction processing.
This process is executed when the operator selects the maintenance button 220 .
The display control unit 160 displays the captured image P0 on the reference screen 170 as a maintenance screen (S10), and displays the baseline BL so as to overlap it (S12).
 このとき、オペレータにより補助線表示ボタン222がオンにされると(S14のY)、表示制御部160は、補助線ALを表示させる(S16)。補助線表示ボタン222がオフであれば(S14のN)、補助線ALを非表示とする(S18)。画像回転操作部224の操作により画像回転指示がなされると(S20のY)、表示制御部160は、撮像画像P0を回転させる(S22)。 At this time, when the operator turns on the auxiliary line display button 222 (Y in S14), the display control unit 160 displays the auxiliary line AL (S16). If the auxiliary line display button 222 is off (N of S14), the auxiliary line AL is hidden (S18). When an image rotation instruction is issued by operating the image rotation operation unit 224 (Y of S20), the display control unit 160 rotates the captured image P0 (S22).
 このようにして特定エリアSAの輪郭がベースラインBLに重なり、オペレータが図示略の確定ボタンを選択すると(S24のY)、補正部158は、判定領域JAを算出する(S26)。すなわち、撮像画像P0の傾きに応じてグリッド領域GAにおいて画像が確実に含まれる領域を判定領域JAとして設定する。この判定領域JAは、グリッド単位で切屑判定が実行される「AI推論領域」として機能する。 When the contour of the specific area SA thus overlaps the baseline BL and the operator selects an unillustrated confirm button (Y in S24), the correction unit 158 calculates the determination area JA (S26). That is, an area in which the image is surely included in the grid area GA is set as the judgment area JA according to the inclination of the captured image P0. This determination area JA functions as an "AI inference area" in which chip determination is performed on a grid-by-grid basis.
 表示制御部160は、その判定領域JAに基づき、撮像画像P0から外周部を切り取る形で抽出領域P1を抽出し(S28)、その抽出領域P1を調整(拡大)して参照画面170に表示させる(S30)。補正部158は、この一連の補正情報をキャリブレーションデータとして補正情報格納部146に格納する(S32)。この補正情報として、撮像画像P0の設定角度(回転角θ)、判定領域JAの設定、抽出領域P1の拡大率(設定倍率)等が含まれる。確定ボタンを選択されなければ(S24のN)、S26~S32の処理をスキップする。 Based on the determination area JA, the display control unit 160 extracts the extraction area P1 by cutting out the outer periphery from the captured image P0 (S28), adjusts (enlarges) the extraction area P1, and displays it on the reference screen 170. (S30). The correction unit 158 stores this series of correction information as calibration data in the correction information storage unit 146 (S32). This correction information includes the set angle (rotational angle θ) of the captured image P0, the setting of the determination area JA, the enlargement ratio (set magnification) of the extraction area P1, and the like. If the confirmation button is not selected (N of S24), the processing of S26-S32 is skipped.
 そして、オペレータにより他のボタンが選択されるなど、予め定めるメンテナンス終了条件が成立すると(S34のY)、表示制御部160は、メンテナンス画面の表示を終了する(S36)。メンテナンス終了条件が成立していなければ(S34のN)、S14に戻る。 Then, when a predetermined maintenance termination condition is met, such as when the operator selects another button (Y in S34), the display control unit 160 terminates the display of the maintenance screen (S36). If the maintenance end condition is not satisfied (N of S34), the process returns to S14.
 図13は、清掃制御の流れを概略的に表すフローチャートである。
 自動洗浄モードおよび手動洗浄モードのいずれかにおいて清掃制御が開始されると、取得部150が、撮像画像P0を取得する(S40)。続いて、補正部158が、補正情報格納部146に格納された補正情報(キャリブレーションデータ)を読み出し(S42)、撮像画像P0に対して内部処理的に上述した補正処理を反映させる。
FIG. 13 is a flow chart schematically showing the flow of cleaning control.
When cleaning control is started in either the automatic cleaning mode or the manual cleaning mode, the acquisition unit 150 acquires the captured image P0 (S40). Subsequently, the correction unit 158 reads the correction information (calibration data) stored in the correction information storage unit 146 (S42), and internally reflects the correction process described above on the captured image P0.
 すなわち、補正部158は、撮像画像P0を設定角度で回転させ(S44)、判定領域を設定し(S46)、抽出画像を抽出する(S48)。続いて、設定倍率で抽出画像を拡大し(S50)、それを撮像画像として画面に表示させる(S52)。そして、オペレータの操作入力により切屑堆積状況の表示指示がなされれば(S54のY)、表示制御部160は、撮像画像に重ねてグリッド画像を表示し(S56)、さらに切屑の堆積状況を表示する(S58)。この切屑堆積状況は、認識部156が判定したクラスに応じて色分けなどの手段により表示される。 That is, the correction unit 158 rotates the captured image P0 at a set angle (S44), sets a determination region (S46), and extracts an extraction image (S48). Subsequently, the extracted image is enlarged by the set magnification (S50) and displayed on the screen as a captured image (S52). Then, when the operator gives an instruction to display the chip accumulation state (Y in S54), the display control unit 160 displays the grid image superimposed on the captured image (S56), and further displays the chip accumulation state. (S58). This chip accumulation state is displayed by means of color coding or the like according to the class determined by the recognition unit 156 .
 表示された画像に基づいてオペレータがクーラントの噴射位置(目標)を指示すると(S60のY)、検知部154がこれを検知する。噴射制御部162は、その噴射位置に基づいてクーラントの噴射経路を設定し(S62)、その噴射経路に基づいてクーラントを噴射するよう噴射指令を加工制御装置102へ出力する(S64)。 When the operator instructs the coolant injection position (target) based on the displayed image (Y in S60), the detection unit 154 detects this. The injection control unit 162 sets a coolant injection route based on the injection position (S62), and outputs an injection command to the machining control device 102 to inject the coolant based on the injection route (S64).
 そして、オペレータにより他のボタンが選択されるなど、予め定める洗浄モード終了条件が成立すると(S66のY)、表示制御部160は、洗浄操作画面の表示を終了する(S68)。洗浄モード終了条件が成立していなければ(S66のN)、S40に戻る。 Then, when a predetermined condition for ending the cleaning mode is met, such as when the operator selects another button (Y in S66), the display control unit 160 terminates the display of the cleaning operation screen (S68). If the cleaning mode end condition is not satisfied (N of S66), the process returns to S40.
 以上、実施形態に基づいて工作機械について説明した。
 本実施形態では、加工室2におけるカメラ30の設置に取り付け誤差があった場合、作業者(オペレータ)が撮像画像P0を確認しつつ、カメラ30の角度(ヨー角、ピッチ角)を微調整できる。カメラ30は構造上、光軸周りの回転(ロール角)を調整できないが、撮像画像P0を回転させることでその補正を行うことができる。すなわち、本実施形態によれば、工作機械1におけるカメラ30の角度調整に制約があったとしても、画面表示される撮像エリアとソフトウェアが認識する撮像エリアとの整合を図ることができる。このため、オペレータが画面に表示された撮像画像に基づいてクーラントによる切屑の洗浄指示を行う場合であっても、その洗浄指示位置とクーラントの噴射位置とが整合する。つまり、洗浄制御の精度を高く維持することができる。
The machine tool has been described above based on the embodiment.
In this embodiment, if there is an installation error in the installation of the camera 30 in the processing chamber 2, the operator (operator) can finely adjust the angle (yaw angle, pitch angle) of the camera 30 while checking the captured image P0. . Although the camera 30 cannot adjust the rotation (roll angle) around the optical axis due to its structure, it can be corrected by rotating the captured image P0. That is, according to the present embodiment, even if there are restrictions on the angle adjustment of the camera 30 in the machine tool 1, it is possible to match the imaging area displayed on the screen with the imaging area recognized by the software. Therefore, even when the operator gives an instruction to wash chips with coolant based on the captured image displayed on the screen, the washing instruction position and the coolant injection position match. That is, it is possible to maintain high accuracy of cleaning control.
 また、本実施形態では、撮像画像P0を補正する際に、回転により傾斜した撮像画像P0から外周部を切り取る形で抽出領域P1を抽出し、予め設定した表示サイズに拡大して表示させるようにした。このため、画面に表示される撮像画像の枠形状を長方形に維持したまま大きさを画面に整合させることができる。このため、補正の有無により(つまり設置するカメラの個体差により)撮像画像の見え方が大きく変化することがなく、補正後の画像によりオペレータに違和感を与えることもない。 In addition, in the present embodiment, when correcting the captured image P0, the extracted region P1 is extracted by cutting out the outer peripheral portion from the captured image P0 tilted by rotation, and is enlarged to a preset display size and displayed. bottom. Therefore, the frame shape of the captured image displayed on the screen can be matched to the screen while maintaining the rectangular frame shape. Therefore, the presence or absence of correction (that is, due to individual differences in installed cameras) does not significantly change the appearance of the captured image, and the corrected image does not cause the operator to feel uncomfortable.
[変形例]
 上記実施形態では、工作機械1を複合加工機として説明したが、ターニングセンタであってもよいし、マシニングセンタであってもよい。また、材料(例えば金属粉末)をレーザで溶かしながら加工する付加加工の機械であってもよい。その場合、加工時に飛散する材料が、物質認識部により認識される「物理量」となる。
[Modification]
In the above embodiment, the machine tool 1 is described as a multitasking machine, but it may be a turning center or a machining center. It may also be an additional processing machine that processes a material (for example, metal powder) while melting it with a laser. In that case, the material that scatters during processing becomes the "physical quantity" recognized by the substance recognition unit.
 上記実施形態では、切屑除去のために噴射される流体としてクーラントを例示した。変形例においては、クーラント以外の液体(洗浄液)であってもよいし、空気のような気体であってもよい。その場合、液体噴射部に代えて気体を噴射する気体噴射部を設ける。 In the above embodiment, the coolant is exemplified as the fluid that is injected for chip removal. In a modification, liquid (cleaning liquid) other than coolant or gas such as air may be used. In that case, a gas injection section for injecting gas is provided instead of the liquid injection section.
 上記実施形態では、補正処理において撮像画像P0の回転角θが3度となる例を示したが(図8(B))、回転角θはカメラ30の設置状況に応じて変化することは言うまでもない。抽出領域P1の縦横比を4:3とした場合、例えばθ=1°の場合にグリッド数を27×20、θ=2°の場合にグリッド数を26×20、θ=4°の場合にグリッド数を25×19、θ=5°の場合にグリッド数を25×18とすることができる。 In the above embodiment, an example was shown in which the rotation angle θ of the captured image P0 was 3 degrees in the correction process (FIG. 8B). stomach. When the aspect ratio of the extraction region P1 is 4:3, for example, when θ=1°, the number of grids is 27×20, when θ=2°, the number of grids is 26×20, and when θ=4°, When the number of grids is 25×19 and θ=5°, the number of grids can be 25×18.
 図14は、撮像画像P0の回転角θを3.3度としたときの補正処理を表す図である。
 この例では、撮像画像P0は、回転角θ=0°であれば参照画面170の枠よりやや大きな形状を有し、その縦横比は参照画面170と同じである(この例では4:3)。撮像画像P0の画素数は、3584×2688となる。参照画面170(表示画面)の画素数は3337×2500となり、撮像画像P0の画素数よりも少ない。図14に示すように、参照画面170に対する撮像画像P0の回転角θ=3.3°の場合、撮像画像P0の全体を参照画面170に収めることができない。このため、撮像画像P0の外周部を切り取り、参照画面170を最大矩形として収まる撮像画像P0の一部の領域を抽出領域P1として抽出する。言い換えれば、抽出領域P1は、回転後の撮像画像P0の外周部を切り取った残余の部分画像である。
FIG. 14 is a diagram showing correction processing when the rotation angle θ of the captured image P0 is 3.3 degrees.
In this example, the captured image P0 has a shape slightly larger than the frame of the reference screen 170 when the rotation angle θ=0°, and its aspect ratio is the same as that of the reference screen 170 (4:3 in this example). . The number of pixels of the captured image P0 is 3584×2688. The number of pixels of the reference screen 170 (display screen) is 3337×2500, which is smaller than the number of pixels of the captured image P0. As shown in FIG. 14 , when the rotation angle θ of the captured image P0 with respect to the reference screen 170 is 3.3°, the entire captured image P0 cannot fit within the reference screen 170 . Therefore, the peripheral portion of the captured image P0 is cut off, and a partial region of the captured image P0 that fits within the reference screen 170 as the maximum rectangle is extracted as an extraction region P1. In other words, the extraction region P1 is a residual partial image obtained by cutting the outer peripheral portion of the captured image P0 after rotation.
 抽出領域P1の輪郭は、グリッド領域GAの輪郭と重なる。すなわち、参照画面170に表示され撮像画像P0に重畳するグリッドのうち、グリッドの端部が含まれる領域、つまり1区画領域の全体が含まれるグリッドを内方に有する領域をグリッド領域GAとする。言い換えれば、グリッド並びの行および列を単位としてグリッド領域GAが残される。この場合、グリッド数は26×19となる。このグリッド領域GAは、グリッド単位で所定の判定処理(学習モデルを用いた切屑判定等の処理)がなされる領域である。本変形例では、このグリッド領域GAを抽出領域P1とする。抽出領域P1は参照画面170(参照画面)の内側にあるため、抽出領域P1の画素数は参照画面170の画素数よりも少なく、3328×2432となる。 The outline of the extraction area P1 overlaps the outline of the grid area GA. That is, of the grids displayed on the reference screen 170 and superimposed on the captured image P0, the grid area GA is an area that includes the ends of the grid, that is, the area that includes the entire grid area. In other words, the grid area GA is left in units of rows and columns of the grid arrangement. In this case, the number of grids is 26×19. This grid area GA is an area in which predetermined determination processing (processing such as chip determination using a learning model) is performed in grid units. In this modified example, this grid area GA is defined as an extraction area P1. Since the extraction area P1 is inside the reference screen 170 (reference screen), the number of pixels of the extraction area P1 is smaller than the number of pixels of the reference screen 170, which is 3328×2432.
 このように撮像画像P0の外周部を切り取ることで参照画面170の枠よりも小さくなった抽出領域P1を、その抽出領域P1の輪郭が参照画面170の枠に向かって拡大するように表示させる(図10参照)。表示制御部160は、抽出領域P1の上部の輪郭が参照画面170の枠と重なる又は接する位置まで移動した際に、抽出領域P1の拡大処理を止める。本変形例では、抽出領域P1の輪郭をなす2組の平行な二辺のうち一方が参照画面170の枠に接し、他方が参照画面170の枠の内側に位置するように抽出領域P1が拡大される。なお、視覚的には、上の枠があっていれば、下と左右がブランクになっていても、多少の違和感が軽減される。このとき、抽出領域P1をその画素数を維持したまま拡大するため、参照画面170の枠内に表示される画像の画素数は減少する。 The extraction area P1, which is smaller than the frame of the reference screen 170 by cutting the outer periphery of the captured image P0 in this way, is displayed so that the outline of the extraction area P1 expands toward the frame of the reference screen 170 ( See Figure 10). The display control unit 160 stops enlarging the extraction region P1 when the upper outline of the extraction region P1 moves to a position where it overlaps or touches the frame of the reference screen 170 . In this modification, the extraction region P1 is enlarged so that one of the two pairs of parallel sides forming the outline of the extraction region P1 is in contact with the frame of the reference screen 170 and the other is positioned inside the frame of the reference screen 170. be done. Visually, if the upper frame is in place, even if the bottom and left and right are blank, the sense of incongruity can be reduced somewhat. At this time, since the extraction area P1 is enlarged while maintaining the number of pixels thereof, the number of pixels of the image displayed within the frame of the reference screen 170 is reduced.
 なお、ここでいう「撮像画像P0の外周部を切り取る形での抽出領域P1の抽出と拡大」は、その外周部を非表示にしたうえで拡大する形態であってもよいし、抽出領域P1を拡大した結果、外周部が参照画面170に表示されない形態であってもよい。 Note that the “extraction and enlargement of the extraction region P1 by cutting out the outer periphery of the captured image P0” referred to here may be a form in which the outer periphery is hidden and then enlarged. may not be displayed on the reference screen 170 as a result of enlarging the .
 上記実施形態では、情報処理装置100を工作機械1の内部コンピュータとし、操作盤4と一体に構成する例を示したが、操作盤4とは独立に構成してもよい。その場合、情報処理装置は、操作盤のモニタをリモートディスクトップとし、表示部として機能させてもよい。あるいは、情報処理装置を工作機械に接続される外部コンピュータとしてもよい。その場合、情報処理装置は、一般的なラップトップPC(Personal Computer)あるいはタブレット・コンピュータであってもよい。 In the above embodiment, the information processing device 100 is used as the internal computer of the machine tool 1, and an example is shown in which it is configured integrally with the operation panel 4, but it may be configured independently of the operation panel 4. In that case, the information processing apparatus may use the monitor of the control panel as a remote desktop to function as a display unit. Alternatively, the information processing device may be an external computer connected to the machine tool. In that case, the information processing device may be a general laptop PC (Personal Computer) or a tablet computer.
 図15は、変形例に係る情報処理システムの構成を表す図である。なお、上記実施形態と同様の構成については同一の符号を付し、その詳細な説明を省略する。
 本変形例では、情報処理装置が工作機械の外部に設置される。すなわち、情報処理システムは、工作機械301、データ処理装置310およびデータ格納装置312を備える。データ処理装置310が「情報処理装置」として機能する。
FIG. 15 is a diagram showing the configuration of an information processing system according to a modification. In addition, the same code|symbol is attached|subjected about the structure similar to the said embodiment, and the detailed description is abbreviate|omitted.
In this modification, the information processing device is installed outside the machine tool. That is, the information processing system includes a machine tool 301 , a data processing device 310 and a data storage device 312 . The data processing device 310 functions as an "information processing device".
 工作機械301は、加工制御装置102、加工装置104、工具交換部106、工具格納部108、データ処理部330、操作盤304および撮像部110を備える。データ処理部330は、通信部332、グリッド設定部152、検知部154、認識部156および噴射制御部162を含む。通信部332は、受信部および送信部を有し、データ処理装置310およびデータ格納装置312を含む外部装置との通信を担当する。 The machine tool 301 includes a machining control device 102 , a machining device 104 , a tool changer 106 , a tool storage 108 , a data processor 330 , an operation panel 304 and an imaging unit 110 . Data processing unit 330 includes communication unit 332 , grid setting unit 152 , detection unit 154 , recognition unit 156 and injection control unit 162 . Communication unit 332 has a receiving unit and a transmitting unit, and is in charge of communication with external devices including data processing device 310 and data storage device 312 .
 操作盤304は、ユーザインタフェース処理部130およびデータ処理部334を含む。データ処理部334は、オペレータの操作入力に基づいて加工制御装置102に制御指令を出力する。また、オペレータの操作入力に応じて表示部144に表示される画面を制御する。 The operation panel 304 includes a user interface processing section 130 and a data processing section 334 . The data processing unit 334 outputs a control command to the processing control device 102 based on the operation input by the operator. Further, the screen displayed on the display unit 144 is controlled according to the operation input by the operator.
 データ処理装置310は、通信部320、補正部158および表示制御部160を含む。通信部320は、受信部および送信部を有し、工作機械301との通信を担当する。データ格納装置312は、補正情報格納部146および基準情報格納部148を含む。データ格納装置312は、本変形例ではデータ処理装置310と有線接続されているが、無線接続されてもよい。他の変形例においては、データ格納装置312は、データ処理装置310の一部として組み込まれてもよい。 The data processing device 310 includes a communication section 320 , a correction section 158 and a display control section 160 . The communication unit 320 has a receiving unit and a transmitting unit and takes charge of communication with the machine tool 301 . Data storage device 312 includes correction information storage section 146 and reference information storage section 148 . The data storage device 312 is wired to the data processing device 310 in this modification, but may be wirelessly connected. In other variations, data storage device 312 may be incorporated as part of data processing device 310 .
 本変形例ではこのように、上記実施形態における情報処理装置100の機能を工作機械の内部と外部に分ける形で実現する。このような構成によっても上記実施形態と同様の効果を得ることができる。 In this modification, the functions of the information processing apparatus 100 in the above embodiment are realized by dividing them into the inside and the outside of the machine tool. With such a configuration, it is possible to obtain the same effects as those of the above-described embodiment.
 上記実施形態では述べなかったが、グリッドの大きさおよび形状は、必要に応じて変更できるように構成されてもよい。その場合も、撮像画像の外周部をグリッド単位で切り取って抽出画像を得るようにするとよい。 Although not described in the above embodiment, the size and shape of the grid may be configured to be changeable as needed. Also in this case, it is preferable to obtain an extracted image by cutting out the outer peripheral portion of the captured image in units of grids.
 上記実施形態では、カメラ30の取り付け誤差を解消するために、作業員(オペレータ)が手作業で30のヨー軸周りおよびピッチ軸周りの角度を調整する例を示した。変形例においては、カメラの角度制御に関し、ヨー軸周りの第1の回動と、ピッチ軸周りの第2の回動とを制御する駆動制御部を設けてもよい。駆動制御部は、補正部による補正に先立ってカメラの光軸が撮像エリアに予め設定された中心設定位置(マーカM)に位置するよう、第1の回動および第2の回動を制御する。 In the above embodiment, an example was shown in which a worker (operator) manually adjusts the angles around the yaw axis and the pitch axis of the camera 30 in order to eliminate the mounting error of the camera 30 . In a modification, regarding the angle control of the camera, a drive control section may be provided that controls the first rotation about the yaw axis and the second rotation about the pitch axis. The drive control unit controls the first rotation and the second rotation so that the optical axis of the camera is positioned at a center setting position (marker M) preset in the imaging area prior to correction by the correction unit. .
 上記実施形態では、カメラ30の光軸周りの規制状態として、カメラ30が光軸周りに回転できない構成を例示した。変形例においては、カメラ30が光軸周りに所定量回転できるものの、その回転角が制限された状態でもよい。あるいは、カメラ30そのものは回転可能な構造であるものの、周囲の構造物との干渉を避けるためにその回転が制限されている状態でもよい。 In the above-described embodiment, the configuration in which the camera 30 cannot rotate around the optical axis is exemplified as the restricted state around the optical axis of the camera 30 . In a modification, the camera 30 may be rotated around the optical axis by a predetermined amount, but the angle of rotation may be limited. Alternatively, the camera 30 itself may have a rotatable structure, but its rotation may be restricted to avoid interference with surrounding structures.
 上記実施形態では、図7および図8に示したように、撮像画像P0の中心Cとカメラ30の光軸Lとが一致し、光軸Lを中心に撮像画像P0を回転させる構成を例示した。変形例においては、撮像画像を光軸とは無関係に回転させてもよい。その場合も、補正部は、特定エリアの輪郭をベースラインに合わせるよう撮像画像を回転させる。具体的には、撮像画像の面に直交し、かつ、撮像画像の中心を通る直交軸を中心に回転させる形態が好ましい。 In the above embodiment, as shown in FIGS. 7 and 8, the configuration in which the center C of the captured image P0 and the optical axis L of the camera 30 are aligned and the captured image P0 is rotated around the optical axis L is illustrated. . In a modified example, the captured image may be rotated independently of the optical axis. Also in this case, the correction unit rotates the captured image so that the contour of the specific area is aligned with the baseline. Specifically, it is preferable to rotate around an orthogonal axis that is orthogonal to the plane of the captured image and passes through the center of the captured image.
 図16は、変形例に係る画像処理方法を表す図である。
 本変形例では、図16上段に示す撮像画像を図16下段に示すように補正することができる。すなわち、グリッド設定部152は、撮像部110から受信した撮像画像の予め設定された位置にグリッド線を重ねる処理(グリッド線分割処理)を行う。そのように処理すると、グリッド線と撮像画像P10との重なりが図16上段の図のような関係になる場合がある。このグリット線と撮像画像P10とを重ねたデータは、撮像画像P10と重なる最外周のグリッドについて4辺のグリッド線が表れないグリッド領域が存在する。このような場合に、表示制御部160は、4辺すべてが撮像画像と重ならないグリッド領域を無視し、4辺すべてが撮像画像P10と重なるグリッド区画だけを参照画面の大きさにあわせて表示する処理を行うことが可能である(図16下段)。また、グリッド設定部152が、4辺が撮像画像P10と重ならないグリッド領域(または4辺が表れないグリッド領域)があることを検出し、4辺が撮像画像P10と重なるように網目状にあるグリッド線全体を上下左右に動かして調整するような処理を行ってもよい。
FIG. 16 is a diagram showing an image processing method according to a modification.
In this modified example, the captured image shown in the upper part of FIG. 16 can be corrected as shown in the lower part of FIG. That is, the grid setting unit 152 performs processing (grid line division processing) to superimpose grid lines on preset positions of the captured image received from the imaging unit 110 . When such processing is performed, the overlapping relationship between the grid lines and the captured image P10 may be as shown in the upper diagram of FIG. In the data obtained by superimposing the grid lines and the captured image P10, there is a grid area where grid lines do not appear on the four sides of the outermost grid overlapping the captured image P10. In such a case, the display control unit 160 ignores the grid area whose four sides do not overlap the captured image, and displays only the grid area whose four sides overlap the captured image P10 according to the size of the reference screen. Processing can be performed (bottom of FIG. 16). In addition, the grid setting unit 152 detects that there is a grid area whose four sides do not overlap the captured image P10 (or a grid area whose four sides do not appear), and the grid is formed in a mesh shape so that the four sides overlap the captured image P10. A process of adjusting by moving the entire grid line up, down, left, or right may be performed.
 上記実施形態では、撮像部による撮像対象として、撮像エリアに存在する切屑を例示したが、加工に伴って発生する材料や汚染物質などの物理量であれば特に限定されない。 In the above embodiment, chips present in the imaging area were exemplified as an object to be imaged by the imaging unit, but there is no particular limitation as long as it is a physical quantity such as a material or contaminant generated along with processing.
 上記実施形態では、撮像画像を回転させた後、その外周部を切り取る形で撮像画像の一部を抽出した。そして、その抽出領域をその輪郭が参照画面の枠に向かって拡大するように表示させる例を示した。変形例においては、撮像画像を回転させることなく、その外周部を切り取る形で撮像画像の一部を抽出し、その抽出領域をその輪郭が参照画面の枠に向かって拡大するように表示させてもよい。例えば、撮像対象となる物理量が撮像画像の中央寄りに存在する場合、つまり外周部には無視できる程度にしか存在しない場合、外周部を切り取ってもよい。その切り取りによりオペレータに違和感を与えないよう、抽出領域を同様に拡大して表示させてもよい。 In the above embodiment, after rotating the captured image, a part of the captured image is extracted by cutting the outer periphery. Then, an example is shown in which the extraction area is displayed such that its outline expands toward the frame of the reference screen. In the modified example, a part of the captured image is extracted by cutting off the outer periphery without rotating the captured image, and the extraction area is displayed so that the contour expands toward the frame of the reference screen. good too. For example, when the physical quantity to be imaged exists near the center of the captured image, that is, when it exists in the outer peripheral portion to a negligible extent, the outer peripheral portion may be cut off. The extracted area may be enlarged and displayed in the same manner so that the operator does not feel uncomfortable due to the clipping.
 上記実施形態および本変形例は、撮像画像の外周部が切り取られる形で撮像画像の一部を抽出する処理がなされることを前提とし、その場合に、抽出領域の輪郭が画面枠よりも小さくなることでオペレータに違和感を与えることを防止できる。つまり、その違和感を与えないという課題を解決できる。そのために「その抽出領域をその輪郭が参照画面の枠に向かって拡大するように表示する」という処理を行っている。違和感を与える要因が「撮像画像の外周部を切り取ること」にあるため、撮像画像が切屑の除去用に取得されたものであるかどうかは必須でなくもてよく、本発明の適用例の一つであってよい。 The above-described embodiment and this modification are based on the premise that processing is performed to extract a portion of the captured image in such a manner that the peripheral portion of the captured image is cut off. Therefore, it is possible to prevent the operator from feeling uncomfortable. That is, it is possible to solve the problem of not giving the sense of discomfort. For this reason, a process of "displaying the extraction area so that its outline expands toward the frame of the reference screen" is performed. Since the factor that gives discomfort is "cutting off the outer peripheral part of the captured image", it may not be essential whether the captured image is obtained for removing chips, which is one of the application examples of the present invention. can be one.
 また、「外周部は、グリッドの区画において撮像画像が占有している部分が50%以下であるグリッドを含む」という中途半端な領域を表示しないことにより、グリッド区画の大きさが一定であり、その対象領域も同じであるため、オペレータが違和感を受けることなく、画面を見ながら操作ができる。また、同じグリッドの大きさであれば、工作機械の内部構造も同じ大きさで見ることができるため、オペレータに違和感を与えることがない。切屑の堆積状況を認識する用途に用いる場合は、さらに、その効果が顕著になる。 In addition, by not displaying an incomplete area such as "the peripheral portion includes a grid in which the captured image occupies 50% or less of the grid section", the size of the grid section is constant, Since the target area is also the same, the operator can operate while viewing the screen without feeling discomfort. Moreover, if the size of the grid is the same, the internal structure of the machine tool can also be seen in the same size, so the operator does not feel uncomfortable. When used for the purpose of recognizing the state of accumulation of chips, the effect becomes even more pronounced.
 なお、撮像画像に基づく認識の対象が色や温度であっても、中途半端な領域に色や温度を表示するとオペレータは違和感を覚えると考えられる。つまり、色や温度等を表示する場合であっても、上記課題は解決できる。また、切屑の堆積状況は、工作機械の内部構造により堆積箇所が変わるが、工作機械の内部構造の撮影画像であっても、画面に表示される大きさが変われば、オペレータは違和感を覚えると考えられる。したがって、上記実施形態および変形例によれば、上記課題を解決できる。 It should be noted that even if the object of recognition based on the captured image is color or temperature, the operator will feel uncomfortable if the color or temperature is displayed in an incomplete area. In other words, the above problems can be solved even when colors, temperatures, and the like are displayed. In addition, the accumulation of chips varies depending on the internal structure of the machine tool, but even if it is a photographed image of the internal structure of the machine tool, if the size displayed on the screen changes, the operator will feel uncomfortable. Conceivable. Therefore, according to the above embodiments and modifications, the above problems can be solved.
 なお、本発明は上記実施形態や変形例に限定されるものではなく、要旨を逸脱しない範囲で構成要素を変形して具体化することができる。上記実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることにより種々の発明を形成してもよい。また、上記実施形態や変形例に示される全構成要素からいくつかの構成要素を削除してもよい。 It should be noted that the present invention is not limited to the above embodiments and modifications, and can be embodied by modifying the constituent elements without departing from the scope of the invention. Various inventions may be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments and modifications. Also, some constituent elements may be deleted from all the constituent elements shown in the above embodiments and modifications.
 この特許出願は、日本の特願2021-144425(2021年9月6日出願)および特願2022-072847(2022年4月27日出願)の優先権を主張し、その全体が参照により本明細書に組み込まれるものとする。 This patent application claims priority to Japanese Patent Application No. 2021-144425 (filed on September 6, 2021) and Japanese Patent Application No. 2022-072847 (filed on April 27, 2022), the entirety of which is incorporated herein by reference. shall be incorporated into this document.

Claims (9)

  1.  工作機械に設置される撮像部による撮像エリアの撮像画像を取得する取得部と、
     撮像画像を複数の画素を含む大きさのグリッドに区分するグリッド設定部と、
     前記グリッド並びの行および列を単位として、前記撮像画像の外周部を切り取る形で抽出した、前記撮像画像の一部である抽出領域を表示部の参照画面に表示させる表示制御部と、
     を備え、
     前記表示制御部は、前記撮像画像から外周部を切り取ることで前記参照画面の枠よりも小さくなった前記抽出領域を、前記抽出領域の輪郭が前記参照画面の枠に向かって拡大するよう前記表示部に表示させる、情報処理装置。
    an acquisition unit that acquires an image of an imaging area captured by an imaging unit installed in a machine tool;
    a grid setting unit that divides the captured image into grids each having a size that includes a plurality of pixels;
    a display control unit for displaying, on a reference screen of a display unit, an extraction region that is a part of the captured image, which is extracted by cutting out the outer peripheral portion of the captured image in units of rows and columns of the grid arrangement;
    with
    The display control unit displays the extraction area, which has become smaller than the frame of the reference screen by cutting the outer peripheral portion from the captured image, such that the outline of the extraction area expands toward the frame of the reference screen. Information processing device to display on the part.
  2.  前記抽出領域に基づき、前記撮像エリアにおける予め定める物理量を前記グリッド単位で認識する認識部をさらに備える、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, further comprising a recognition unit that recognizes a predetermined physical quantity in the imaging area in units of the grid based on the extraction region.
  3.  前記表示制御部は、前記撮像エリア内で認識された前記物理量を示す表示を前記抽出領域に重ねて表示させる、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the display control unit causes a display indicating the physical quantity recognized within the imaging area to be superimposed on the extraction region.
  4.  前記参照画面の枠が長方形であるとともに前記抽出領域も長方形であり、
     前記表示制御部は、前記抽出領域を拡大して表示する際に、拡大された前記抽出領域の長方形の辺が前記参照画面の枠の長方形の辺と同じ位置となるように表示させる、請求項1に記載の情報処理装置。
    the frame of the reference screen is rectangular and the extraction area is also rectangular;
    2. The display control unit, when displaying the enlarged extraction area, causes the sides of the rectangle of the enlarged extraction area to be displayed at the same positions as the sides of the rectangle of the frame of the reference screen. 1. The information processing device according to 1.
  5.  表示画像中に前記工作機械の特定エリアの輪郭に沿うラインを予め記憶する基準情報格納部と、
     撮像画像に表示される前記特定エリアの輪郭を前記ラインに合わせるよう、前記撮像画像を前記撮像部の光軸の回転方向に回転させる補正部と、
     をさらに備え、
     前記表示制御部は、前記補正部による補正後の撮像画像の外周部を切り取る形で抽出された前記抽出領域を拡大して表示させる、請求項1に記載の情報処理装置。
    a reference information storage unit that stores in advance a line along the contour of a specific area of the machine tool in the display image;
    a correction unit that rotates the captured image in a rotational direction of the optical axis of the imaging unit so that the outline of the specific area displayed in the captured image is aligned with the line;
    further comprising
    2. The information processing apparatus according to claim 1, wherein said display control unit enlarges and displays said extraction region extracted by cutting out an outer peripheral portion of the captured image after correction by said correction unit.
  6.  前記外周部は、前記グリッドの区画において前記撮像画像が占有している部分が50%以下であるグリッドを含む、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the outer peripheral portion includes a grid in which a portion of the section of the grid occupied by the captured image is 50% or less.
  7.  前記外周部は、前記グリッドの区画において前記撮像画像が占有している部分が50%以下であるグリッドを含むグリッドの行またはグリッドの列である、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the outer peripheral portion is a grid row or a grid column including a grid in which the portion occupied by the captured image in the division of the grid is 50% or less.
  8.  前記表示制御部は、前記抽出領域の上部の輪郭が前記参照画面の枠と重なる又は接する位置まで移動した際に、前記抽出領域の拡大処理を止める、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the display control unit stops enlarging the extraction area when the upper outline of the extraction area moves to a position where it overlaps or touches the frame of the reference screen.
  9.  加工室に設定された撮像エリアを撮像する撮像部と、
     前記撮像部による撮像画像を表示する表示部と、
     前記撮像画像を複数の画素を含む大きさのグリッドに区分するグリッド設定部と、
     前記グリッド並びの行および列を単位として、前記撮像画像の外周部を切り取る形で抽出した、前記撮像画像の一部である抽出領域を表示部の参照画面に表示させる表示制御部と、
     を備える工作機械であって、
     前記表示制御部は、前記撮像画像から外周部を切り取ることで前記参照画面の枠よりも小さくなった前記抽出領域を、前記抽出領域の輪郭が前記参照画面の枠に向かって拡大するよう前記表示部に表示させる、工作機械。
    an imaging unit that images an imaging area set in the processing chamber;
    a display unit for displaying an image captured by the imaging unit;
    a grid setting unit that divides the captured image into grids each having a size including a plurality of pixels;
    a display control unit for displaying, on a reference screen of a display unit, an extraction region that is a part of the captured image, which is extracted by cutting out the outer peripheral portion of the captured image in units of rows and columns of the grid arrangement;
    A machine tool comprising
    The display control unit displays the extraction area, which has become smaller than the frame of the reference screen by cutting the outer peripheral portion from the captured image, such that the outline of the extraction area expands toward the frame of the reference screen. A machine tool displayed in the department.
PCT/JP2022/032317 2021-09-06 2022-08-29 Information processing device, and machine tool WO2023032876A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2021144425 2021-09-06
JP2021-144425 2021-09-06
JP2022072847A JP7405899B2 (en) 2021-09-06 2022-04-27 Information processing equipment and machine tools
JP2022-072847 2022-04-27

Publications (1)

Publication Number Publication Date
WO2023032876A1 true WO2023032876A1 (en) 2023-03-09

Family

ID=85412754

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032317 WO2023032876A1 (en) 2021-09-06 2022-08-29 Information processing device, and machine tool

Country Status (1)

Country Link
WO (1) WO2023032876A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453080B1 (en) * 1999-10-11 2002-09-17 Mustek Systems Inc. Method for real-time auto-cropping a scanned image
JP2007329548A (en) * 2006-06-06 2007-12-20 Sony Corp Image processing system, image processing method, and program
JP2014099832A (en) * 2012-11-16 2014-05-29 Xacti Corp Camera
JP2015142281A (en) * 2014-01-29 2015-08-03 キヤノン株式会社 Subject searching device, control method and control program of the same, and imaging device
JP6921354B1 (en) * 2021-05-24 2021-08-18 Dmg森精機株式会社 Information processing device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453080B1 (en) * 1999-10-11 2002-09-17 Mustek Systems Inc. Method for real-time auto-cropping a scanned image
JP2007329548A (en) * 2006-06-06 2007-12-20 Sony Corp Image processing system, image processing method, and program
JP2014099832A (en) * 2012-11-16 2014-05-29 Xacti Corp Camera
JP2015142281A (en) * 2014-01-29 2015-08-03 キヤノン株式会社 Subject searching device, control method and control program of the same, and imaging device
JP6921354B1 (en) * 2021-05-24 2021-08-18 Dmg森精機株式会社 Information processing device

Similar Documents

Publication Publication Date Title
JP5832083B2 (en) Tool dimension measuring method and measuring device
JP5725796B2 (en) Tool measuring method and measuring device, and machine tool
JP6351745B2 (en) Machine tool control method and machine tool control apparatus
EP3416009B1 (en) Beam tool pathing for 3d compound contours using machining path surfaces to maintain a single solid representation of objects
JP2009175954A (en) Generating device of processing robot program
JP5404450B2 (en) Processing status monitoring device
WO2023032662A1 (en) Iinformation processing device and machine tool
US9902070B2 (en) Robot system and robot control method for adjusting position of coolant nozzle
JP6921354B1 (en) Information processing device
JP6570592B2 (en) On-machine measuring method and control device of machine tool
US10191460B2 (en) Control device for machine tool
JP6656387B2 (en) Machine tool with display device
WO2023032876A1 (en) Information processing device, and machine tool
JP2023038151A (en) Information processing device, and machine tool
US20230019148A1 (en) Display device, machine tool, and liquid ejection method
JP6827579B1 (en) Machine tools, machine tool control methods, and machine tool control programs
JP2009214289A (en) Copy grinding method and apparatus for the same
CN116847958A (en) Method and device for adjusting a robot path for processing a workpiece
WO2022250052A1 (en) Information processing device, and program
JP2022117544A (en) Machine tool
JP4380580B2 (en) Component library data creation method, component library data creation device, and electronic component mounting device
JP2008139260A (en) Image display unit and method, appearance inspection device, cream solder printer
JP2011138872A (en) Electronic component mounting apparatus and electronic component mounting method
WO2022163459A1 (en) Image processing device and machine tool
JP6621639B2 (en) Image processing apparatus for substrates

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22864459

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE