WO2020065854A1 - Workpiece position detecting method and machine tool - Google Patents

Workpiece position detecting method and machine tool Download PDF

Info

Publication number
WO2020065854A1
WO2020065854A1 PCT/JP2018/036052 JP2018036052W WO2020065854A1 WO 2020065854 A1 WO2020065854 A1 WO 2020065854A1 JP 2018036052 W JP2018036052 W JP 2018036052W WO 2020065854 A1 WO2020065854 A1 WO 2020065854A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
work
shape data
camera
machine tool
Prior art date
Application number
PCT/JP2018/036052
Other languages
French (fr)
Japanese (ja)
Inventor
忠 笠原
恵 古田
Original Assignee
株式会社牧野フライス製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社牧野フライス製作所 filed Critical 株式会社牧野フライス製作所
Priority to PCT/JP2018/036052 priority Critical patent/WO2020065854A1/en
Publication of WO2020065854A1 publication Critical patent/WO2020065854A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves

Definitions

  • the present invention relates to a work position detection method and a machine tool.
  • a feed axis is numerically controlled so that a tool is moved along a tool path set in advance with respect to a workpiece and the workpiece is machined into a desired shape. Therefore, in order to accurately set a tool path for a work actually installed on the machine tool, it is necessary to know at which position the work is actually mounted on the machine tool before machining.
  • Patent Document 1 discloses a device for checking a machining state in a machine tool.
  • the apparatus includes: an actual imaging unit that captures a workpiece from a preset viewpoint to generate two-dimensional actual image data of the workpiece; a model data storage unit that stores data including a three-dimensional model of the workpiece; Virtual image generating means for generating two-dimensional virtual image data of the work from the same viewpoint as the real imaging means based on the two-dimensional model; and comparing means for comparing the two-dimensional real image data with the two-dimensional virtual image data.
  • the positional deviation of the work is detected.
  • the environment in which the machine tool is installed can have various lighting environments. Therefore, when the position of the work is detected by the imaging means, there is a possibility that an image of the work may not be partially obtained due to various factors such as a light environment and a material of the work (so-called overexposure or underexposure). etc). If the feature shape required to specify the position and orientation of the work is included in an area where an image cannot be obtained, the position and orientation of the work may not be accurately detected.
  • the object of the present invention is to improve the accuracy of position detection of a work while taking the above problems into consideration.
  • a step of reading shape data of the work a step of imaging the work from a first direction to obtain a first image of the work, and a first image Is divided into a first area having a luminance value within a predetermined range and a second area having a luminance value outside the predetermined range.
  • the shape data and the first A step of comparing the image with an image, wherein the position and orientation of the workpiece in the machine coordinate system of the machine tool are determined by the comparison; and the shape data based on the shape data arranged at the determined position and orientation.
  • the first image includes a first area having a luminance value within a predetermined range (that is, an area where the camera can acquire an image) and a first area having a luminance value outside the predetermined range. And a second area having a luminance value (that is, an area where the camera cannot acquire an image). Then, using the shape data in the first area and the first image, the position and orientation of the work are obtained. Furthermore, when the shape data includes the characteristic shape in the second region, the second image captured from a different direction (that is, a direction in which the camera can recognize the second region as an image) is used. Then, the shape data and the image in the second area are collated.
  • the first image may be captured by the first camera, and the second image may be captured by a second camera different from the first camera. Alternatively, the first image and the second image may be captured by the same mobile camera.
  • Another aspect of the present disclosure is a storage device for storing shape data of a work and an imaging device configured to capture a first image and a second image in a machine tool,
  • the work is imaged in a first direction
  • the second image the work is imaged in a second direction different from the first direction.
  • the apparatus reads the shape data of the work from the storage device, receives the first image from the imaging device, and converts the first image into a first area having a brightness value within a predetermined range, A second region having a luminance value outside the range, and comparing the shape data with the first image in the first region. The comparison is performed by the machine coordinate system of the machine tool.
  • the position and orientation of the workpiece at Determining whether or not the shape data includes a characteristic shape in the second region based on the shape data arranged at the set position and orientation; receiving a second image from the imaging device; When the data is determined to include the characteristic shape in the second region, the shape data is compared with the second image in the second region. It is a machine. Similarly, such a machine tool can improve the accuracy of work position detection.
  • the imaging device may include a first camera configured to capture a first image and a second camera configured to capture a second image.
  • the imaging device may include a movable camera configured to capture the first image and the second image.
  • the accuracy of work position detection can be improved.
  • FIG. 1 is a schematic perspective view of a machine tool according to an embodiment.
  • FIG. 2 is a schematic perspective view of the machine tool of FIG. 1 when a door is opened. It is a block diagram of a machine tool concerning an embodiment. It is a schematic diagram for explaining a work.
  • FIG. 4 is a schematic diagram illustrating an example of an image obtained by a first camera.
  • FIG. 9 is a schematic diagram illustrating an example of an image obtained by a second camera.
  • 5 is a flowchart illustrating a work position detection method according to the embodiment. It is a flowchart following FIG. It is a figure showing a screen for probe measurement. It is a block diagram of a machine tool concerning other embodiments.
  • FIG. 1 is a schematic perspective view of the machine tool according to the embodiment.
  • the machine tool 10 includes the processing machine 1 housed in the cover 100.
  • the machine tool 10 can be a vertical machining center, and the processing machine 1 has an X axis in the horizontal direction, a Y axis in the horizontal direction, and a Z axis in the vertical direction.
  • the machine tool 10 may be another type of machine tool.
  • the processing machine 1 includes a bed 11 as a base, and a column (not shown) provided upright on the upper surface of the bed 11.
  • the processing machine 1 includes a spindle head 14 that supports the spindle 13 rotatably about a vertical axis, and a saddle (not shown) that supports the spindle head 14 in front of the column.
  • the spindle head 14 supports the spindle 13 downward so that the tip of the spindle 13 faces the table 16.
  • a tool T is mounted on the tip of the main shaft 13.
  • the processing machine 1 includes a table 16 on which the work W is mounted.
  • the table 16 is provided on the bed 11.
  • the work W may be attached to the table 16 via a pallet, or may be attached directly to the table 16.
  • the processing machine 1 includes a driving device that moves the tool T relative to the workpiece W based on each feed axis.
  • the drive includes a servomotor 75 (see FIG. 3) for each axis that drives along the respective feed axis.
  • the driving device can move the column in the X-axis direction with respect to the bed 11, move the saddle with respect to the column in the Y-axis direction,
  • the spindle head 14 can be moved in the Z-axis direction.
  • the processing machine 1 of the present embodiment has three linear motion axes of the X axis, the Y axis, and the Z axis that are orthogonal to each other.
  • the cover 100 has at least a flat upper wall 102 and a front wall 104 and surrounds the entire processing machine 1.
  • the cover 100 has an opening 106 for an operator to access a processing space in which the processing machine 1, in particular, the table 16 and the tool T at the tip of the spindle 13 are arranged.
  • the opening 106 is formed from the upper wall 102 to the front wall 104.
  • the cover 100 has a door 108 for opening and closing the opening 106.
  • An operation panel 110 for operating the processing machine 1 is attached to the front wall 104.
  • the operation panel 110 is provided with a plurality of buttons or input keys 112 for operating the processing machine 1, and a touch panel 114 included in the display / input unit 26 (described later).
  • the door 108 is formed substantially in an L shape, and has an upper door portion 108a and a front door portion 108b.
  • the door 108 is slidable in the X-axis direction.
  • the door 108 may be provided with an interlock, and the processing machine 1 may be configured to operate only when the door 108 is at a position that closes the opening 106.
  • FIG. 2 is a schematic perspective view of the machine tool of FIG. 1 when the door is opened. It should be noted that some components such as columns and saddles are omitted in FIG. 2 for clarity.
  • the machine tool 10 captures a first image in which the workpiece W is captured from a first direction and a second image in which the workpiece W is captured from a second direction different from the first direction.
  • the imaging device 120 is provided.
  • the imaging device 120 includes a first camera 121 and a second camera 122.
  • the first camera 121 and the second camera 122 may be, for example, a CMOS camera or a CCD camera.
  • the first camera 121 and the second camera 122 may be, for example, a color camera in which each of RGB has 256 gradations, or a black and white camera having 256 gradations.
  • Each of the first camera 121 and the second camera 122 is configured to image a predetermined position in the machine coordinate system of the machine tool 10.
  • the positions in the machine coordinate system captured by the first camera 121 and the second camera 122 can be stored in, for example, the storage device 28 (see FIG. 3).
  • first camera 121 is arranged at a position and an orientation that captures image of work W from a first direction
  • second camera 122 is arranged at a second position different from the first direction.
  • the work W is arranged in such a position and orientation as to image the work W from the direction.
  • each of the first camera 121 and the second camera 122 is mounted above the table 16 so that the table 16 and the work W can be imaged.
  • the first camera 121 and the second camera 122 can be fixed to an arbitrary component (for example, an inner surface of the cover 100 or a splash guard or the like).
  • the first camera 121 and the second camera 122 are arranged such that a straight line connecting the first camera 121 and the second camera 122 passes through the center point of the upper surface of the table 16 when viewed in the Z-axis direction, and
  • the first camera 121 and the second camera 122 may be arranged in the cover 100 such that the optical axes of the first camera 121 and the second camera 122 are aligned with the straight line.
  • the angles of the first camera 121 and the second camera 122 when viewed in the horizontal direction may be set, for example, such that the optical axes of the first camera 121 and the second camera 122 pass through the center point on the upper surface of the table 16. Good. With such a configuration, all outer surfaces except the bottom surface of the work W are imaged without blind spots.
  • FIG. 3 is a block diagram of the machine tool according to the embodiment.
  • the machine tool 10 includes a control device 70 configured to control a feed device of each axis, set processing information, and measure a workpiece.
  • the control device 70 is, for example, a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and a hard disk or solid state drive (Solid State Drive, SSD) connected to each other via a bus. And other electronic data storage media.
  • a CPU Central Processing Unit
  • RAM Random Access Memory
  • ROM Read Only Memory
  • SSD solid state drive
  • the control device 70 includes an input unit 71, a read interpretation unit 72, an interpolation calculation unit 73, and a servo motor control unit 74.
  • the operator creates the machining program 76.
  • the machining program 76 can be generated by a CAM (Computer / Aided / Manufacturing) device or the like based on workpiece shape data including a workpiece shape before processing and a target shape.
  • the workpiece shape data can be created by, for example, a CAD (Computer Aided Design) device.
  • Processing information related to processing is input to the input unit 71.
  • the processing information may be manually input to the input unit 71 via the touch panel 114, or may be input to the input unit 71 as electronic data including various information via a communication network.
  • the machine tool 10 processes the workpiece W based on various processing information.
  • the processing information can include, for example, a processing program 76, tool information 77, and work information 78.
  • the processing information may include other information.
  • the machining program 76 includes information on the relative movement of the tool with respect to the workpiece.
  • the machining program 76 includes a command for the machine tool in, for example, a G code and an M code.
  • the tool information 77 includes, for example, information about the tool such as the type of tool such as a drill and an end mill, the diameter of the tool, and the length of the tool.
  • the work information 78 includes information on the work W to be processed.
  • the information on the work W includes, for example, shape data including a shape of the work W before processing, a target shape, and the like.
  • the reading / interpreting unit 72 reads information including the processing program 76 from the input unit 71.
  • the reading interpretation unit 72 outputs a movement command to the interpolation calculation unit 73.
  • the interpolation calculation unit 73 calculates a position command value for each interpolation cycle. For example, the interpolation calculation unit 73 calculates a movement amount for each time interval set based on the movement command.
  • the interpolation calculator 73 outputs the position command value to the servo motor controller 74.
  • the servo motor control unit 74 outputs a current value to the servo motor 75 of each of the X-axis, Y-axis, and Z-axis feed axes based on the position command.
  • a linear position measuring device such as a digital scale is provided on each of the X-axis, Y-axis, and Z-axis feed axes so that the servo motor control unit 74 servo-controls the servo motor 75 of each axis.
  • the positions of the X-axis, Y-axis, and Z-axis feed axes are fed back to the servo motor control unit 74.
  • the machining information can include coordinate information in addition to the machining program 76, the tool information 77, and the work information 78.
  • the coordinate information includes information on coordinates used in the machining program 76.
  • the coordinate system used in the machine tool 10 can include, for example, a machine coordinate system whose origin is a predetermined point of the machine tool, and a work coordinate system whose origin is an arbitrary point of the work.
  • the work coordinate system moves with the movement of the work.
  • the work coordinate system can be determined by setting a relative position with respect to the machine coordinate system.
  • the coordinate system used in the machine tool 10 may include another coordinate system. When a plurality of coordinate systems are used in the machining program 76, information on the plurality of coordinate systems is included in the coordinate information.
  • the processing information can further include a parameter of the control device.
  • the processing information can include information on processing conditions specified on the operation panel 110.
  • processing conditions can include, for example, an override value relating to the moving speed and information relating to the coolant.
  • the control device 70 further includes a command unit 20, a display / input unit 26, a storage device 28 for storing processing information, a measurement command unit 29, a measurement information acquisition unit 30, and a matching processing unit 31.
  • the command unit 20 generates processing information and outputs it to the input unit 71.
  • the command unit 20 can edit the processing information input to the input unit 71 based on an operation of the touch panel 114 by the operator, and output the edited processing information to the input unit 71 as new processing information. it can.
  • the command unit 20 receives processing information such as an override value input on the operation panel 110. In this way, the command unit 20 can newly create and / or edit the processing information.
  • the command unit 20 includes a display command unit 22, a storage command unit 25, and an imaging command unit 27.
  • the display command unit 22 can output a command to the display / input unit 26 to display the processing information, for example.
  • various commands and information are input to the command unit 20 from the display / input unit 26.
  • the storage command unit 25 can output a command to the storage device 28 to store the newly generated processing information and / or the edited processing information.
  • the imaging instruction unit 27 can output an instruction to the imaging device 120 so as to image the work W and the table 16.
  • the imaging instruction unit 27 may be configured to output an instruction to the imaging device 120 based on a signal from the interlock only when the door 108 is closed.
  • the display / input unit 26 is configured to display processing information and to enable an operator to input data and instructions.
  • the display / input unit 26 can include, for example, the touch panel 114 (see FIG. 1). Referring to FIG. 3, display / input unit 26 may include other components.
  • the operator can input data and a command by tapping or clicking an icon on the touch panel 114.
  • the storage device 28 can include, for example, the above-described ROM and / or storage medium. Although the storage device 28 of the present embodiment is arranged inside the control device 70, the storage device 28 may be arranged outside the control device 70. For example, the storage device 28 may include a storage medium such as a memory card and / or a hard disk connected to the control device 70 via a communication network.
  • a storage medium such as a memory card and / or a hard disk connected to the control device 70 via a communication network.
  • the measurement command unit 29 receives a command for measuring the workpiece W from the command unit 20.
  • the measurement command unit 29 may receive a command for moving the measurement probe 32 attached to the tip of the spindle 13 from the command unit 20 to execute the second position detection (described later) of the work W. it can.
  • the measurement command unit 29 outputs a command for measurement to the reading / interpreting unit 72, and the work W is measured based on the command.
  • the measurement information acquisition unit 30 acquires the measurement value of the work.
  • the measurement information acquisition unit 30 receives a signal from the measurement probe 32. Further, the measurement information acquisition unit 30 receives the coordinate values of the machine coordinate system from the servo motor control unit 74.
  • the measurement information acquisition unit 30 outputs these pieces of information to the command unit 20. For example, when the measurement probe 32 contacts the workpiece W, the measurement information acquisition unit 30 sends the skip signal output from the measurement probe 32 and the coordinate value of each axis in the machine coordinate system to the command unit 20. Can be output.
  • the command unit 20 can calculate the processing information from the measured values as necessary, and can display the calculated processing information on the display / input unit 26.
  • the matching processing unit 31 receives the first image and the second image from the imaging device 120. Further, the matching processing unit 31 receives the shape data of the work W from the storage device 28. The matching processing unit 31 analyzes the first image and converts the first image into a first area having a luminance value within a predetermined range and a second area having a luminance value outside the predetermined range. , Is configured to be divided.
  • the matching processing unit 31 determines that all the brightness values of R, G, and B in a certain area are within a predetermined range (for example, 250 This area can be classified as a first area when it is within a smaller range. In contrast, when at least one of the R, G, and B luminance values of a certain region is out of the predetermined range, the matching processing unit 31 can classify this region as a second region.
  • a predetermined range for example, 250 This area can be classified as a first area when it is within a smaller range.
  • the matching processing unit 31 can classify this region as a second region.
  • the matching processing unit 31 can be classified into a first region.
  • the matching processing unit 31 can classify this area as a second area.
  • the matching processing unit 31 compares the shape data and the first image in the first area, and additionally compares the shape data and the second image in the second area if necessary. Thus, the position and orientation of the workpiece W in the machine coordinate system of the machine tool 10 are determined.
  • the matching processing unit 31 can obtain the shape of the work W by, for example, binarizing the image data and extracting pixels serving as edges.
  • the matching processing unit 31 can obtain the dimensions of the work W by counting the number of pixels in the X-axis, Y-axis, and Z-axis directions, for example.
  • the matching processing unit 31 can calculate, for example, the position and orientation of the work W where the shape data and the shape data of the work W obtained from the image data best match.
  • the matching processing unit 31 may determine that the calculated position and orientation of the workpiece W are correct when the degree of coincidence exceeds a predetermined threshold. Since the degree of coincidence can be obtained by various known methods, detailed description is omitted in this specification.
  • the work W when attaching the work W to the table 16, the work W may be attached to an arbitrary position on the table 16. Therefore, in order to accurately set the tool path for the work W actually installed on the machine tool 10, it is necessary to determine in which position the work W is actually mounted in the machine coordinate system of the machine tool 10 before machining. You need to know.
  • the machine tool 10 is configured to detect the position and the orientation of the workpiece W in the machine coordinate system of the machine tool 10 before machining.
  • the first position detection of the workpiece W is performed by the first camera 121 and the second camera 122, and subsequently, the second position detection is more accurate than the first position detection. Is performed by the measurement probe 32.
  • the machine tool 10 is configured to rapidly feed the measurement probe 32 to the vicinity of the work W in the second position detection based on the position and the orientation of the work W obtained by the first position detection. With such a configuration, the position and orientation of the work W can be detected quickly and accurately.
  • an offset value of the work coordinate system with respect to the machine coordinate system and an inclination of the work coordinate system with respect to the machine coordinate system can be obtained.
  • FIG. 4 is a schematic diagram for explaining a work.
  • the work W has an arc-shaped surface SF.
  • One side (right side in FIG. 4) half of the surface SF includes the characteristic shape CS.
  • the characteristic shape CS can be a shape that can be a mark when specifying the direction of the work W, and can be, for example, a shape including a ridge line and / or a valley line (for example, a protrusion or a groove).
  • FIG. 5 is a schematic diagram showing an example of an image obtained by the first camera 121, and shows an image obtained by the first camera 121 in FIG.
  • an area AR1 indicates a first area having a luminance value within a predetermined range (that is, an area where the first camera 121 can acquire an image), and an area AR2 has a luminance value outside the predetermined range. 2 (that is, an area where the first camera 121 cannot acquire an image).
  • a line that cannot be recognized as an image by the first camera 121 is indicated by a broken line.
  • the light L1 has such a high luminance value that the first camera 121 cannot recognize the area AR2 where the light L1 is reflected as an image. May be present (so-called overexposure).
  • the feature shape CS included in the surface SF may not be recognized as an image by the first camera 121.
  • the work W is a target in the left-right direction and the front-back direction in FIG. Therefore, when the characteristic shape CS is not detected, the position and the orientation of the work W may not be correctly detected.
  • FIG. 6 is a schematic diagram showing an example of an image obtained by the second camera 122. Since the light L2 incident on the second camera 122 from the right half of the surface SF is weak, the characteristic shape CS included in the surface SF is recognized as an image by the second camera 122 as shown in FIG. Therefore, the machine tool 10 is configured to perform the first position detection using the first image as shown in FIG. 5 and the second image as shown in FIG. 4 to 6 illustrate a case where an image cannot be partially obtained due to overexposure. However, in the present disclosure, a case where an image cannot be partially obtained due to a low luminance value (so-called black underexposure). Note that this is also applicable.
  • black underexposure a low luminance value
  • the machine tool 10 of the present embodiment is configured to select manual or automatic for the method of measuring a work.
  • the screen and the measurement program that support the automatic measurement can be stored in the storage device 28.
  • the screen supporting the manual measurement can be stored in the storage device 28.
  • FIG. 9 is a diagram showing a screen for probe measurement.
  • a screen 80 as shown in FIG. 9 is displayed on the display / input unit 26.
  • the screen 80 has a plurality of call buttons including a measurement point / measurement direction selection area 82, a measurement item selection area 84, and a model call button 86.
  • the model call button 86 When the operator taps or clicks the model call button 86, the shape of the work to be measured is output from the storage device 28 to the display command section 22 of the command section 20, and displayed in the measurement point / measurement direction selection area 82.
  • the operator can set a measurement point and a measurement direction on the work by tapping or clicking on desired points P 1 to P 3 on the work displayed on the measurement point / measurement direction selection area 82 on the touch panel 114. It is.
  • a plurality of measurement method selection icons 84a to 84d indicating the work measurement method are displayed.
  • Icon 84a can be used to measure the angle of the two sides of the workpiece with respect to the machine coordinate system.
  • the icon 84b can be used to measure the length of two opposing sides of a rectangular recess formed in a work.
  • Icon 84c can be used to measure the inner diameter of a circular recess formed in the work.
  • the icon 84d can be used to measure the length of two opposing sides of the rectangular parallelepiped work.
  • the operator can select a desired measurement by tapping or clicking one of the measurement method selection icons 84a to 84d on the touch panel 114. It should be noted that the icons 84a to 84d are merely examples, and the measurement item selection area 84 may include icons used for other measurements.
  • the measurement of the workpiece W is started according to the selected measurement method.
  • the first position detection of the workpiece W is performed by the imaging device 120, and then, the second position detection is performed by the measurement probe 32.
  • FIG. 7 is a flowchart illustrating a work position detection method according to the embodiment.
  • the work W is fixed at an arbitrary position on the table 16 during the first position detection.
  • the matching processing unit 31 reads out the shape data (the shape data before processing) of the work W from the storage device 28 (Step S100). Subsequently, the imaging instruction unit 27 outputs an instruction to the first camera 121 to image the workpiece W, and acquires a first image (Step S102). The acquired first image is transmitted from the first camera 121 to the matching processing unit 31.
  • the matching processing unit 31 analyzes the first image, and converts the first image into a first region AR1 (see FIG. 5) having a luminance value within a predetermined range and an area outside the predetermined range. It is divided into a second area AR2 having a luminance value (see FIG. 5) (step S104).
  • matching processing section 31 compares the shape data with the first image data in first area AR1 (step S106). As a result, the position and orientation of the workpiece W are calculated, and the degree of coincidence between the shape data and the image data at this time is calculated.
  • the matching processing unit 31 determines whether the calculated degree of matching exceeds a predetermined threshold (step S108). If it is determined in step S108 that the degree of coincidence exceeds the predetermined threshold, the matching processing unit 31 then temporarily sets the position and orientation of the workpiece W at this time as the position and orientation in the machine coordinate system, respectively. It is determined (step S110).
  • step S108 If it is determined in step S108 that the matching degree does not exceed the predetermined threshold, the matching processing unit 31 repeats steps S106 to S108 until a matching degree exceeding the predetermined threshold is obtained.
  • the matching processing unit 31 can recalculate the degree of coincidence by, for example, changing the position and the direction of the work W.
  • the matching processing unit 31 determines whether or not the second area AR2 has been extracted in step S104 (step S112). If it is determined in step S112 that the second area AR2 has been extracted, subsequently, the matching processing unit 31 determines whether the shape data includes the characteristic shape CS in the second area AR2 ( Step S114). If it is determined in step S114 that the shape data includes the characteristic shape CS in the second area AR2, the control device 70 proceeds to step S118 (FIG. 8).
  • step S112 when it is determined in step S112 that second region AR2 has not been extracted, and in step S114, it is determined that shape data does not include feature shape CS in second region AR2.
  • the matching processing unit 31 determines the position and orientation provisionally determined in step S110 as the position and orientation of the workpiece W in the machine coordinate system (step S116), and ends a series of operations for the first position detection. I do.
  • FIG. 8 is a flowchart following FIG. Subsequently, the imaging instruction unit 27 outputs an instruction to the second camera 122 to image the work W, and acquires a second image (Step S118). The acquired second image is transmitted from the second camera 122 to the matching processing unit 31.
  • the matching processing unit 31 arranges the shape data in the second image at the position and orientation provisionally determined in step S110 (step S120). Subsequently, the matching processing unit 31 compares the shape data with the second image in the second area AR2 extracted in step S104 (step S122). Specifically, the matching processing unit 31 calculates the degree of coincidence between the shape data and the image data in the second area AR2. As shown in FIG. 6, since the second region AR2 includes the characteristic shape CS, by calculating the degree of coincidence in consideration of the characteristic shape CS, the position and the orientation temporarily determined in step S110 are correct. No, you can double check.
  • matching processing section 31 determines whether or not the degree of coincidence calculated in step S122 exceeds a predetermined threshold (step S124). If it is determined in step S124 that the degree of coincidence exceeds the predetermined threshold, the matching processing unit 31 determines the provisionally determined position and orientation as the position and orientation of the workpiece W in the machine coordinate system (step S126). , The series of operations for the first position detection is terminated.
  • step S124 If it is determined in step S124 that the matching degree does not exceed the predetermined threshold, the matching processing unit 31 repeats steps S106 to S124 until a matching degree exceeding the predetermined threshold is obtained.
  • the matching processing unit 31 can recalculate the degree of coincidence by, for example, changing the position and the direction of the work W.
  • the control device 70 executes the second position detection using the measurement probe 32 according to the method selected on the screen 80 (see FIG. 9).
  • the command unit 20 and the measurement command unit 29 output a command to fast-forward the measurement probe 32 to the vicinity of the work W based on the position and orientation of the work W obtained by the first position detection.
  • the position and orientation of the workpiece W can be detected more quickly and more easily than when the operator manually sets the end point of the fast-forward.
  • the first image has the first area having the luminance value within the predetermined range (that is, the area where the first camera 121 can acquire the image).
  • the position and the orientation of the workpiece W are obtained using the shape data and the first image in the first area AR1.
  • the shape data includes the characteristic shape CS in the second area AR2
  • the second image captured from a different direction (that is, a direction in which the second camera 122 can recognize the second area as an image).
  • the shape data and the image in the second area AR2 are compared using the second image.
  • the second area AR2 is captured using the second image captured from a different second direction.
  • the feature shape CS in the area AR2 can be confirmed. Therefore, even when there is an area AR2 where an image cannot be obtained from the first direction, the accuracy of position detection of the work W can be improved.
  • the work W is imaged from the first direction and the second direction. Therefore, for example, by displaying the first image and the second image on the touch panel 114, the operator can visually recognize the work W without a blind spot.
  • the second image is acquired after the first image is acquired (that is, step S118 is executed after step S102).
  • the first image and the second image may be acquired simultaneously at the same timing (ie, steps S102 and S118 may be performed simultaneously at the same timing).
  • the position and the orientation of the work W are detected using the shape data and the image of the work W.
  • the position and the orientation of the work W may be detected using the shape data and the image of the table 16 and / or the jig on the table 16 in addition to the shape data and the image of the work W.
  • FIG. 10 is a block diagram of a machine tool according to another embodiment.
  • the machine tool 10A differs from the machine tool 10 in that the imaging device 120 has one movable camera 123.
  • the camera 123 moves between a first position at which the work W can be imaged from the first direction and a second position at which the work W can be imaged from the second direction. It can be configured to:
  • the camera 123 may be configured to move, for example, along a rail laid between the first position and the second position.
  • the camera 123 can have other configurations.
  • the camera 123 may be, for example, a CMOS camera or a CCD camera. Further, the camera 123 may be a color camera or a black and white camera.
  • the machine tool 10A and the work position detection method implemented using the machine tool 10A can provide the same effects as described above.
  • Control device 120 Imaging device 121 First camera 122 Second camera 123 Movable camera AR1 First area AR2 Second area CS Characteristic shape W Work

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Machine Tool Sensing Apparatuses (AREA)

Abstract

Provided is a workpiece position detecting method comprising the steps of: reading shape data of a workpiece (W); obtaining a first image by imaging the workpiece (W) from a first direction; dividing the first image to a first region having a luminosity value within a predetermined range and a second region having a luminosity value out of the predetermined range; comparing the shape data with the first image in the first region wherein the comparison provides a position and a direction of the workpiece (W) in a mechanical coordinate system of a machine tool; determining, on the basis of the shape data arranged in the provided position and the direction, whether or not the shape data contains a characteristic shape (CS) in the second region; obtaining second image by imaging the workpiece (W) from a second direction; and collating the shape data with the second image in the second region when it is determined that the shape data contains the characteristic shape (CS) in the second region.

Description

ワーク位置検出方法及び工作機械Work position detection method and machine tool
 本発明は、ワーク位置検出方法及び工作機械に関する。 The present invention relates to a work position detection method and a machine tool.
 工作機械では、ワークに対して予め設定された工具経路に沿って工具を移動して、ワークを所望の形状に加工するように送り軸が数値制御される。したがって、工作機械に実際に設置されたワークに対して工具経路を正確に設定するために、工作機械においてどの位置にワークが実際に取り付けられたかを、加工前に知る必要がある。 In a machine tool, a feed axis is numerically controlled so that a tool is moved along a tool path set in advance with respect to a workpiece and the workpiece is machined into a desired shape. Therefore, in order to accurately set a tool path for a work actually installed on the machine tool, it is necessary to know at which position the work is actually mounted on the machine tool before machining.
 例えば、特許文献1は、工作機械における加工状態確認装置を開示している。この装置は、予め設定された視点からワークを撮像してワークの2次元実画像データを生成する実撮像手段と、ワークの3次元モデルを含むデータを記憶するモデルデータ記憶手段と、ワークの3次元モデルに基づいて実撮像手段と同じ視点からワークの2次元仮想画像データを生成する仮想画像生成手段と、2次元実画像データと2次元仮想画像データとを比較する比較手段と、を備えている。このような構成によって、ワークの位置ずれが検出される。 For example, Patent Document 1 discloses a device for checking a machining state in a machine tool. The apparatus includes: an actual imaging unit that captures a workpiece from a preset viewpoint to generate two-dimensional actual image data of the workpiece; a model data storage unit that stores data including a three-dimensional model of the workpiece; Virtual image generating means for generating two-dimensional virtual image data of the work from the same viewpoint as the real imaging means based on the two-dimensional model; and comparing means for comparing the two-dimensional real image data with the two-dimensional virtual image data. I have. With such a configuration, the positional deviation of the work is detected.
特開2010-058264号公報JP 2010-058264 A
 工作機械が設置される環境は、様々な光環境を有し得る。したがって、撮像手段によってワークの位置を検出する場合には、光環境及びワークの材質等の様々な要因によって、ワークの画像が部分的に得られない可能性がある(いわゆる、白とび又は黒つぶれ等)。ワークの位置及び向きを特定するために必要な特徴形状が、画像が得られない領域に含まれている場合、ワークの位置及び向きを正確に検出できない可能性がある。 環境 The environment in which the machine tool is installed can have various lighting environments. Therefore, when the position of the work is detected by the imaging means, there is a possibility that an image of the work may not be partially obtained due to various factors such as a light environment and a material of the work (so-called overexposure or underexposure). etc). If the feature shape required to specify the position and orientation of the work is included in an area where an image cannot be obtained, the position and orientation of the work may not be accurately detected.
 本発明は、上記のような問題を考慮しつつ、ワークの位置検出の精度を向上させることを目的とする。 The object of the present invention is to improve the accuracy of position detection of a work while taking the above problems into consideration.
 本開示の一態様は、工作機械におけるワーク位置検出方法において、ワークの形状データを読み出す工程と、ワークを第1の方向から撮像してワークの第1の画像を得る工程と、第1の画像を、所定の範囲内の輝度値を有する第1の領域と、所定の範囲外の輝度値を有する第2の領域と、に分割する工程と、第1の領域において、形状データと第1の画像とを比較する工程であって、当該比較によって工作機械の機械座標系におけるワークの位置及び向きが求められる、工程と、求められた位置及び向きに配置された形状データに基づいて、形状データが第2の領域に特徴形状を含んでいるか否かを判定する工程と、ワークを第1の方向とは異なる第2の方向から撮像して、ワークの第2の画像を得る工程と、形状データが第2の領域において特徴形状を含んでいると判定された場合に、第2の領域において、形状データと第2の画像とを照合する工程と、を備えるワーク位置検出方法である。 According to one embodiment of the present disclosure, in a work position detection method for a machine tool, a step of reading shape data of the work, a step of imaging the work from a first direction to obtain a first image of the work, and a first image Is divided into a first area having a luminance value within a predetermined range and a second area having a luminance value outside the predetermined range. In the first area, the shape data and the first A step of comparing the image with an image, wherein the position and orientation of the workpiece in the machine coordinate system of the machine tool are determined by the comparison; and the shape data based on the shape data arranged at the determined position and orientation. Determining whether the second region includes a characteristic shape in the second region, imaging the work from a second direction different from the first direction, and obtaining a second image of the work, Data is in the second area If it is determined that comprises the features shape Te in the second region, a step of collating the shape data and the second image, a work position detecting method comprising a.
 本開示の一態様に係るワーク位置検出方法では、第1の画像が、所定の範囲内の輝度値を有する第1の領域(すなわち、カメラが画像を取得できる領域)と、所定の範囲外の輝度値を有する第2の領域(すなわち、カメラが画像を取得できない領域)と、に分割される。そして、第1の領域における形状データと第1の画像とを用いて、ワークの位置及び向きが求められる。さらに、形状データが第2の領域に特徴形状を含んでいる場合には、異なる方向(すなわち、カメラが第2の領域を画像として認識できるような方向)から撮像された第2の画像を用いて、第2の領域における形状データと画像とが照合される。このような構成によって、第1の方向からでは画像が得られない第2の領域がある場合であっても、異なる第2の方向から撮像された第2の画像を用いて、第2の領域における特徴形状を確認することができる。したがって、第1の方向からでは画像が得られない領域がある場合にも、ワークの位置検出の精度を向上させることができる。 In the work position detection method according to an aspect of the present disclosure, the first image includes a first area having a luminance value within a predetermined range (that is, an area where the camera can acquire an image) and a first area having a luminance value outside the predetermined range. And a second area having a luminance value (that is, an area where the camera cannot acquire an image). Then, using the shape data in the first area and the first image, the position and orientation of the work are obtained. Furthermore, when the shape data includes the characteristic shape in the second region, the second image captured from a different direction (that is, a direction in which the camera can recognize the second region as an image) is used. Then, the shape data and the image in the second area are collated. With such a configuration, even when there is a second region in which an image cannot be obtained from the first direction, the second region is captured using the second image captured from a different second direction. Can be confirmed. Therefore, even when there is an area where an image cannot be obtained from the first direction, it is possible to improve the accuracy of work position detection.
 第1の画像は、第1カメラによって撮像されてもよく、第2の画像は、第1カメラとは異なる第2カメラによって撮像されてもよい。代替的に、第1の画像及び第2の画像は、移動可能な同一のカメラによって撮像されてもよい。 The first image may be captured by the first camera, and the second image may be captured by a second camera different from the first camera. Alternatively, the first image and the second image may be captured by the same mobile camera.
 本開示の他の態様は、工作機械において、ワークの形状データを記憶するための記憶装置と、第1の画像及び第2の画像を撮像するように構成された撮像装置であって、第1の画像では、ワークが第1の方向から撮像され、第2の画像では、ワークが第1の方向とは異なる第2の方向から撮像される、撮像装置と、制御装置と、を備え、制御装置は、記憶装置からワークの形状データを読み出すことと、撮像装置から第1の画像を受け取ることと、第1の画像を、所定の範囲内の輝度値を有する第1の領域と、所定の範囲外の輝度値を有する第2の領域と、に分割することと、第1の領域において、形状データと第1の画像とを比較することであって、当該比較によって工作機械の機械座標系におけるワークの位置及び向きが求められ、求められた位置及び向きに配置された形状データに基づいて、形状データが第2の領域に特徴形状を含んでいるか否かを判定することと、撮像装置から第2の画像を受け取ることと、形状データが第2の領域において特徴形状を含んでいると判定された場合に、第2の領域において、形状データと第2の画像とを照合することと、を実行するように構成されている工作機械である。このような工作機械も同様に、ワークの位置検出の精度を向上させることができる。 Another aspect of the present disclosure is a storage device for storing shape data of a work and an imaging device configured to capture a first image and a second image in a machine tool, In the image of (1), the work is imaged in a first direction, and in the second image, the work is imaged in a second direction different from the first direction. The apparatus reads the shape data of the work from the storage device, receives the first image from the imaging device, and converts the first image into a first area having a brightness value within a predetermined range, A second region having a luminance value outside the range, and comparing the shape data with the first image in the first region. The comparison is performed by the machine coordinate system of the machine tool. The position and orientation of the workpiece at Determining whether or not the shape data includes a characteristic shape in the second region based on the shape data arranged at the set position and orientation; receiving a second image from the imaging device; When the data is determined to include the characteristic shape in the second region, the shape data is compared with the second image in the second region. It is a machine. Similarly, such a machine tool can improve the accuracy of work position detection.
 撮像装置は、第1の画像を撮像するように構成された第1カメラと、第2の画像を撮像するように構成された第2カメラと、を有してもよい。代替的に、撮像装置は、第1の画像及び第2の画像を撮像するように構成された移動可能なカメラを有してもよい。 The imaging device may include a first camera configured to capture a first image and a second camera configured to capture a second image. Alternatively, the imaging device may include a movable camera configured to capture the first image and the second image.
 本発明によれば、ワークの位置検出の精度を向上させることができる。 According to the present invention, the accuracy of work position detection can be improved.
実施形態に係る工作機械の概略斜視図である。1 is a schematic perspective view of a machine tool according to an embodiment. ドアが開けられているときの図1の工作機械の概略斜視図である。FIG. 2 is a schematic perspective view of the machine tool of FIG. 1 when a door is opened. 実施形態に係る工作機械のブロック図である。It is a block diagram of a machine tool concerning an embodiment. ワークを説明するための概略図である。It is a schematic diagram for explaining a work. 第1カメラによって得られる画像の例を示す概略図である。FIG. 4 is a schematic diagram illustrating an example of an image obtained by a first camera. 第2カメラによって得られる画像の例を示す概略図である。FIG. 9 is a schematic diagram illustrating an example of an image obtained by a second camera. 実施形態に係るワーク位置検出方法を示すフローチャートである。5 is a flowchart illustrating a work position detection method according to the embodiment. 図7に続くフローチャートである。It is a flowchart following FIG. プローブ測定のための画面を示す図である。It is a figure showing a screen for probe measurement. 他の実施形態に係る工作機械のブロック図である。It is a block diagram of a machine tool concerning other embodiments.
 以下、添付図面を参照して、実施形態に係るワーク位置検出方法及び工作機械を説明する。同様な又は対応する要素には同一の符号を付し、重複する説明は省略する。理解を容易にするために、図の縮尺は変更されている場合がある。 Hereinafter, a work position detection method and a machine tool according to the embodiment will be described with reference to the accompanying drawings. Similar or corresponding elements are denoted by the same reference numerals, and redundant description will be omitted. The figures may be scaled differently for ease of understanding.
 図1は、実施形態に係る工作機械の概略斜視図である。工作機械10は、カバー100内に収容された加工機1を具備している。例えば、工作機械10は、立形マシニングセンタであることができ、加工機1には、水平左右方向にX軸、水平前後方向にY軸、及び、上下方向にZ軸が設定されている。工作機械10は、他の種類の工作機械であってもよい。 FIG. 1 is a schematic perspective view of the machine tool according to the embodiment. The machine tool 10 includes the processing machine 1 housed in the cover 100. For example, the machine tool 10 can be a vertical machining center, and the processing machine 1 has an X axis in the horizontal direction, a Y axis in the horizontal direction, and a Z axis in the vertical direction. The machine tool 10 may be another type of machine tool.
 加工機1は、基台であるベッド11と、ベッド11の上面に立設されるコラム(不図示)とを備える。加工機1は、主軸13を鉛直軸線周りに回転可能に支持する主軸頭14と、主軸頭14をコラムの前方に支持するサドル(不図示)とを備える。主軸頭14は、主軸13の先端がテーブル16に対向するように主軸13を下向きに支持している。主軸13の先端には工具Tが装着される。加工機1は、ワークWを取り付けるテーブル16を備える。テーブル16は、ベッド11上に設けられている。ワークWは、パレットを介してテーブル16に取り付けられてもよく、又は、直接的にテーブル16に取り付けられてもよい。 The processing machine 1 includes a bed 11 as a base, and a column (not shown) provided upright on the upper surface of the bed 11. The processing machine 1 includes a spindle head 14 that supports the spindle 13 rotatably about a vertical axis, and a saddle (not shown) that supports the spindle head 14 in front of the column. The spindle head 14 supports the spindle 13 downward so that the tip of the spindle 13 faces the table 16. A tool T is mounted on the tip of the main shaft 13. The processing machine 1 includes a table 16 on which the work W is mounted. The table 16 is provided on the bed 11. The work W may be attached to the table 16 via a pallet, or may be attached directly to the table 16.
 加工機1は、それぞれの送り軸に基づいてワークWに対して工具Tを相対的に移動させる駆動装置を備える。駆動装置は、それぞれの送り軸に沿って駆動する各軸のサーボモータ75(図3参照)を含む。図1を参照して、例えば、駆動装置は、ベッド11に対してコラムをX軸方向に移動させることができ、コラムに対してサドルをY軸方向に移動させることができ、サドルに対して主軸頭14をZ軸方向に移動させることができる。このように、本実施形態の加工機1は、互いに直交するX軸、Y軸及びZ軸の3つの直動軸を有する。 The processing machine 1 includes a driving device that moves the tool T relative to the workpiece W based on each feed axis. The drive includes a servomotor 75 (see FIG. 3) for each axis that drives along the respective feed axis. With reference to FIG. 1, for example, the driving device can move the column in the X-axis direction with respect to the bed 11, move the saddle with respect to the column in the Y-axis direction, The spindle head 14 can be moved in the Z-axis direction. As described above, the processing machine 1 of the present embodiment has three linear motion axes of the X axis, the Y axis, and the Z axis that are orthogonal to each other.
 カバー100は、少なくとも平板状の上壁102と前壁104とを有し、加工機1の全体を包囲する。カバー100は、オペレータが加工機1、特にテーブル16及び主軸13の先端の工具Tが配置される加工空間にアクセスするための開口部106を有している。開口部106は上壁102から前壁104へかけて形成されている。カバー100は、開口部106を開閉するドア108を有している。前壁104には、加工機1を操作する操作盤110が取り付けられている。例えば、操作盤110には、加工機1を操作する複数の釦又は入力キー112と、表示・入力部26(後述)に含まれるタッチパネル114と、が配設されている。 The cover 100 has at least a flat upper wall 102 and a front wall 104 and surrounds the entire processing machine 1. The cover 100 has an opening 106 for an operator to access a processing space in which the processing machine 1, in particular, the table 16 and the tool T at the tip of the spindle 13 are arranged. The opening 106 is formed from the upper wall 102 to the front wall 104. The cover 100 has a door 108 for opening and closing the opening 106. An operation panel 110 for operating the processing machine 1 is attached to the front wall 104. For example, the operation panel 110 is provided with a plurality of buttons or input keys 112 for operating the processing machine 1, and a touch panel 114 included in the display / input unit 26 (described later).
 ドア108は、概ねL字形に形成されており、上ドア部分108aと、前ドア部分108bとを有している。ドア108は、X軸方向にスライド可能である。ドア108には、インターロックが設けられてもよく、加工機1は、ドア108が開口部106を閉鎖する位置にあるときのみ、動作するように構成されていてもよい。 The door 108 is formed substantially in an L shape, and has an upper door portion 108a and a front door portion 108b. The door 108 is slidable in the X-axis direction. The door 108 may be provided with an interlock, and the processing machine 1 may be configured to operate only when the door 108 is at a position that closes the opening 106.
 図2は、ドアが開けられているときの図1の工作機械の概略斜視図である。なお、図2では、明確さのために、コラム及びサドル等のいくつかの構成要素が省略されていることに留意されたい。工作機械10は、ワークWが第1の方向から撮像される第1の画像と、ワークWが第1の方向とは異なる第2の方向から撮像される第2の画像と、を撮像することができる、撮像装置120を備えている。本実施形態では、撮像装置120は、第1カメラ121と、第2カメラ122と、を具備している。第1カメラ121及び第2カメラ122は、例えば、CMOSカメラ又はCCDカメラであってもよい。また、第1カメラ121及び第2カメラ122は、例えば、RGBの各々が256階調を有するカラーカメラであってもよく、又は、256階調を有する白黒カメラであってもよい。 FIG. 2 is a schematic perspective view of the machine tool of FIG. 1 when the door is opened. It should be noted that some components such as columns and saddles are omitted in FIG. 2 for clarity. The machine tool 10 captures a first image in which the workpiece W is captured from a first direction and a second image in which the workpiece W is captured from a second direction different from the first direction. The imaging device 120 is provided. In the present embodiment, the imaging device 120 includes a first camera 121 and a second camera 122. The first camera 121 and the second camera 122 may be, for example, a CMOS camera or a CCD camera. In addition, the first camera 121 and the second camera 122 may be, for example, a color camera in which each of RGB has 256 gradations, or a black and white camera having 256 gradations.
 第1カメラ121及び第2カメラ122の各々は、工作機械10の機械座標系における所定の位置を撮像するように構成されている。第1カメラ121及び第2カメラ122により撮像される機械座標系における位置は、例えば、記憶装置28(図3参照)に保存されることができる。図2を参照して、第1カメラ121は、第1の方向からワークWを撮像するような位置及び向きで配置されており、第2カメラ122は、第1の方向とは異なる第2の方向からワークWを撮像するような位置及び向きで配置されている。 Each of the first camera 121 and the second camera 122 is configured to image a predetermined position in the machine coordinate system of the machine tool 10. The positions in the machine coordinate system captured by the first camera 121 and the second camera 122 can be stored in, for example, the storage device 28 (see FIG. 3). Referring to FIG. 2, first camera 121 is arranged at a position and an orientation that captures image of work W from a first direction, and second camera 122 is arranged at a second position different from the first direction. The work W is arranged in such a position and orientation as to image the work W from the direction.
 具体的には、第1カメラ121及び第2カメラ122の各々は、テーブル16及びワークWを撮像可能なように、テーブル16の上方に取り付けられている。第1カメラ121及び第2カメラ122は、任意の構成要素(例えば、カバー100の内面、又は、スプラッシュガード等)に固定されることができる。例えば、第1カメラ121及び第2カメラ122は、Z軸方向に見た場合に、第1カメラ121及び第2カメラ122を結ぶ直線がテーブル16の上面の中心点を通るように、かつ、第1カメラ121及び第2カメラ122の光軸が当該直線と整列するように、カバー100内に配置されてもよい。水平方向に見た場合の第1カメラ121及び第2カメラ122の角度は、例えば、第1カメラ121及び第2カメラ122の光軸がテーブル16の上面の中心点を通るように設定されてもよい。このような構成によって、ワークWの底面を除く全ての外表面が死角無く撮像される。 Specifically, each of the first camera 121 and the second camera 122 is mounted above the table 16 so that the table 16 and the work W can be imaged. The first camera 121 and the second camera 122 can be fixed to an arbitrary component (for example, an inner surface of the cover 100 or a splash guard or the like). For example, the first camera 121 and the second camera 122 are arranged such that a straight line connecting the first camera 121 and the second camera 122 passes through the center point of the upper surface of the table 16 when viewed in the Z-axis direction, and The first camera 121 and the second camera 122 may be arranged in the cover 100 such that the optical axes of the first camera 121 and the second camera 122 are aligned with the straight line. The angles of the first camera 121 and the second camera 122 when viewed in the horizontal direction may be set, for example, such that the optical axes of the first camera 121 and the second camera 122 pass through the center point on the upper surface of the table 16. Good. With such a configuration, all outer surfaces except the bottom surface of the work W are imaged without blind spots.
 図3は、実施形態に係る工作機械のブロック図である。工作機械10は、各軸の送り装置を制御し、加工情報を設定し、かつ、ワークを測定するように構成された、制御装置70を具備する。制御装置70は、例えば、バスを介して互いに接続されたCPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)、及び、ハードディスク又はソリッドステートドライブ(Solid State Drive, SSD)等の電子データの記憶媒体を含むことができる。 FIG. 3 is a block diagram of the machine tool according to the embodiment. The machine tool 10 includes a control device 70 configured to control a feed device of each axis, set processing information, and measure a workpiece. The control device 70 is, for example, a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and a hard disk or solid state drive (Solid State Drive, SSD) connected to each other via a bus. And other electronic data storage media.
 制御装置70は、入力部71、読取解釈部72、補間演算部73及びサーボモータ制御部74を含む。加工プログラム76に基づいてワークを加工する場合に、オペレータは、加工プログラム76を作成する。例えば、加工プログラム76は、ワークの加工前の形状及び目標の形状等を含むワークの形状データに基づいて、CAM(Computer Aided Manufacturing)装置等にて生成することができる。ワークの形状データは、例えば、CAD(Computer Aided Design)装置にて作成することができる。 The control device 70 includes an input unit 71, a read interpretation unit 72, an interpolation calculation unit 73, and a servo motor control unit 74. When machining a workpiece based on the machining program 76, the operator creates the machining program 76. For example, the machining program 76 can be generated by a CAM (Computer / Aided / Manufacturing) device or the like based on workpiece shape data including a workpiece shape before processing and a target shape. The workpiece shape data can be created by, for example, a CAD (Computer Aided Design) device.
 入力部71には、加工に関連する加工情報が入力される。加工情報は、例えば、タッチパネル114を介して手動で入力部71に入力されてもよく、又は、通信ネットワークを介して各種の情報が含まれた電子データとして入力部71に入力されてもよい。工作機械10は、様々な加工情報に基づいてワークWを加工する。加工情報は、例えば、加工プログラム76、工具情報77及びワーク情報78を含むことができる。加工情報は、他の情報を含んでもよい。 加工 Processing information related to processing is input to the input unit 71. The processing information may be manually input to the input unit 71 via the touch panel 114, or may be input to the input unit 71 as electronic data including various information via a communication network. The machine tool 10 processes the workpiece W based on various processing information. The processing information can include, for example, a processing program 76, tool information 77, and work information 78. The processing information may include other information.
 加工プログラム76には、ワークに対する工具の相対移動の情報などが含まれている。加工プログラム76には、例えば、Gコード及びMコード等により工作機械に対する指令が含まれている。工具情報77には、例えば、ドリル及びエンドミル等の工具の種類、工具の径、並びに、工具の長さ等の工具に関する情報が含まれている。ワーク情報78には、加工すべきワークWに関する情報が含まれている。ワークWに関する情報には、例えば、ワークWの加工前の形状及び目標の形状等を含む形状データが含まれる。 The machining program 76 includes information on the relative movement of the tool with respect to the workpiece. The machining program 76 includes a command for the machine tool in, for example, a G code and an M code. The tool information 77 includes, for example, information about the tool such as the type of tool such as a drill and an end mill, the diameter of the tool, and the length of the tool. The work information 78 includes information on the work W to be processed. The information on the work W includes, for example, shape data including a shape of the work W before processing, a target shape, and the like.
 読取解釈部72は、入力部71から加工プログラム76を含む情報を読み込む。読取解釈部72は、移動指令を補間演算部73に出力する。補間演算部73は、補間周期毎の位置指令値を演算する。例えば、補間演算部73は、移動指令に基づいて設定された時間間隔ごとの移動量を算出する。補間演算部73は、位置指令値をサーボモータ制御部74に出力する。サーボモータ制御部74は、位置指令に基づいて、X軸、Y軸及びZ軸の各送り軸のサーボモータ75へ電流値を出力する。サーボモータ制御部74が各軸のサーボモータ75をサーボ制御するために、X軸、Y軸及びZ軸の各送り軸にはデジタルスケールのような直線位置測定装置(図示せず)が配設されており、サーボモータ制御部74にはX軸、Y軸及びZ軸の各送り軸の位置がフィードバックされる。 The reading / interpreting unit 72 reads information including the processing program 76 from the input unit 71. The reading interpretation unit 72 outputs a movement command to the interpolation calculation unit 73. The interpolation calculation unit 73 calculates a position command value for each interpolation cycle. For example, the interpolation calculation unit 73 calculates a movement amount for each time interval set based on the movement command. The interpolation calculator 73 outputs the position command value to the servo motor controller 74. The servo motor control unit 74 outputs a current value to the servo motor 75 of each of the X-axis, Y-axis, and Z-axis feed axes based on the position command. A linear position measuring device (not shown) such as a digital scale is provided on each of the X-axis, Y-axis, and Z-axis feed axes so that the servo motor control unit 74 servo-controls the servo motor 75 of each axis. The positions of the X-axis, Y-axis, and Z-axis feed axes are fed back to the servo motor control unit 74.
 加工情報は、上記の加工プログラム76、工具情報77及びワーク情報78に加えて、座標情報を含むことができる。座標情報には、加工プログラム76において使用される座標の情報が含まれる。工作機械10で使用される座標系は、例えば、工作機械の所定の点を原点とした機械座標系と、ワークの任意の点を原点としたワーク座標系と、を含むことができる。ワーク座標系は、ワークの移動と共に移動する。ワーク座標系は、機械座標系に対する相対位置を設定することにより定めることができる。工作機械10で使用される座標系は、他の座標系を含んでもよい。加工プログラム76の内部で複数の座標系が用いられる場合には、これらの複数の座標系の情報が座標情報に含まれる。 The machining information can include coordinate information in addition to the machining program 76, the tool information 77, and the work information 78. The coordinate information includes information on coordinates used in the machining program 76. The coordinate system used in the machine tool 10 can include, for example, a machine coordinate system whose origin is a predetermined point of the machine tool, and a work coordinate system whose origin is an arbitrary point of the work. The work coordinate system moves with the movement of the work. The work coordinate system can be determined by setting a relative position with respect to the machine coordinate system. The coordinate system used in the machine tool 10 may include another coordinate system. When a plurality of coordinate systems are used in the machining program 76, information on the plurality of coordinate systems is included in the coordinate information.
 加工情報は、制御装置のパラメータを更に含むことができる。例えば、加工情報は、操作盤110にて指定された加工条件の情報を含むことができる。このような加工条件は、例えば、移動速度に関するオーバライド値、及び、クーラントに関する情報を含むことができる。 The processing information can further include a parameter of the control device. For example, the processing information can include information on processing conditions specified on the operation panel 110. Such processing conditions can include, for example, an override value relating to the moving speed and information relating to the coolant.
 制御装置70は、指令部20と、表示・入力部26と、加工情報を記憶する記憶装置28と、測定指令部29と、測定情報取得部30と、マッチング処理部31と、を更に含む。 The control device 70 further includes a command unit 20, a display / input unit 26, a storage device 28 for storing processing information, a measurement command unit 29, a measurement information acquisition unit 30, and a matching processing unit 31.
 指令部20は、加工情報を生成して入力部71に出力する。例えば、指令部20は、オペレータによるタッチパネル114の操作に基づいて、入力部71に入力された加工情報を編集して、編集された加工情報を新たな加工情報として入力部71に出力することができる。また、指令部20は、オーバライド値等の操作盤110にて入力される加工情報を受信する。この様に、指令部20は、加工情報を新規に作成及び/又は編集することができる。 The command unit 20 generates processing information and outputs it to the input unit 71. For example, the command unit 20 can edit the processing information input to the input unit 71 based on an operation of the touch panel 114 by the operator, and output the edited processing information to the input unit 71 as new processing information. it can. In addition, the command unit 20 receives processing information such as an override value input on the operation panel 110. In this way, the command unit 20 can newly create and / or edit the processing information.
 指令部20は、表示指令部22、記憶指令部25及び撮像指令部27を含む。表示指令部22は、例えば、加工情報を表示するように、表示・入力部26に対して指令を出力することができる。また、オペレータがタッチパネル114を操作することにより、表示・入力部26から様々な指令及び情報が指令部20に入力される。 The command unit 20 includes a display command unit 22, a storage command unit 25, and an imaging command unit 27. The display command unit 22 can output a command to the display / input unit 26 to display the processing information, for example. When the operator operates the touch panel 114, various commands and information are input to the command unit 20 from the display / input unit 26.
 記憶指令部25は、新たに生成された加工情報及び/又は編集された加工情報を記憶するように、記憶装置28に対して指令を出力することができる。撮像指令部27は、ワークW及びテーブル16を撮像するように、撮像装置120に対して指令を出力することができる。撮像指令部27は、インターロックからの信号に基づいて、ドア108が閉じているときのみ、撮像装置120に対して指令を出力するように構成されていてもよい。 The storage command unit 25 can output a command to the storage device 28 to store the newly generated processing information and / or the edited processing information. The imaging instruction unit 27 can output an instruction to the imaging device 120 so as to image the work W and the table 16. The imaging instruction unit 27 may be configured to output an instruction to the imaging device 120 based on a signal from the interlock only when the door 108 is closed.
 表示・入力部26は、加工情報を表示するように、且つ、オペレータがデータ及び指令を入力することを可能とするように、構成されている。表示・入力部26は、例えば、上記のタッチパネル114(図1参照)を含むことができる。図3を参照して、表示・入力部26は、その他の構成要素を含んでもよい。オペレータは、タッチパネル114上のアイコンをタップ又はクリックすることによって、データ及び指令を入力することができる。 The display / input unit 26 is configured to display processing information and to enable an operator to input data and instructions. The display / input unit 26 can include, for example, the touch panel 114 (see FIG. 1). Referring to FIG. 3, display / input unit 26 may include other components. The operator can input data and a command by tapping or clicking an icon on the touch panel 114.
 記憶装置28は、例えば、上記のROM及び/又は記憶媒体を含むことができる。本実施形態の記憶装置28は、制御装置70の内部に配置されているが、記憶装置28は、制御装置70の外部に配置されていてもよい。例えば、記憶装置28は、通信ネットワークを介して制御装置70に接続されたメモリーカード及び/又はハードディスク等の記憶媒体を含んでもよい。 The storage device 28 can include, for example, the above-described ROM and / or storage medium. Although the storage device 28 of the present embodiment is arranged inside the control device 70, the storage device 28 may be arranged outside the control device 70. For example, the storage device 28 may include a storage medium such as a memory card and / or a hard disk connected to the control device 70 via a communication network.
 測定指令部29は、ワークWの測定の指令を指令部20から受け取る。例えば、測定指令部29は、ワークWの第2の位置検出(後述)を実行すべく、主軸13の先端に取り付けられた測定プローブ32を移動させるための指令を、指令部20から受け取ることができる。測定指令部29は、測定のための指令を読取解釈部72に対して出力し、当該指令に基づいてワークWが測定される。 The measurement command unit 29 receives a command for measuring the workpiece W from the command unit 20. For example, the measurement command unit 29 may receive a command for moving the measurement probe 32 attached to the tip of the spindle 13 from the command unit 20 to execute the second position detection (described later) of the work W. it can. The measurement command unit 29 outputs a command for measurement to the reading / interpreting unit 72, and the work W is measured based on the command.
 測定情報取得部30は、ワークの測定値を取得する。測定情報取得部30は、測定プローブ32からの信号を受信する。また、測定情報取得部30は、サーボモータ制御部74から機械座標系の座標値を受信する。測定情報取得部30は、これらの情報を指令部20に出力する。例えば、測定プローブ32がワークWに接触したときに、測定情報取得部30は、測定プローブ32から出力されるスキップ信号と、その時の各軸の機械座標系の座標値と、を指令部20に出力することができる。指令部20は、必要に応じて、測定値から加工情報を演算して、演算された加工情報を表示・入力部26に表示することができる。 The measurement information acquisition unit 30 acquires the measurement value of the work. The measurement information acquisition unit 30 receives a signal from the measurement probe 32. Further, the measurement information acquisition unit 30 receives the coordinate values of the machine coordinate system from the servo motor control unit 74. The measurement information acquisition unit 30 outputs these pieces of information to the command unit 20. For example, when the measurement probe 32 contacts the workpiece W, the measurement information acquisition unit 30 sends the skip signal output from the measurement probe 32 and the coordinate value of each axis in the machine coordinate system to the command unit 20. Can be output. The command unit 20 can calculate the processing information from the measured values as necessary, and can display the calculated processing information on the display / input unit 26.
 マッチング処理部31は、撮像装置120から第1の画像及び第2の画像を受け取る。また、マッチング処理部31は、記憶装置28からワークWの形状データを受け取る。マッチング処理部31は、第1の画像を解析して、第1の画像を、所定の範囲内の輝度値を有する第1の領域と、所定の範囲外の輝度値を有する第2の領域と、に分割するように構成されている。 The matching processing unit 31 receives the first image and the second image from the imaging device 120. Further, the matching processing unit 31 receives the shape data of the work W from the storage device 28. The matching processing unit 31 analyzes the first image and converts the first image into a first area having a luminance value within a predetermined range and a second area having a luminance value outside the predetermined range. , Is configured to be divided.
 具体的には、例えば、第1カメラ121がカラーカメラである場合には、マッチング処理部31は、ある領域のR,G及びBの全ての輝度値が所定の範囲(例えば、5より大きく250より小さい範囲)内にあるとき、この領域を第1の領域に分類することができる。対照的に、マッチング処理部31は、ある領域のR,G及びBの少なくとも1つの輝度値が所定の範囲外にあるとき、この領域を第2の領域に分類することができる。 Specifically, for example, when the first camera 121 is a color camera, the matching processing unit 31 determines that all the brightness values of R, G, and B in a certain area are within a predetermined range (for example, 250 This area can be classified as a first area when it is within a smaller range. In contrast, when at least one of the R, G, and B luminance values of a certain region is out of the predetermined range, the matching processing unit 31 can classify this region as a second region.
 また、例えば、第1カメラ121が白黒カメラである場合には、マッチング処理部31は、ある領域の輝度値が所定の範囲(例えば、5より大きく250より小さい範囲)内にあるとき、この領域を第1の領域に分類することができる。対照的に、マッチング処理部31は、ある領域の輝度値が所定の範囲外にあるとき、この領域を第2の領域に分類することができる。上記の範囲の数値は単なる例であり、所定の範囲は他の数値を有してもよいことに留意すべきである。 Further, for example, when the first camera 121 is a black-and-white camera, when the luminance value of a certain area is within a predetermined range (for example, a range larger than 5 and smaller than 250), the matching processing unit 31 Can be classified into a first region. In contrast, when the luminance value of a certain area is out of the predetermined range, the matching processing unit 31 can classify this area as a second area. It should be noted that the numerical values in the above ranges are merely examples, and that certain ranges may have other numerical values.
 また、マッチング処理部31は、第1の領域において形状データ及び第1の画像を比較して、かつ、必要な場合には追加的に第2の領域において形状データ及び第2の画像を比較して、工作機械10の機械座標系におけるワークWの位置及び向きを確定するように構成されている。 Further, the matching processing unit 31 compares the shape data and the first image in the first area, and additionally compares the shape data and the second image in the second area if necessary. Thus, the position and orientation of the workpiece W in the machine coordinate system of the machine tool 10 are determined.
 具体的には、マッチング処理部31は、例えば、画像データを二値化してエッジとなる画素を抽出することによって、ワークWの形状を得ることができる。また、マッチング処理部31は、例えば、画素数をX軸、Y軸及びZ軸方向にカウントすることによって、ワークWの寸法を得ることができる。マッチング処理部31は、例えば、画像データから得られたワークWの形状及び寸法と、形状データとが、最も一致するワークWの位置及び向きを算出することができる。マッチング処理部31は、一致度が所定の閾値を超える場合に、算出されたワークWの位置及び向きが正しいと判断してもよい。一致度は、公知の様々な方法によって求めることが可能であるので、本明細書においては詳細な説明を省略する。 Specifically, the matching processing unit 31 can obtain the shape of the work W by, for example, binarizing the image data and extracting pixels serving as edges. The matching processing unit 31 can obtain the dimensions of the work W by counting the number of pixels in the X-axis, Y-axis, and Z-axis directions, for example. The matching processing unit 31 can calculate, for example, the position and orientation of the work W where the shape data and the shape data of the work W obtained from the image data best match. The matching processing unit 31 may determine that the calculated position and orientation of the workpiece W are correct when the degree of coincidence exceeds a predetermined threshold. Since the degree of coincidence can be obtained by various known methods, detailed description is omitted in this specification.
 次に、第1カメラ121及び第2カメラ122を用いたワークWの位置検出について説明する。 Next, position detection of the workpiece W using the first camera 121 and the second camera 122 will be described.
 図2を参照して、工作機械10では、ワークWをテーブル16に取り付ける際、ワークWはテーブル16上の任意の位置に取り付けられる場合がある。したがって、工作機械10に実際に設置されたワークWに対して工具経路を正確に設定するためには、工作機械10の機械座標系において、どの位置にワークWが実際に取り付けられたかを加工前に知る必要がある。 Referring to FIG. 2, in the machine tool 10, when attaching the work W to the table 16, the work W may be attached to an arbitrary position on the table 16. Therefore, in order to accurately set the tool path for the work W actually installed on the machine tool 10, it is necessary to determine in which position the work W is actually mounted in the machine coordinate system of the machine tool 10 before machining. You need to know.
 したがって、工作機械10は、加工に先立って、工作機械10の機械座標系におけるワークWの位置及び向きを検出するように構成されている。本実施形態の工作機械10は、先ず、ワークWの第1の位置検出を第1カメラ121及び第2カメラ122によって実施し、続いて、第1の位置検出よりも正確な第2の位置検出を測定プローブ32によって実施するように構成されている。工作機械10は、第1の位置検出によって得られたワークWの位置及び向きに基づいて、第2の位置検出においてワークWの近傍まで測定プローブ32を早送りするように構成されている。このような構成によって、ワークWの位置及び向きを迅速かつ正確に検出することが可能である。機械座標系におけるワークWの位置及び向きを検出することにより、例えば、機械座標系に対するワーク座標系のオフセット値、及び、機械座標系に対するワーク座標系の傾きを得ることができる。 Therefore, the machine tool 10 is configured to detect the position and the orientation of the workpiece W in the machine coordinate system of the machine tool 10 before machining. In the machine tool 10 of the present embodiment, first, the first position detection of the workpiece W is performed by the first camera 121 and the second camera 122, and subsequently, the second position detection is more accurate than the first position detection. Is performed by the measurement probe 32. The machine tool 10 is configured to rapidly feed the measurement probe 32 to the vicinity of the work W in the second position detection based on the position and the orientation of the work W obtained by the first position detection. With such a configuration, the position and orientation of the work W can be detected quickly and accurately. By detecting the position and orientation of the work W in the machine coordinate system, for example, an offset value of the work coordinate system with respect to the machine coordinate system and an inclination of the work coordinate system with respect to the machine coordinate system can be obtained.
 図4は、ワークを説明するための概略図である。ワークWは、円弧状の表面SFを有している。表面SFの片側(図4において右側)半分は、特徴形状CSを含んでいる。特徴形状CSは、ワークWの向きを特定する際に目印となり得る形状であることができ、例えば、稜線及び/又は谷線を含む形状(例えば、突起又は溝)であることができる。 FIG. 4 is a schematic diagram for explaining a work. The work W has an arc-shaped surface SF. One side (right side in FIG. 4) half of the surface SF includes the characteristic shape CS. The characteristic shape CS can be a shape that can be a mark when specifying the direction of the work W, and can be, for example, a shape including a ridge line and / or a valley line (for example, a protrusion or a groove).
 上方から表面SFの右側半分に入射する光L0の一部は、第1カメラ121に向かう光L1として反射し、光L0の他の一部は、第2カメラ122に向かう光L2として反射する。表面SFの右側半分は第1カメラ121と対向するように傾斜しているため、第1カメラ121に向かう光L1は、より強く反射される。対照的に、第2カメラ122に向かう光L2は、より弱く反射される。 一部 Part of the light L0 incident on the right half of the surface SF from above is reflected as light L1 toward the first camera 121, and another part of the light L0 is reflected as light L2 toward the second camera 122. Since the right half of the surface SF is inclined so as to face the first camera 121, the light L1 traveling toward the first camera 121 is more strongly reflected. In contrast, the light L2 going to the second camera 122 is reflected weaker.
 図5は、第1カメラ121によって得られる画像の例を示す概略図であり、図4の第1カメラ121によって得られる画像を示している。図5において、領域AR1は所定の範囲内の輝度値を有する第1の領域(すなわち、第1カメラ121が画像を取得できる領域)を示し、領域AR2は所定の範囲外の輝度値を有する第2の領域(すなわち、第1カメラ121が画像を取得できない領域)を示している。また、図5において、第1カメラ121によって画像として認識できない線は、破線で示されている。ワークWが設置される光環境及びワークWの材質(例えば、金属)によっては、光L1は、第1カメラ121が、光L1が反射される領域AR2を画像として認識できないほど、高い輝度値を有する場合がある(いわゆる、白とび)。この場合、表面SFに含まれる特徴形状CSは、第1カメラ121によっては画像として認識されない場合がある。ワークWは、図5において左右方向及び前後方向に対象である。このため、特徴形状CSが検出されない場合、ワークWの位置及び向きが正しく検出されない可能性がある。 FIG. 5 is a schematic diagram showing an example of an image obtained by the first camera 121, and shows an image obtained by the first camera 121 in FIG. In FIG. 5, an area AR1 indicates a first area having a luminance value within a predetermined range (that is, an area where the first camera 121 can acquire an image), and an area AR2 has a luminance value outside the predetermined range. 2 (that is, an area where the first camera 121 cannot acquire an image). In FIG. 5, a line that cannot be recognized as an image by the first camera 121 is indicated by a broken line. Depending on the light environment in which the work W is installed and the material (for example, metal) of the work W, the light L1 has such a high luminance value that the first camera 121 cannot recognize the area AR2 where the light L1 is reflected as an image. May be present (so-called overexposure). In this case, the feature shape CS included in the surface SF may not be recognized as an image by the first camera 121. The work W is a target in the left-right direction and the front-back direction in FIG. Therefore, when the characteristic shape CS is not detected, the position and the orientation of the work W may not be correctly detected.
 図6は、第2カメラ122によって得られる画像の例を示す概略図である。表面SFの右側半分から第2カメラ122に入射する光L2は弱いため、図6に示されるように、表面SFに含まれる特徴形状CSは、第2カメラ122によって画像として認識される。したがって、工作機械10は、図5に示されるような第1の画像及び図6に示されるような第2画像を用いて、第1の位置検出を実施するように構成されている。なお、図4~図6では、白とびによって画像が部分的に得られない場合について説明されているが、本開示は、低い輝度値によって画像が部分的に得られない場合(いわゆる、黒つぶれ)にも適用可能であることに留意されたい。 FIG. 6 is a schematic diagram showing an example of an image obtained by the second camera 122. Since the light L2 incident on the second camera 122 from the right half of the surface SF is weak, the characteristic shape CS included in the surface SF is recognized as an image by the second camera 122 as shown in FIG. Therefore, the machine tool 10 is configured to perform the first position detection using the first image as shown in FIG. 5 and the second image as shown in FIG. 4 to 6 illustrate a case where an image cannot be partially obtained due to overexposure. However, in the present disclosure, a case where an image cannot be partially obtained due to a low luminance value (so-called black underexposure). Note that this is also applicable.
 本実施形態の工作機械10は、ワークの測定方法について、手動又は自動を選択するように構成されている。自動測定を支援する画面及び測定プログラムは、記憶装置28に記憶されることができる。同様に、手動測定を支援する画面は、記憶装置28に記憶されることができる。 工作 The machine tool 10 of the present embodiment is configured to select manual or automatic for the method of measuring a work. The screen and the measurement program that support the automatic measurement can be stored in the storage device 28. Similarly, the screen supporting the manual measurement can be stored in the storage device 28.
 図9は、プローブ測定のための画面を示す図である。ワークWの自動測定を行う場合には、例えば操作盤110を操作することによって、図9に示されるような画面80が表示・入力部26に表示される。画面80は、測定点・測定向き選択領域82、測定項目選択領域84、及び、モデル呼出し釦86を含む複数の呼出し釦を有している。オペレータがモデル呼出し釦86をタップ又はクリックすると、測定対象のワークの形状が、記憶装置28から指令部20の表示指令部22に出力され、測定点・測定向き選択領域82に表示される。例えば、オペレータは、タッチパネル114上で測定点・測定向き選択領域82に表示されたワーク上の所望箇所P1~P3をタップ又はクリックすることによって、ワーク上の測定点及び測定方向を設定可能である。 FIG. 9 is a diagram showing a screen for probe measurement. When performing automatic measurement of the work W, for example, by operating the operation panel 110, a screen 80 as shown in FIG. 9 is displayed on the display / input unit 26. The screen 80 has a plurality of call buttons including a measurement point / measurement direction selection area 82, a measurement item selection area 84, and a model call button 86. When the operator taps or clicks the model call button 86, the shape of the work to be measured is output from the storage device 28 to the display command section 22 of the command section 20, and displayed in the measurement point / measurement direction selection area 82. For example, the operator can set a measurement point and a measurement direction on the work by tapping or clicking on desired points P 1 to P 3 on the work displayed on the measurement point / measurement direction selection area 82 on the touch panel 114. It is.
 測定項目選択領域84には、ワーク測定方法を示した複数の測定方法選択アイコン84a~84dが表示される。アイコン84aは、機械座標系に対するワークの2つの側面の角度を測定するために使用されることができる。アイコン84bは、ワークに形成する矩形凹部の対向する2辺の長さを測定するために使用されることができる。アイコン84cは、ワークに形成される円形凹部の内径を測定するために使用されることができる。アイコン84dは、直方体ワークの対向する2辺の長さを測定するために使用されることができる。オペレータは、タッチパネル114上で測定方法選択アイコン84a~84dの1つをタップ又はクリックすることによって、所望の測定を選択することができる。なお、アイコン84a~84dは単なる例に過ぎず、測定項目選択領域84は他の測定に使用されるアイコンを含んでもよいことに留意すべきである。 In the measurement item selection area 84, a plurality of measurement method selection icons 84a to 84d indicating the work measurement method are displayed. Icon 84a can be used to measure the angle of the two sides of the workpiece with respect to the machine coordinate system. The icon 84b can be used to measure the length of two opposing sides of a rectangular recess formed in a work. Icon 84c can be used to measure the inner diameter of a circular recess formed in the work. The icon 84d can be used to measure the length of two opposing sides of the rectangular parallelepiped work. The operator can select a desired measurement by tapping or clicking one of the measurement method selection icons 84a to 84d on the touch panel 114. It should be noted that the icons 84a to 84d are merely examples, and the measurement item selection area 84 may include icons used for other measurements.
 オペレータが操作盤110上の測定開始釦(図示せず)を押下すると、選択された測定方法に従ってワークWの測定が開始される。本実施形態の工作機械10では、上記のように、先ず、ワークWの第1の位置検出が撮像装置120によって実施され、続いて、第2の位置検出が測定プローブ32によって実施される。 (4) When the operator presses a measurement start button (not shown) on the operation panel 110, the measurement of the workpiece W is started according to the selected measurement method. In the machine tool 10 of the present embodiment, as described above, first, the first position detection of the workpiece W is performed by the imaging device 120, and then, the second position detection is performed by the measurement probe 32.
 次に、撮像装置120よって実施されるワークWの第1の位置検出の方法について説明する。図7は、実施形態に係るワーク位置検出方法を示すフローチャートである。第1の位置検出の間、ワークWは、テーブル16上の任意の位置に固定される。 Next, a method for detecting the first position of the workpiece W performed by the imaging device 120 will be described. FIG. 7 is a flowchart illustrating a work position detection method according to the embodiment. The work W is fixed at an arbitrary position on the table 16 during the first position detection.
 第1の位置検出では、先ず、マッチング処理部31が、記憶装置28からワークWの形状データ(加工前の形状データ)を読み出す(ステップS100)。続いて、撮像指令部27が、第1カメラ121に対してワークWを撮像するように指令を出力し、第1の画像を取得する(ステップS102)。取得された第1の画像は、第1カメラ121からマッチング処理部31に送信される。 In the first position detection, first, the matching processing unit 31 reads out the shape data (the shape data before processing) of the work W from the storage device 28 (Step S100). Subsequently, the imaging instruction unit 27 outputs an instruction to the first camera 121 to image the workpiece W, and acquires a first image (Step S102). The acquired first image is transmitted from the first camera 121 to the matching processing unit 31.
 続いて、マッチング処理部31は、第1の画像を解析して、第1の画像を、所定の範囲内の輝度値を有する第1の領域AR1(図5参照)と、所定の範囲外の輝度値を有する第2の領域AR2(図5参照)と、に分割する(ステップS104)。図7を参照して、続いて、マッチング処理部31は、第1の領域AR1において、形状データと第1の画像データとを比較する(ステップS106)。これによって、ワークWの位置及び向きが算出され、かつ、このときの形状データと画像データとの一致度が算出される。 Subsequently, the matching processing unit 31 analyzes the first image, and converts the first image into a first region AR1 (see FIG. 5) having a luminance value within a predetermined range and an area outside the predetermined range. It is divided into a second area AR2 having a luminance value (see FIG. 5) (step S104). Referring to FIG. 7, subsequently, matching processing section 31 compares the shape data with the first image data in first area AR1 (step S106). As a result, the position and orientation of the workpiece W are calculated, and the degree of coincidence between the shape data and the image data at this time is calculated.
 続いて、マッチング処理部31は、算出された一致度が所定の閾値を超えるか否かを判断する(ステップS108)。ステップS108において一致度が所定の閾値を超えると判断された場合には、続いて、マッチング処理部31は、このときのワークWの位置及び向きを、それぞれ、機械座標系における位置及び向きとして仮決定する(ステップS110)。 Next, the matching processing unit 31 determines whether the calculated degree of matching exceeds a predetermined threshold (step S108). If it is determined in step S108 that the degree of coincidence exceeds the predetermined threshold, the matching processing unit 31 then temporarily sets the position and orientation of the workpiece W at this time as the position and orientation in the machine coordinate system, respectively. It is determined (step S110).
 ステップS108において一致度が所定の閾値を超えないと判断された場合には、マッチング処理部31は、所定の閾値を超える一致度が得られるまで、ステップS106~S108を繰り返す。マッチング処理部31は、例えば、ワークWの位置及び向きを変更することによって、一致度を再算出することができる。 に は If it is determined in step S108 that the matching degree does not exceed the predetermined threshold, the matching processing unit 31 repeats steps S106 to S108 until a matching degree exceeding the predetermined threshold is obtained. The matching processing unit 31 can recalculate the degree of coincidence by, for example, changing the position and the direction of the work W.
 続いて、マッチング処理部31は、上記のステップS104において第2の領域AR2が抽出されたか否かを判断する(ステップS112)。ステップS112において第2の領域AR2が抽出されたと判断された場合には、続いて、マッチング処理部31は、形状データが第2の領域AR2に特徴形状CSを含んでいるか否かを判定する(ステップS114)。ステップS114において形状データが第2の領域AR2に特徴形状CSを含んでいると判断された場合には、制御装置70は、ステップS118(図8)に進む。 Subsequently, the matching processing unit 31 determines whether or not the second area AR2 has been extracted in step S104 (step S112). If it is determined in step S112 that the second area AR2 has been extracted, subsequently, the matching processing unit 31 determines whether the shape data includes the characteristic shape CS in the second area AR2 ( Step S114). If it is determined in step S114 that the shape data includes the characteristic shape CS in the second area AR2, the control device 70 proceeds to step S118 (FIG. 8).
 図7を参照して、ステップS112において第2の領域AR2が抽出されなかったと判断された場合、及び、ステップS114において形状データが第2の領域AR2に特徴形状CSを含んでいないと判断された場合、マッチング処理部31は、ステップS110において仮決定された位置及び向きを、機械座標系におけるワークWの位置及び向きとして確定して(ステップS116)、第1の位置検出の一連の動作を終了する。 Referring to FIG. 7, when it is determined in step S112 that second region AR2 has not been extracted, and in step S114, it is determined that shape data does not include feature shape CS in second region AR2. In this case, the matching processing unit 31 determines the position and orientation provisionally determined in step S110 as the position and orientation of the workpiece W in the machine coordinate system (step S116), and ends a series of operations for the first position detection. I do.
 図8は、図7に続くフローチャートである。続いて、撮像指令部27が、第2カメラ122に対してワークWを撮像するように指令を出力し、第2の画像を取得する(ステップS118)。取得された第2の画像は、第2カメラ122からマッチング処理部31に送信される。 FIG. 8 is a flowchart following FIG. Subsequently, the imaging instruction unit 27 outputs an instruction to the second camera 122 to image the work W, and acquires a second image (Step S118). The acquired second image is transmitted from the second camera 122 to the matching processing unit 31.
 続いて、マッチング処理部31は、ステップS110において仮決定された位置及び向きで、形状データを第2の画像に配置する(ステップS120)。続いて、マッチング処理部31は、上記のステップS104で抽出された第2の領域AR2において、形状データと第2の画像とを照合する(ステップS122)。具体的には、マッチング処理部31は、第2の領域AR2において、形状データと画像データとの一致度を算出する。図6に示されるように、第2の領域AR2には特徴形状CSが含まれるため、特徴形状CSを考慮した一致度を算出することで、ステップS110において仮決定された位置及び向きが正しいか否か、ダブルチェックすることができる。 Next, the matching processing unit 31 arranges the shape data in the second image at the position and orientation provisionally determined in step S110 (step S120). Subsequently, the matching processing unit 31 compares the shape data with the second image in the second area AR2 extracted in step S104 (step S122). Specifically, the matching processing unit 31 calculates the degree of coincidence between the shape data and the image data in the second area AR2. As shown in FIG. 6, since the second region AR2 includes the characteristic shape CS, by calculating the degree of coincidence in consideration of the characteristic shape CS, the position and the orientation temporarily determined in step S110 are correct. No, you can double check.
 図8を参照して、続いて、マッチング処理部31は、ステップS122で算出された一致度が所定の閾値を超えるか否かを判断する(ステップS124)。ステップS124において、一致度が所定の閾値を超えると判断された場合、マッチング処理部31は、仮決定された位置及び向きを、機械座標系におけるワークWの位置及び向きとして確定し(ステップS126)、第1の位置検出の一連の動作を終了する。 を Referring to FIG. 8, subsequently, matching processing section 31 determines whether or not the degree of coincidence calculated in step S122 exceeds a predetermined threshold (step S124). If it is determined in step S124 that the degree of coincidence exceeds the predetermined threshold, the matching processing unit 31 determines the provisionally determined position and orientation as the position and orientation of the workpiece W in the machine coordinate system (step S126). , The series of operations for the first position detection is terminated.
 ステップS124において、一致度が所定の閾値を超えないと判断された場合、マッチング処理部31は、所定の閾値を超える一致度が得られるまで、ステップS106~S124を繰り返す。マッチング処理部31は、例えば、ワークWの位置及び向きを変更することによって、一致度を再算出することができる。 場合 If it is determined in step S124 that the matching degree does not exceed the predetermined threshold, the matching processing unit 31 repeats steps S106 to S124 until a matching degree exceeding the predetermined threshold is obtained. The matching processing unit 31 can recalculate the degree of coincidence by, for example, changing the position and the direction of the work W.
 第1の位置検出が終了した後に、制御装置70は、画面80(図9参照)で選択された方法に従って、測定プローブ32を使用して第2の位置検出を実行する。この際、指令部20及び測定指令部29は、第1の位置検出で得られたワークWの位置及び向きに基づいて、ワークWの近傍まで測定プローブ32を早送りするよう指令を出力する。以上のような構成に従って、オペレータが手動で早送りの終点を設定するような場合に比して、ワークWの位置及び向きをより迅速且つより簡易に検出することが可能である。 After the first position detection is completed, the control device 70 executes the second position detection using the measurement probe 32 according to the method selected on the screen 80 (see FIG. 9). At this time, the command unit 20 and the measurement command unit 29 output a command to fast-forward the measurement probe 32 to the vicinity of the work W based on the position and orientation of the work W obtained by the first position detection. According to the above configuration, the position and orientation of the workpiece W can be detected more quickly and more easily than when the operator manually sets the end point of the fast-forward.
 以上のような実施形態に係る工作機械10及びワーク位置検出方法では、第1の画像が、所定の範囲内の輝度値を有する第1の領域(すなわち、第1カメラ121が画像を取得できる領域)AR1と、所定の範囲外の輝度値を有する第2の領域(すなわち、第1カメラ121が画像を取得できない領域)AR2と、に分割される。そして、第1の領域AR1における形状データと第1の画像とを用いて、ワークWの位置及び向きが求められる。さらに、形状データが第2の領域AR2に特徴形状CSを含んでいる場合には、異なる方向(すなわち、第2カメラ122が第2の領域を画像として認識できるような方向)から撮像された第2の画像を用いて、第2の領域AR2における形状データと画像とが照合される。このような構成によって、第1の方向からでは画像が得られない第2の領域AR2がある場合であっても、異なる第2の方向から撮像された第2の画像を用いて、第2の領域AR2における特徴形状CSを確認することができる。したがって、第1の方向からでは画像が得られない領域AR2がある場合にも、ワークWの位置検出の精度を向上させることができる。 In the machine tool 10 and the work position detection method according to the embodiment described above, the first image has the first area having the luminance value within the predetermined range (that is, the area where the first camera 121 can acquire the image). ) AR1 and a second area AR2 having a luminance value outside a predetermined range (that is, an area where the first camera 121 cannot acquire an image) AR2. Then, the position and the orientation of the workpiece W are obtained using the shape data and the first image in the first area AR1. Furthermore, when the shape data includes the characteristic shape CS in the second area AR2, the second image captured from a different direction (that is, a direction in which the second camera 122 can recognize the second area as an image). The shape data and the image in the second area AR2 are compared using the second image. With such a configuration, even when there is the second area AR2 in which an image cannot be obtained from the first direction, the second area AR2 is captured using the second image captured from a different second direction. The feature shape CS in the area AR2 can be confirmed. Therefore, even when there is an area AR2 where an image cannot be obtained from the first direction, the accuracy of position detection of the work W can be improved.
 また、実施形態に係る工作機械10及びワーク位置検出方法では、第1の方向及び第2の方向からワークWが撮像される。したがって、例えば、タッチパネル114に第1の画像及び第2の画像を表示することによって、オペレータは、死角無しにワークWを視認することができる。 In the machine tool 10 and the work position detection method according to the embodiment, the work W is imaged from the first direction and the second direction. Therefore, for example, by displaying the first image and the second image on the touch panel 114, the operator can visually recognize the work W without a blind spot.
 ワーク位置検出方法及び工作機械の実施形態について説明したが、本発明は上記の実施形態に限定されない。当業者であれば、上記の実施形態の様々な変形が可能であることを理解するだろう。当業者であれば、上記の方法の工程は、矛盾が生じない限り、上記と異なる順番で実施されてもよいことを理解するだろう。 Although the embodiments of the work position detection method and the machine tool have been described, the present invention is not limited to the above embodiments. One skilled in the art will appreciate that various modifications of the above-described embodiments are possible. One skilled in the art will appreciate that the steps of the above method may be performed in a different order than described above, as long as no inconsistencies arise.
 例えば、上記の実施形態では、第2の画像は、第1の画像が取得された後に、取得されている(すなわち、ステップS118は、ステップS102の後に実行されている)。代替的に、第1の画像及び第2の画像は、同じタイミングで同時に取得されてもよい(すなわち、ステップS102及びステップS118は、同じタイミングで同時に実行されてもよい)。 {For example, in the above embodiment, the second image is acquired after the first image is acquired (that is, step S118 is executed after step S102). Alternatively, the first image and the second image may be acquired simultaneously at the same timing (ie, steps S102 and S118 may be performed simultaneously at the same timing).
 また、例えば、上記の実施形態では、ワークWの位置及び向きは、ワークWの形状データ及び画像を用いて検出される。しかしながら、ワークWの位置及び向きは、ワークWの形状データ及び画像に加えて、テーブル16及び/又はテーブル16上の冶具の形状データ及び画像を用いて検出されてもよい。 In addition, for example, in the above embodiment, the position and the orientation of the work W are detected using the shape data and the image of the work W. However, the position and the orientation of the work W may be detected using the shape data and the image of the table 16 and / or the jig on the table 16 in addition to the shape data and the image of the work W.
 また、例えば、図10は、他の実施形態に係る工作機械のブロック図である。この工作機械10Aは、撮像装置120が1つの移動可能なカメラ123を有する点で、上記の工作機械10と異なる。カメラ123は、上記の第1の方向からワークWを撮像することができる第1の位置と、上記の第2の方向からワークWを撮像することができる第2の位置と、の間を移動するように構成されることができる。カメラ123は、例えば、第1の位置と第2の位置との間に敷設されたレールに沿って移動するように構成されていてもよい。カメラ123は、他の構成を有することができる。カメラ123は、例えば、CMOSカメラ又はCCDカメラであってもよい。また、カメラ123は、カラーカメラであってもよく、又は、白黒カメラであってもよい。このような工作機械10A及び工作機械10Aを用いて実施されるワーク位置検出方法は、上記と同様な効果を奏することができる。 {For example, FIG. 10 is a block diagram of a machine tool according to another embodiment. The machine tool 10A differs from the machine tool 10 in that the imaging device 120 has one movable camera 123. The camera 123 moves between a first position at which the work W can be imaged from the first direction and a second position at which the work W can be imaged from the second direction. It can be configured to: The camera 123 may be configured to move, for example, along a rail laid between the first position and the second position. The camera 123 can have other configurations. The camera 123 may be, for example, a CMOS camera or a CCD camera. Further, the camera 123 may be a color camera or a black and white camera. The machine tool 10A and the work position detection method implemented using the machine tool 10A can provide the same effects as described above.
 10,10A  工作機械
 28  記憶装置
 70  制御装置
 120  撮像装置
 121  第1カメラ
 122  第2カメラ
 123  移動可能なカメラ
 AR1  第1の領域
 AR2  第2の領域
 CS  特徴形状
 W  ワーク
10, 10A Machine tool 28 Storage device 70 Control device 120 Imaging device 121 First camera 122 Second camera 123 Movable camera AR1 First area AR2 Second area CS Characteristic shape W Work

Claims (4)

  1.  工作機械におけるワーク位置検出方法において、
     ワークの形状データを読み出す工程と、
     前記ワークを第1の方向から撮像して前記ワークの第1の画像を得る工程と、
     前記第1の画像を、所定の範囲内の輝度値を有する第1の領域と、前記所定の範囲外の輝度値を有する第2の領域と、に分割する工程と、
     前記第1の領域において、前記形状データと前記第1の画像とを比較する工程であって、当該比較によって前記工作機械の機械座標系における前記ワークの位置及び向きが求められる、工程と、
     求められた前記位置及び向きに配置された前記形状データに基づいて、前記形状データが前記第2の領域に特徴形状を含んでいるか否かを判定する工程と、
     前記ワークを前記第1の方向とは異なる第2の方向から撮像して、前記ワークの第2の画像を得る工程と、
     前記形状データが前記第2の領域において特徴形状を含んでいると判定された場合に、前記第2の領域において、前記形状データと前記第2の画像とを照合する工程と、
     を備えることを特徴としたワーク位置検出方法。
    In the method of detecting a work position in a machine tool,
    Reading the shape data of the work;
    Imaging the work from a first direction to obtain a first image of the work;
    Dividing the first image into a first region having a luminance value within a predetermined range and a second region having a luminance value outside the predetermined range;
    A step of comparing the shape data with the first image in the first area, wherein the position and orientation of the workpiece in the machine coordinate system of the machine tool are obtained by the comparison;
    A step of determining whether or not the shape data includes a characteristic shape in the second region, based on the obtained shape data arranged in the position and orientation;
    Imaging the work from a second direction different from the first direction to obtain a second image of the work;
    When it is determined that the shape data includes a characteristic shape in the second area, a step of comparing the shape data with the second image in the second area;
    A work position detection method comprising:
  2.  前記第1の画像は、第1カメラによって撮像され、前記第2の画像は、前記第1カメラとは異なる第2カメラによって撮像される、請求項1に記載のワーク位置検出方法。 The work position detection method according to claim 1, wherein the first image is captured by a first camera, and the second image is captured by a second camera different from the first camera.
  3.  前記第1の画像及び前記第2の画像は、移動可能な同一のカメラによって撮像される、請求項1に記載のワーク位置検出方法。 The work position detection method according to claim 1, wherein the first image and the second image are captured by the same movable camera.
  4.  工作機械において、
     ワークの形状データを記憶するための記憶装置と、
     第1の画像及び第2の画像を撮像するように構成された撮像装置であって、前記第1の画像では、前記ワークが第1の方向から撮像され、前記第2の画像では、前記ワークが前記第1の方向とは異なる第2の方向から撮像される、撮像装置と、
     制御装置と
     を備え、
     前記制御装置は、
      前記記憶装置から前記ワークの前記形状データを読み出すことと、
      前記撮像装置から前記第1の画像を受け取ることと、
      前記第1の画像を、所定の範囲内の輝度値を有する第1の領域と、前記所定の範囲外の輝度値を有する第2の領域と、に分割することと、
      前記第1の領域において、前記形状データと前記第1の画像とを比較することであって、当該比較によって前記工作機械の機械座標系における前記ワークの位置及び向きが求められ、
      求められた前記位置及び向きに配置された前記形状データに基づいて、前記形状データが前記第2の領域に特徴形状を含んでいるか否かを判定することと、
      前記撮像装置から前記第2の画像を受け取ることと、
      前記形状データが前記第2の領域において特徴形状を含んでいると判定された場合に、前記第2の領域において、前記形状データと前記第2の画像とを照合することと、
     を実行するように構成されている、
     ことを特徴とした工作機械。
    In machine tools,
    A storage device for storing shape data of the work;
    An imaging apparatus configured to capture a first image and a second image, wherein in the first image, the workpiece is captured from a first direction, and in the second image, the workpiece is captured. An imaging device, which is imaged from a second direction different from the first direction,
    With a control device,
    The control device includes:
    Reading the shape data of the work from the storage device;
    Receiving the first image from the imaging device;
    Dividing the first image into a first region having a luminance value within a predetermined range and a second region having a luminance value outside the predetermined range;
    In the first region, the shape data and the first image are compared, and the position and orientation of the workpiece in the machine coordinate system of the machine tool are obtained by the comparison,
    Based on the obtained shape data arranged in the position and orientation, determining whether the shape data includes a characteristic shape in the second region,
    Receiving the second image from the imaging device;
    When it is determined that the shape data includes a characteristic shape in the second area, in the second area, comparing the shape data with the second image;
    Is configured to perform
    A machine tool characterized by the following.
PCT/JP2018/036052 2018-09-27 2018-09-27 Workpiece position detecting method and machine tool WO2020065854A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/036052 WO2020065854A1 (en) 2018-09-27 2018-09-27 Workpiece position detecting method and machine tool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/036052 WO2020065854A1 (en) 2018-09-27 2018-09-27 Workpiece position detecting method and machine tool

Publications (1)

Publication Number Publication Date
WO2020065854A1 true WO2020065854A1 (en) 2020-04-02

Family

ID=69951261

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/036052 WO2020065854A1 (en) 2018-09-27 2018-09-27 Workpiece position detecting method and machine tool

Country Status (1)

Country Link
WO (1) WO2020065854A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI762182B (en) * 2021-02-05 2022-04-21 德制國際有限公司 Automatic processing method and automatic processing system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008026723A1 (en) * 2006-09-01 2008-03-06 Mori Seiki Co., Ltd. Three-dimensional model data confirming method, and three-dimensional model data confirming apparatus
JP2010058264A (en) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd Method and apparatus for checking machining state
JP2010061661A (en) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd Machining status monitoring method and machining status monitoring apparatus
JP2011075311A (en) * 2009-09-29 2011-04-14 Dainippon Screen Mfg Co Ltd Image processing method
US20180018778A1 (en) * 2015-03-30 2018-01-18 Carl Zeiss Industrielle Messtechnik Gmbh Motion-measuring system of a machine and method for operating the motion-measuring system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008026723A1 (en) * 2006-09-01 2008-03-06 Mori Seiki Co., Ltd. Three-dimensional model data confirming method, and three-dimensional model data confirming apparatus
JP2010058264A (en) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd Method and apparatus for checking machining state
JP2010061661A (en) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd Machining status monitoring method and machining status monitoring apparatus
JP2011075311A (en) * 2009-09-29 2011-04-14 Dainippon Screen Mfg Co Ltd Image processing method
US20180018778A1 (en) * 2015-03-30 2018-01-18 Carl Zeiss Industrielle Messtechnik Gmbh Motion-measuring system of a machine and method for operating the motion-measuring system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI762182B (en) * 2021-02-05 2022-04-21 德制國際有限公司 Automatic processing method and automatic processing system

Similar Documents

Publication Publication Date Title
CN106346315B (en) Machine tool control system capable of obtaining workpiece origin and workpiece origin setting method
US7570795B2 (en) Multi-region autofocus tool and mode
EP2082850B1 (en) Generating device of processing robot program
JP6570592B2 (en) On-machine measuring method and control device of machine tool
JP5597056B2 (en) Image measuring apparatus, image measuring method, and program for image measuring apparatus
US8988032B2 (en) Numerical controller having display function for trajectory of tool
US10152044B2 (en) Control device for machine tool
JP3596753B2 (en) Apparatus and method for generating part program for image measuring device
US10656618B2 (en) Numerical controller
EP3214514A1 (en) Machine tool control method and machine tool control device
US10585418B2 (en) Control system of machine tool
JP6845873B2 (en) Method and machine tool to determine XYZ reference coordinates of workpiece
JP5649262B2 (en) MEASUREMENT DISPLAY METHOD AND MECHANISM PROVIDED WITH MEASUREMENT DISPLAY DEVICE
US20190111534A1 (en) Instrument for measuring workpiece, and machine tool
WO2020065854A1 (en) Workpiece position detecting method and machine tool
JP6579682B2 (en) Image measuring device
JP6554695B2 (en) Image measuring device
JP3958815B2 (en) Tool position measuring method in NC machine tools
US20230096477A1 (en) Information processing device, machining system, machine tool, and program
JP2666512B2 (en) Machine coordinate system correction device
JP2008071015A (en) Numerical controller
JP7397100B2 (en) Work image analysis device, work image analysis method, and program
WO2007037032A1 (en) Outline measuring machine, method for calculating geometrical basic profile of outline measuring machine and its program
KR101677255B1 (en) Apparatus and method for setting position of workpiece
KR20220144217A (en) Apparatus and method of machning workpiece

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18935804

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18935804

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP