WO2020065854A1 - Procédé de détection de position de pièce ouvrée et machine-outil - Google Patents

Procédé de détection de position de pièce ouvrée et machine-outil Download PDF

Info

Publication number
WO2020065854A1
WO2020065854A1 PCT/JP2018/036052 JP2018036052W WO2020065854A1 WO 2020065854 A1 WO2020065854 A1 WO 2020065854A1 JP 2018036052 W JP2018036052 W JP 2018036052W WO 2020065854 A1 WO2020065854 A1 WO 2020065854A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
work
shape data
camera
machine tool
Prior art date
Application number
PCT/JP2018/036052
Other languages
English (en)
Japanese (ja)
Inventor
忠 笠原
恵 古田
Original Assignee
株式会社牧野フライス製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社牧野フライス製作所 filed Critical 株式会社牧野フライス製作所
Priority to PCT/JP2018/036052 priority Critical patent/WO2020065854A1/fr
Publication of WO2020065854A1 publication Critical patent/WO2020065854A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves

Definitions

  • the present invention relates to a work position detection method and a machine tool.
  • a feed axis is numerically controlled so that a tool is moved along a tool path set in advance with respect to a workpiece and the workpiece is machined into a desired shape. Therefore, in order to accurately set a tool path for a work actually installed on the machine tool, it is necessary to know at which position the work is actually mounted on the machine tool before machining.
  • Patent Document 1 discloses a device for checking a machining state in a machine tool.
  • the apparatus includes: an actual imaging unit that captures a workpiece from a preset viewpoint to generate two-dimensional actual image data of the workpiece; a model data storage unit that stores data including a three-dimensional model of the workpiece; Virtual image generating means for generating two-dimensional virtual image data of the work from the same viewpoint as the real imaging means based on the two-dimensional model; and comparing means for comparing the two-dimensional real image data with the two-dimensional virtual image data.
  • the positional deviation of the work is detected.
  • the environment in which the machine tool is installed can have various lighting environments. Therefore, when the position of the work is detected by the imaging means, there is a possibility that an image of the work may not be partially obtained due to various factors such as a light environment and a material of the work (so-called overexposure or underexposure). etc). If the feature shape required to specify the position and orientation of the work is included in an area where an image cannot be obtained, the position and orientation of the work may not be accurately detected.
  • the object of the present invention is to improve the accuracy of position detection of a work while taking the above problems into consideration.
  • a step of reading shape data of the work a step of imaging the work from a first direction to obtain a first image of the work, and a first image Is divided into a first area having a luminance value within a predetermined range and a second area having a luminance value outside the predetermined range.
  • the shape data and the first A step of comparing the image with an image, wherein the position and orientation of the workpiece in the machine coordinate system of the machine tool are determined by the comparison; and the shape data based on the shape data arranged at the determined position and orientation.
  • the first image includes a first area having a luminance value within a predetermined range (that is, an area where the camera can acquire an image) and a first area having a luminance value outside the predetermined range. And a second area having a luminance value (that is, an area where the camera cannot acquire an image). Then, using the shape data in the first area and the first image, the position and orientation of the work are obtained. Furthermore, when the shape data includes the characteristic shape in the second region, the second image captured from a different direction (that is, a direction in which the camera can recognize the second region as an image) is used. Then, the shape data and the image in the second area are collated.
  • the first image may be captured by the first camera, and the second image may be captured by a second camera different from the first camera. Alternatively, the first image and the second image may be captured by the same mobile camera.
  • Another aspect of the present disclosure is a storage device for storing shape data of a work and an imaging device configured to capture a first image and a second image in a machine tool,
  • the work is imaged in a first direction
  • the second image the work is imaged in a second direction different from the first direction.
  • the apparatus reads the shape data of the work from the storage device, receives the first image from the imaging device, and converts the first image into a first area having a brightness value within a predetermined range, A second region having a luminance value outside the range, and comparing the shape data with the first image in the first region. The comparison is performed by the machine coordinate system of the machine tool.
  • the position and orientation of the workpiece at Determining whether or not the shape data includes a characteristic shape in the second region based on the shape data arranged at the set position and orientation; receiving a second image from the imaging device; When the data is determined to include the characteristic shape in the second region, the shape data is compared with the second image in the second region. It is a machine. Similarly, such a machine tool can improve the accuracy of work position detection.
  • the imaging device may include a first camera configured to capture a first image and a second camera configured to capture a second image.
  • the imaging device may include a movable camera configured to capture the first image and the second image.
  • the accuracy of work position detection can be improved.
  • FIG. 1 is a schematic perspective view of a machine tool according to an embodiment.
  • FIG. 2 is a schematic perspective view of the machine tool of FIG. 1 when a door is opened. It is a block diagram of a machine tool concerning an embodiment. It is a schematic diagram for explaining a work.
  • FIG. 4 is a schematic diagram illustrating an example of an image obtained by a first camera.
  • FIG. 9 is a schematic diagram illustrating an example of an image obtained by a second camera.
  • 5 is a flowchart illustrating a work position detection method according to the embodiment. It is a flowchart following FIG. It is a figure showing a screen for probe measurement. It is a block diagram of a machine tool concerning other embodiments.
  • FIG. 1 is a schematic perspective view of the machine tool according to the embodiment.
  • the machine tool 10 includes the processing machine 1 housed in the cover 100.
  • the machine tool 10 can be a vertical machining center, and the processing machine 1 has an X axis in the horizontal direction, a Y axis in the horizontal direction, and a Z axis in the vertical direction.
  • the machine tool 10 may be another type of machine tool.
  • the processing machine 1 includes a bed 11 as a base, and a column (not shown) provided upright on the upper surface of the bed 11.
  • the processing machine 1 includes a spindle head 14 that supports the spindle 13 rotatably about a vertical axis, and a saddle (not shown) that supports the spindle head 14 in front of the column.
  • the spindle head 14 supports the spindle 13 downward so that the tip of the spindle 13 faces the table 16.
  • a tool T is mounted on the tip of the main shaft 13.
  • the processing machine 1 includes a table 16 on which the work W is mounted.
  • the table 16 is provided on the bed 11.
  • the work W may be attached to the table 16 via a pallet, or may be attached directly to the table 16.
  • the processing machine 1 includes a driving device that moves the tool T relative to the workpiece W based on each feed axis.
  • the drive includes a servomotor 75 (see FIG. 3) for each axis that drives along the respective feed axis.
  • the driving device can move the column in the X-axis direction with respect to the bed 11, move the saddle with respect to the column in the Y-axis direction,
  • the spindle head 14 can be moved in the Z-axis direction.
  • the processing machine 1 of the present embodiment has three linear motion axes of the X axis, the Y axis, and the Z axis that are orthogonal to each other.
  • the cover 100 has at least a flat upper wall 102 and a front wall 104 and surrounds the entire processing machine 1.
  • the cover 100 has an opening 106 for an operator to access a processing space in which the processing machine 1, in particular, the table 16 and the tool T at the tip of the spindle 13 are arranged.
  • the opening 106 is formed from the upper wall 102 to the front wall 104.
  • the cover 100 has a door 108 for opening and closing the opening 106.
  • An operation panel 110 for operating the processing machine 1 is attached to the front wall 104.
  • the operation panel 110 is provided with a plurality of buttons or input keys 112 for operating the processing machine 1, and a touch panel 114 included in the display / input unit 26 (described later).
  • the door 108 is formed substantially in an L shape, and has an upper door portion 108a and a front door portion 108b.
  • the door 108 is slidable in the X-axis direction.
  • the door 108 may be provided with an interlock, and the processing machine 1 may be configured to operate only when the door 108 is at a position that closes the opening 106.
  • FIG. 2 is a schematic perspective view of the machine tool of FIG. 1 when the door is opened. It should be noted that some components such as columns and saddles are omitted in FIG. 2 for clarity.
  • the machine tool 10 captures a first image in which the workpiece W is captured from a first direction and a second image in which the workpiece W is captured from a second direction different from the first direction.
  • the imaging device 120 is provided.
  • the imaging device 120 includes a first camera 121 and a second camera 122.
  • the first camera 121 and the second camera 122 may be, for example, a CMOS camera or a CCD camera.
  • the first camera 121 and the second camera 122 may be, for example, a color camera in which each of RGB has 256 gradations, or a black and white camera having 256 gradations.
  • Each of the first camera 121 and the second camera 122 is configured to image a predetermined position in the machine coordinate system of the machine tool 10.
  • the positions in the machine coordinate system captured by the first camera 121 and the second camera 122 can be stored in, for example, the storage device 28 (see FIG. 3).
  • first camera 121 is arranged at a position and an orientation that captures image of work W from a first direction
  • second camera 122 is arranged at a second position different from the first direction.
  • the work W is arranged in such a position and orientation as to image the work W from the direction.
  • each of the first camera 121 and the second camera 122 is mounted above the table 16 so that the table 16 and the work W can be imaged.
  • the first camera 121 and the second camera 122 can be fixed to an arbitrary component (for example, an inner surface of the cover 100 or a splash guard or the like).
  • the first camera 121 and the second camera 122 are arranged such that a straight line connecting the first camera 121 and the second camera 122 passes through the center point of the upper surface of the table 16 when viewed in the Z-axis direction, and
  • the first camera 121 and the second camera 122 may be arranged in the cover 100 such that the optical axes of the first camera 121 and the second camera 122 are aligned with the straight line.
  • the angles of the first camera 121 and the second camera 122 when viewed in the horizontal direction may be set, for example, such that the optical axes of the first camera 121 and the second camera 122 pass through the center point on the upper surface of the table 16. Good. With such a configuration, all outer surfaces except the bottom surface of the work W are imaged without blind spots.
  • FIG. 3 is a block diagram of the machine tool according to the embodiment.
  • the machine tool 10 includes a control device 70 configured to control a feed device of each axis, set processing information, and measure a workpiece.
  • the control device 70 is, for example, a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and a hard disk or solid state drive (Solid State Drive, SSD) connected to each other via a bus. And other electronic data storage media.
  • a CPU Central Processing Unit
  • RAM Random Access Memory
  • ROM Read Only Memory
  • SSD solid state drive
  • the control device 70 includes an input unit 71, a read interpretation unit 72, an interpolation calculation unit 73, and a servo motor control unit 74.
  • the operator creates the machining program 76.
  • the machining program 76 can be generated by a CAM (Computer / Aided / Manufacturing) device or the like based on workpiece shape data including a workpiece shape before processing and a target shape.
  • the workpiece shape data can be created by, for example, a CAD (Computer Aided Design) device.
  • Processing information related to processing is input to the input unit 71.
  • the processing information may be manually input to the input unit 71 via the touch panel 114, or may be input to the input unit 71 as electronic data including various information via a communication network.
  • the machine tool 10 processes the workpiece W based on various processing information.
  • the processing information can include, for example, a processing program 76, tool information 77, and work information 78.
  • the processing information may include other information.
  • the machining program 76 includes information on the relative movement of the tool with respect to the workpiece.
  • the machining program 76 includes a command for the machine tool in, for example, a G code and an M code.
  • the tool information 77 includes, for example, information about the tool such as the type of tool such as a drill and an end mill, the diameter of the tool, and the length of the tool.
  • the work information 78 includes information on the work W to be processed.
  • the information on the work W includes, for example, shape data including a shape of the work W before processing, a target shape, and the like.
  • the reading / interpreting unit 72 reads information including the processing program 76 from the input unit 71.
  • the reading interpretation unit 72 outputs a movement command to the interpolation calculation unit 73.
  • the interpolation calculation unit 73 calculates a position command value for each interpolation cycle. For example, the interpolation calculation unit 73 calculates a movement amount for each time interval set based on the movement command.
  • the interpolation calculator 73 outputs the position command value to the servo motor controller 74.
  • the servo motor control unit 74 outputs a current value to the servo motor 75 of each of the X-axis, Y-axis, and Z-axis feed axes based on the position command.
  • a linear position measuring device such as a digital scale is provided on each of the X-axis, Y-axis, and Z-axis feed axes so that the servo motor control unit 74 servo-controls the servo motor 75 of each axis.
  • the positions of the X-axis, Y-axis, and Z-axis feed axes are fed back to the servo motor control unit 74.
  • the machining information can include coordinate information in addition to the machining program 76, the tool information 77, and the work information 78.
  • the coordinate information includes information on coordinates used in the machining program 76.
  • the coordinate system used in the machine tool 10 can include, for example, a machine coordinate system whose origin is a predetermined point of the machine tool, and a work coordinate system whose origin is an arbitrary point of the work.
  • the work coordinate system moves with the movement of the work.
  • the work coordinate system can be determined by setting a relative position with respect to the machine coordinate system.
  • the coordinate system used in the machine tool 10 may include another coordinate system. When a plurality of coordinate systems are used in the machining program 76, information on the plurality of coordinate systems is included in the coordinate information.
  • the processing information can further include a parameter of the control device.
  • the processing information can include information on processing conditions specified on the operation panel 110.
  • processing conditions can include, for example, an override value relating to the moving speed and information relating to the coolant.
  • the control device 70 further includes a command unit 20, a display / input unit 26, a storage device 28 for storing processing information, a measurement command unit 29, a measurement information acquisition unit 30, and a matching processing unit 31.
  • the command unit 20 generates processing information and outputs it to the input unit 71.
  • the command unit 20 can edit the processing information input to the input unit 71 based on an operation of the touch panel 114 by the operator, and output the edited processing information to the input unit 71 as new processing information. it can.
  • the command unit 20 receives processing information such as an override value input on the operation panel 110. In this way, the command unit 20 can newly create and / or edit the processing information.
  • the command unit 20 includes a display command unit 22, a storage command unit 25, and an imaging command unit 27.
  • the display command unit 22 can output a command to the display / input unit 26 to display the processing information, for example.
  • various commands and information are input to the command unit 20 from the display / input unit 26.
  • the storage command unit 25 can output a command to the storage device 28 to store the newly generated processing information and / or the edited processing information.
  • the imaging instruction unit 27 can output an instruction to the imaging device 120 so as to image the work W and the table 16.
  • the imaging instruction unit 27 may be configured to output an instruction to the imaging device 120 based on a signal from the interlock only when the door 108 is closed.
  • the display / input unit 26 is configured to display processing information and to enable an operator to input data and instructions.
  • the display / input unit 26 can include, for example, the touch panel 114 (see FIG. 1). Referring to FIG. 3, display / input unit 26 may include other components.
  • the operator can input data and a command by tapping or clicking an icon on the touch panel 114.
  • the storage device 28 can include, for example, the above-described ROM and / or storage medium. Although the storage device 28 of the present embodiment is arranged inside the control device 70, the storage device 28 may be arranged outside the control device 70. For example, the storage device 28 may include a storage medium such as a memory card and / or a hard disk connected to the control device 70 via a communication network.
  • a storage medium such as a memory card and / or a hard disk connected to the control device 70 via a communication network.
  • the measurement command unit 29 receives a command for measuring the workpiece W from the command unit 20.
  • the measurement command unit 29 may receive a command for moving the measurement probe 32 attached to the tip of the spindle 13 from the command unit 20 to execute the second position detection (described later) of the work W. it can.
  • the measurement command unit 29 outputs a command for measurement to the reading / interpreting unit 72, and the work W is measured based on the command.
  • the measurement information acquisition unit 30 acquires the measurement value of the work.
  • the measurement information acquisition unit 30 receives a signal from the measurement probe 32. Further, the measurement information acquisition unit 30 receives the coordinate values of the machine coordinate system from the servo motor control unit 74.
  • the measurement information acquisition unit 30 outputs these pieces of information to the command unit 20. For example, when the measurement probe 32 contacts the workpiece W, the measurement information acquisition unit 30 sends the skip signal output from the measurement probe 32 and the coordinate value of each axis in the machine coordinate system to the command unit 20. Can be output.
  • the command unit 20 can calculate the processing information from the measured values as necessary, and can display the calculated processing information on the display / input unit 26.
  • the matching processing unit 31 receives the first image and the second image from the imaging device 120. Further, the matching processing unit 31 receives the shape data of the work W from the storage device 28. The matching processing unit 31 analyzes the first image and converts the first image into a first area having a luminance value within a predetermined range and a second area having a luminance value outside the predetermined range. , Is configured to be divided.
  • the matching processing unit 31 determines that all the brightness values of R, G, and B in a certain area are within a predetermined range (for example, 250 This area can be classified as a first area when it is within a smaller range. In contrast, when at least one of the R, G, and B luminance values of a certain region is out of the predetermined range, the matching processing unit 31 can classify this region as a second region.
  • a predetermined range for example, 250 This area can be classified as a first area when it is within a smaller range.
  • the matching processing unit 31 can classify this region as a second region.
  • the matching processing unit 31 can be classified into a first region.
  • the matching processing unit 31 can classify this area as a second area.
  • the matching processing unit 31 compares the shape data and the first image in the first area, and additionally compares the shape data and the second image in the second area if necessary. Thus, the position and orientation of the workpiece W in the machine coordinate system of the machine tool 10 are determined.
  • the matching processing unit 31 can obtain the shape of the work W by, for example, binarizing the image data and extracting pixels serving as edges.
  • the matching processing unit 31 can obtain the dimensions of the work W by counting the number of pixels in the X-axis, Y-axis, and Z-axis directions, for example.
  • the matching processing unit 31 can calculate, for example, the position and orientation of the work W where the shape data and the shape data of the work W obtained from the image data best match.
  • the matching processing unit 31 may determine that the calculated position and orientation of the workpiece W are correct when the degree of coincidence exceeds a predetermined threshold. Since the degree of coincidence can be obtained by various known methods, detailed description is omitted in this specification.
  • the work W when attaching the work W to the table 16, the work W may be attached to an arbitrary position on the table 16. Therefore, in order to accurately set the tool path for the work W actually installed on the machine tool 10, it is necessary to determine in which position the work W is actually mounted in the machine coordinate system of the machine tool 10 before machining. You need to know.
  • the machine tool 10 is configured to detect the position and the orientation of the workpiece W in the machine coordinate system of the machine tool 10 before machining.
  • the first position detection of the workpiece W is performed by the first camera 121 and the second camera 122, and subsequently, the second position detection is more accurate than the first position detection. Is performed by the measurement probe 32.
  • the machine tool 10 is configured to rapidly feed the measurement probe 32 to the vicinity of the work W in the second position detection based on the position and the orientation of the work W obtained by the first position detection. With such a configuration, the position and orientation of the work W can be detected quickly and accurately.
  • an offset value of the work coordinate system with respect to the machine coordinate system and an inclination of the work coordinate system with respect to the machine coordinate system can be obtained.
  • FIG. 4 is a schematic diagram for explaining a work.
  • the work W has an arc-shaped surface SF.
  • One side (right side in FIG. 4) half of the surface SF includes the characteristic shape CS.
  • the characteristic shape CS can be a shape that can be a mark when specifying the direction of the work W, and can be, for example, a shape including a ridge line and / or a valley line (for example, a protrusion or a groove).
  • FIG. 5 is a schematic diagram showing an example of an image obtained by the first camera 121, and shows an image obtained by the first camera 121 in FIG.
  • an area AR1 indicates a first area having a luminance value within a predetermined range (that is, an area where the first camera 121 can acquire an image), and an area AR2 has a luminance value outside the predetermined range. 2 (that is, an area where the first camera 121 cannot acquire an image).
  • a line that cannot be recognized as an image by the first camera 121 is indicated by a broken line.
  • the light L1 has such a high luminance value that the first camera 121 cannot recognize the area AR2 where the light L1 is reflected as an image. May be present (so-called overexposure).
  • the feature shape CS included in the surface SF may not be recognized as an image by the first camera 121.
  • the work W is a target in the left-right direction and the front-back direction in FIG. Therefore, when the characteristic shape CS is not detected, the position and the orientation of the work W may not be correctly detected.
  • FIG. 6 is a schematic diagram showing an example of an image obtained by the second camera 122. Since the light L2 incident on the second camera 122 from the right half of the surface SF is weak, the characteristic shape CS included in the surface SF is recognized as an image by the second camera 122 as shown in FIG. Therefore, the machine tool 10 is configured to perform the first position detection using the first image as shown in FIG. 5 and the second image as shown in FIG. 4 to 6 illustrate a case where an image cannot be partially obtained due to overexposure. However, in the present disclosure, a case where an image cannot be partially obtained due to a low luminance value (so-called black underexposure). Note that this is also applicable.
  • black underexposure a low luminance value
  • the machine tool 10 of the present embodiment is configured to select manual or automatic for the method of measuring a work.
  • the screen and the measurement program that support the automatic measurement can be stored in the storage device 28.
  • the screen supporting the manual measurement can be stored in the storage device 28.
  • FIG. 9 is a diagram showing a screen for probe measurement.
  • a screen 80 as shown in FIG. 9 is displayed on the display / input unit 26.
  • the screen 80 has a plurality of call buttons including a measurement point / measurement direction selection area 82, a measurement item selection area 84, and a model call button 86.
  • the model call button 86 When the operator taps or clicks the model call button 86, the shape of the work to be measured is output from the storage device 28 to the display command section 22 of the command section 20, and displayed in the measurement point / measurement direction selection area 82.
  • the operator can set a measurement point and a measurement direction on the work by tapping or clicking on desired points P 1 to P 3 on the work displayed on the measurement point / measurement direction selection area 82 on the touch panel 114. It is.
  • a plurality of measurement method selection icons 84a to 84d indicating the work measurement method are displayed.
  • Icon 84a can be used to measure the angle of the two sides of the workpiece with respect to the machine coordinate system.
  • the icon 84b can be used to measure the length of two opposing sides of a rectangular recess formed in a work.
  • Icon 84c can be used to measure the inner diameter of a circular recess formed in the work.
  • the icon 84d can be used to measure the length of two opposing sides of the rectangular parallelepiped work.
  • the operator can select a desired measurement by tapping or clicking one of the measurement method selection icons 84a to 84d on the touch panel 114. It should be noted that the icons 84a to 84d are merely examples, and the measurement item selection area 84 may include icons used for other measurements.
  • the measurement of the workpiece W is started according to the selected measurement method.
  • the first position detection of the workpiece W is performed by the imaging device 120, and then, the second position detection is performed by the measurement probe 32.
  • FIG. 7 is a flowchart illustrating a work position detection method according to the embodiment.
  • the work W is fixed at an arbitrary position on the table 16 during the first position detection.
  • the matching processing unit 31 reads out the shape data (the shape data before processing) of the work W from the storage device 28 (Step S100). Subsequently, the imaging instruction unit 27 outputs an instruction to the first camera 121 to image the workpiece W, and acquires a first image (Step S102). The acquired first image is transmitted from the first camera 121 to the matching processing unit 31.
  • the matching processing unit 31 analyzes the first image, and converts the first image into a first region AR1 (see FIG. 5) having a luminance value within a predetermined range and an area outside the predetermined range. It is divided into a second area AR2 having a luminance value (see FIG. 5) (step S104).
  • matching processing section 31 compares the shape data with the first image data in first area AR1 (step S106). As a result, the position and orientation of the workpiece W are calculated, and the degree of coincidence between the shape data and the image data at this time is calculated.
  • the matching processing unit 31 determines whether the calculated degree of matching exceeds a predetermined threshold (step S108). If it is determined in step S108 that the degree of coincidence exceeds the predetermined threshold, the matching processing unit 31 then temporarily sets the position and orientation of the workpiece W at this time as the position and orientation in the machine coordinate system, respectively. It is determined (step S110).
  • step S108 If it is determined in step S108 that the matching degree does not exceed the predetermined threshold, the matching processing unit 31 repeats steps S106 to S108 until a matching degree exceeding the predetermined threshold is obtained.
  • the matching processing unit 31 can recalculate the degree of coincidence by, for example, changing the position and the direction of the work W.
  • the matching processing unit 31 determines whether or not the second area AR2 has been extracted in step S104 (step S112). If it is determined in step S112 that the second area AR2 has been extracted, subsequently, the matching processing unit 31 determines whether the shape data includes the characteristic shape CS in the second area AR2 ( Step S114). If it is determined in step S114 that the shape data includes the characteristic shape CS in the second area AR2, the control device 70 proceeds to step S118 (FIG. 8).
  • step S112 when it is determined in step S112 that second region AR2 has not been extracted, and in step S114, it is determined that shape data does not include feature shape CS in second region AR2.
  • the matching processing unit 31 determines the position and orientation provisionally determined in step S110 as the position and orientation of the workpiece W in the machine coordinate system (step S116), and ends a series of operations for the first position detection. I do.
  • FIG. 8 is a flowchart following FIG. Subsequently, the imaging instruction unit 27 outputs an instruction to the second camera 122 to image the work W, and acquires a second image (Step S118). The acquired second image is transmitted from the second camera 122 to the matching processing unit 31.
  • the matching processing unit 31 arranges the shape data in the second image at the position and orientation provisionally determined in step S110 (step S120). Subsequently, the matching processing unit 31 compares the shape data with the second image in the second area AR2 extracted in step S104 (step S122). Specifically, the matching processing unit 31 calculates the degree of coincidence between the shape data and the image data in the second area AR2. As shown in FIG. 6, since the second region AR2 includes the characteristic shape CS, by calculating the degree of coincidence in consideration of the characteristic shape CS, the position and the orientation temporarily determined in step S110 are correct. No, you can double check.
  • matching processing section 31 determines whether or not the degree of coincidence calculated in step S122 exceeds a predetermined threshold (step S124). If it is determined in step S124 that the degree of coincidence exceeds the predetermined threshold, the matching processing unit 31 determines the provisionally determined position and orientation as the position and orientation of the workpiece W in the machine coordinate system (step S126). , The series of operations for the first position detection is terminated.
  • step S124 If it is determined in step S124 that the matching degree does not exceed the predetermined threshold, the matching processing unit 31 repeats steps S106 to S124 until a matching degree exceeding the predetermined threshold is obtained.
  • the matching processing unit 31 can recalculate the degree of coincidence by, for example, changing the position and the direction of the work W.
  • the control device 70 executes the second position detection using the measurement probe 32 according to the method selected on the screen 80 (see FIG. 9).
  • the command unit 20 and the measurement command unit 29 output a command to fast-forward the measurement probe 32 to the vicinity of the work W based on the position and orientation of the work W obtained by the first position detection.
  • the position and orientation of the workpiece W can be detected more quickly and more easily than when the operator manually sets the end point of the fast-forward.
  • the first image has the first area having the luminance value within the predetermined range (that is, the area where the first camera 121 can acquire the image).
  • the position and the orientation of the workpiece W are obtained using the shape data and the first image in the first area AR1.
  • the shape data includes the characteristic shape CS in the second area AR2
  • the second image captured from a different direction (that is, a direction in which the second camera 122 can recognize the second area as an image).
  • the shape data and the image in the second area AR2 are compared using the second image.
  • the second area AR2 is captured using the second image captured from a different second direction.
  • the feature shape CS in the area AR2 can be confirmed. Therefore, even when there is an area AR2 where an image cannot be obtained from the first direction, the accuracy of position detection of the work W can be improved.
  • the work W is imaged from the first direction and the second direction. Therefore, for example, by displaying the first image and the second image on the touch panel 114, the operator can visually recognize the work W without a blind spot.
  • the second image is acquired after the first image is acquired (that is, step S118 is executed after step S102).
  • the first image and the second image may be acquired simultaneously at the same timing (ie, steps S102 and S118 may be performed simultaneously at the same timing).
  • the position and the orientation of the work W are detected using the shape data and the image of the work W.
  • the position and the orientation of the work W may be detected using the shape data and the image of the table 16 and / or the jig on the table 16 in addition to the shape data and the image of the work W.
  • FIG. 10 is a block diagram of a machine tool according to another embodiment.
  • the machine tool 10A differs from the machine tool 10 in that the imaging device 120 has one movable camera 123.
  • the camera 123 moves between a first position at which the work W can be imaged from the first direction and a second position at which the work W can be imaged from the second direction. It can be configured to:
  • the camera 123 may be configured to move, for example, along a rail laid between the first position and the second position.
  • the camera 123 can have other configurations.
  • the camera 123 may be, for example, a CMOS camera or a CCD camera. Further, the camera 123 may be a color camera or a black and white camera.
  • the machine tool 10A and the work position detection method implemented using the machine tool 10A can provide the same effects as described above.
  • Control device 120 Imaging device 121 First camera 122 Second camera 123 Movable camera AR1 First area AR2 Second area CS Characteristic shape W Work

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Machine Tool Sensing Apparatuses (AREA)

Abstract

L'invention concerne un procédé de détection de position de pièce ouvrée, comprenant les étapes consistant à : lire des données de forme d'une pièce ouvrée (W) ; obtenir une première image par capture d'image de la pièce ouvrée (W) à partir d'une première direction ; diviser la première image en une première région ayant une valeur de luminosité à l'intérieur d'une plage prédéterminée et une deuxième région ayant une valeur de luminosité hors de la plage prédéterminée ; comparer les données de forme à la première image dans la première région, la comparaison fournissant une position et une direction de la pièce ouvrée (W) dans un système de coordonnées mécaniques d'une machine-outil ; déterminer, sur la base des données de forme agencées dans la position fournie et la direction, si les données de forme contiennent ou non une forme caractéristique (CS) dans la deuxième région ; obtenir une deuxième image par capture d'image de la pièce ouvrée (W) à partir d'une deuxième direction ; et collationner les données de forme avec la deuxième image dans la deuxième région lorsqu'il est déterminé que les données de forme contiennent la forme caractéristique (CS) dans la deuxième région.
PCT/JP2018/036052 2018-09-27 2018-09-27 Procédé de détection de position de pièce ouvrée et machine-outil WO2020065854A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/036052 WO2020065854A1 (fr) 2018-09-27 2018-09-27 Procédé de détection de position de pièce ouvrée et machine-outil

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/036052 WO2020065854A1 (fr) 2018-09-27 2018-09-27 Procédé de détection de position de pièce ouvrée et machine-outil

Publications (1)

Publication Number Publication Date
WO2020065854A1 true WO2020065854A1 (fr) 2020-04-02

Family

ID=69951261

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/036052 WO2020065854A1 (fr) 2018-09-27 2018-09-27 Procédé de détection de position de pièce ouvrée et machine-outil

Country Status (1)

Country Link
WO (1) WO2020065854A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI762182B (zh) * 2021-02-05 2022-04-21 德制國際有限公司 自動加工方法及自動加工系統

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008026723A1 (fr) * 2006-09-01 2008-03-06 Mori Seiki Co., Ltd. Procédé de confirmation de données de modèle tridimensionnel, et appareil de confirmation de données de modèle tridimensionnel
JP2010061661A (ja) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd 加工状況監視方法及び加工状況監視装置
JP2010058264A (ja) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd 加工状態確認方法及び加工状態確認装置
JP2011075311A (ja) * 2009-09-29 2011-04-14 Dainippon Screen Mfg Co Ltd 画像処理方法
US20180018778A1 (en) * 2015-03-30 2018-01-18 Carl Zeiss Industrielle Messtechnik Gmbh Motion-measuring system of a machine and method for operating the motion-measuring system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008026723A1 (fr) * 2006-09-01 2008-03-06 Mori Seiki Co., Ltd. Procédé de confirmation de données de modèle tridimensionnel, et appareil de confirmation de données de modèle tridimensionnel
JP2010061661A (ja) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd 加工状況監視方法及び加工状況監視装置
JP2010058264A (ja) * 2008-09-05 2010-03-18 Mori Seiki Co Ltd 加工状態確認方法及び加工状態確認装置
JP2011075311A (ja) * 2009-09-29 2011-04-14 Dainippon Screen Mfg Co Ltd 画像処理方法
US20180018778A1 (en) * 2015-03-30 2018-01-18 Carl Zeiss Industrielle Messtechnik Gmbh Motion-measuring system of a machine and method for operating the motion-measuring system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI762182B (zh) * 2021-02-05 2022-04-21 德制國際有限公司 自動加工方法及自動加工系統

Similar Documents

Publication Publication Date Title
CN106346315B (zh) 能够取得工件原点的机床控制系统以及工件原点设定方法
US7570795B2 (en) Multi-region autofocus tool and mode
EP2082850B1 (fr) Dispositif de génération de programme de robot de traitement
JP6570592B2 (ja) 工作機械の機上測定方法および制御装置
JP5597056B2 (ja) 画像測定装置、画像測定方法及び画像測定装置用のプログラム
US8988032B2 (en) Numerical controller having display function for trajectory of tool
US10152044B2 (en) Control device for machine tool
JP3596753B2 (ja) 画像測定装置用パートプログラム生成装置及び方法
US10656618B2 (en) Numerical controller
EP3214514A1 (fr) Procédé de commande de machine-outil et dispositif de commande de machine-outil
US10585418B2 (en) Control system of machine tool
JP5649262B2 (ja) 測定表示方法及び測定表示装置を備えた機械
US20190111534A1 (en) Instrument for measuring workpiece, and machine tool
JP6845873B2 (ja) ワークピースのx−y−z基準座標を決定する方法及び工作機械
WO2020065854A1 (fr) Procédé de détection de position de pièce ouvrée et machine-outil
JP6579682B2 (ja) 画像測定装置
JP3958815B2 (ja) Nc工作機械における工具位置測定方法
JP2666512B2 (ja) 機械座標系補正装置
JP6554695B2 (ja) 画像測定装置
JP2008071015A (ja) 数値制御装置
JP7397100B2 (ja) ワーク画像解析装置、ワーク画像解析方法、及びプログラム
WO2007037032A1 (fr) Machine de mesure de contour, procédé permettant de calculer le profil de base géometrique d’une machine de mesure de contour et son programme
US20230096477A1 (en) Information processing device, machining system, machine tool, and program
KR101677255B1 (ko) 공작물 위치 자동 인식 장치 및 그 방법
KR20220144217A (ko) 공작물 가공 장치 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18935804

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18935804

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP