CN103722449A - Machine tool machining locating method and device - Google Patents

Machine tool machining locating method and device Download PDF

Info

Publication number
CN103722449A
CN103722449A CN201210392978.8A CN201210392978A CN103722449A CN 103722449 A CN103722449 A CN 103722449A CN 201210392978 A CN201210392978 A CN 201210392978A CN 103722449 A CN103722449 A CN 103722449A
Authority
CN
China
Prior art keywords
workpiece
central point
standard
image
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201210392978.8A
Other languages
Chinese (zh)
Other versions
CN103722449B (en
Inventor
黄伟尧
王圣林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Priority to CN201210392978.8A priority Critical patent/CN103722449B/en
Publication of CN103722449A publication Critical patent/CN103722449A/en
Application granted granted Critical
Publication of CN103722449B publication Critical patent/CN103722449B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q15/00Automatic control or regulation of feed movement, cutting velocity or position of tool or work
    • B23Q15/20Automatic control or regulation of feed movement, cutting velocity or position of tool or work before or after the tool acts upon the workpiece
    • B23Q15/22Control or regulation of position of tool or workpiece
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/22Arrangements for observing, indicating or measuring on machine tools for indicating or measuring existing or desired position of tool or work
    • B23Q17/2233Arrangements for observing, indicating or measuring on machine tools for indicating or measuring existing or desired position of tool or work for adjusting the tool relative to the workpiece
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
    • B23Q17/2409Arrangements for indirect observation of the working space using image recording means, e.g. a camera

Abstract

The invention relates to a machine tool machining locating method and device. The machine tool machining locating method mainly includes the steps that an image acquisition unit is additionally arranged on a CNC machine tool and utilized to shoot standard workpieces and workpieces to be machined to generate standard images and workpiece images respectively, characteristic contours are input into the standard images and correspond to appearance contours of the workpiece images, and an actual coordinate of a central point after correspondence is calculated; the different value of an actual coordinate of a central point of the standard images and an actual coordinate of a central point of the workpiece images is calculated, and then machining coordinate data of the workpieces to be machined can be accurately acquired by the combination of originally known workpiece intervals of the standard workpieces and the workpieces to be machined. In other words, the multiple workpieces to be machined can be rapidly and accurately located only by conducting shooting and process operation steps, and mechanical moving measurement is not needed.

Description

Toolroom machine processing positioning method and device thereof
Technical field
The present invention system is about a kind of toolroom machine processing positioning method and device thereof, espespecially a kind of computer numerical that is applicable to is controlled (Computer Numerical Control, CNC) method on toolroom machine, the Working position of workpiece being positioned, and the device of execution the method.
Background technology
With the technical field of current toolroom machine, for the location of the Working position of workpiece, maintain all the time traditional approach, with probe, the surrounding of workpiece is measured, in the hope of going out the heart or special characteristic point among workpiece.
In detail, with general square type workpiece, want to obtain the processing stand at its center, must be with surrounding sidewall and the roof of probe measuring workpiece, that is measuring workpiece respectively at length distance shared on X-axis, Y-axis and Z axis after, then see through the central point that its workpiece is asked in computing.Such measurement mode, with the simplest square type workpiece, just must be carried out five times and measure.Yet once the workpiece of wish processing is complicated or irregular shape, or the angle that workpiece is put changes, and just must measure more frequently, quite consuming time taking a lot of work.
And.The workpiece of wanting to process for each must again measure, locate, and is unfavorable for very much a large amount of production.Moreover for the location of small workpiece, the difficulty very that the mode that probe measures can become, even cannot carry out.In addition, in the mode of popping one's head in and measuring, be easy to, because user's operation not very, is collided workpiece and make to pop one's head in, cause workpiece or probe to be damaged.In addition, experience and technology that the mode measuring to pop one's head in is dependent on operator very much, unskilled operator easily expends the more time, and easily produces error in measurement.
Summary of the invention
The present invention's main purpose ties up to provides a kind of toolroom machine processing positioning method and device thereof, in order to do the centre coordinate that can automatically measure each workpiece, and then rapidly each workpiece is carried out the automatic location of Working position, known workpiece location spent time can be significantly reduced, and the precision of location can be improved significantly.
For reaching above-mentioned purpose, a kind of toolroom machine processing positioning method of the present invention, it is to position for the Working position to plural workpiece, plural number workpiece has an appearance profile, and plural workpiece comprises a standard workpiece and at least one to be processed, method comprises the following steps: first, take standard workpiece to obtain a standard video, and on standard video, input a feature contour, feature contour system is at least partly corresponding to the appearance profile of plural workpiece, between standard workpiece and at least one to be processed, has a workpiece spacing.Moreover, the central point actual coordinate of calculating standard video.Then, take at least one to be processed to obtain a workpiece image, again with after the appearance profile on feature contour alignment pieces image, calculate the central point actual coordinate of workpiece image.Finally, calculate the difference between the central point actual coordinate of workpiece image and standard video, and it is added to workpiece spacing, and obtain one of at least one to be processed machining coordinate data.
Accordingly, the feature contour that the present invention system utilizes on standard video and inputted carrys out the appearance profile on corresponding workpiece image, and calculates the central point actual coordinate of corresponding rear workpiece image.Then, then calculate the difference of the central point actual coordinate of standard video and workpiece image, and add originally known the two workpiece spacing, can accurately learn the machining coordinate data of to be processed.In other words, the present invention only needs to take and process calculation step, need not mechanical mobile survey, just can locate fast, accurately a plurality of to be processed.
Wherein, in automatic positioning method provided by the present invention, about calculating in the step of central point actual coordinate of standard video, can first calculate after the central point pixel coordinate value of standard video, then convert it to central point actual coordinate.Similarly, in the step of central point actual coordinate of calculating workpiece image, can first calculate after the central point pixel coordinate value of workpiece image, then convert it to central point actual coordinate.Accordingly, the present invention system can utilize pixel in image as coordinate figure, and converts by this coordinate figure of actual processing dimension to.
Yet, above-mentionedly convert the pixel value in image to actual processing dimension, can see through a size conversion ratio value and carry out.About size conversion ratio value, the invention provides following steps and obtain: first, take standard workpiece and obtain one first workpiece image; Then,, after mobile standard workpiece one specific range, take standard workpiece and obtain a second workpiece image; Finally, the difference between the central point pixel coordinate value of calculating the first workpiece image and second workpiece image and the ratio between specific range.Wherein, the specific range that workpiece moved is preferably two axial displacements.In other words, method system provided by the present invention utilizes the ratio between the actual displacement amount of workpiece and the displacement of the pixel coordinate in image, can obtain above-mentioned size conversion ratio value.
Moreover, in automatic positioning method provided by the present invention, about calculating the central point actual coordinate of standard video, can comprise the following steps: first, segmentation standard image is plural interest domain, and each interest domain on standard video includes feature contour; Then, calculate respectively the geometric center pixel coordinate of each interest domain; Finally, the central point between the geometric center pixel coordinate on calculated complex interest domain, it is the central point pixel coordinate value of standard video.Accordingly, the above-mentioned method of carrying can precisely and be calculated the central point pixel coordinate value of the feature contour of being inputted rapidly.
Hold, in automatic positioning method provided by the present invention, workpiece image can comprise plural comparison area, and complex ratio to fauna corresponding to plural interest domain, and can comprise the following steps about calculating the central point actual coordinate of workpiece image: first, with the feature contour on the plural interest domain of standard video, compare respectively the appearance profile on the plural comparison area of workpiece image; Then, calculate respectively the geometric center pixel coordinate of each comparison area; Finally, the central point between the geometric center pixel coordinate on calculated complex comparison area, it is central point pixel coordinate value.
In other words, the present invention can see through the feature contour of being inputted is divided into the feature contour of a plurality of parts, and removes respectively to compare corresponding appearance profile on workpiece image with it, when all Partial Feature profiles have been compared, and ability computer center coordinate.Accordingly, the present invention can be suitable for different size but the location of the workpiece of tool identical appearance profile, no matter that is be that workpiece size zooms in or out, as long as feature contour meets and can compare.Therefore, the present invention's size range applicatory is large, and can obtain excellent precision by the comparison mode of many characteristics.
In addition, in automatic positioning method provided by the present invention, central point that can localization criteria workpiece in the most initial step is in a shooting central point.And, in the step of central point actual coordinate of calculating standard video, can comprise that the difference between the coordinate that calculates the central point actual coordinate of standard video and take central point is an Input Offset Value.Again and, in calculate to be processed the step of machining coordinate data time, except the difference between the central point actual coordinate of workpiece image and standard video is added workpiece spacing, add aforementioned Input Offset Value.
Yet, above-mentioned steps main with being intended to, because the present invention's feature contour can be by artificially inputting, and operator is difficult to precisely, intactly describe to input while artificially inputting, therefore likely can produce error.In view of this, the present invention considers the Input Offset Value between feature contour and center origin especially, counted in the calculating of machining coordinate data of to be processed, accordingly except improving the precision of location, user need not take a lot of trouble precisely to describe when input feature vector profile, can significantly save and describe spent time.
In addition, in normal circumstances, the central point of shooting can not be positioned on machining center point.Given this point, spy of the present invention provides following method, and the spacing between the central point of shooting and machining center point is defined as to a center distance.Wherein, when calculating the step of machining coordinate data of to be processed, except the difference between the central point actual coordinate of workpiece image and standard video is added workpiece spacing and Input Offset Value, add center distance.
For reaching the present invention's object, the present invention provides again a kind of toolroom machine processing and positioning device, it can be used for the Working position of plural workpiece to position, and plural workpiece has an appearance profile, plural number workpiece comprises a standard workpiece and at least one to be processed, between standard workpiece and at least one to be processed, there is a workpiece spacing, and this device mainly comprises: an image acquisition unit, an input unit and a controller.Wherein, image acquisition unit system is arranged on one of toolroom machine main tapping, and image acquisition unit system is used for taking standard workpiece to obtain a standard video, and take at least one to be processed to obtain a workpiece image.Moreover input unit system is used to input on standard video a feature contour, and feature contour system is at least partly corresponding to the appearance profile of plural workpiece.In addition, controller is electrically connected image acquisition unit and input unit.Wherein, the central point actual coordinate of standard video first calculates in controller system; When image acquisition unit according to feature contour corresponding to the appearance profile on workpiece image after, the central point actual coordinate of workpiece image calculates in controller system; And controller calculates the difference between the central point actual coordinate of standard video and workpiece image, and it is added to workpiece spacing, and obtain one of at least one to be processed machining coordinate data.
Preferably, the controller of a kind of toolroom machine processing and positioning device of the present invention can be a computer numerical controller.That is to say, when the present invention applies to Computerized numerical control machine tool (hereinafter to be referred as CNC toolroom machine), as long as install aforementioned image acquisition unit additional, need not other extras.Therefore, the present invention can be take existing CNC toolroom machine as basis, and significantly raising equipment builds cost.And image acquisition unit can be installed on CNC toolroom machine easily, fast.Accordingly, Working position automatic positioning equipment provided by the present invention, equipment cost is cheap, arrange simple, again need not mechanical mobile survey, just can locate fast, accurately a plurality of workpiece.
Again and, the image acquisition unit of a kind of toolroom machine processing and positioning device of the present invention can include a microprocessor and a pick-up lens, and microprocessor system is electrically connected pick-up lens; In addition, controller can comprise a primary processor and a storage module, primary processor system is electrically connected storage module, and storage module stores a size conversion ratio value, and the size conversion ratio value Pixel Dimensions that to be image acquisition unit captured and the ratio value between actual processing dimension.Wherein, the microprocessor of image acquisition unit calculates respectively the central point pixel coordinate value of standard video and workpiece image, and primary processor converts the central point pixel coordinate value of standard video and workpiece image to central point actual coordinate according to the stored size conversion ratio value of storage module.Accordingly, the present invention system can utilize the pixel value in image to convert actual processing dimension to, and can be precisely and calculate rapidly actual processing dimension.
When again, the image acquisition unit of a kind of toolroom machine processing and positioning device of the present invention is taken the standard video of standard workpiece, in advance standard workpiece is positioned to the center position of pick-up lens.User by input unit on standard video after input feature vector profile, Input Offset Value one of between the central point actual coordinate of controller calculated characteristics profile and the center of pick-up lens.Accordingly, these machining coordinate data of at least one to be processed comprise difference between the central point actual coordinate of workpiece image and standard video add break apart from and Input Offset Value.Accordingly, by adding, calculate after Input Offset Value, except improving the precision of location, user need not take a lot of trouble precisely to describe when input feature vector profile, can significantly save and describe spent time.
More and, between the center of pick-up lens of a kind of toolroom machine processing and positioning device of the present invention and the machining center point of the main tapping of toolroom machine, there is a center distance, and the machining coordinate data of at least one to be processed comprise the difference between the central point actual coordinate of workpiece image and standard video, add workpiece spacing, Input Offset Value and center distance.Accordingly, the present invention taken into full account between pick-up lens and machining center point between distance, more improve the precision of location.
In addition, the microprocessor of the image acquisition unit of a kind of toolroom machine processing and positioning device of the present invention can be controlled feature contour rotation with the appearance profile corresponding at least one to be processed.In other words, the present invention loads position and angle not for the moment to machined part at face, sees through the processing of the microprocessor of image acquisition unit, can real time rotation feature contour, and make it correspond to smoothly the appearance profile of at least one to be processed.Therefore, the present invention is applicable to be processed of any irregular setting.
Accompanying drawing explanation
Below with reference to accompanying drawing and with reference to specific embodiment, the present invention is described in further detail, wherein:
Fig. 1 is the schematic diagram of a preferred embodiment of the present invention.
Fig. 2 is the system architecture diagram of a preferred embodiment of the present invention.
Fig. 3 is the flow chart of a preferred embodiment of the present invention driven dimension conversion proportion value.
Fig. 4 is the flow chart of a preferred embodiment of the present invention.
Fig. 5 A is the calculation flow chart of the central point pixel coordinate value of a preferred embodiment of the present invention standard video.
Fig. 5 B is the calculation flow chart of the central point pixel coordinate value of a preferred embodiment of the present invention workpiece image.
Fig. 6 is the standard video of a preferred embodiment of the present invention and the schematic diagram of feature contour Pf.
Fig. 7 A to Fig. 7 E is the elements of a fix figure of a preferred embodiment of the present invention to the workpiece of five kinds of different setting positions or angle.
1 CNC toolroom machine
11 main tappings
2 image acquisition units
21 microprocessors
22 pick-up lens
3 controllers
31 primary processors
32 storage modules
4 input units
Cd center distance
Cv size conversion ratio value
D workpiece spacing
Os appearance profile
Pf feature contour
Ps standard video
Psc, Pwc central point pixel coordinate value
Pw workpiece image
ROI_2 ~ ROI_5 interest domain
RM_2 ~ RM_5 comparison area
W workpiece
Ws standard workpiece
To be processed of Wt
The specific embodiment
Please refer to Fig. 1 and Fig. 2, Fig. 1 is the schematic diagram of a preferred embodiment of the present invention, and Fig. 2 is the system architecture diagram of a preferred embodiment of the present invention.Following examples will be take CNC toolroom machine 1 as executive agent, wherein only need install an image acquisition unit 2 additional, need not other extras, also need not carry out extra repacking to CNC toolroom machine 1, and all computings and processing only need to see through microprocessor 21 and the controller 3 of CNC toolroom machine 1 own in image acquisition unit 2.
As shown in FIG., the present embodiment mainly comprises an image acquisition unit 2, an input unit 4 and a controller 3, and controller 3 is electrically connected image acquisition unit 2 and input unit 4.In the present embodiment, image acquisition unit 2 includes a microprocessor 21 and a pick-up lens 22, microprocessor 21 is to be electrically connected pick-up lens 22, and image acquisition unit 2 is on the main tapping 11 (spindle head) that is arranged at toolroom machine 1, and between the machining center point of the main tapping 11 of the center of pick-up lens 22 and toolroom machine 1, there is a center distance Cd.Yet, the focal length of the image acquisition unit 2 of the present embodiment and the height that is arranged at main tapping 11 are fixed, and image acquisition unit 2 can be general camera head, its unique restriction must be carried out with the controller 3 of CNC toolroom machine 1 exactly communication and controlled by it.
As for, the computer numerical controller that controller 3 itself carries for CNC toolroom machine 1, it comprises a primary processor 31 and a storage module 32, primary processor 31 is to be electrically connected storage module 32.In storage module 32, store above-mentioned center distance Cd and a size conversion ratio value Cv.And the input unit that input unit 4 also itself carries or installs additional in addition for CNC toolroom machine 1 all can, as mouse, trace ball, drawing board or Touch Screen all can, the present embodiment be take Touch Screen as enforcement means certainly.
The present invention is mainly for the Working position of plural workpiece w is positioned, and as shown in fig. 1, the present embodiment system describes the location of the workpiece w of six kinds of different setting positions or angle.As shown in FIG., these six workpiece w have identical appearance profile Os (asking for an interview Fig. 7), and wherein first is standard workpiece Ws, and other is to be processed Wt.And, between standard workpiece Ws and each to be processed Wt, thering is a workpiece space D, this workpiece space D can be the spacing between workpiece w between two certainly, or the spacing between each to be processed Wt and standard workpiece Ws.
Refer to Fig. 3, Fig. 3 is the flow chart of a preferred embodiment of the present invention driven dimension conversion proportion value.Before reality positions workpiece w, first ask for size conversion ratio value Cv, and size conversion ratio value Cv is the ratio value between the captured Pixel Dimensions of image acquisition unit 2 and actual processing dimension.First, take standard workpiece Ws and obtain one first workpiece image, be i.e. step S100.Then, after mobile standard workpiece Ws mono-specific range, take standard workpiece Ws and obtain a second workpiece image, be i.e. step S105.Wherein, the specific range that standard workpiece Ws is moved is two axial displacements, comprises the movement of X-axis and Y-axis simultaneously.Finally, ratio between the difference between the central point pixel coordinate value of calculating the first workpiece image and second workpiece image and specific range, step S110, can obtain size conversion ratio value Cv accordingly, and it is stored in the storage module 3 of controller 3.Wherein, so-called central point pixel coordinate value is the unit that refers to take that the pixel (Pixel) of image is coordinate figure.
In other words, method that the present embodiment provided system utilizes the actual displacement amount of workpiece and the ratio between the pixel displacement amount in captured image, can obtain above-mentioned size conversion ratio value Cv.Certainly, the present invention, also not to be limited, also can adopt the mode of traditional correction card to obtain this size conversion ratio value Cv.The mode of tradition correction card illustrate as after, the correction of first putting a fixed dimension is stuck on workbench, takes after this correction card, asks for the ratio between the actual size of the shared pixel of correction card on captured image and correction card.
Refer to Fig. 4, Fig. 4 is the flow chart of a preferred embodiment of the present invention.First, first by the location of the core of standard workpiece Ws in a shooting central point, as shown in step S200.Then, image acquisition unit 2 is taken standard workpiece Ws to obtain a standard video Ps, and in the upper input of standard video Ps one feature contour Pf, this feature contour Pf depicts along with the appearance profile of standard workpiece Ws.Wherein, because there is identical appearance profile Os between standard workpiece Ws and to be processed Wt, therefore the feature contour Pf being inputted should at least partly can be corresponding to the appearance profile Os of plural workpiece w, as shown in step S205.
Moreover, calculate the central point actual coordinate of standard video Ps, and the difference between the coordinate of the central point actual coordinate of calculated characteristics profile Pf and shooting central point is an Input Offset Value, as shown in step S210.Wherein, so-called central point actual coordinate is to refer to using the unit of actual processing dimension (as mm or μ tm) as coordinate figure.Please refer to Fig. 5 A and Fig. 6, Fig. 5 A is the calculation flow chart of the central point pixel coordinate value of a preferred embodiment of the present invention standard video, and Fig. 6 is the standard video of a preferred embodiment of the present invention and the schematic diagram of feature contour Pf.
The central point actual coordinate that is used in the present embodiment calculating standard video Ps comprises the following steps: first, segmentation standard image Ps is four interest domains (Region of Interest) ROI_2 ~ 5, and each upper interest domain ROI_2 ~ 5 of standard video Ps all include the part of feature contour Pf, step S300 as shown in Figure 5A; Then, calculate respectively the geometric center pixel coordinate Pc 2 ~ 5 of each interest domain ROI_2 ~ 5, i.e. step S305 shown in Fig. 5 A; Come again, the central point that geometric center pixel coordinate Pc on calculated complex interest domain ROI_2 ~ 5 is 2 ~ 5, this is the central point pixel coordinate value Psc of standard video Ps, and this step i.e. step S310 as shown in Figure 5A.Finally, the primary processor 31 of controller 3 is converted to central point actual coordinate according to the stored size conversion ratio value Cv of storage module 3 by the central point pixel coordinate value Psc of standard video Ps.
In addition, about the Input Offset Value between the central point actual coordinate of feature contour Pf and the coordinate of shooting central point, be mainly because the feature contour Pf of the present embodiment system is inputted for describing by people, and operator is difficult to precisely and intactly describe input, therefore likely can produce error.In view of this, the present embodiment is after calculating the central point actual coordinate of feature contour Pf, and the difference between the central point actual coordinate of calculated characteristics profile Pf and the coordinate of shooting central point, is Input Offset Value especially.
In other words, the present embodiment is considered the displacement bias value between feature contour Pf and practical center initial point especially, is counted in the calculating of machining coordinate data of to be processed Wt.Accordingly, except improving the precision of location, operator, when input feature vector profile Pf, need not take a lot of trouble precisely to describe, and can significantly save and describe spent time.
Then, step S215 as shown in Figure 4, takes one to be processed Wt to obtain a workpiece image Pw.And, after the upper appearance profile Os of feature contour Pf alignment pieces image Pw, calculate the central point actual coordinate of workpiece image Pw, i.e. step S220.Actual following steps of carrying out in the present embodiment, first the feature contour Pf with standard video Ps compares workpiece image Pw, because having divided on standard video Ps on four interest domain ROI_2 ~ 5, therefore just compare respectively workpiece image Pw with the feature contour Pf on four interest domain ROI_2 ~ 5.
On the other hand, the appearance profile Os on workpiece image Pw, in order to correspond to the feature contour Pf on interest domain ROI_2 ~ 5, also will form RM_2 ~ 5, four comparison area automatically, and this is the step S400 shown in Fig. 5 B.Then, calculate respectively the geometric center pixel coordinate of each RM_2 ~ 5, comparison area, i.e. step S405.And, calculate the central point between the geometric center pixel coordinate on RM_2 ~ 5, four comparison area, the central point pixel coordinate value Pwc of this that is workpiece image Pw, i.e. step S410 shown in Fig. 5 B.Finally, the primary processor 31 of controller 3 is converted to central point actual coordinate according to the stored size conversion ratio value Cv of storage module 3 by the central point pixel coordinate value Pwc of workpiece image Pw.
In other words, the present embodiment system sees through the feature contour Pf being inputted is divided into a plurality of parts according to a plurality of interest domains ROI_2 ~ 5, and by each part of feature contour Pf, remove one by one to compare the upper corresponding appearance profile Os of workpiece image Pw respectively, when all parts of feature contour Pf have been compared all, ability computer center coordinate.Accordingly, the present embodiment can be suitable for different size but the location of the workpiece of tool identical appearance profile Os, no matter that is be that workpiece size zooms in or out, as long as feature contour Pf meets and can compare.Therefore, the present embodiment size range applicatory is large, and can obtain excellent precision by the mode of many parts comparison.
It is worth mentioning that, in the present embodiment, when the microprocessor 21 of image acquisition unit 2 is controlled respectively feature contour Pf by each part and is removed one by one to compare the upper corresponding appearance profile Os of workpiece image Pw, microprocessor 21 by controlling feature profile Pf rotation with completely corresponding to the appearance profile Os of to be processed Wt.In other words, face to machined part Wt angle not for the moment, see through the processing of the microprocessor 21 of image acquisition unit 2, can real time rotation feature contour Pf, make it correspond to smoothly the appearance profile Os of to be processed Wt.Therefore, the present embodiment is applicable to be processed Wt of any irregular setting.
Finally, as shown in the step S225 of Fig. 4, the difference between the central point actual coordinate of calculating workpiece image Pw and standard video Ps, and it is added to workpiece space D, Input Offset Value and center distance Cd, just can obtain one of to be processed Wt machining coordinate data.As for, the location of the Working position of follow-up to be processed Wt, repeat one by one step S215, step S220 and the step S225 of Fig. 4.
Yet, in the present embodiment, in aforementioned all calculation process, except shooting, the comparison of image and to ask for central point pixel coordinate value be microprocessor 21 in image acquisition unit 2 is processed, other computing is by the controller 3 of CNC toolroom machine 1 itself.And, through actual count, before and after the location of each to be processed Wt, spend the time less than 1 second, fast and accurately, be very beneficial for the scale of mass production of same specification workpiece.
Refer to Fig. 7 A to Fig. 7 E, Fig. 7 A to Fig. 7 E is the elements of a fix figure of a preferred embodiment of the present invention to the workpiece of five kinds of different setting positions or angle.In detail, below by with Fig. 7 A to Fig. 7 E, illustrate respectively without the location of translation and workpiece without spin, without translation but be rotated counterclockwise the location of the workpiece of 5 degree, without translation but the location of the workpiece of 5 degree that turn clockwise and have translation and have again the location of the workpiece that is rotated counterclockwise 5 degree.
As shown in Figure 7 A, the geometric center pixel coordinate that has shown each RM_2~5, comparison area in figure is respectively (201.9,160.0), (201.7,-155.0), (201.1,160.9), (201.2 ,-155.2), therefore ask for again above-mentioned four geometric center pixel coordinates, be Fig. 7 A the central point pixel coordinate value Pwc of to be processed Wt of correspondence, in the result of calculation of this example, be (0.55,2.675).In addition, utilizing antitrigonometric function theorem (arctan) can obtain the anglec of rotation is 0.128 degree.Accordingly, this very aobvious routine translation and the anglec of rotation are all quite a little.
As shown in Figure 7 B, the geometric center pixel coordinate that has shown each RM_2 ~ 5, comparison area in figure is respectively (232.6,128.3), (232.3,-186.5), (170.6,129.4), (170.2 ,-187.5), therefore ask for again above-mentioned four geometric center pixel coordinates, be Fig. 7 B the central point pixel coordinate value Pwc of to be processed Wt of correspondence, in the result of calculation of this example, be (31.025 ,-29.075).In addition, utilizing antitrigonometric function theorem (arctan) can obtain the anglec of rotation is 0.156.Accordingly, this example has very aobvious translation in X-axis and Y-axis, but the anglec of rotation quite a little.
As shown in Fig. 7 C, the geometric center pixel coordinate that has shown each RM_2 ~ 5, comparison area in figure is respectively (214.3,141.0), (187.0,-170.8), (187.7,176.7), (214.4 ,-137.5), therefore ask for again above-mentioned four geometric center pixel coordinates, be Fig. 7 C the central point pixel coordinate value Pwc of to be processed Wt of correspondence, in the result of calculation of this example, be (0.2,1.85).In addition, utilizing antitrigonometric function theorem (arctan) can obtain the anglec of rotation is 5.075 degree.Accordingly, this example has very aobvious being rotated counterclockwise, but translation quite a little.
As shown in Fig. 7 D, the geometric center pixel coordinate that has shown each RM_2 ~ 5, comparison area in figure is respectively (186.5,175.9), (214.2,-139.1), (215.9,141.3), (186.6 ,-173.7), therefore ask for again above-mentioned four geometric center pixel coordinates, be Fig. 7 D the central point pixel coordinate value Pwc of to be processed Wt of correspondence, in the result of calculation of this example, be (0.45,1.1).In addition, utilize antitrigonometric function theorem (arctan) can obtain the anglec of rotation for-4.9144 degree.Accordingly, this example has very aobvious turning clockwise, but translation quite a little.
As shown in Fig. 7 E, the geometric center pixel coordinate that has shown each RM_2 ~ 5, comparison area in figure is respectively (246.1,140.7), (217.6,-173.9), (157.1,176.7), (183.2 ,-137.0), therefore ask for again above-mentioned four geometric center pixel coordinates, be Fig. 7 E the central point pixel coordinate value Pwc of to be processed Wt of correspondence, in the result of calculation of this example, be (30.85,1.625).In addition, utilizing antitrigonometric function theorem (arctan) can obtain the anglec of rotation is 5.1022 degree.Accordingly, this example has obvious translation in X-axis, and has significantly and be rotated counterclockwise.
Above-described embodiment is only to give an example for convenience of description, and the interest field that the present invention advocated should be as the criterion with described in claim certainly, but not only limits to above-described embodiment.

Claims (15)

1. a toolroom machine processing positioning method, it is for the Working position of plural workpiece is positioned, and this plural number workpiece respectively has an appearance profile, and this plural number workpiece comprises a standard workpiece and at least one to be processed, and the method comprises the following steps:
(A) take this standard workpiece to obtain a standard video, and on this standard video, input a feature contour, this feature contour system is at least partly corresponding to this appearance profile of this plural number workpiece, between this standard workpiece and this at least one to be processed, has a workpiece spacing;
(B) calculate the central point actual coordinate of this standard video;
(C) take this at least one to be processed to obtain a workpiece image;
(D) with this feature contour, aim at after this appearance profile on this workpiece image, calculate the central point actual coordinate of this workpiece image; And
(E) calculate the difference between the central point actual coordinate of this workpiece image and this standard video, and it is added to this workpiece spacing, and obtain one of this at least one to be processed machining coordinate data.
2. toolroom machine processing positioning method according to claim 1, wherein, this step (B) is to calculate after the central point pixel coordinate value of this standard video, then converts it to this central point actual coordinate; This step (D) is to calculate after the central point pixel coordinate value of this workpiece image, then converts it to this central point actual coordinate.
3. toolroom machine processing positioning method according to claim 2, wherein, this step (B) and this step (D) they are, by a size conversion ratio value, this central point pixel coordinate value is converted to this central point actual coordinate; This size conversion ratio value system sees through following steps and obtains:
(S1) take this standard workpiece and obtain one first workpiece image;
(S2) move after this standard workpiece one specific range, take this standard workpiece and obtain a second workpiece image; And
(S3) calculate difference between the central point pixel coordinate value of this first workpiece image and this second workpiece image and the ratio between this specific range.
4. toolroom machine processing positioning method according to claim 3, wherein, this specific range system of this step (S2) comprises two axial displacements.
5. toolroom machine processing positioning method according to claim 2, wherein, the central point pixel coordinate value of calculating this standard video in this step (B) comprises the following steps:
(B1) cutting apart this standard video is plural interest domain, and each interest domain on this standard video includes this feature contour;
(B2) calculate respectively the geometric center pixel coordinate of each interest domain; And
(B3) calculate the central point between this geometric center pixel coordinate on this plural number interest domain.
6. toolroom machine processing positioning method according to claim 5, wherein, this workpiece image comprises plural comparison area, and this complex ratio corresponds respectively to this plural number interest domain to fauna, and the central point pixel coordinate value of calculating this workpiece image in this step (D) comprises the following steps:
(D1) with this feature contour on this plural number interest domain of this standard video, compare respectively this appearance profile on the plural comparison area of this workpiece image;
(D2) calculate respectively the geometric center pixel coordinate of each comparison area; And
(D3) calculate the central point between this geometric center pixel coordinate on this plural number comparison area.
7. toolroom machine processing positioning method according to claim 1, wherein, this step (A) more comprises a step before:
(a1) central point of locating this standard workpiece is in a shooting central point.
8. toolroom machine processing positioning method according to claim 7, wherein, the difference of calculating between this central point actual coordinate of this standard video and the coordinate of this shooting central point in this step (B) is an Input Offset Value; This step (E) is the difference of calculating between the central point actual coordinate of this workpiece image and this standard video, it is added to this workpiece spacing and this Input Offset Value, and obtain this these machining coordinate data of at least one to be processed.
9. toolroom machine processing positioning method according to claim 8, wherein, has a center distance between one of this shooting central point and this toolroom machine machining center point; This step (E) is the difference of calculating between the central point actual coordinate of this workpiece image and this standard video, it is added to this workpiece spacing, this Input Offset Value and this center distance, and obtain this these machining coordinate data of at least one to be processed.
10. a toolroom machine processing and positioning device, it is to position for the Working position to plural workpiece, this plural number workpiece has an appearance profile, this plural number workpiece comprises a standard workpiece and at least one to be processed, between this standard workpiece and this at least one to be processed, have a workpiece spacing, this device comprises:
One image acquisition unit, it is to be arranged on one of this toolroom machine main tapping, this image acquisition unit system is in order to take this standard workpiece to obtain a standard video, and in order to take this at least one to be processed to obtain a workpiece image;
One input unit, it is to be used to input on this standard video a feature contour, this feature contour system is at least partly corresponding to this appearance profile of this plural number workpiece; And
One controller, it is electrically connected this image acquisition unit and this input unit;
Wherein, the central point actual coordinate of this standard video calculates in this controller system; When this image acquisition unit according to this feature contour corresponding to this appearance profile on this workpiece image after, the central point actual coordinate of this workpiece image calculates in this controller system; This controller calculates the difference between the central point actual coordinate of this standard video and workpiece image, and it is added to this workpiece spacing, and obtains one of this at least one to be processed machining coordinate data.
11. toolroom machine processing and positioning devices according to claim 10, wherein, this image acquisition unit includes a microprocessor and a pick-up lens, and this microprocessor system is electrically connected this pick-up lens;
This controller comprises a primary processor and a storage module, this primary processor system is electrically connected this storage module, this storage module stores a size conversion ratio value, and this size conversion ratio value is the ratio value between the captured Pixel Dimensions of this image acquisition unit and actual processing dimension;
This microprocessor of this image acquisition unit calculates respectively the central point pixel coordinate value of this standard video and this workpiece image, and this primary processor converts the central point pixel coordinate value of this standard video and this workpiece image to this central point actual coordinate according to this stored size conversion ratio value of this storage module.
12. according to the toolroom machine processing and positioning device described in claim 11, and wherein, this image acquisition unit is positioned this standard workpiece the center position of this pick-up lens in advance while taking this standard video of this standard workpiece; This user inputs on this standard video by this input unit after this feature contour, and this controller one of calculates between the central point actual coordinate of this feature contour and the center of this pick-up lens Input Offset Value; This this machining coordinate data system of at least one to be processed adds this workpiece spacing and this Input Offset Value by the difference between the central point actual coordinate of this workpiece image and this standard video.
13. according to the toolroom machine processing and positioning device described in claim 12, wherein, between the machining center point of this main tapping of the center of this pick-up lens and this toolroom machine, have a center distance, this this machining coordinate data system of at least one to be processed adds this workpiece spacing, this Input Offset Value and this center distance by the difference between the central point actual coordinate of this workpiece image and this standard video.
14. according to the toolroom machine processing and positioning device described in claim 11, and wherein, this microprocessor of this image acquisition unit is controlled the rotation of this feature contour with corresponding to this this appearance profile of at least one to be processed.
15. toolroom machine processing and positioning devices according to claim 10, wherein, this controller means a computer numerical controller.
CN201210392978.8A 2012-10-16 2012-10-16 Toolroom machine processing positioning method and device thereof Active CN103722449B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210392978.8A CN103722449B (en) 2012-10-16 2012-10-16 Toolroom machine processing positioning method and device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210392978.8A CN103722449B (en) 2012-10-16 2012-10-16 Toolroom machine processing positioning method and device thereof

Publications (2)

Publication Number Publication Date
CN103722449A true CN103722449A (en) 2014-04-16
CN103722449B CN103722449B (en) 2016-06-29

Family

ID=50446871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210392978.8A Active CN103722449B (en) 2012-10-16 2012-10-16 Toolroom machine processing positioning method and device thereof

Country Status (1)

Country Link
CN (1) CN103722449B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104181861A (en) * 2014-09-02 2014-12-03 上海维宏电子科技股份有限公司 Biaxial numerical control machine tool correction positioning implementation method and system based on counterpoint platform
CN104551866A (en) * 2015-01-29 2015-04-29 周玉红 Automatically-view-finding camera for assembly line
CN105252376A (en) * 2015-10-14 2016-01-20 中国人民解放军国防科学技术大学 Workpiece self-locating device for high-precision polishing machine tool and machining method
CN105277137A (en) * 2015-10-08 2016-01-27 莆田市荣兴机械有限公司 Scavenging valve detection alignment method
CN105397568A (en) * 2015-12-24 2016-03-16 湖州以创精工机械有限公司 Calculating method for turning tool center height error of lathe
CN106181585A (en) * 2016-09-19 2016-12-07 深圳大宇精雕科技有限公司 It is applicable to the carving milling Apparatus and method for of intermetallic composite coating
CN107303644A (en) * 2016-04-19 2017-10-31 大隈株式会社 The position measuring method and position measurement system of object on lathe
CN107356202A (en) * 2017-07-27 2017-11-17 中国科学院光电研究院 A kind of laser scanning measurement system target sights method automatically
CN107977953A (en) * 2016-10-20 2018-05-01 英业达科技有限公司 Workpiece conductive features inspection method and workpiece conductive features check system
CN108247428A (en) * 2016-12-29 2018-07-06 中村留精密工业株式会社 Work mechanism
CN111823057A (en) * 2019-04-15 2020-10-27 财团法人工业技术研究院 Contour precision measuring system and measuring method
CN112415948A (en) * 2020-11-25 2021-02-26 长沙埃福思科技有限公司 Surface type machining method based on numerical control machine tool with coordinate detection function
CN113675121A (en) * 2020-05-13 2021-11-19 苏州能讯高能半导体有限公司 Positioning method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06246520A (en) * 1993-02-26 1994-09-06 Japan Steel Works Ltd:The Calibration method of camera type reference hole drilling machine
CN2638920Y (en) * 2003-08-06 2004-09-08 雷特国际股份有限公司 Image detection device for processed article
CN1861317A (en) * 2005-05-13 2006-11-15 西门子公司 Device and method for workpiece calibration
TW200949472A (en) * 2008-05-28 2009-12-01 Univ Chung Yuan Christian On-board two-dimension contour detection method and system
KR101013749B1 (en) * 2009-02-06 2011-02-14 (주)한테크 CNC Machinery tool having vision system
CN102343526A (en) * 2011-06-28 2012-02-08 天津汽车模具股份有限公司 Method for quickly determining machining center of automobile die cast
CN203109713U (en) * 2012-10-16 2013-08-07 西门子股份有限公司 Machine tool processing and locating device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06246520A (en) * 1993-02-26 1994-09-06 Japan Steel Works Ltd:The Calibration method of camera type reference hole drilling machine
CN2638920Y (en) * 2003-08-06 2004-09-08 雷特国际股份有限公司 Image detection device for processed article
CN1861317A (en) * 2005-05-13 2006-11-15 西门子公司 Device and method for workpiece calibration
TW200949472A (en) * 2008-05-28 2009-12-01 Univ Chung Yuan Christian On-board two-dimension contour detection method and system
KR101013749B1 (en) * 2009-02-06 2011-02-14 (주)한테크 CNC Machinery tool having vision system
CN102343526A (en) * 2011-06-28 2012-02-08 天津汽车模具股份有限公司 Method for quickly determining machining center of automobile die cast
CN203109713U (en) * 2012-10-16 2013-08-07 西门子股份有限公司 Machine tool processing and locating device

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104181861A (en) * 2014-09-02 2014-12-03 上海维宏电子科技股份有限公司 Biaxial numerical control machine tool correction positioning implementation method and system based on counterpoint platform
CN104551866A (en) * 2015-01-29 2015-04-29 周玉红 Automatically-view-finding camera for assembly line
CN105277137B (en) * 2015-10-08 2018-06-05 莆田市荣兴机械有限公司 Scavenging air valve detects alignment method
CN105277137A (en) * 2015-10-08 2016-01-27 莆田市荣兴机械有限公司 Scavenging valve detection alignment method
CN105252376A (en) * 2015-10-14 2016-01-20 中国人民解放军国防科学技术大学 Workpiece self-locating device for high-precision polishing machine tool and machining method
CN105397568A (en) * 2015-12-24 2016-03-16 湖州以创精工机械有限公司 Calculating method for turning tool center height error of lathe
CN107303644A (en) * 2016-04-19 2017-10-31 大隈株式会社 The position measuring method and position measurement system of object on lathe
CN107303644B (en) * 2016-04-19 2020-06-23 大隈株式会社 Method and system for measuring position of object on machine tool
CN106181585A (en) * 2016-09-19 2016-12-07 深圳大宇精雕科技有限公司 It is applicable to the carving milling Apparatus and method for of intermetallic composite coating
CN107977953A (en) * 2016-10-20 2018-05-01 英业达科技有限公司 Workpiece conductive features inspection method and workpiece conductive features check system
CN108247428A (en) * 2016-12-29 2018-07-06 中村留精密工业株式会社 Work mechanism
CN107356202A (en) * 2017-07-27 2017-11-17 中国科学院光电研究院 A kind of laser scanning measurement system target sights method automatically
CN111823057A (en) * 2019-04-15 2020-10-27 财团法人工业技术研究院 Contour precision measuring system and measuring method
CN111823057B (en) * 2019-04-15 2022-06-14 财团法人工业技术研究院 Contour precision measuring system and measuring method
CN113675121A (en) * 2020-05-13 2021-11-19 苏州能讯高能半导体有限公司 Positioning method and device
CN112415948A (en) * 2020-11-25 2021-02-26 长沙埃福思科技有限公司 Surface type machining method based on numerical control machine tool with coordinate detection function

Also Published As

Publication number Publication date
CN103722449B (en) 2016-06-29

Similar Documents

Publication Publication Date Title
CN103722449A (en) Machine tool machining locating method and device
CN111941425B (en) Rapid workpiece positioning method of robot milling system based on laser tracker and binocular camera
KR102056664B1 (en) Method for work using the sensor and system for performing thereof
US20090187276A1 (en) Generating device of processing robot program
CN105509671B (en) A kind of robot tooling center points scaling method using plane reference plate
CN106247932A (en) The online error-compensating apparatus of a kind of robot based on camera chain and method
CN203109713U (en) Machine tool processing and locating device
CN105841641A (en) Laser triangulation method-based three-dimensional measuring instrument and flatness detection method
US10591289B2 (en) Method for measuring an artefact
CN112880562A (en) Method and system for measuring pose error of tail end of mechanical arm
CN109597318B (en) Robot space registration method and device
CN108081267B (en) Method and device for starting a multiaxial system
CN203772217U (en) Non-contact type flexible online dimension measuring device
CN114001651B (en) Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data
CN115841516A (en) Method and device for modeling dynamic intrinsic parameters of camera
CN109773589B (en) Method, device and equipment for online measurement and machining guidance of workpiece surface
CN109591007B (en) Robot space registration method and device
CN108257184B (en) Camera attitude measurement method based on square lattice cooperative target
CN113554616A (en) Online measurement guiding method and system based on numerical control machine tool
CN112631200A (en) Machine tool axis measuring method and device
Xu et al. Industrial robot base assembly based on improved Hough transform of circle detection algorithm
Krotova et al. Development of a trajectory planning algorithm for moving measuring instrument for binding a basic coordinate system based on a machine vision system
CN105425724A (en) High-precision motion positioning method and apparatus based on machine vision scanning imaging
CN101660895B (en) Computer system and method for accurately positioning element to be measured
US20150287177A1 (en) Image measuring device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant