CN113510536B - On-machine detection device and method for machining center - Google Patents

On-machine detection device and method for machining center Download PDF

Info

Publication number
CN113510536B
CN113510536B CN202110474521.0A CN202110474521A CN113510536B CN 113510536 B CN113510536 B CN 113510536B CN 202110474521 A CN202110474521 A CN 202110474521A CN 113510536 B CN113510536 B CN 113510536B
Authority
CN
China
Prior art keywords
point
axis
camera
image
detection device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110474521.0A
Other languages
Chinese (zh)
Other versions
CN113510536A (en
Inventor
陈云
侯亮
郭敬
卜祥建
梁小龙
邱树彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen University
Original Assignee
Xiamen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen University filed Critical Xiamen University
Priority to CN202110474521.0A priority Critical patent/CN113510536B/en
Publication of CN113510536A publication Critical patent/CN113510536A/en
Application granted granted Critical
Publication of CN113510536B publication Critical patent/CN113510536B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/20Arrangements for observing, indicating or measuring on machine tools for indicating or measuring workpiece characteristics, e.g. contour, dimension, hardness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
    • B23Q17/2409Arrangements for indirect observation of the working space using image recording means, e.g. a camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides an on-machine detection device and method for a machining center, and relates to the technical field of machining center detection. Wherein, the on-machine detection device comprises a fixing component and a shooting component. The fixing component comprises a connecting piece used for connecting a main shaft of the machining center and a mounting seat configured on the connecting piece. The shooting component comprises a camera, a lens, a light source and a wireless module, wherein the camera, the lens, the light source and the wireless module are arranged on the mounting seat. The light source is used for illuminating the workpiece. The camera and the lens are used for shooting the workpiece. The wireless module is electrically connected with the camera and the image processing equipment and used for sending the image shot by the camera to the image processing equipment. The camera, the lens, the light source and the wireless module are arranged on the mounting seat to form a whole; and this whole is formed into a cutter-like device by means of a coupling. The on-machine detection device can be stored in the tool magazine for calling when not in use, and the tool magazine is called out to be installed on a main shaft of a machining center through a connecting piece when in use, so that the on-machine detection process is greatly simplified.

Description

On-machine detection device and method for machining center
Technical Field
The invention relates to the field of machining center detection, in particular to an on-machine detection device and method for a machining center.
Background
With the wide industrial application of machining centers and the improvement of the manufacturing precision and complexity of parts, the on-machine detection system is widely applied to the field of precision machining by virtue of the advantages of high efficiency, high precision, high automation measurement and the like, and becomes an important component in a precision numerical control machining system.
However, because of the low popularity of the on-line inspection device, the workpiece inspection is mostly manual inspection and off-line inspection at present. The manual detection has the problems of insufficient precision and low efficiency, secondary positioning errors can be caused by offline detection, the machining precision is difficult to ensure, and the machining efficiency is influenced. In order to meet the production requirements of precision parts, ensure the machining precision and simplify the detection process so as to improve the production efficiency, a device for realizing high-speed and high-efficiency on-machine detection of a machining center is urgently needed.
Disclosure of Invention
First aspect,
The invention provides an on-machine detection device of a machining center, and aims to solve the problem that the machining center cannot detect workpieces quickly.
In order to solve the technical problem, the invention provides an on-machine detection device of a machining center, which comprises a fixing assembly and a shooting assembly.
The fixing assembly comprises a connecting piece used for connecting a main shaft of the machining center and a mounting seat configured on the connecting piece.
The shooting component comprises a camera, a lens, a light source and a wireless module, wherein the camera, the lens and the light source are arranged on the mounting base. The light source is used for illuminating a workpiece. The camera and the lens are used for shooting a workpiece. The wireless module is electrically connected with the camera and the image processing equipment and used for sending the image shot by the camera to the image processing equipment.
Optionally, one end of the connecting piece is a handle structure adapted to the main shaft of the machining center, and the other end of the connecting piece is detachably disposed on the mounting seat.
The connecting piece and the mounting seat are detachably connected through a first fastener.
Optionally, the mount is PC or ABS material. The first fastener is a bolt or a screw.
Optionally, the camera is a monochrome industrial camera and the lens is a small fixed focus lens. The camera is detachably configured to the mount through a second fastener.
Optionally, a gasket is disposed between the camera and the mount. The second fastener is a hexagon socket flat end set screw.
Optionally, the light source is a ring-shaped structured shadowless light source. The light source is detachably configured on the mounting seat through a third fastener.
Optionally, a gasket is disposed between the light source and the mount. The third fastener is a screw.
Optionally, the camera assembly further includes an external battery disposed on the mounting base. The external battery is electrically connected with the camera and the wireless module.
Optionally, the mounting seat is of a columnar structure, is provided with a mounting groove, and is connected to the shooting hole of the mounting groove. The wireless module, the external battery and the camera are all configured in the mounting groove. The lens is configured on the camera and passes through the shooting hole. The light source is configured at one end of the mounting seat, which is provided with the shooting hole.
Optionally, the mounting seat is of a rectangular parallelepiped structure. The number of the mounting grooves is 3. The wireless module, the external battery and the camera are respectively arranged in 3 mounting grooves.
The second aspect,
The embodiment of the invention provides an on-machine detection method of a machining center, which is used for detecting manufacturing error information of a workpiece in the machining center by using the on-machine detection device according to any section of the first aspect;
the detection method comprises the following steps:
s1, acquiring a first image containing the characteristic to be detected of the workpiece; wherein the first image is captured by the on-machine detection device;
S2, calculating an included angle between the optical axis of the on-line detection device and the axis of the feature to be detected according to the first image;
s3, adjusting the position of the workpiece and/or the on-machine detection device according to the included angle so as to enable the optical axis to coincide with the axial line;
s4, acquiring a second image sequence containing the feature to be detected; wherein the second sequence of images is captured by the on-machine inspection device moving equidistantly along the axis;
s5, according to the second image sequence, based on focus morphology recovery, generating and acquiring a full focus surface morphology depth map of the feature to be detected;
and S6, acquiring a theoretical model of the workpiece, and comparing the theoretical model with the full-focusing surface topography depth map to generate manufacturing error information of the workpiece.
Optionally, the step S2 specifically includes the step S21 to the step S25:
s21, extracting the edge characteristics of the first image to obtain an edge contour map;
s22, obtaining the coordinates of each edge point on the edge contour map and generating an array;
s23, calculating to obtain ellipse parameters MPNQ { A (x, y), delta, a, b } according to the array; wherein, M and N are two end points of the major axis of the ellipse respectively, P and Q are two end points of the minor axis of the ellipse respectively, A (x, y) is the central coordinate of the ellipse, delta is the deflection angle of the major axis, a is the length of the major semi-axis, and b is the length of the minor semi-axis;
S24, acquiring a coordinate o of the mechanical detection device in a machining center and coordinates of an MPNQA point on a feature to be detected corresponding to the five points MPNQA in the machining center;
and S25, calculating to obtain the included angle according to the coordinate o and the coordinate of the mpnqa point in the machining center.
Optionally, step S23 specifically includes steps S231 to S233:
s231, selecting two edge points from the array, and calculating basic parameters MN { A (x, y), delta, a }; wherein, M and N are two end points of the ellipse major axis respectively, A (x, y) is the ellipse center coordinate, delta is the major axis deflection angle, and a is the length of the major semi-axis;
s232, calculating the length b of the short half shaft corresponding to other edge points in the array based on the basic parameters;
s233, recording the occurrence frequency of different b values, and when the occurrence frequency of a certain b value is larger than a preset threshold value, judging that the two edge points are ellipse major axis end points, and acquiring ellipse parameters MPNQ { A (x, y), delta, a, b }; otherwise, selecting the other two edge points in the array and recalculating;
wherein, M and N are two end points of the major axis of the ellipse respectively, P and Q are two end points of the minor axis of the ellipse respectively, A (x, y) is the central coordinate of the ellipse, delta is the deflection angle of the major axis, a is the length of the major semi-axis, and b is the length of the minor semi-axis;
Optionally, step S25 specifically includes step S251 to step S255:
s251, passing point p, point o and point q establish plane S poq (ii) a Wherein an angle bisector oa of the included angle poq is positioned on the plane S poq
S252, establishing the plane S through the angular bisector oa poq Perpendicular plane S of ⊥poq (ii) a Wherein the vertical plane S ⊥poq Intersecting the feature to be detected at m point and n point;
s253, taking the point a as the center of a circle and oa as the radius, and forming a vertical plane S ⊥poq Establishing a circular arc;
s254, calculating an optical core included angle mon corresponding to each point on the arc, and taking the arc point with the maximum optical core included angle mon as a target working point;
and S255, calculating the included angle according to the target working point, the point a and the point o.
Optionally, the step S5 specifically includes the step S51 to the step S52:
s51, calculating a focus pixel area of each image in the second image sequence based on the evaluation function;
s52, splicing the focus pixel regions into a full focus surface topography depth map;
optionally, the step S6 specifically includes the step S61 to the step S62:
s61, acquiring a theoretical model of the workpiece;
and S62, extracting point cloud data in the full focusing surface topography depth map, and calculating to obtain manufacturing error information by making a difference between the point cloud data and a corresponding point in a theoretical model.
By adopting the technical scheme, the invention can obtain the following technical effects:
the camera, the lens, the light source and the wireless module are arranged on the mounting seat to form a whole; and this whole is formed into a cutter-like device by means of a coupling. The on-machine detection device can be stored in the tool magazine for calling when not in use, and the tool magazine is called out to be installed on a main shaft of a machining center through a connecting piece when in use, so that the on-machine detection process is greatly simplified.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is an isometric view of an on-machine inspection device.
Fig. 2 is an exploded view of an on-machine inspection device.
FIG. 3 is a schematic flow diagram of an on-machine inspection method.
FIG. 4 is a logic block diagram of an on-machine detection method.
Fig. 5 is an imaging schematic diagram (object vertical) when the camera photographs an object.
Fig. 6 is an imaging principle diagram (object tilt) when the camera photographs an object.
Fig. 7 and 8 are schematic diagrams of ellipse parameters and hough transform bases.
FIG. 9 is a schematic diagram of the calculation of the optimal viewing angle for the on-machine inspection device.
Fig. 10 is a schematic diagram of a second sequence of images being taken.
FIG. 11 is a schematic diagram of focal topographic recovery.
The labels in the figure are: 1-a connector; 2-a first fastener; 3-mounting a base; 4-a gasket; 5-a third fastener; 6-a light source; 7-a lens; 8-a gasket; 9-a second fastener; 10-a camera; 11-external battery; 12-wireless module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings of the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations and positional relationships based on those shown in the drawings, and are used only for convenience of description and simplicity of description, and do not indicate or imply that the equipment or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, "above" or "below" a first feature means that the first and second features are in direct contact, or that the first and second features are not in direct contact but are in contact with each other via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The invention is described in further detail below with reference to the following detailed description and accompanying drawings:
the first embodiment,
As shown in fig. 1 and fig. 2, the present embodiment provides an on-machine inspection device for a machining center, which includes a fixing component and a shooting component. The fixing component comprises a connecting piece 1 for connecting a main shaft of the machining center and a mounting seat 3 configured on the connecting piece 1. The shooting assembly includes a camera 10, a lens 7, a light source 6, and a wireless module 12 disposed on the mounting base 3. The light source 6 is used to illuminate the workpiece. The camera 10 and the lens 7 are used to photograph a workpiece. The wireless module 12 is electrically connected to the camera 10 and the image processing apparatus to transmit an image captured by the camera 10 to the image processing apparatus.
In the present embodiment, the camera 10, the lens 7, the light source 6, and the wireless module 12 are integrally disposed on the mount 3; and the whole is formed into a cutter-like device through the connecting piece 1, and the cutter-like device can be quickly connected with and detached from a main shaft of a machining center. When the on-machine detection device is not used, the on-machine detection device can be stored in the tool magazine for calling, and when the on-machine detection device is used, the on-machine detection device is called out and is installed on a main shaft of a machining center through the connecting piece 1, so that the on-machine detection process is greatly simplified.
The image processing device can receive the image information sent by the wireless module 12 and perform image preprocessing, focus morphology recovery, point cloud information extraction and manufacturing error evaluation. The method is used for making a workpiece machining error evaluation report and guiding the machining process optimization process of the workpiece to be tested. The size and error information of the workpiece are obtained by analyzing the image, which is not described herein for the prior art. The on-machine detection device greatly simplifies the installation and measurement process of the visual on-machine detection device.
On the basis of the above embodiment, in an optional embodiment of the present invention, one end of the connecting member 1 is a tool shank structure adapted to the main shaft of the machining center, and the other end is detachably disposed on the mounting base 3. The connecting piece 1 is detachably connected with the mounting seat 3 through a first fastening piece 2. In this embodiment, through setting the one end of connecting piece 1 to handle of a knife structure, handle of a knife structure such as morse taper hole handle of a knife, side fixed handle of a knife for connecting piece 1 can be like quick installation on the main shaft of conventional cutter, comes the shift position through the main shaft, shoots suitable photo. In other embodiments, other structures such as bolts, snaps, etc. may be provided.
Particularly, the tool handle and the connecting piece 1 are detachably connected, so that the on-machine detection device can adapt to the same machining center by replacing the connecting piece 1, and the tool handle and the connecting piece have good practical significance.
On the basis of the above embodiments, in an alternative embodiment of the present invention, the mounting base 3 is made of PC or ABS material. The first fastening member 2 is a bolt or a screw. The PC or ABS material has enough toughness, strength and heat resistance, and the bolt or screw can reliably mount the connecting piece 1 on the mounting seat 3. In this embodiment, connecting piece 1 is provided with the location arch, and mount pad 3 is provided with positioning groove, and the structure through location arch and positioning groove not only can guarantee connecting piece 1 for the position of mount pad 3, can also prevent to take place to rotate between connecting piece 1 and the mount pad 3, has fine practical meaning.
On the basis of the above embodiments, in an alternative embodiment of the present invention, the camera 10 is a monochrome industrial camera, and the lens 7 is a small fixed focal length lens. The camera 10 is detachably attached to the mount 3 by a second fastener 9. Specifically, a spacer 8 is provided between the camera 10 and the mount 3. The second fastener 9 is a hexagon socket flat set screw. The camera 10 and the lens 7 constitute an image pickup system. When in use, the camera 10 is adjusted to a proper position above the part through the main shaft to take a picture at the most proper distance. The camera 10 can be firmly fixed to the mount 3 by the spacer 8 and the hexagon socket flat set screw. The fixed focal length lens, the industrial camera and the radio communication system can realize the functions of image acquisition, transmission and the like, and realize on-machine non-contact measurement;
On the basis of the above embodiments, in an alternative embodiment of the present invention, the light source 6 is a ring-shaped shadowless light source. The light source 6 is detachably attached to the mount 3 by a third fastener 5. Specifically, a gasket 4 is provided between the light source 6 and the mount 3. The third fastening member 5 is a screw. The annular shadowless light source can be used for high-efficiency low-angle illumination to uniformly illuminate a workpiece, the problem of unclear pictures caused by local dark is avoided, and the reliability of the on-machine detection device is greatly improved.
On the basis of the above embodiment, in an optional embodiment of the present invention, the photographing assembly further includes an external battery 11 disposed on the mounting base 3. The external battery 11 is electrically connected to the camera 10 and the wireless module 12. Mount pad 3 is the column structure, and is provided with the mounting groove and connects in the shooting hole of mounting groove. The wireless module 12, the external battery 11, and the camera 10 are disposed in the mounting groove. The lens 7 is disposed on the camera 10 and passes through the shooting hole. The light source 6 is disposed at one end of the mount 3 where the imaging hole is provided. Specifically, the mount 3 is a rectangular parallelepiped structure. The number of the mounting grooves is 3. The wireless module 12, the external battery 11, and the camera 10 are respectively disposed in the 3 mounting slots. Through installing wireless module 12, external battery 11 to and camera 10 in the mounting groove, can be fine protect these electronic components, prevent to bump in the shooting process, have fine practical meaning.
Example II,
As shown in fig. 3 to 11, an embodiment of the present invention provides an on-machine inspection method for a machining center, which is used for using the on-machine inspection apparatus according to any one of the paragraphs of the first aspect in the machining center to inspect manufacturing error information of a workpiece.
The on-board inspection method may be performed by an on-board inspection apparatus. In particular, the method is executed by one or more processors in an on-board inspection apparatus to implement the steps of:
s1, acquiring a first image containing the characteristic to be measured of the workpiece. Wherein the first image is captured by an on-machine detection device.
In this embodiment, the on-machine inspection device may be a controller of the machining center, or a computer capable of interacting with the machining center and the on-machine inspection apparatus. Specifically, in the present embodiment, the on-machine detection device is a computer, which transmits data with the on-machine detection device by wireless signals.
The workpiece is machined at a machining center. The characteristics to be measured are the rotating structural characteristics of a cylinder, a hole or a cone and the like processed by a processing center on the workpiece. The on-machine detection method provided by the embodiment is suitable for detecting cylindrical, hole or conical parts or parts comprising the characteristics, and is also suitable for visual angle optimization, subsequent shape recovery and manufacturing error evaluation of other rotary parts or characteristics in on-machine vision detection.
The on-machine detection device is mounted on a main shaft of a machining center, and the on-machine detection device is moved by the main shaft. The first step of the detection is to move the on-machine detection device so that the feature of the workpiece to be detected is within the shooting range of the on-machine detection device. The relative position of the feature to be detected and the on-machine detection device can be determined through the first image shot by the on-machine detection device from the present time to the present time. Thereby adjusting the on-machine detection device to the proper position.
Specifically, the workpiece is placed on a worktable of a machining center, and an image is preliminarily photographed under an initial positional relationship between a camera and the part, and the image is transmitted to an image processing system in the machine inspection equipment by a radio communication system.
And S2, calculating an included angle between the optical axis of the mechanical detection device and the axis of the feature to be detected according to the first image.
Specifically, the position of the on-machine detection device is preliminarily determined in step S1. At this time, the optical axis of the camera (the optical axis is the axis direction of the camera) and the axis of the feature to be measured (the feature to be measured is the rotation feature, the axis is the rotation axis of the feature to be measured) are often not parallel, and a certain included angle exists. Therefore, the image obtained by the on-machine detection device is elliptical, and cannot accurately reflect the characteristic information of the characteristic to be detected. Therefore, in the invention, the included angle between the optical axis of the camera and the axis of the feature to be detected is determined through the first image shot by the on-machine detection device, so that the shooting position of the camera is accurately adjusted, and the on-machine detection precision is ensured.
In this embodiment, the step S2 specifically includes steps S21 to S25:
as shown in fig. 5 and 6, the camera imaging principle and the circular imaging law are explained first: the object point P through the optical system forms a sharp pixel Pf in the focal plane, where u is often >2 f. Wherein u is the object distance, s is the image distance, and f is the focal length. An inverted and reduced image is formed on the focal plane. When a circular object is imaged, the imaging is smaller because the object distance is larger. As shown in fig. 6, the spatial circle that is not perpendicular to the optical axis is an ellipse in the imaging shape. With this feature, the deviation angle of the object plane from the camera plane can be calculated. Specifically, the method comprises the following steps:
and S21, extracting the edge characteristics of the first image to obtain an edge contour map.
And S22, acquiring the coordinates of each edge point on the edge contour map, and generating an array.
Firstly, carrying out graph preprocessing on a first image, obtaining a binary edge contour map based on an elliptical edge extracted from the first image by a Canny operator, and then storing point coordinates on the edge map into an array. The Canny operator is an edge detection algorithm based on image gradient calculation, and has the advantages of low error rate, accurate edge center positioning, no generation of false edges, simple process and the like.
And S23, calculating to obtain the ellipse parameters MPNQ { A (x, y), delta, a, b } according to the array. Wherein, M and N are two endpoints of the major axis of the ellipse respectively, P and Q are two endpoints of the minor axis of the ellipse respectively, A (x, y) is the central coordinate of the ellipse, delta is the deflection angle of the major axis, a is the length of the major semi-axis, and b is the length of the minor semi-axis.
Then, ellipse detection is performed through Hough (Hough) transformation, an ellipse obtained by shooting in the first image is searched, and ellipse parameters are determined. The Hough transform is a good method for detecting a specific geometric curve, and has the advantages of strong anti-interference capability and insensitivity to defective parts of straight lines, noise and other coexisting nonlinear structures in an image. In the ellipse detection, the geometric characteristics are used for parameter dimension reduction, so that the algorithm performance is improved.
Specifically, step S23 specifically includes steps S231 to S233:
s231, two edge points are selected from the array, and basic parameters MN { A (x, y), delta, a } are calculated. Wherein, M and N are two end points of the ellipse major axis respectively, A (x, y) is the ellipse center coordinate, delta is the major axis deflection angle, and a is the length of the major semi-axis.
And S232, calculating the length b of the minor semi-axis corresponding to other edge points in the array based on the basic parameters.
S233, recording the occurrence times of different b values, and when the occurrence times of a certain b value is larger than a preset threshold value, judging that the two edge points are ellipse major axis end points, and acquiring ellipse parameters MPNQ { A (x, y), delta, a, b }. Otherwise, selecting the other two edge points in the array and recalculating.
Wherein, M and N are two end points of the ellipse major axis respectively, P and Q are two end points of the ellipse minor axis respectively, A (x, y) is the ellipse center coordinate, delta is the major axis deflection angle, a is the length of the major semi-axis, and b is the length of the minor semi-axis.
Specifically, as shown in fig. 7 and 8, based on Hough transform ellipse detection, a pair of edge points in an arbitrarily selected array is assumed to be major axis points M (x) M ,y M )、N(x N ,y N ) The basic parameters { A (x, y), δ, a } of the ellipse are calculated as equations (1) to (4).
Figure GDA0003692875500000111
Figure GDA0003692875500000121
Figure GDA0003692875500000122
Figure GDA0003692875500000123
Suppose the ellipse has a focus W (x) W ,y W )、V(x V ,y V ) In addition, take B (x) at the ellipse periphery B ,y B ) Points (i.e., edge points). From the properties of the ellipse and the focus, equations (5) to (9), the corresponding minor semi-axis b value can be calculated.
Figure GDA0003692875500000124
Figure GDA0003692875500000125
Figure GDA0003692875500000126
Figure GDA0003692875500000127
Figure GDA0003692875500000128
The combined type (5) to (9) is as follows:
Figure GDA0003692875500000129
wherein, in formula (10):
ρ 2 =(y M -y) 2 +(x M -x) 2 (11)
μ=|sinδ|(y M -y)+|cosδ|(x M -x) (12)
by determining the end point of the major axis as derived above, parameters other than the minor axis b can be calculated. That is, a pair of edge points is arbitrarily selected from the array, and assumed to be the major axis points, and the corresponding basic parameters (i.e., parameters other than the minor axis b) are calculated.
Then, sequentially and randomly taking a point B in the ellipse to calculate the minor semi-axis length B, and counting the occurrence times of different minor semi-axis lengths B. The cumulative most significant b-value is used to determine the ellipse parameters.
Based on the characteristics of the ellipse, if the starting points are two endpoints of the major axis of the ellipse, the b value calculated each time is constant. Therefore, when a pair of edge points assumed to be major axis points are selected as true major axis points, the cumulative occurrence number of the same b value is the largest.
Setting a threshold value T (the value is the stopping condition of the algorithm loop structure, and generally the value is in the order of magnitude of 10 2 ) Counting the parameter b on a two-dimensional parameter space to obtain a group of parameters with peak values exceeding a certain threshold value T, namely ellipses, and considering the assumed long axis end point to be correct; otherwise, selecting the next point pair in the array as the long axis endpoint for iteration; the end of the iteration may determine the ellipse MPNQ { A (x, y), δ, a, b }.
And S24, acquiring the coordinate o of the mechanical detection device in the machining center, and the coordinates of the MPNQA point on the feature to be detected corresponding to the five points MPNQA in the machining center.
And S25, calculating the included angle according to the coordinate o and the coordinate of the mpnqa point in the machining center.
In this embodiment, step S25 specifically includes steps S251 to S255:
S251, passing point p, point o and point q establish plane S poq . Wherein an angle bisector oa of the included angle poq is positioned on a plane S poq
S252, establishing a plane S through an angular bisector oa poq Perpendicular plane S of ⊥poq . Wherein the vertical plane S ⊥poq And intersecting the characteristic to be measured at m point and n point.
S253, taking the point a as the center of a circle and oa as the radius, and forming a vertical plane S ⊥poq And establishing a circular arc.
S254, calculating the optical core included angle mon corresponding to each point on the arc, and taking the arc point with the maximum optical core included angle mon as a target working point.
And S255, calculating an included angle according to the target working point, the point a and the point o.
As shown in fig. 9, the camera rotation angle can be determined from the imaging geometry based on the target circle (i.e., the feature to be measured of the workpiece) MPNQ and its mirror ellipse (i.e., the ellipse in the first image) MPNQ.
Specifically, the spatial circle (i.e., the ellipse in the first image) MPNQ is a standard ellipse MPNQ on the camera imaging plane, and two opposite elliptical cones can be obtained by connecting the optical center and the object image contour. The camera optical center is 0 and images the major axis MN, the minor axis PQ, the center a of the ellipse a.
When the minor axis PQ corresponds to the diameter PQ of the target circle, the included angle poq is the minimum included angle of the elliptical cone, oa is an angular bisector, and a plane which passes through the angular bisector oa and is perpendicular to the plane poq is counted as S ⊥poq And the plane intersects with the target circular surface and the imaging ellipse at MN and MN respectively. The included angle between the diameter mn and the optical center is the maximum included angle of the elliptical cone, and at the moment, om is not equal to on.
At S ⊥poq In the plane, along the arc (a is the center of circle, oa is the radius)
Figure GDA0003692875500000131
And moving the camera to change the shooting angle until the included angle mon reaches the maximum value, wherein om is on, and the elliptical cone 0-mpqn becomes a standard cone. The rotated angle theta is the deviation angle between the circular characteristic axis and the optical axis of the camera.
Namely, by calculating the optical center included angle between each point on the arc and the m point and the n point, the arc point when the optical center included angle is the largest is on the axis of the feature to be measured. Therefore, the optimal shooting point is determined by calculating the optical center angle, and the angle between the optical axis and the axis is determined by passing the optimal shooting point and the points a and o.
And S3, adjusting the position of the workpiece and/or the on-machine detection device according to the included angle so that the optical axis and the axial line coincide.
Specifically, the calculated included angle is fed back to the machining center, so that the machining center adjusts the relative position of the camera and the measured part to achieve the optimal viewing angle. At this time, the object plane, the mirror plane and the image plane are parallel to each other.
It should be noted that the optimal viewing angle can be realized by changing the pose of the machining center or the clamping mode of the workpiece to be measured. For example: the three-axis machining center needs to adjust a tool clamping mode to achieve an adjusting effect; for five-axis or other multi-axis control machining centers, the angle can be adjusted according to different forms of reasonable rotation angles of the swing head or the rotary table. The specific adjusting method can be configured according to different rotating shafts of the machine tool, and the optical axis adjusting strategy can be reasonably selected, and the invention is not repeated herein.
Before formal image acquisition, the deviation angle of the optical axis of the camera is calculated according to the geometric relationship between an object and an image, and the angle of the camera is adjusted to enable the camera to shoot at the optimal visual angle, so that a better detection effect is obtained.
And S4, acquiring a second image sequence containing the feature to be measured. Wherein the second image sequence is obtained by moving the on-machine detection device along the axis at equal intervals and shooting.
Specifically, as shown in fig. 10, after the angle is adjusted to the optimal angle, the camera acquires N frames of images of the surface of the feature to be detected along the optical axis direction at equal step length during shooting, each frame of image has a determined depth, and the determined depth is used as input for depth extraction and morphology restoration of subsequent image processing.
In this embodiment, the second image sequence is transmitted to the detection device by wireless transmission.
And S5, according to the second image sequence, generating a full focusing surface topography depth map for acquiring the feature to be detected based on focusing topography recovery.
In this embodiment, the step S5 specifically includes steps S51 to S52:
s51, the focus pixel area of each image in the second image sequence is calculated based on the evaluation function.
And S52, splicing the full focus surface topography depth map according to the focus pixel areas.
It should be noted that, as shown in fig. 5, for a lens with a fixed focal length, the object point P is imaged on the focal plane as Pf, so that when the imaging plane coincides with the focal plane, it is imaged as a clear pixel. When the two planes are not coincident, a circle of confusion with a radius R is formed, and the imaging is more blurred as the distance between the two planes is increased to cause the radius R of the circle of confusion to be larger. By applying the focusing morphology recovery technology, the depth information can be respectively extracted from the collected N frames of non-full focusing images, namely the depth information of the workpiece when the imaging surface is superposed with the focusing plane is extracted from the partial focusing images. And obtaining a full-focus image after serialization.
In particular, the amount of the solvent to be used,
(a) as shown in a in fig. 11, K two-dimensional sequence images are captured, and include all depth information of the surface to be measured, and it is assumed that an element of the surface to be measured is imaged clearly when i is equal to K.
(b) An evaluation function (commonly used evaluation functions include a Laplacian operator, a Tenengrad focus evaluation function, a gray variance operator and the like) is selected, for example, a gray variance operator, the gray variance operator utilizes the gray difference between a clear pixel and a fuzzy pixel, and the variance is used as a definition evaluation standard. And calculating the definition evaluation function value U (x, y) of each pixel point (x, y) of the image by taking each pixel point (x, y) of the image as an evaluation window, and positioning the serial image number corresponding to the maximum point on the evaluation curve, wherein the depth of the point is counted as k, as shown by d in fig. 11.
Namely, the depth value of the pixel is calculated through a definition evaluation function, namely the depth information of each pixel point is obtained through focusing information.
(c) And constructing a full-focus surface topography depth map.
Specifically, as shown in b in fig. 11, a depth matrix corresponding to the sequence image is obtained, then the depth of each surface element to be measured is determined in sequence, and a surface topography depth map D (x, y) can be obtained after the serialization processing.
(d) From the depth map, a fully focused image F (x, y) is obtained, as shown in fig. 11 c.
Namely, extracting all focusing pixel areas from the original sequence image after the definition evaluation, and splicing the focusing pixel areas into a full focusing surface topography depth map.
The focusing morphology recovery algorithm has the advantages of simplicity, rapidness, easiness in operation, good instantaneity, high measurement precision and the like, and can meet the requirements of complex surface morphology reconstruction, surface depth measurement and the like.
And S6, acquiring a theoretical model of the workpiece, and comparing the theoretical model with the full-focusing surface topography depth map to generate manufacturing error information of the workpiece.
In this embodiment, the step S6 specifically includes steps S61 to S62:
and S61, acquiring a theoretical model of the workpiece.
And S62, extracting point cloud data in the full-focusing surface topography depth map, and calculating to obtain manufacturing error information by making a difference between the point cloud data and a corresponding point in the theoretical model.
And finally, comparing the design size requirements of the full focusing surface topography depth map and the theoretical model, and evaluating the manufacturing error. The method comprises the steps of obtaining a depth map based on a focusing morphology recovery principle, extracting the difference between coordinate information of corresponding points and corresponding points of a theoretical model to obtain an error result, and calculating form and position error information through measuring points and lines and surfaces determined by the measuring points.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. An on-machine inspection device for a machining center, comprising:
the fixing assembly comprises a connecting piece (1) used for connecting a main shaft of a machining center and a mounting seat (3) configured on the connecting piece (1);
a shooting component which comprises a camera (10) arranged on the mounting seat (3), a lens (7), a light source (6) and a wireless module (12); the light source (6) is used for illuminating a workpiece; the camera (10) and the lens (7) are used for shooting a workpiece; the wireless module (12) is electrically connected with the camera (10) and the image processing equipment and is used for sending the image shot by the camera (10) to the image processing equipment;
the on-machine detection device is used for detecting the manufacturing error information of the workpiece in the machining center; wherein the detection method for detecting the manufacturing error information using the on-machine detection device includes:
acquiring a first image containing the feature to be detected of the workpiece; wherein the first image is captured by the on-machine detection device;
calculating an included angle between an optical axis of the on-line detection device and an axis of the feature to be detected according to the first image;
adjusting the position of the workpiece and/or the on-machine detection device according to the included angle so as to enable the optical axis to coincide with the axis;
Acquiring a second image sequence containing the feature to be detected; wherein the second sequence of images is captured by the on-machine inspection device moving equidistantly along the axis;
according to the second image sequence, based on focus topography recovery, generating a full focus surface topography depth map for obtaining the feature to be detected;
obtaining a theoretical model of the workpiece, and comparing the theoretical model with the full-focusing surface topography depth map to generate manufacturing error information of the workpiece;
calculating an included angle between an optical axis of the on-machine detection device and an axis of the feature to be detected according to the first image, and specifically comprising the following steps:
extracting edge features of the first image to obtain an edge profile graph;
acquiring coordinates of each edge point on the edge contour map, and generating an array;
according to the array, calculating to obtain ellipse parameters MPNQ { A (x, y), delta, a, b }; wherein, M and N are two end points of the major axis of the ellipse respectively, P and Q are two end points of the minor axis of the ellipse respectively, A (x, y) is the central coordinate of the ellipse, delta is the deflection angle of the major axis, a is the length of the major semi-axis, and b is the length of the minor semi-axis;
acquiring a coordinate o of the on-machine detection device in a machining center and coordinates of an MPNQA point on a feature to be detected corresponding to the five points MPNQA in the machining center;
Calculating to obtain the included angle according to the coordinate o and the coordinate of the mpnqa point in the machining center;
calculating to obtain the included angle according to the coordinate o and the mpnqa point, and specifically:
the crossing point p, the point o and the point q establish a plane S poq (ii) a Wherein an angle bisector oa of the included angle poq is positioned on the plane S poq
Establishing said plane S through said bisector oa poq Perpendicular plane S of ⊥poq (ii) a Wherein the vertical plane S ⊥poq Intersecting the feature to be detected at m point and n point;
using the point a as the center of circle and oa as the radius, on the vertical plane S ⊥poq Establishing a circular arc;
calculating the optical core included angle mon corresponding to each point on the arc, and taking the arc point with the maximum optical core included angle mon as a target working point;
and calculating the included angle according to the target working point, the point a and the point o.
2. The on-machine detection device according to claim 1, wherein one end of the connecting piece (1) is a handle structure matched with a main shaft of a machining center, and the other end of the connecting piece is detachably arranged on the mounting base (3);
the connecting piece (1) is detachably connected with the mounting seat (3) through a first fastening piece (2);
the mounting seat (3) is made of PC or ABS material; the first fastening piece (2) is a bolt or a screw.
3. The on-machine inspection device according to claim 1, wherein the camera (10) is a monochrome industrial camera, and the lens (7) is a small fixed focal length lens; the camera (10) is detachably arranged on the mounting seat (3) through a second fastener (9);
A gasket (8) is arranged between the camera (10) and the mounting base (3); the second fastener (9) is a hexagon socket flat end set screw.
4. The on-machine inspection device according to claim 1, wherein the light source (6) is a ring-shaped shadowless light source; the light source (6) is detachably arranged on the mounting seat (3) through a third fastener (5);
a gasket (4) is arranged between the light source (6) and the mounting seat (3); the third fastening piece (5) is a screw.
5. The on-machine detection device according to any one of claims 1 to 4, wherein the camera assembly further comprises an external battery (11) disposed on the mounting base (3); the external battery (11) is electrically connected with the camera (10) and the wireless module (12);
the mounting seat (3) is of a columnar structure and is provided with a mounting groove and a shooting hole connected to the mounting groove;
the wireless module (12), the external battery (11) and the camera (10) are all configured in the mounting groove; the lens (7) is configured on the camera (10) and passes through the shooting hole; the light source (6) is arranged at one end of the mounting seat (3) where the shooting hole is arranged.
6. An on-machine detection device according to claim 5, characterized in that the mounting seat (3) is of a rectangular parallelepiped structure; the number of the mounting grooves is 3; the wireless module (12), the external battery (11) and the camera (10) are respectively arranged in 3 mounting grooves.
7. An on-machine inspection method for a machining center, which is used for inspecting manufacturing error information of a workpiece in the machining center by using the on-machine inspection device as claimed in any one of claims 1 to 6; the detection method comprises the following steps:
acquiring a first image containing a feature to be measured of the workpiece; wherein the first image is captured by the on-machine detection device;
calculating an included angle between an optical axis of the on-line detection device and an axis of the feature to be detected according to the first image;
adjusting the position of the workpiece and/or the on-machine detection device according to the included angle so that the optical axis and the axis coincide:
acquiring a second image sequence containing the feature to be detected; wherein the second sequence of images is captured by the on-machine inspection device moving equidistantly along the axis;
generating a full-focus surface topography depth map for acquiring the features to be detected based on focus topography recovery according to the second image sequence;
obtaining a theoretical model of the workpiece, and comparing the theoretical model with the full-focusing surface topography depth map to generate manufacturing error information of the workpiece;
calculating an included angle between an optical axis of the on-machine detection device and an axis of the feature to be detected according to the first image, and specifically comprising the following steps:
Extracting edge features of the first image to obtain an edge contour map;
acquiring coordinates of each edge point on the edge contour map, and generating an array;
according to the array, calculating to obtain ellipse parameters MPNQ { A (x, y), delta, a, b }; wherein, M and N are two end points of the major axis of the ellipse respectively, P and Q are two end points of the minor axis of the ellipse respectively, A (x, y) is the central coordinate of the ellipse, delta is the deflection angle of the major axis, a is the length of the major semi-axis, and b is the length of the minor semi-axis;
acquiring a coordinate o of the on-machine detection device in a machining center and coordinates of an MPNQA point on a feature to be detected corresponding to the five points MPNQA in the machining center;
calculating to obtain the included angle according to the coordinate o and the coordinate of the mpnqa point in the machining center;
calculating to obtain the included angle according to the coordinate o and the mpnqa point, and specifically:
the crossing point p, the point o and the point q establish a plane S poq (ii) a Wherein the position of an angular bisector oa of the included angle of poqIn the said plane S poq
Establishing said plane S through said bisector oa poq Perpendicular plane S of ⊥poq (ii) a Wherein the vertical plane S ⊥poq Intersecting the feature to be detected at m point and n point;
using point a as the center of circle and oa as the radius, on the vertical plane S ⊥poq Establishing a circular arc;
Calculating the optical core included angle mon corresponding to each point on the arc, and taking the arc point with the maximum optical core included angle mon as a target working point;
and calculating the included angle according to the target working point, the point a and the point o.
8. The on-machine detection method according to claim 7, wherein the ellipse parameters MPNQ { a (x, y), δ, a, b } are calculated from the array, specifically:
selecting two edge points from the array, and calculating basic parameters MN { A (x, y), delta, a }; wherein, M and N are two end points of the ellipse major axis respectively, A (x, y) is the ellipse center coordinate, delta is the major axis deflection angle, and a is the length of the major semi-axis;
calculating the length b of the short half shaft corresponding to other edge points in the array based on the basic parameters;
recording the occurrence times of different b values, judging that the two edge points are the end points of the major axis of the ellipse when the occurrence times of a certain b value is greater than a preset threshold value, and acquiring the parameters MPNQ { A (x, y), delta, a, b }; otherwise, selecting the other two edge points in the array and recalculating;
wherein, M and N are two end points of the ellipse major axis respectively, P and Q are two end points of the ellipse minor axis respectively, A (x, y) is the ellipse center coordinate, delta is the major axis deflection angle, a is the length of the major semi-axis, and b is the length of the minor semi-axis.
9. The on-machine inspection method according to claim 7, wherein a full-focus surface topography depth map for obtaining the feature to be inspected is generated based on focus topography recovery according to the second image sequence, specifically:
calculating a focused pixel region of each image in the second image sequence based on the evaluation function;
splicing the focusing pixel regions to form a full focusing surface topography depth map;
obtaining a theoretical model of the workpiece, comparing the theoretical model with the full-focusing surface topography depth map, and generating manufacturing error information of the workpiece, wherein the method specifically comprises the following steps:
obtaining a theoretical model of the workpiece;
and extracting point cloud data in the full-focusing surface topography depth map, and calculating to obtain manufacturing error information by making a difference between the point cloud data and corresponding points in the theoretical model.
CN202110474521.0A 2021-04-29 2021-04-29 On-machine detection device and method for machining center Active CN113510536B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110474521.0A CN113510536B (en) 2021-04-29 2021-04-29 On-machine detection device and method for machining center

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110474521.0A CN113510536B (en) 2021-04-29 2021-04-29 On-machine detection device and method for machining center

Publications (2)

Publication Number Publication Date
CN113510536A CN113510536A (en) 2021-10-19
CN113510536B true CN113510536B (en) 2022-07-29

Family

ID=78063617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110474521.0A Active CN113510536B (en) 2021-04-29 2021-04-29 On-machine detection device and method for machining center

Country Status (1)

Country Link
CN (1) CN113510536B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353346A (en) * 2011-06-23 2012-02-15 东华大学 Method and system for detection of verticality of CCD installation of laser cutting machine with automatically edge searching performance
CN102607467A (en) * 2012-03-07 2012-07-25 上海交通大学 Device and method for detecting elevator guide rail perpendicularity based on visual measurement
CN103323229A (en) * 2013-07-08 2013-09-25 重庆工业职业技术学院 Rotation axis error detection method of five-axis numerical control machine tool based on machine vision
CN103801989A (en) * 2014-03-10 2014-05-21 太原理工大学 Airborne automatic measurement system for determining origin of coordinates of workpiece according to image processing
CN109612390A (en) * 2018-12-17 2019-04-12 江南大学 Large-size workpiece automatic measuring system based on machine vision
CN208780144U (en) * 2018-09-17 2019-04-23 苏州金迈驰航空智能科技有限公司 A kind of online vision detection system of connecting hole
CN110390696A (en) * 2019-07-03 2019-10-29 浙江大学 A kind of circular hole pose visible detection method based on image super-resolution rebuilding
CN110487183A (en) * 2019-08-27 2019-11-22 中国科学技术大学 A kind of multiple target fiber position accurate detection system and application method
CN111531407A (en) * 2020-05-08 2020-08-14 太原理工大学 Workpiece attitude rapid measurement method based on image processing

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353346A (en) * 2011-06-23 2012-02-15 东华大学 Method and system for detection of verticality of CCD installation of laser cutting machine with automatically edge searching performance
CN102607467A (en) * 2012-03-07 2012-07-25 上海交通大学 Device and method for detecting elevator guide rail perpendicularity based on visual measurement
CN103323229A (en) * 2013-07-08 2013-09-25 重庆工业职业技术学院 Rotation axis error detection method of five-axis numerical control machine tool based on machine vision
CN103801989A (en) * 2014-03-10 2014-05-21 太原理工大学 Airborne automatic measurement system for determining origin of coordinates of workpiece according to image processing
CN208780144U (en) * 2018-09-17 2019-04-23 苏州金迈驰航空智能科技有限公司 A kind of online vision detection system of connecting hole
CN109612390A (en) * 2018-12-17 2019-04-12 江南大学 Large-size workpiece automatic measuring system based on machine vision
CN110390696A (en) * 2019-07-03 2019-10-29 浙江大学 A kind of circular hole pose visible detection method based on image super-resolution rebuilding
CN110487183A (en) * 2019-08-27 2019-11-22 中国科学技术大学 A kind of multiple target fiber position accurate detection system and application method
CN111531407A (en) * 2020-05-08 2020-08-14 太原理工大学 Workpiece attitude rapid measurement method based on image processing

Also Published As

Publication number Publication date
CN113510536A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
JP5709851B2 (en) Image measuring probe and operation method
CN110084854B (en) System and method for runtime determination of camera calibration errors
CN108562233B (en) Utilize the axis part diameter size On-line Measuring Method of conic section invariant
CN100501312C (en) Gem tri-dimensional cut detection device based on machine vision
CN109540040B (en) Active vision detection system and method based on unconstrained concentric beam family automobile morphology
CN109118529A (en) A kind of screw hole Image Quick Orientation method of view-based access control model
JP2018522240A (en) Method for measuring artifacts
JP4837538B2 (en) End position measuring method and dimension measuring method
Su et al. Measuring wear of the grinding wheel using machine vision
CN113510536B (en) On-machine detection device and method for machining center
CN112161598B (en) Detection method and detection device of detection equipment
JP2007303994A (en) Visual inspecting device and method
JPH09329418A (en) Calibrating method for camera
CN116393982B (en) Screw locking method and device based on machine vision
CN114963981B (en) Cylindrical part butt joint non-contact measurement method based on monocular vision
CN110657750B (en) Detection system and method for passivation of cutting edge of cutter
CN111475016A (en) Assembly process geometric parameter self-adaptive measurement system and method based on computer vision
CN111429453A (en) Cylinder product appearance on-line measuring system
JP5273563B2 (en) Tool position measuring method and apparatus
Mohamed et al. Non-contact approach to roundness measurement
Percoco et al. 3D image based modelling for inspection of objects with micro-features, using inaccurate calibration patterns: an experimental contribution
CN117146727B (en) Tower tube welding seam monitoring method and system based on machine vision
CN111699377B (en) Detection device and detection method
CN212112569U (en) Cylinder product appearance on-line measuring system
Wischow et al. Calibration and validation of a stereo camera system augmented with a long-wave infrared module to monitor ultrasonic welding of thermoplastics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant