CN114963981B - Cylindrical part butt joint non-contact measurement method based on monocular vision - Google Patents
Cylindrical part butt joint non-contact measurement method based on monocular vision Download PDFInfo
- Publication number
- CN114963981B CN114963981B CN202210527443.0A CN202210527443A CN114963981B CN 114963981 B CN114963981 B CN 114963981B CN 202210527443 A CN202210527443 A CN 202210527443A CN 114963981 B CN114963981 B CN 114963981B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- image
- camera
- edge
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
A cylindrical part butt joint non-contact measurement method based on monocular vision is characterized by comprising the following steps: selecting two cameras of the same manufacturer and the same model, and calibrating and registering the cameras; 2. photographing and measuring the end surfaces of the fixed part and the butt joint part; 3. denoising the shot picture; 4. extracting the processing area where the end face features are located through threshold segmentation, extracting the hole edge through an edge detection algorithm, determining the position of the hole center in an image through screening and ellipse fitting, mapping the hole centers and the axes on two parts to the same coordinate system through coordinate transformation, and calculating the rolling angle of the butt joint part relative to the fixed part. The invention has high automation degree and high measuring speed. According to the invention, on the premise that a target is not required to be arranged on the end face of the part and manual participation is not required, automatic measurement is realized, and the working efficiency is improved.
Description
Technical Field
The invention relates to the field of intelligent assembly, in particular to a butt joint technology of large-size cylindrical parts, and specifically relates to a butt joint non-contact measurement method of cylindrical parts based on monocular vision.
Background
With the continuous development of science and technology, market competition is becoming more and more intense. Rapid, efficient, reliable production has become a major direction and feature of all industrial developments today. Various industries are faced with the problems of improving production efficiency, improving product quality and reducing production cost in order to achieve these objectives. In the production of certain large-sized cylindrical parts, high quality automated docking measurement equipment is particularly important. The space attitude, centering and positioning of the parts can be measured rapidly and accurately, and the device plays an important role in shortening assembly time and improving butting efficiency.
The existing attitude measurement mode has two main types of contact measurement and non-contact measurement. The non-contact measurement mainly adopts two modes of visual measurement and laser scanning measurement. The contact type measurement mainly comprises a three-coordinate mechanical mechanism and a detection head, and the detection head and a part to be measured must be contacted to realize the measurement. In order to ensure the measurement accuracy and prevent the instrument from being broken, the measuring instrument and the measured part must be contacted slowly when contacted. In addition, because the space dimension of the part to be detected is relatively large, the measuring points are relatively large, the measuring time is relatively long, and the production efficiency is seriously influenced. The laser scanning measurement needs to scan the whole appearance of the workpiece, generate point cloud, process a large amount of model data and has long measurement time. The vision measurement technology has the advantages of simple structure, convenient movement, rapid and convenient data acquisition, convenient operation and lower measurement cost, and is particularly suitable for detecting the point position, the size or the outline of a large-sized workpiece in a three-dimensional space. The vision measurement is divided into monocular vision and binocular vision, compared with binocular vision, the monocular vision system has a simple structure, does not need to calibrate a camera, and is more convenient to install and use, so that the monocular vision system is used for measurement. Meanwhile, the non-contact measurement method can avoid damage to the measured object and is suitable for the situation that the measured object cannot be contacted, such as occasions of high temperature, high pressure, fluid, environmental hazard and the like; meanwhile, the machine vision system can measure a plurality of sizes together, so that the quick completion of measurement work is realized; for the measurement of the micro size, the method is the advantage of a machine vision system, and the high-power lens can be used for magnifying the measured object, so that the measurement precision reaches more than micrometers.
Disclosure of Invention
The invention aims at solving the problems that the existing cylindrical part is long in measuring period, the data to be processed is easy to cause long in assembling period and the production efficiency is affected, and provides a cylindrical part butt-joint non-contact measuring method based on monocular vision.
The technical scheme of the invention is as follows:
a cylindrical part butt joint non-contact measurement method based on monocular vision is characterized by comprising the following steps:
step 1: two cameras of the same manufacturer and the same model are selected and installed according to the diagram shown in fig. 1, one camera shoots the end face of the fixed part, the other camera shoots the end face of the butt joint part, and then the two cameras are calibrated and registered, and the specific steps are as follows:
step 1.1: the world coordinate system, the camera coordinate system, the image coordinate system and the pixel coordinate system are established, the position relationship of the four coordinate systems is shown in figure 2, the camera optical axis is taken as the Z axis, and the camera coordinate system (X) is established according to the right hand rule c ,Y c ,Z c ) The center of the picture is taken as the origin, an image coordinate system (X, y) as shown in fig. 2 is established, the first element at the upper left corner of the picture is taken as the origin, a pixel coordinate system (u, v) as shown in fig. 2 is established, and the camera coordinate system of the first picture taken during camera calibration is taken as the world coordinate system (X) w ,Y w ,Z w )。
Step 1.2: taking a plurality of pictures from different directions by using a standard-sized chess checkerboard as a calibration object, inputting the pictures into a MATLAB monocular camera calibration model, obtaining a conversion matrix from a camera coordinate system to a world coordinate system, namely an external reference of a camera, a distortion model of the camera, an internal reference comprising a camera focal length, the width and height of a single pixel and an image, and the coordinates of a picture center point under the image coordinate system, and calculating mathematical models of two groups of cameras.
Step 1.3: the two cameras are symmetrically placed along the registration frame, the included angle between the axis of the camera and the shooting end face is 45 degrees according to the distance between the camera and the end face of the part, the parts which are completely abutted and attached are separated again, the distances between the end faces of the abutted and fixed parts and the midpoint of the fixing frame are equal, the cameras are registered and calibrated, and the system model structure is shown in figure 1.
Step 2: and photographing and measuring the end surfaces of the fixed part and the butt joint part.
Step 3: and (3) preprocessing the measurement picture obtained in the step (2) by combining related constraint conditions, wherein the specific steps are as follows:
step 3.1: and (3) carrying out re-projection on the measured picture, obtaining internal parameters and external parameters of the camera by applying the step (1), and converting the picture obliquely photographed on the end face into the picture perpendicularly photographed on the end face.
Step 3.2: the photographed picture is subjected to logarithmic transformation, a low gray value with a narrow range in a source image is mapped to a gray interval with a wide range, and a high gray value interval with a wide range is mapped to a narrow gray interval, so that the value of a dark pixel is expanded, the value of a high gray is compressed, and the low gray detail in the image is enhanced.
Step 3.3: the value of a point in the photo is replaced by the median value of the values of each point in a neighborhood of the point, so that the problems of salt and pepper noise in the image and the point without gray value existing after re-projection are solved.
Step 4: threshold segmentation and edge detection are carried out on the preprocessed picture, and the positions of the hole center and the axis of the part are extracted, wherein the specific steps are as follows:
step 4.1: setting the gray value of threshold segmentation as 50 according to the end face butting characteristics, screening and extracting independent communication areas through threshold segmentation, and further screening the area where the holes are located through roundness and area characteristics for the selected areas. And respectively corroding and expanding the area and intersecting the obtained area to obtain the area where the hole edge is located.
Step 4.2: and (3) obtaining intersection of the region obtained in the previous step and the original image, selecting the image containing the hole edge from the original image, and further reducing the operation region of the image.
Step 4.3: and extracting the hole edge by using an edge detection algorithm, screening according to the shape, performing ellipse fitting on the screening result, determining the positions of the hole center and the part axis in the image, and calculating the positions of the measured hole center and the part axis in a corresponding camera real coordinate system through a camera model.
Step 4.4: and mapping the centers and the axes of the fixed part and the butt joint part to the same world coordinate system, and calculating the deflection angle to be adjusted.
The beneficial effects of the invention are as follows:
the invention can avoid the damage to the measured object and is suitable for the situation that the measured object can not be contacted, such as high temperature, high pressure, fluid, environmental hazard and other occasions; meanwhile, the machine vision system can measure a plurality of sizes together, so that the quick completion of measurement work is realized; for the measurement of the micro size, the method is the advantage of a machine vision system, and the high-power lens can be used for magnifying the measured object, so that the measurement precision reaches more than micrometers.
The invention provides a simplified and automatic measuring method for measuring the posture of the parts, and the content of measurement is the relative rotation angle between the parts. The invention has high automation degree and high measuring speed. According to the invention, on the premise that a target is not required to be arranged on the end face of the part and manual participation is not required, automatic measurement is realized, and the working efficiency is improved.
Drawings
Fig. 1 is a diagram showing the structure of a system model according to the present invention.
Fig. 2 is a schematic diagram of the positional relationship of four coordinate systems employed in the present invention.
Fig. 3 is a schematic view of a non-contact measuring device used in the present invention.
Fig. 4 is a flow chart of a non-contact measurement process of the present invention.
Detailed Description
In order to make the technical scheme and implementation steps of the invention more clear, the invention is further described below with reference to the accompanying drawings and examples.
As shown in fig. 1-4.
A cylindrical part butt joint non-contact measurement method based on monocular vision is used for extracting positioning holes and part axes on a butt joint part and a fixed part, so that relative rotation angle measurement between the two parts is realized.
The invention provides a non-contact measuring method for part butt joint, which adopts a measuring system shown in figure 3 and comprises a non-contact measuring device, a first communication module, a second communication module and an industrial personal computer system.
Fig. 4 is a flowchart of a non-contact measurement process according to the present invention, specifically including the steps of:
step 1: two cameras of the same manufacturer and the same model are selected, the cameras are installed according to the structure shown in the figure 1, the camera A shoots the end face of the fixed part, the camera B shoots the end face of the butt joint part, and then the two cameras are calibrated and registered, and the system model structure is shown in the figure 1, and specifically comprises the following steps:
step 1.1: the world coordinate system, the camera coordinate system, the image coordinate system and the pixel coordinate system are established, the position relationship of the four coordinate systems is shown in figure 2, the camera optical axis is taken as the Z axis, and the camera coordinate system (X) is established according to the right hand rule c ,Y c ,Z c ) The center of the picture is taken as the origin, an image coordinate system (X, y) as shown in fig. 2 is established, the first element at the upper left corner of the picture is taken as the origin, a pixel coordinate system (u, v) as shown in fig. 2 is established, and the camera coordinate system of the first picture taken during camera calibration is taken as the world coordinate system (X) w ,Y w ,Z w )。
Step 1.2: taking a plurality of pictures from different directions for a calibration object by using a standard-sized chess checkerboard as the calibration object, inputting the pictures into a MATLAB monocular camera calibration model to obtain a distortion model of a camera, converting a camera coordinate system into a world coordinate system into a matrix, and calculating internal parameters including a camera focal length, the width and height of a single pixel and an image and the coordinates of a picture center point under the image coordinate system to obtain mathematical models of two groups of cameras.
Step 1.3: symmetrically placing two cameras along a fixing frame, forming an included angle of 45 degrees between the camera axis and the shooting end face according to the distance between the cameras and the end face of the part, separating the completely abutted parts again, enabling the distances between the abutting joint end face and the fixing end face of the part and the midpoint of the fixing frame to be equal, shooting the positioning holes on the end face, and extracting the center coordinates of the holes of the fixing end face and the abutting joint end face to be P respectively ak (x k ,y k ) And P bk (x k ,y k ) K=1, 2,3 … … N, two different registration point pairs were acquired. Taking the fixed part camera as a reference camera, the two point sets are assumed to be registered through a transformation matrix, namely:
calculatingTransfer matrix between world coordinate systems of fixed parts and interfacing parts in physical space
Step 2: and photographing and measuring the end surfaces of the fixed part and the butt joint part.
Step 3: and (3) preprocessing the measurement picture obtained in the step (2) by combining related constraint conditions, wherein the specific steps are as follows:
step 3.1: and (3) carrying out re-projection on the measured picture, obtaining internal parameters and external parameters of the camera by applying the step (1), and converting the picture obliquely photographed on the end face into the picture perpendicularly photographed on the end face.
Step 3.2: the logarithmic transformation s=clog (1+r) is carried out on the photographed picture, c is a constant, a low gray value with a narrow range in a source image is mapped to a gray interval with a wide range, and a high gray value interval with a wide range is mapped to a narrow gray interval, so that the value of a dark pixel is expanded, the value of a high gray is compressed, and the low gray detail in the image is enhanced.
Step 3.3: the value of a point in the photo is replaced by the median value of the values of each point in a neighborhood of the point, so that the problems of salt and pepper noise in the image and the point without gray value existing after re-projection are solved.
Step 4: threshold segmentation and edge detection are carried out on the preprocessed picture, and the positions of the hole center and the axis of the part are extracted, wherein the specific steps are as follows:
step 4.1: setting a gray value suitable for threshold segmentation under the current condition, extracting independent communication areas through threshold segmentation, and further screening out areas where holes are located from the selected areas through roundness and area characteristics. And respectively corroding and expanding the area and intersecting the obtained area to obtain the area where the hole edge is located. The obtained region and the original image are intersected, the image containing the hole edge is selected from the original image, and the operation region of the image is further reduced.
Step 4.2: the result of the previous step is gaussian filtered to smooth the image, and for a pixel at one position (m, n),the gray value (only binary image is considered here) is f (m, n). The gaussian filtered gray value will become:and calculating the gradient value and gradient direction of the filtered edge, wherein the edge is the set of pixel points with larger gray value change. In an image, the degree and direction of change of a gradation value are expressed by gradients, and the gradient value and gradient direction are calculated by the following formula:
during gaussian filtering, the edges may be amplified. Therefore, by setting rules to filter points that are not edges, the width of the edges is made as wide as 1 pixel point as possible: if a pixel belongs to an edge, the gradient value of the pixel in the gradient direction is the largest, otherwise, the pixel is not the edge, and the gray value is set to 0. The edges are detected using upper and lower thresholds, where all above the upper threshold are detected as edges and all below the upper threshold are detected as non-edges. For the middle pixel point, if the middle pixel point is adjacent to the pixel point determined as the edge, the edge is determined; otherwise, it is non-edge.
Step 4.3: and screening the result by the shape, carrying out ellipse fitting by using a least square method, determining the positions of the center of the hole and the axis of the part in the image, and calculating the positions of the center of the measured hole and the axis of the part in a corresponding camera real coordinate system by using a camera model.
Step 4.4: mapping the centers and axes of the fixed part and the butt joint part to the same world coordinate system, wherein the axis coordinate is H ab0 =(x 0 ,y 0 ) The hole center coordinate set is H a =(x m ,y m ),H b =(x m ,y m ),m=1,2,3……NAccording to the straight line determined by the axes of the two parts and the hole center under the same coordinate system, the three-dimensional function is adopted:
the minimum deflection angle α to be adjusted is calculated.
The invention is not related in part to the same as or can be practiced with the prior art.
Claims (2)
1. A cylindrical part butt joint non-contact measurement method based on monocular vision is characterized by comprising the following steps: selecting two cameras of the same manufacturer and the same model, and calibrating and registering the cameras, wherein the steps of calibrating and registering the cameras are as follows:
step 1.1: four coordinate systems of a world coordinate system, a camera coordinate system, an image coordinate system and a pixel coordinate system are established, a camera optical axis is taken as a Z axis, and a camera coordinate system (X) is established according to a right-hand rule c ,Y c ,Z c ) An image coordinate system (X, y) is established by taking the center of the photograph as the origin, a pixel coordinate system (u, v) is established by taking the first element of the upper left corner of the photograph as the origin, and a camera coordinate system of the first photograph taken during camera calibration is taken as a world coordinate system (X) w ,Y w ,Z w );
Step 1.2: taking a plurality of pictures from different directions for a calibration object by using a standard-sized chess checkerboard as the calibration object, inputting the pictures into a MATLAB monocular camera calibration model, obtaining a conversion matrix from a camera coordinate system to a world coordinate system, namely an external reference of a camera, a distortion model of the camera, an internal reference comprising a camera focal length, the width and height of a single pixel and an image, and the coordinates of a picture center point under the image coordinate system, and calculating mathematical models of two groups of cameras;
step 1.3: symmetrically placing the two cameras along the registration frame, enabling the included angle between the axis of the cameras and the shooting end face to be 45 degrees according to the distance between the cameras and the end face of the part, separating the completely abutted parts again, enabling the distances between the end faces of the abutted parts and the fixed parts to be equal to each other, and registering and calibrating the cameras;
secondly, photographing and measuring the end surfaces of the fixed part and the butt joint part, and preprocessing the obtained measurement picture, wherein the specific steps are as follows:
step 2.1: re-projecting the measured picture, and converting the picture obliquely photographed on the end face into a picture perpendicularly photographed on the end face by applying the camera internal parameters and external parameters obtained in the step 1;
step 2.2: carrying out logarithmic transformation on the photographed picture, mapping a low gray value with a narrower range in a source image to a gray interval with a wider range, and mapping a high gray value interval with a wider range to a narrower gray interval at the same time, so as to expand the value of a dark pixel, compress the value of a high gray and strengthen the details of the low gray in the image;
step 2.3: replacing the value of a point in the photo with the median value of the values of each point in a neighborhood of the point, and solving the problems of salt and pepper noise in the image and the point without gray value after re-projection;
step three, denoising the shot picture;
extracting a processing area where the end face features are located through threshold segmentation, extracting hole edges through an edge detection algorithm, determining the position of a hole center in an image through screening and ellipse fitting, mapping hole centers and axes on two parts to the same coordinate system through coordinate transformation, and calculating the rolling angle of the butt joint part relative to the fixed part;
threshold segmentation and edge detection are carried out on the preprocessed picture, and the positions of the hole center and the axis of the part are extracted, wherein the specific steps are as follows:
step 3.1: setting the gray value of threshold segmentation as 50 according to the end face butting characteristics, screening and extracting independent communication areas through threshold segmentation, and further screening the area where the holes are located through roundness and area characteristics for the selected areas; etching and expanding the area and the obtained area respectively, and obtaining the area where the edge of the hole is located;
step 3.2: intersection of the region obtained in the previous step and the original image is obtained, the image containing the hole edge is selected from the original image, and the operation region of the image is further reduced;
step 3.3: extracting hole edges by using an edge detection algorithm, screening according to the shape, carrying out ellipse fitting on screening results, determining positions of a hole center and a part axis in an image, and calculating through a camera model to obtain positions of a measured hole center and the part axis in a corresponding camera real coordinate system;
step 3.4: mapping the center of holes and the axes of the fixed part and the butt joint part to the same world coordinate system, and calculating a deflection angle to be adjusted;
smoothing the image by Gaussian filtering, wherein the gray value of the pixel point at one position (m, n) is f (m, n); the gaussian filtered gray value will become:calculating the gradient value and gradient direction of the filtered edge, wherein the edge is a set of pixel points with larger gray value change; in an image, the degree and direction of change of a gradation value are expressed by gradients, and the gradient value and gradient direction are calculated by the following formula:
in the gaussian filtering process, points which are not edges are filtered by setting rules, so that the width of the edges is as 1 pixel point as possible: if a pixel belongs to an edge, the gradient value of the pixel in the gradient direction is the largest, otherwise, the pixel is not the edge, and the gray value is set to be 0; detecting edges using upper and lower thresholds, wherein all above the upper threshold are detected as edges and all below the upper threshold are detected as non-edges; for the middle pixel point, if the middle pixel point is adjacent to the pixel point determined as the edge, the edge is determined; otherwise, it is non-edge.
2. The method according to claim 1, characterized in that: in step 3.4, the center of the hole and the axis of the fixed part and the butt joint part are mapped to the same world coordinate system, and the axis coordinate is H ab0 =(x 0 ,y 0 ) The hole center coordinate set is H a =(x m ,y m ),H b =(x m ,y m ) M=1, 2,3 … … N, according to the straight line determined by the axes of the two parts and the hole center under the same coordinate system, through trigonometric function:
the minimum deflection angle α to be adjusted is calculated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210527443.0A CN114963981B (en) | 2022-05-16 | 2022-05-16 | Cylindrical part butt joint non-contact measurement method based on monocular vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210527443.0A CN114963981B (en) | 2022-05-16 | 2022-05-16 | Cylindrical part butt joint non-contact measurement method based on monocular vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114963981A CN114963981A (en) | 2022-08-30 |
CN114963981B true CN114963981B (en) | 2023-08-15 |
Family
ID=82970874
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210527443.0A Active CN114963981B (en) | 2022-05-16 | 2022-05-16 | Cylindrical part butt joint non-contact measurement method based on monocular vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114963981B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116140987A (en) * | 2023-04-17 | 2023-05-23 | 广东施泰德测控与自动化设备有限公司 | Visual quick docking device and docking method for axle test board |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108562274A (en) * | 2018-04-20 | 2018-09-21 | 南京邮电大学 | A kind of noncooperative target pose measuring method based on marker |
CN109190628A (en) * | 2018-08-15 | 2019-01-11 | 东北大学 | A kind of plate camber detection method based on machine vision |
CN110146038A (en) * | 2019-06-08 | 2019-08-20 | 西安电子科技大学 | The distributed monocular camera laser measuring device for measuring and method of cylindrical member assembly corner |
CN112362034A (en) * | 2020-11-11 | 2021-02-12 | 上海电器科学研究所(集团)有限公司 | Solid engine multi-cylinder section butt joint guiding measurement algorithm based on binocular vision |
CN112686920A (en) * | 2020-12-31 | 2021-04-20 | 天津理工大学 | Visual measurement method and system for geometric dimension parameters of circular part |
CN113295171A (en) * | 2021-05-19 | 2021-08-24 | 北京航空航天大学 | Monocular vision-based attitude estimation method for rotating rigid body spacecraft |
WO2021208231A1 (en) * | 2020-04-15 | 2021-10-21 | 上海工程技术大学 | Gap measuring system and measuring method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111243032B (en) * | 2020-01-10 | 2023-05-12 | 大连理工大学 | Full-automatic detection method for checkerboard corner points |
-
2022
- 2022-05-16 CN CN202210527443.0A patent/CN114963981B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108562274A (en) * | 2018-04-20 | 2018-09-21 | 南京邮电大学 | A kind of noncooperative target pose measuring method based on marker |
CN109190628A (en) * | 2018-08-15 | 2019-01-11 | 东北大学 | A kind of plate camber detection method based on machine vision |
CN110146038A (en) * | 2019-06-08 | 2019-08-20 | 西安电子科技大学 | The distributed monocular camera laser measuring device for measuring and method of cylindrical member assembly corner |
WO2021208231A1 (en) * | 2020-04-15 | 2021-10-21 | 上海工程技术大学 | Gap measuring system and measuring method |
CN112362034A (en) * | 2020-11-11 | 2021-02-12 | 上海电器科学研究所(集团)有限公司 | Solid engine multi-cylinder section butt joint guiding measurement algorithm based on binocular vision |
CN112686920A (en) * | 2020-12-31 | 2021-04-20 | 天津理工大学 | Visual measurement method and system for geometric dimension parameters of circular part |
CN113295171A (en) * | 2021-05-19 | 2021-08-24 | 北京航空航天大学 | Monocular vision-based attitude estimation method for rotating rigid body spacecraft |
Non-Patent Citations (1)
Title |
---|
基于机器视觉的高精度测量与装配系统设计;焦亮等;《计算机测量与控制》;20160725(第07期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114963981A (en) | 2022-08-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110689579B (en) | Rapid monocular vision pose measurement method and measurement system based on cooperative target | |
CN110276808B (en) | Method for measuring unevenness of glass plate by combining single camera with two-dimensional code | |
CN109612390B (en) | Large-size workpiece automatic measuring system based on machine vision | |
CN110146038B (en) | Distributed monocular camera laser measuring device and method for assembly corner of cylindrical part | |
CN109190628A (en) | A kind of plate camber detection method based on machine vision | |
CN108007388A (en) | A kind of turntable angle high precision online measuring method based on machine vision | |
CN109579695B (en) | Part measuring method based on heterogeneous stereoscopic vision | |
CN110298853B (en) | Visual inspection method for surface difference | |
CN113324478A (en) | Center extraction method of line structured light and three-dimensional measurement method of forge piece | |
CN110260818B (en) | Electronic connector robust detection method based on binocular vision | |
CN115096206B (en) | High-precision part size measurement method based on machine vision | |
CN111402330B (en) | Laser line key point extraction method based on planar target | |
CN112381847A (en) | Pipeline end head space pose measuring method and system | |
CN112729112B (en) | Engine cylinder bore diameter and hole site detection method based on robot vision | |
CN114963981B (en) | Cylindrical part butt joint non-contact measurement method based on monocular vision | |
CN108694713B (en) | Stereo vision based satellite-rocket docking ring local ring segment identification and measurement method | |
CN113607058B (en) | Straight blade size detection method and system based on machine vision | |
Guo et al. | A V-shaped weld seam measuring system for large workpieces based on image recognition | |
CN113418927A (en) | Automobile mold visual detection system and detection method based on line structured light | |
CN116596987A (en) | Workpiece three-dimensional size high-precision measurement method based on binocular vision | |
CN114549659A (en) | Camera calibration method based on quasi-three-dimensional target | |
CN109308706A (en) | A method of three-dimension curved surface area is obtained by image procossing | |
CN114693801B (en) | Calibration plate, calibration method and calibration system | |
CN113510536B (en) | On-machine detection device and method for machining center | |
CN118470099B (en) | Object space pose measurement method and device based on monocular camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |