CN114963981A - Monocular vision-based cylindrical part butt joint non-contact measurement method - Google Patents

Monocular vision-based cylindrical part butt joint non-contact measurement method Download PDF

Info

Publication number
CN114963981A
CN114963981A CN202210527443.0A CN202210527443A CN114963981A CN 114963981 A CN114963981 A CN 114963981A CN 202210527443 A CN202210527443 A CN 202210527443A CN 114963981 A CN114963981 A CN 114963981A
Authority
CN
China
Prior art keywords
coordinate system
camera
image
hole
butt joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210527443.0A
Other languages
Chinese (zh)
Other versions
CN114963981B (en
Inventor
薛善良
郑祖闯
岳松
张明
陈琪玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202210527443.0A priority Critical patent/CN114963981B/en
Publication of CN114963981A publication Critical patent/CN114963981A/en
Application granted granted Critical
Publication of CN114963981B publication Critical patent/CN114963981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A monocular vision-based cylindrical part butt joint non-contact measurement method is characterized by comprising the following steps: selecting two cameras of the same manufacturer and the same model, and calibrating and registering the cameras; 2. photographing and measuring the end faces of the fixed part and the butt joint part; 3. denoising the shot picture; 4. extracting a processing area where the end face features are located by threshold segmentation, extracting hole edges by an edge detection algorithm, determining the position of a hole center in an image by screening and ellipse fitting, mapping the hole centers and the axes on the two parts to the same coordinate system by coordinate transformation, and calculating the roll angle of the butt joint part relative to the fixed part. The invention has high automation degree and high measuring speed. According to the invention, on the premise that a target is not required to be arranged on the end face of the part and manual participation is not required, automatic measurement is realized, and the working efficiency is improved.

Description

Monocular vision-based cylindrical part butt joint non-contact measurement method
Technical Field
The invention relates to the field of intelligent assembly, in particular to a butt joint technology of a large-size cylindrical part, and specifically relates to a monocular vision-based cylindrical part butt joint non-contact measurement method which is used for realizing measurement of a relative rotation angle between a butt joint part and a fixed part under the condition that the coaxiality of the butt joint part and the fixed part meets the butt joint requirement so as to realize butt joint of the butt joint part and the fixed part.
Background
With the continuous development of scientific technology, the market competition is intensified day by day. Fast, efficient, reliable production has become a major direction and characteristic of all industrial developments today. In order to achieve these goals, various industries face the problems of improving production efficiency, improving product quality and reducing production cost. In the production of certain large-size cylindrical parts, high-quality automatic butt-joint measuring equipment is particularly important. The space attitude, centering and positioning of the parts can be measured quickly and accurately, and the method plays an important role in shortening the assembly time and improving the butt joint efficiency.
The existing attitude measurement modes comprise contact measurement and non-contact measurement. The non-contact measurement mainly adopts two modes of vision measurement and laser scanning measurement. The contact type measurement mainly comprises a three-coordinate mechanical mechanism and a detection head, and the detection head and a part to be measured can be measured only by contacting. In order to ensure the measurement accuracy and prevent the instrument from being damaged by collision, the measuring instrument and the measured part must be contacted slowly when in contact. In addition, because the space dimensions of the parts to be detected are more, the number of the measured points is more, the measuring time is longer, and the production efficiency is seriously influenced. In laser scanning measurement, the whole appearance of a workpiece needs to be scanned to generate point cloud, a large amount of model data is processed, and the measurement time is long. The measuring system of the vision measuring technology is simple in structure, convenient to move, rapid and convenient in data acquisition, convenient to operate and low in measuring cost, and is particularly suitable for detecting point positions and sizes of three-dimensional spaces or outlines of large-sized workpieces. The vision measurement is divided into monocular vision and binocular vision, and compared with the binocular vision, the monocular vision system has a simple structure, does not need to adjust and calibrate a camera, and is more convenient to install and use, so that the monocular vision is used for measurement. Meanwhile, the non-contact measuring method can avoid damage to the measured object and is suitable for the situation that the measured object is not in contact, such as occasions of high temperature, high pressure, fluid, environmental hazard and the like; meanwhile, the machine vision system can simultaneously measure a plurality of sizes together, so that the measurement work is quickly finished; the measurement of micro size is the strong point of machine vision system, it can use high power lens to enlarge the measured object, making the measurement accuracy reach more than micrometer.
Disclosure of Invention
The invention aims to provide a monocular vision-based cylindrical part butt joint non-contact measuring method, which aims at solving the problems that the existing cylindrical part has long measuring period and is easy to cause long assembly period due to more data needing to be processed and the production efficiency is influenced.
The technical scheme of the invention is as follows:
a monocular vision-based cylindrical part butt joint non-contact measurement method is characterized by comprising the following steps:
step 1: two cameras of the same manufacturer and the same model are selected and installed according to the scheme shown in figure 1, one camera shoots the end face of the fixed part, the other camera shoots the end face of the butt joint part, and then the two cameras are calibrated and registered, and the method specifically comprises the following steps:
step 1.1: establishing a world coordinate system, a camera coordinate system, an image coordinate system and a pixel coordinate system, wherein the position relations of the four coordinate systems are shown in figure 2, the optical axis of the camera is taken as the Z axis, and the camera coordinate system (X) is established according to the right-hand rule c ,Y c ,Z c ) Establishing an image coordinate system (X, y) as shown in figure 2 by taking the center of the picture as an origin, establishing a pixel coordinate system (u, v) as shown in figure 2 by taking the first element at the upper left corner of the picture as an origin, and taking a camera coordinate system of the first picture taken during camera calibration as a world coordinate system (X) w ,Y w ,Z w )。
Step 1.2: the method comprises the steps of taking a plurality of photos of a calibration object from different directions by using a chess chessboard of standard size as the calibration object, inputting the photos into an MATLAB monocular camera calibration model, obtaining a conversion matrix from a camera coordinate system to a world coordinate system, namely external parameters of the camera, a distortion model of the camera, internal parameters including the focal length of the camera, the width and the height of a single pixel and an image and the coordinate of the central point of the photo in the image coordinate system, and calculating the mathematical models of two groups of cameras.
Step 1.3: two cameras are symmetrically arranged along the registration frame, the included angle between the axis of the camera and the shooting end face is 45 degrees according to the distance between the camera and the end face of the part, the completely butted and attached parts are separated again, the distance between the end faces of the butted and fixed parts and the midpoint of the fixed frame is equal, the cameras are registered and calibrated, and the structure of the system model is shown in figure 1.
Step 2: and photographing and measuring the end faces of the fixed part and the butt joint part.
And step 3: and (3) preprocessing the measurement picture obtained in the step (2) by combining related constraint conditions, and specifically comprising the following steps:
step 3.1: and (3) carrying out re-projection on the measured picture, applying the step 1 to obtain internal reference and external reference of the camera, and converting the picture of obliquely photographing the end face into a picture of vertically photographing the end face.
Step 3.2: and carrying out logarithmic transformation on the photographed picture, mapping the low gray value with a narrow range in the source image to the gray interval with a wide range, and mapping the high gray value interval with a wide range to the narrow gray interval, so that the value of a dark pixel is expanded, the value of high gray is compressed, and low gray level details in the image are enhanced.
Step 3.3: and replacing the value of one point in the picture with the median value of each point value in a neighborhood of the point, so as to solve the salt and pepper noise in the image and the points without gray values existing after the re-projection.
And 4, step 4: carrying out threshold segmentation and edge detection on the preprocessed picture, and extracting the positions of the hole center and the part axis, wherein the specific steps are as follows:
step 4.1: setting a threshold value segmentation gray value of 50 according to the end face butt joint characteristics, screening and extracting an independent communication area through threshold segmentation, and further screening the area where the hole is located through roundness and area characteristics for the selected area. And respectively corroding and expanding the area and intersecting the obtained areas to obtain the area where the hole edge is located.
Step 4.2: and obtaining intersection of the area obtained in the previous step and the original image, selecting the image containing the hole edge from the original image, and further reducing the operation area of the image.
Step 4.3: and extracting the hole edge by using an edge detection algorithm, screening according to the shape, fitting the screening result in an ellipse, determining the positions of the hole center and the part axis in the image, and calculating by using a camera model to obtain the positions of the measured hole center and the part axis in a corresponding camera real coordinate system.
Step 4.4: and mapping the hole centers and the shaft centers of the fixed part and the butt joint part to the same world coordinate system, and calculating the deflection angle required to be adjusted.
The invention has the beneficial effects that:
the invention can avoid the damage to the tested object and is suitable for the condition that the tested object can not be contacted, such as occasions of high temperature, high pressure, fluid, environmental hazard and the like; meanwhile, the machine vision system can simultaneously measure a plurality of sizes together, so that the measurement work is quickly finished; the measurement of micro size is the strong point of machine vision system, it can use high power lens to enlarge the measured object, making the measurement accuracy reach more than micrometer.
The invention provides a simple and automatic measuring method for measuring the attitude of parts, and the measuring content is the relative rotation angle between the parts. The invention has high automation degree and high measuring speed. The invention realizes automatic measurement and improves the working efficiency on the premise of not arranging a target on the end surface of a part and not needing manual participation.
Drawings
FIG. 1 is a system model architecture composition diagram of the present invention.
Fig. 2 is a schematic diagram of the position relationship of four coordinate systems adopted by the present invention.
Fig. 3 is a schematic structural diagram of a noncontact measuring device employed in the present invention.
Fig. 4 is a flowchart of the non-contact measurement process of the present invention.
Detailed Description
In order to make the technical scheme and implementation steps of the present invention more clear, the present invention is further described below with reference to the accompanying drawings and examples.
As shown in fig. 1-4.
A monocular vision-based cylindrical part butt joint non-contact measurement method is used for realizing extraction of positioning holes and part axes on a butt joint part and a fixed part, and further realizing measurement of relative rotation angles between the two parts.
The invention provides a non-contact measurement method for part butt joint, which adopts a measurement system shown in figure 3 and comprises a non-contact measurement device, a first communication module, a second communication module and an industrial personal computer system.
Fig. 4 is a flowchart of a non-contact measurement process according to the present invention, which specifically includes the following steps:
step 1: selecting two cameras of the same manufacturer and the same model, installing the cameras according to the figure 1, shooting the end faces of the fixed parts by the camera A, shooting the end faces of the butt joint parts by the camera B, and then calibrating and registering the two cameras, wherein the system model structure is shown in the figure 1, and the specific steps are as follows:
step 1.1: establishing a world coordinate system, a camera coordinate system, an image coordinate system and a pixel coordinate system, wherein the position relations of the four coordinate systems are shown in figure 2, the optical axis of the camera is taken as the Z axis, and the camera coordinate system (X) is established according to the right-hand rule c ,Y c ,Z c ) Establishing an image coordinate system (X, y) as shown in figure 2 by taking the center of the picture as an origin, establishing a pixel coordinate system (u, v) as shown in figure 2 by taking the first element at the upper left corner of the picture as an origin, and taking a camera coordinate system of the first picture taken during camera calibration as a world coordinate system (X) w ,Y w ,Z w )。
Step 1.2: the method comprises the steps of taking a chessboard of the chess with standard size as a calibration object, taking a plurality of pictures of the calibration object from different directions, inputting the pictures into an MATLAB monocular camera calibration model, obtaining a distortion model of a camera, a conversion matrix from a camera coordinate system to a world coordinate system, and internal parameters including the focal length of the camera, the width and the height of a single pixel and an image, and the coordinate of the center point of the picture in the image coordinate system, and calculating the mathematical models of two groups of cameras.
Step 1.3: two cameras are symmetrically arranged along the fixing frame, and the axes of the cameras and the shooting are arranged according to the distance between the cameras and the end face of the partTaking a picture of the included angle of the end face of 45 degrees, separating the completely butted and attached parts again to ensure that the distances between the end faces of the butted and fixed parts and the midpoint of the fixed frame are equal, taking a picture of the positioning hole on the end face, and extracting the coordinates of the hole centers of the fixed end face and the butted end face which are respectively P ak (x k ,y k ) And P bk (x k ,y k ) And k is 1,2,3 … … N, two different sets of registration point pairs are obtained. Taking a fixed part camera as a reference camera, assuming that two point sets can be registered through a transformation matrix, namely:
Figure BDA0003645133570000041
calculating a transfer matrix between world coordinate systems of the fixed part and the butted part in the physical space
Figure BDA0003645133570000042
Step 2: and shooting and measuring the end faces of the fixed part and the butt joint part.
And step 3: and (3) preprocessing the measurement picture obtained in the step (2) by combining related constraint conditions, and specifically comprising the following steps:
step 3.1: and (3) carrying out re-projection on the measured picture, applying the step 1 to obtain internal reference and external reference of the camera, and converting the picture of obliquely photographing the end face into a picture of vertically photographing the end face.
Step 3.2: and carrying out logarithmic transformation on the photographed picture, wherein s is close (1+ r), c is a constant, the low gray value with a narrow range in the source image is mapped to a gray interval with a wide range, and the high gray value interval with a wide range is mapped to a narrow gray interval, so that the value of a dark pixel is expanded, the value of high gray is compressed, and low gray level details in the image are enhanced.
Step 3.3: and replacing the value of one point in the picture with the median value of each point value in a neighborhood of the point, so as to solve the salt and pepper noise in the image and the points without gray values existing after the re-projection.
And 4, step 4: carrying out threshold segmentation and edge detection on the preprocessed picture, and extracting the positions of the hole center and the part axis, wherein the specific steps are as follows:
step 4.1: and setting a gray value suitable for threshold segmentation under the current condition, extracting an independent connected region through threshold segmentation, and further screening the region where the hole is located through roundness and area characteristics of the selected region. And respectively corroding and expanding the area and intersecting the obtained areas to obtain the area where the hole edge is located. The intersection of the obtained region and the original image is determined, and an image including the hole edge is selected from the original image, thereby further reducing the operation region of the image.
Step 4.2: and performing Gaussian filtering on the result of the previous step to smooth the image, wherein the gray value (only considering a binary image) of the pixel point at one position (m, n) is f (m, n). The gaussian filtered gray value will become:
Figure BDA0003645133570000051
and calculating the gradient value and gradient direction of the filtered edge, wherein the edge is a set of pixel points with large gray value change. In an image, the degree and direction of change in the gray-scale value are expressed by a gradient, and the gradient value and the gradient direction are calculated by the following formulas:
Figure BDA0003645133570000052
Figure BDA0003645133570000053
in the gaussian filtering process, the edges may be amplified. Therefore, the points which are not edges are filtered by setting rules, so that the width of the edges is 1 pixel point as much as possible: if a pixel belongs to the edge, the gradient value of the pixel in the gradient direction is the maximum, otherwise, the pixel is not the edge, and the gray value is set to be 0. Edges are detected using upper and lower thresholds, where those above the upper threshold are all detected as edges and those below the upper threshold are all detected as non-edges. For the middle pixel point, if the middle pixel point is adjacent to the pixel point determined as the edge, the edge is judged; otherwise, it is not edge.
Step 4.3: and screening results through the shape, carrying out ellipse fitting by using a least square method, determining the positions of the hole center and the part axis in the image, and calculating through a camera model to obtain the positions of the measured hole center and the part axis in a corresponding camera real coordinate system.
Step 4.4: the center of a hole and the axis of the fixed part and the butt joint part are mapped to the same world coordinate system, and the axis coordinate is H ab0 =(x 0 ,y 0 ) The hole center coordinate set is H a =(x m ,y m ),H b =(x m ,y m ) And m is 1,2,3 … … N, and is determined by a trigonometric function according to a straight line defined by the axes of the two parts and the hole center under the same coordinate system:
Figure BDA0003645133570000054
the minimum deflection angle alpha to be adjusted is calculated.
The present invention is not concerned with parts which are the same as or can be implemented using prior art techniques.

Claims (6)

1. A monocular vision-based cylindrical part butt joint non-contact measurement method is characterized by comprising the following steps: step 1, selecting two cameras of the same manufacturer and the same model, and calibrating and registering the cameras; step 2, photographing and measuring the end faces of the fixed part and the butt joint part; step 3, denoising the shot picture; and 4, extracting a processing area where the end face features are located by threshold segmentation, extracting the edge of the hole by an edge detection algorithm, determining the position of the center of the hole in the image by screening and ellipse fitting, mapping the center of the hole and the axis on the two parts to the same coordinate system by coordinate transformation, and calculating the roll angle of the butt joint part relative to the fixed part.
2. The method of claim 1, further comprising: the steps of calibrating and registering the camera are as follows:
step 1.1: establishing four coordinate systems of a world coordinate system, a camera coordinate system, an image coordinate system and a pixel coordinate system, taking the optical axis of the camera as the Z axis, and establishing a camera coordinate system (X) according to the right-hand rule c ,Y c ,Z c ) Establishing an image coordinate system (X, y) by taking the center of the photo as an origin, establishing a pixel coordinate system (u, v) by taking the first element at the upper left corner of the photo as an origin, and taking the camera coordinate system of the first photo taken at the time of camera calibration as a world coordinate system (X) w ,Y y ,Z w );
Step 1.2: using a standard-sized chess checkerboard as a calibration object, taking a plurality of photos of the calibration object from different directions, inputting the photos into an MATLAB monocular camera calibration model, obtaining a conversion matrix from a camera coordinate system to a world coordinate system, namely external parameters of the camera, a distortion model of the camera, internal parameters including the focal length of the camera, the width and height of a single pixel and an image and the coordinate of the central point of the photo in the image coordinate system, and calculating mathematical models of two groups of cameras;
step 1.3: and symmetrically placing the two cameras along the registration frame, enabling the included angle between the axis of the camera and the shooting end face to be 45 degrees according to the distance between the cameras and the end face of the part, re-separating the completely butted and attached parts, enabling the distances between the butted and fixed part end faces and the middle point of the fixed frame to be equal, and registering and calibrating the cameras.
3. The method of claim 1, further comprising: and (3) preprocessing the measurement picture obtained in the step (2), and specifically comprising the following steps:
step 3.1: carrying out re-projection on the measured picture, applying the camera internal parameter and the camera external parameter obtained in the step 1, and converting the picture of obliquely photographing the end face into a picture of vertically photographing the end face;
step 3.2: carrying out logarithmic transformation on the photographed picture, mapping the low gray value with a narrow range in the source image to a gray interval with a wide range, and simultaneously mapping the high gray value interval with a wide range to a narrow gray interval, thereby expanding the value of a dark pixel, compressing the value of high gray, and enhancing low gray details in the image;
step 3.3: and replacing the value of one point in the picture with the median of the point values in one neighborhood of the point to solve the salt and pepper noise in the image and the points without gray values existing after the image is re-projected.
4. The method of claim 3, wherein: carrying out threshold segmentation and edge detection on the preprocessed picture, and extracting the positions of the hole center and the part axis, wherein the method comprises the following specific steps:
step 4.1: setting a threshold segmentation gray value of 50 according to the end face butt joint characteristics, screening and extracting an independent communication region through threshold segmentation, and further screening the region where the hole is located through roundness and area characteristics for the selected region; corroding and expanding the area and the area respectively, and solving the intersection of the areas to obtain the area where the hole edge is located;
step 4.2: solving the intersection of the area obtained in the last step and the original image, selecting the image with the hole edge from the original image, and further reducing the operation area of the image;
step 4.3: extracting hole edges by using an edge detection algorithm, screening according to the shapes, carrying out ellipse fitting on a screening result, determining the positions of the hole center and the part axis in the image, and calculating the positions of the measured hole center and the part axis in a corresponding camera real coordinate system through a camera model;
step 4.4: and mapping the hole centers and the shaft centers of the fixed part and the butt joint part to the same world coordinate system, and calculating the deflection angle required to be adjusted.
5. The method of claim 4, wherein: step 4.2, adopting Gaussian filtering to smoothen the image, wherein the gray value of a pixel point at one position (m, n) is f (m, n); the gaussian filtered gray value will become:
Figure FDA0003645133560000021
after calculation and filteringThe edge gradient value and the gradient direction of the image are obtained, and the edge is a set of pixel points with large gray value change; in an image, the degree and direction of change in the gray-scale value are expressed by a gradient, and the gradient value and the gradient direction are calculated by the following formulas:
Figure FDA0003645133560000022
Figure FDA0003645133560000023
in the Gaussian filtering process, filtering points which are not edges by setting rules to ensure that the width of the edges is 1 pixel point as far as possible: if a pixel point belongs to the edge, the gradient value of the pixel point in the gradient direction is the maximum, otherwise, the pixel point is not the edge, and the gray value is set to be 0; detecting edges using upper and lower thresholds, wherein those above the upper threshold are detected as edges and those below the upper threshold are detected as non-edges; for the middle pixel point, if the middle pixel point is adjacent to the pixel point determined as the edge, the edge is determined; otherwise, it is not edge.
6. The method of claim 4, wherein: step 4.4, mapping the hole centers and the axle centers of the fixed part and the butt joint part to the same world coordinate system, wherein the axle center coordinate is H ab0 =(x 0 ,y 0 ) The hole center coordinate set is H a =(x m ,y m ),H b =(x m ,y m ) And m is 1,2,3 … … N, and is determined by a trigonometric function according to a straight line defined by the axes of the two parts and the hole center under the same coordinate system:
Figure FDA0003645133560000024
the minimum deflection angle alpha to be adjusted is calculated.
CN202210527443.0A 2022-05-16 2022-05-16 Cylindrical part butt joint non-contact measurement method based on monocular vision Active CN114963981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210527443.0A CN114963981B (en) 2022-05-16 2022-05-16 Cylindrical part butt joint non-contact measurement method based on monocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210527443.0A CN114963981B (en) 2022-05-16 2022-05-16 Cylindrical part butt joint non-contact measurement method based on monocular vision

Publications (2)

Publication Number Publication Date
CN114963981A true CN114963981A (en) 2022-08-30
CN114963981B CN114963981B (en) 2023-08-15

Family

ID=82970874

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210527443.0A Active CN114963981B (en) 2022-05-16 2022-05-16 Cylindrical part butt joint non-contact measurement method based on monocular vision

Country Status (1)

Country Link
CN (1) CN114963981B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116140987A (en) * 2023-04-17 2023-05-23 广东施泰德测控与自动化设备有限公司 Visual quick docking device and docking method for axle test board

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108562274A (en) * 2018-04-20 2018-09-21 南京邮电大学 A kind of noncooperative target pose measuring method based on marker
CN109190628A (en) * 2018-08-15 2019-01-11 东北大学 A kind of plate camber detection method based on machine vision
CN110146038A (en) * 2019-06-08 2019-08-20 西安电子科技大学 The distributed monocular camera laser measuring device for measuring and method of cylindrical member assembly corner
CN112362034A (en) * 2020-11-11 2021-02-12 上海电器科学研究所(集团)有限公司 Solid engine multi-cylinder section butt joint guiding measurement algorithm based on binocular vision
CN112686920A (en) * 2020-12-31 2021-04-20 天津理工大学 Visual measurement method and system for geometric dimension parameters of circular part
CN113295171A (en) * 2021-05-19 2021-08-24 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft
WO2021208231A1 (en) * 2020-04-15 2021-10-21 上海工程技术大学 Gap measuring system and measuring method
US20220148213A1 (en) * 2020-01-10 2022-05-12 Dalian University Of Technology Method for fully automatically detecting chessboard corner points

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108562274A (en) * 2018-04-20 2018-09-21 南京邮电大学 A kind of noncooperative target pose measuring method based on marker
CN109190628A (en) * 2018-08-15 2019-01-11 东北大学 A kind of plate camber detection method based on machine vision
CN110146038A (en) * 2019-06-08 2019-08-20 西安电子科技大学 The distributed monocular camera laser measuring device for measuring and method of cylindrical member assembly corner
US20220148213A1 (en) * 2020-01-10 2022-05-12 Dalian University Of Technology Method for fully automatically detecting chessboard corner points
WO2021208231A1 (en) * 2020-04-15 2021-10-21 上海工程技术大学 Gap measuring system and measuring method
CN112362034A (en) * 2020-11-11 2021-02-12 上海电器科学研究所(集团)有限公司 Solid engine multi-cylinder section butt joint guiding measurement algorithm based on binocular vision
CN112686920A (en) * 2020-12-31 2021-04-20 天津理工大学 Visual measurement method and system for geometric dimension parameters of circular part
CN113295171A (en) * 2021-05-19 2021-08-24 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
焦亮等: "基于机器视觉的高精度测量与装配系统设计", 《计算机测量与控制》 *
焦亮等: "基于机器视觉的高精度测量与装配系统设计", 《计算机测量与控制》, no. 07, 25 July 2016 (2016-07-25) *
谷凤伟等: "一种简易的单目视觉位姿测量方法研究", 《光电技术应用》 *
谷凤伟等: "一种简易的单目视觉位姿测量方法研究", 《光电技术应用》, no. 04, 15 August 2018 (2018-08-15) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116140987A (en) * 2023-04-17 2023-05-23 广东施泰德测控与自动化设备有限公司 Visual quick docking device and docking method for axle test board

Also Published As

Publication number Publication date
CN114963981B (en) 2023-08-15

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
CN104981105B (en) A kind of quickly accurate detection and method for correcting error for obtaining element central and deflection angle
CN111369630A (en) Method for calibrating multi-line laser radar and camera
CN105716527B (en) Laser seam tracking transducer calibration method
CN108007388A (en) A kind of turntable angle high precision online measuring method based on machine vision
CN112161997B (en) Online precise visual measurement method and system for three-dimensional geometric dimension of semiconductor chip pin
CN113240674A (en) Coplanarity detection method based on three-dimensional point cloud and two-dimensional image fusion
CN113324478A (en) Center extraction method of line structured light and three-dimensional measurement method of forge piece
CN106996748A (en) A kind of wheel footpath measuring method based on binocular vision
CN113744351B (en) Underwater structure light measurement calibration method and system based on multi-medium refraction imaging
CN109974618B (en) Global calibration method of multi-sensor vision measurement system
CN110966956A (en) Binocular vision-based three-dimensional detection device and method
CN110260818B (en) Electronic connector robust detection method based on binocular vision
CN107084680A (en) A kind of target depth measuring method based on machine monocular vision
Wang et al. Error analysis and improved calibration algorithm for LED chip localization system based on visual feedback
CN112381847A (en) Pipeline end head space pose measuring method and system
CN111402330A (en) Laser line key point extraction method based on plane target
CN115096206A (en) Part size high-precision measurement method based on machine vision
CN114963981B (en) Cylindrical part butt joint non-contact measurement method based on monocular vision
CN116129037A (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN109506629B (en) Method for calibrating rotation center of underwater nuclear fuel assembly detection device
CN113607058B (en) Straight blade size detection method and system based on machine vision
CN111968182B (en) Calibration method for nonlinear model parameters of binocular camera
CN116596987A (en) Workpiece three-dimensional size high-precision measurement method based on binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant