CN111145247A - Vision-based position detection method, robot and computer storage medium - Google Patents

Vision-based position detection method, robot and computer storage medium Download PDF

Info

Publication number
CN111145247A
CN111145247A CN201911311789.1A CN201911311789A CN111145247A CN 111145247 A CN111145247 A CN 111145247A CN 201911311789 A CN201911311789 A CN 201911311789A CN 111145247 A CN111145247 A CN 111145247A
Authority
CN
China
Prior art keywords
center
feature
detected
visual
deviation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911311789.1A
Other languages
Chinese (zh)
Other versions
CN111145247B (en
Inventor
王伦
周瑞松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peitian Robot Technology Co Ltd
Original Assignee
Peitian Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peitian Robot Technology Co Ltd filed Critical Peitian Robot Technology Co Ltd
Priority to CN201911311789.1A priority Critical patent/CN111145247B/en
Publication of CN111145247A publication Critical patent/CN111145247A/en
Application granted granted Critical
Publication of CN111145247B publication Critical patent/CN111145247B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a vision-based position degree detection method, a robot and a computer storage medium, wherein the position degree detection method comprises the following steps: acquiring an image at a reference characteristic photographing position and an image at a to-be-detected characteristic photographing position; acquiring the coordinates of a reference visual center of the image at the reference characteristic photographing position in a visual coordinate system and the reference deviation of the reference characteristic center relative to the reference visual center, and acquiring the coordinates of a to-be-detected visual center of the image at the to-be-detected characteristic photographing position in the visual coordinate system and the to-be-detected deviation of the to-be-detected characteristic center relative to the to-be-detected visual center; and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the to-be-measured feature center in the visual coordinate system to obtain the position degree of the to-be-measured feature center relative to the reference feature center. The position degree of the feature center to be detected is calculated by collecting images, the requirement on detection environment is low, the detection device can be incorporated into an automatic production line, automatic detection is realized, and the product to be detected is prevented from being worn.

Description

Vision-based position detection method, robot and computer storage medium
Technical Field
The invention relates to the technical field of machine vision detection, in particular to a vision-based position detection method, a robot and a computer storage medium.
Background
The position degree is an index for limiting the amount of variation of the actual position of the measured element from the ideal position. The position degree detection is a conventional detection frequently performed in assembling parts, and the position degree determines the assembling precision and the product yield of the parts. For example, the assembly of positioning pins and other fastening devices requires high degree of positioning, and thus, the degree of positioning of the element to be measured needs to be accurately detected.
The existing position detection generally detects the position of a part on line, and a common detection method comprises a position detection tool and a three-coordinate measuring instrument. The position degree detection tool is used for detection, on one hand, manual detection is needed, and automation cannot be realized; on the other hand, the corresponding position degree detection tool is required to be manufactured according to the detection object and the detection precision requirement, so that the cost is high, contact type measurement is carried out when the detection tool is used for detection, and the phenomenon of abrasion exists. The three-coordinate measuring instrument is used for detection, although the detection precision is high, the requirement of the three-coordinate measuring instrument on the environment is high, the three-coordinate measuring instrument needs to be detected in the environment with constant temperature and constant humidity, and the three-coordinate measuring instrument cannot be incorporated into an automatic production line for automatic detection. And the three-coordinate measuring instrument has higher cost and high requirement on the operation of a human.
Disclosure of Invention
The invention provides a vision-based position detection method, a robot and a computer storage medium, and aims to solve the technical problems that in the prior art, the position detection cannot be automated and the cost is high.
In order to solve the technical problems, the invention adopts a technical scheme that: provided is a vision-based location degree detection method, including: acquiring an image at a reference characteristic photographing position and an image at a to-be-detected characteristic photographing position; acquiring the coordinate of a reference visual center of the image at the reference characteristic photographing position in a visual coordinate system and the reference deviation of the reference characteristic center relative to the reference visual center, and acquiring the coordinate of a to-be-detected visual center of the image at the to-be-detected characteristic photographing position in the visual coordinate system and the to-be-detected deviation of the to-be-detected characteristic center relative to the to-be-detected visual center; and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the to-be-detected feature center in the visual coordinate system to obtain the position degree of the to-be-detected feature center relative to the reference feature center.
In order to solve the technical problem, the invention adopts another technical scheme that: a computer storage medium is provided for storing a computer program for implementing the location degree detection method as described in the foregoing when executed by a processor.
In order to solve the technical problem, the invention adopts another technical scheme that: the robot comprises a clamping system, an image acquisition system and a master control system, wherein the clamping system is used for clamping and positioning a product to be detected, the image acquisition system is used for taking pictures, and the master control system is used for executing the position degree detection method.
The invention has the beneficial effects that: different from the situation of the prior art, the vision-based position degree detection method provided by the invention obtains the image at the reference feature photographing position and the image at the to-be-detected feature photographing position, and obtains the position degree of the to-be-detected feature center relative to the reference feature center based on the image analysis and calculation. On one hand, the requirement on the detection environment is low, and the detection environment can be integrated into an automatic production line to carry out real-time and automatic detection on the position degree; on the other hand, the abrasion to the product to be measured can be avoided.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram of a robot in one embodiment of the present invention;
FIG. 2 is a flow chart illustrating a location degree detection method according to an embodiment of the invention;
FIG. 3 is a schematic diagram of the model of FIG. 2;
FIG. 4 is a partial schematic flow chart of step S102 in FIG. 2;
FIG. 5 is a schematic flow chart of step S202 in FIG. 4;
FIG. 6 is a flow chart illustrating a location degree detection method according to another embodiment of the present invention;
FIG. 7 is a schematic diagram of the model of FIG. 6;
fig. 8 is a flowchart illustrating step S403 in fig. 6;
fig. 9 is a flowchart of step S501 in fig. 8;
FIG. 10 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The present invention provides a robot 100, the robot 100 including: the clamping system 10, the image acquisition system 20 and the main control system 30. As shown in fig. 1, in the present embodiment, the clamping system 10 is configured to clamp and position a product to be tested, the image capturing system 20 is configured to take a picture, and the main control system 30 is configured to execute a position degree detecting method to obtain a position degree of a center of a feature to be tested on the product to be tested relative to a center of a reference feature.
The present invention utilizes the image acquisition system 20 to acquire the image of the product to be tested, and utilizes the main control system 30 to analyze and calculate the image of the product to be tested, thereby detecting the position degree of the element on the product to be tested. On one hand, the clamping system 10, the image acquisition system 20 and the main control system 30 can be installed on an automatic production line to realize real-time detection of the position degree of the element; on the other hand, the robot 100 can be prevented from contacting with the product and causing abrasion to the product.
The present invention further provides a vision-based location degree detection method, which can be used for the above-mentioned main control system 30. As shown in fig. 2 and fig. 3, fig. 2 is a schematic flow chart of a location degree detection method in an embodiment of the present invention, and fig. 3 is a schematic model diagram in fig. 2. The position degree detection method of the embodiment specifically includes:
step S101: and acquiring an image at the reference characteristic photographing position and an image at the to-be-detected characteristic photographing position.
Specifically, the main control system 30 controls the image capture system 20 to take pictures at the reference feature photographing position and the feature-to-be-measured photographing position, respectively, so as to obtain an image at the reference feature photographing position and an image at the feature-to-be-measured photographing position.
The number of the reference feature photographing positions may be one or two, the number of the feature photographing positions to be detected may be one, two or more, and the like, and the embodiment of the present invention is not particularly limited. As shown in fig. 3, a reference feature photographing position and a feature to be measured photographing position are taken as an example for detailed description. And recording the reference characteristic photographing position as M and the characteristic photographing position to be detected as N.
Step S102: the method comprises the steps of obtaining coordinates of a reference visual center of an image at a reference feature photographing position in a visual coordinate system and reference deviation of the reference feature center relative to the reference visual center, and obtaining coordinates of a to-be-measured visual center of the image at a to-be-measured feature photographing position in the visual coordinate system and to-be-measured deviation of the to-be-measured feature center relative to the to-be-measured visual center.
Specifically, in the present embodiment, the main control system 30 calculates the coordinates (x) of the reference visual center a of the image at the reference feature photographing position M in the visual coordinate system, respectively0,y0) And the reference deviation of the reference characteristic center B relative to the reference visual center A, and calculating the coordinate (x) of the visual center C to be measured of the image at the photographing position N of the characteristic to be measured in the visual coordinate system1,y1) And the deviation to be measured of the characteristic center D to be measured relative to the visual center C to be measured.
The reference deviation is the deviation of the reference feature center in the visual coordinate system relative to the reference visual center, and the deviation to be measured is the deviation of the feature center to be measured in the visual coordinate system relative to the visual center to be measured.
Further, as shown in fig. 4, fig. 4 is a partial flowchart of step S102 in fig. 2. In step S102, the step of acquiring the reference deviation of the reference feature center B from the reference visual center includes:
step S201: and acquiring the reference characteristic deviation of the reference characteristic center B of the image at the reference characteristic photographing position M relative to the reference vision center A.
Because, the images acquired by the image acquisition system 20 generally have errors in three aspects: (1) the repeated positioning precision is poor; (2) visual calibration error; (3) and visually detecting errors. Based on the above three errors, the main control system 30 analyzes and calculates that there is an error between the coordinates of the reference visual center a and the reference characteristic center B in the visual coordinate system and the actual coordinates thereof in the visual coordinate system based on the image at the reference characteristic photographing position M. Wherein the reference feature deviation is given by: the main control system 30 analyzes the image at the reference feature photographing position M, and the obtained deviation between the theoretical coordinates of the reference feature center B in the visual coordinate system and the theoretical coordinates of the reference visual center a in the visual coordinate system.
After obtaining the reference feature deviation of the reference feature center B with respect to the reference visual center a, the step S202 is continuously executed: and calibrating the reference characteristic deviation so as to obtain the reference deviation.
In order to reduce the recognition error of the robot 100 caused by the acquisition error of the image acquisition system 20, the reference feature deviation acquired by the main control system 30 needs to be calibrated to obtain the reference deviation between the actual coordinates of the reference feature center B in the visual coordinate system and the actual coordinates of the reference visual center a in the visual coordinate system. In this embodiment, by calibrating the reference characteristic deviation, the influence of the errors in the above three aspects on the position detection result can be reduced, and the detection accuracy of the robot 100 can be further improved.
Alternatively, in the present embodiment, as shown in fig. 4, after the reference deviation is obtained by the above method in step S202, the following steps need to be executed:
step S203: and acquiring the deviation of the feature center to be detected of the image at the position where the feature to be detected is photographed relative to the vision center to be detected.
Step S204: and calibrating the deviation of the characteristic to be measured, thereby obtaining the deviation to be measured.
The difference between the step S203 and the step S201 in this embodiment is that the image at the reference feature photographing position is analyzed in the step S201 and the step S202, and the image at the feature to be measured photographing position is analyzed in the step S203 and the step S204, and the analysis methods are substantially the same, please refer to the description in the above embodiments, which is not repeated herein.
In another embodiment, step S203 and step S204 may be performed first to obtain the deviation to be measured, and then step S201 and step S202 may be performed to obtain the reference deviation. Or, in another embodiment, while the step S201 and the step S202 are executed to obtain the reference deviation, the step S203 and the step S204 are executed to obtain the deviation to be measured, so as to shorten the time calculated by the main control system 30.
Further, referring to fig. 5, fig. 5 is a flowchart illustrating step S202 in fig. 4. In this embodiment, the step of calibrating the reference characteristic deviation includes:
step S301: an image of the visual calibration plate is collected.
The visual calibration plate is a geometric model of camera imaging which needs to be established in the applications of machine vision, image measurement, photogrammetry, three-dimensional reconstruction and the like, and is used for correcting lens distortion, determining a conversion relation between a physical size and pixels, and determining a mutual relation between a three-dimensional geometric position of a certain point on the surface of a space object and a corresponding point in an image.
In this embodiment, the main control system 30 first clamps and positions the visual calibration plate on the clamping system 10, and then the main control system 30 controls the image capturing system 20 to capture an image of the visual calibration plate.
Step S302: and analyzing the image, acquiring a calibration model, and calibrating the reference characteristic deviation and the characteristic deviation to be measured by using the calibration model.
Specifically, the main control system 30 analyzes the image of the visual calibration board acquired by the image acquisition system 20 to obtain a calibration model, and then the main control system 30 calibrates the reference characteristic deviation obtained in step S201 by using the calibration model to obtain a reference deviation.
Further, in this embodiment, the deviation of the feature to be measured may also be calibrated by using the methods in step S301 and step S302, which are not described herein again.
In this embodiment, as shown in fig. 3, in the reference feature photographing position M, the main control system 30 may calculate a reference feature deviation of the reference feature center B with respect to the reference visual center a, and calibrate the reference feature deviation, so as to obtain a reference deviation of the reference feature center B with respect to the reference visual center a in the visual coordinate system
Figure BDA0002324719940000061
In the feature photographing position N, the main control system 30 can calculate the deviation of the feature center D to be measured from the visual center C to be measured, and calibrate the deviation of the feature to be measured, so as to obtain the deviation of the feature center D to be measured from the visual center C to be measured in the visual coordinate system
Figure BDA0002324719940000062
Step S103: and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the to-be-measured feature center in the visual coordinate system to obtain the position degree of the to-be-measured feature center relative to the reference feature center.
The main control system 30 may calculate the coordinates of the reference feature center in the visual coordinate system by using the coordinates of the reference visual center in the visual coordinate system and the reference deviation of the reference feature center with respect to the reference visual center. The main control system 30 can calculate the coordinates of the feature center to be measured in the visual coordinate system through the coordinates of the visual coordinate system to be measured and the deviation of the feature center to be measured with respect to the visual coordinate system to be measured.
Specifically, as shown in fig. 3, the coordinate of the reference visual center a in the visual coordinate system is (x)0,y0) The reference characteristic center B has a reference deviation in the visual coordinate system with respect to the reference visual center A of
Figure BDA0002324719940000071
According to the plane vector algorithm, the coordinate of the reference feature center B in the visual coordinate system is (x)0+xAB,y0+yAB). The coordinate of the vision center C to be measured in the vision coordinate system is (x)1,y1) The reference deviation of the feature center D to be measured in the visual coordinate system relative to the reference visual center C is
Figure BDA0002324719940000072
According to the plane vector algorithm, the coordinate of the feature center D to be measured in the visual coordinate system is (x)1+xCD,y1+yCD)。
Since the number of the reference feature photographing positions is one in this embodiment, the position degree of the center of the feature to be measured with respect to the center of the reference feature is equal to the distance between the coordinates of the center of the reference feature in the visual coordinate system and the coordinates of the center of the feature to be measured in the visual coordinate system.
Specifically, in this embodiment, the position of the feature center D to be measured with respect to the reference feature center B is the distance between the reference feature center B and the feature center D to be measured in the visual coordinate system.
As shown in fig. 3, in the present embodiment, the coordinate of the reference feature center B in the visual coordinate system is (x)0+xAB,y0+yAB) The coordinate of the center D of the feature to be measured in the visual coordinate system is (x)1+xCD,y1+yCD). The main control system 30 can obtain the position degree of the feature center D to be measured relative to the reference feature center B according to the distance formula between the two points
Figure BDA0002324719940000073
Further, before acquiring the image at the reference feature photographing position and the image at the to-be-detected feature photographing position, the position degree detection method includes: and marking the reference characteristic photographing position and the characteristic photographing position to be detected through teaching.
Specifically, teaching is also required to be performed on the main control system before the position degree detection is performed using the robot 100, so that the main control system 30 acquires the reference feature photographing position M and the feature to be detected photographing position N.
In the present embodiment, before the robot 100 is used to perform position degree detection, the master control system 30 needs to be taught to obtain one reference feature photographing position M and one feature to be measured photographing position N. When the robot 100 is used to perform position degree detection, the main control system 30 controls the image capturing system 20 to take pictures at the reference feature photographing position M and the feature-to-be-measured photographing position N.
Of course, in other embodiments, when there are a plurality of detected features to be detected, the photographing positions N of the plurality of features to be detected need to be taught, so that the main control system 30 obtains the photographing positions N of the plurality of features to be detected. And when detecting the subsequent position degree, controlling the image acquisition system 20 to take a picture at the plurality of feature-to-be-detected picture taking positions N. The number of the shot of the feature to be detected and the shot position are determined according to the position of the feature to be detected on the clamping system 10 and the number of the feature to be detected, and the shot position and the number of the feature to be detected are not specifically limited in the present invention.
The teaching of the robot 100 may be implemented in a conventional manner, and the embodiments of the present invention are not limited in particular.
Referring to fig. 6 and 7, fig. 6 is a schematic flow chart of a location degree detection method according to another embodiment of the present invention, and fig. 7 is a schematic model diagram of fig. 6. In this embodiment, the vision-based location degree detection method includes:
step S401: and acquiring an image at the reference characteristic photographing position and an image at the to-be-detected characteristic photographing position.
Step S401 in this embodiment is substantially the same as step S101 in the above embodiment, except that in this embodiment, the number of the reference feature photographing positions is two, that is, the main control system 30 controls the image capturing system 20 to photograph at the two reference feature photographing positions respectively. And recording the reference characteristic photographing position as R and S, and recording the characteristic photographing position to be detected as T.
Step S402: the method comprises the steps of obtaining coordinates of a reference visual center of an image at a reference feature photographing position in a visual coordinate system and reference deviation of the reference feature center relative to the reference visual center, and obtaining coordinates of a to-be-measured visual center of the image at a to-be-measured feature photographing position in the visual coordinate system and to-be-measured deviation of the to-be-measured feature center relative to the to-be-measured visual center.
In this embodiment, the main control system 330 respectively calculates the reference feature photographing position R and the reference vision center R of the image under S0And S0Coordinates (x) in the visual coordinate systemR,yR) And (x)S,yS) A reference feature center R1Relative reference center of vision R0And reference feature center S1Relative reference center of vision S0The reference deviation of (1). And calculating the vision center T to be measured of the image under the photographing position T of the characteristic to be measured0Coordinates (x) in the visual coordinate systemT,yT) The feature center to be measured is opposite to the vision center T to be measured0To be measured deviation.
The calculation method of each reference deviation and deviation to be measured in step S402 in this embodiment is substantially the same as the calculation method of each reference deviation and deviation to be measured in step S102 in the above embodiment, please refer to the description in the above embodiment, and details are not repeated here.
Specifically, in this embodiment, as shown in fig. 7, in the reference feature photographing positions R and S, the main control system 230 may calculate a reference feature deviation I of the reference feature center from the reference vision center and a reference feature deviation II of the reference feature center from the reference vision center, and calibrate the reference feature deviations I and II, so as to obtain the reference feature center R1Relative reference center of vision R0Reference deviation in visual coordinate system
Figure BDA0002324719940000091
And a reference feature center S1Relative reference center of vision S0Reference deviation in visual coordinate system
Figure BDA0002324719940000092
Under the photographing position T of the feature to be measured, the main control system 230 can calculate the feature center T to be measured1Relative vision center T to be measured0And calibrating the deviation of the characteristic to be measured, thereby obtaining the center T of the characteristic to be measured1Relative vision center T to be measured0Deviation to be measured in a visual coordinate system
Figure BDA0002324719940000093
The master control system 230 calculates a reference deviation
Figure BDA0002324719940000094
And
Figure BDA0002324719940000095
and deviation to be measured
Figure BDA0002324719940000096
Then the following step S403 is executed: calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the feature center to be measured in the visual coordinate system to obtain the relative reference of the feature center to be measuredThe degree of location of the feature center.
The main control system 30 may calculate the coordinates of the reference feature center in the visual coordinate system by using the coordinates of the reference visual center in the visual coordinate system and the reference deviation of the reference feature center with respect to the reference visual center. The main control system 30 can calculate the coordinates of the feature center to be measured in the visual coordinate system through the coordinates of the visual coordinate system to be measured and the deviation of the feature center to be measured with respect to the visual coordinate system to be measured.
Specifically, as shown in fig. 7, the reference visual center R0And S0The coordinate in the visual coordinate system is (x)R,yR) And (x)S,yS) Center of reference feature R1And S1Relative to a reference vision center R in a vision coordinate system0And S0Has a reference deviation of
Figure BDA0002324719940000101
And
Figure BDA0002324719940000102
according to the plane vector algorithm, the reference feature center R1And S1The coordinates in the visual coordinate system are
Figure BDA0002324719940000104
And
Figure BDA0002324719940000105
visual center T to be measured0The coordinate in the visual coordinate system is (x)T,yT) And, a center of the feature to be measured T1Relative to a reference vision center T in a vision coordinate system0Has a reference deviation of
Figure BDA0002324719940000103
According to the plane vector algorithm, the feature center T to be measured1The coordinates in the visual coordinate system are
Figure BDA0002324719940000106
Since the number of reference feature photographing positions is two, as shown in fig. 8, fig. 8 is a flowchart of step S403 in fig. 6. In this embodiment, the step of obtaining the position degree of the center of the feature to be measured with respect to the center of the reference feature includes:
step S501: and establishing a reference coordinate system by using the two reference characteristic centers, and calculating a conversion relation between the reference coordinate system and the visual coordinate system.
Specifically, the master control system 30 centers on two reference features R1And S1Establishing a reference coordinate system and utilizing two reference characteristic centers R1And S1And obtaining the conversion relation between the reference coordinate system and the visual coordinate system through the relation between the coordinates in the reference coordinate system and the visual coordinate system.
Further, as shown in fig. 9, fig. 9 is a flowchart of step S501 in fig. 8. In this embodiment, the step of establishing a reference coordinate system with two reference feature centers includes:
step S601: with one of the reference feature centers as the origin.
In the present embodiment, the main control system 30 uses the reference feature center R1As the origin.
Step S602: the line connecting the two reference features is taken as the X axis.
In this embodiment, the master control system 30 uses R1And S1The line of (2) is taken as the X-axis.
Step S603: and establishing a reference coordinate system by taking the direction vertical to the X axis as the Y axis.
In the present embodiment, the main control system 30 establishes a reference coordinate system with a direction perpendicular to the X axis as the Y axis.
In another embodiment, the master control system 30 may also center the reference feature S1As origin, with S1And R1The connection line is used as an X axis, and a direction perpendicular to the X axis is used as a Y axis to establish a reference coordinate system.
Step S502: and calculating the coordinates of the center of the feature to be measured in the reference coordinate system.
Specifically, the master control system 30 utilizes the feature center to be tested T1Coordinates in a visual coordinate system
Figure BDA0002324719940000112
And the transformation relation from the visual coordinate system to the reference coordinate system can obtain the characteristic center T to be measured1Coordinates (x ', y') in the reference visual coordinate system.
Step S503: the position degree of the center of the feature to be measured relative to the center of the reference feature is equal to the distance between the center of the feature to be measured and the origin of coordinates of the reference coordinate system.
Specifically, in this embodiment, the position degree of the center of the feature to be measured with respect to the center of the reference feature is the center T of the feature to be measured in the reference coordinate system1Distance from the origin of coordinates.
Specifically, in the present embodiment, the feature center T to be measured1The coordinate in the reference coordinate system is (x ', y'), which can be obtained by the main control system 230 according to the distance formula between two points, the feature center T to be measured1The distance between the characteristic center T and the origin of coordinates is the center T of the characteristic to be measured1In a position degree of
Figure BDA0002324719940000111
Referring to fig. 10, fig. 10 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application. The computer storage medium 40 is used to store a computer program 41, and the computer program 41 is used to implement the position degree detection method in the above-described embodiments when executed by a processor.
The computer storage medium 40 may be a server, a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various media capable of storing program codes.
In summary, the present invention utilizes the image acquisition system to acquire the images at the reference feature photographing position and the feature to be detected photographing position, and utilizes the main control system to analyze the images, so as to calculate the position degree of the feature center to be detected relative to the reference feature center. The vision is utilized to detect, the requirement on the detection environment is low, so that the detection device can be incorporated into an automatic production line, and the real-time detection on the position degree of a product is realized. And the contact with the product is avoided, and the abrasion to the product is reduced.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A vision-based location degree detection method is characterized by comprising the following steps:
acquiring an image at a reference characteristic photographing position and an image at a to-be-detected characteristic photographing position;
acquiring the coordinate of a reference visual center of the image at the reference characteristic photographing position in a visual coordinate system and the reference deviation of the reference characteristic center relative to the reference visual center, and acquiring the coordinate of a to-be-detected visual center of the image at the to-be-detected characteristic photographing position in the visual coordinate system and the to-be-detected deviation of the to-be-detected characteristic center relative to the to-be-detected visual center;
and calculating the coordinates of the reference feature center in the visual coordinate system and the coordinates of the to-be-detected feature center in the visual coordinate system to obtain the position degree of the to-be-detected feature center relative to the reference feature center.
2. The position degree detection method according to claim 1, wherein the step of acquiring a reference deviation of the reference feature center from the reference visual center includes:
acquiring the reference characteristic deviation of the reference characteristic center of the image at the reference characteristic photographing position relative to the reference vision center;
and calibrating the reference characteristic deviation so as to obtain the reference deviation.
3. The position degree detection method according to claim 2, wherein the step of obtaining the deviation to be measured of the center of the feature to be measured with respect to the center of the vision to be measured includes:
acquiring the deviation of the feature center to be detected of the image at the position where the feature to be detected is photographed relative to the vision center to be detected;
and calibrating the deviation of the characteristic to be detected, thereby obtaining the deviation to be detected.
4. The position degree detection method according to claim 3, wherein the step of calibrating the reference feature deviation and the to-be-measured feature deviation includes:
collecting an image of a visual calibration plate;
and analyzing the image, acquiring a calibration model, and calibrating the reference characteristic deviation and the characteristic deviation to be measured by using the calibration model.
5. The position degree detection method according to claim 1, wherein the number of the reference feature photographing positions is one, and the position degree of the center of the feature to be measured with respect to the reference feature center is equal to a distance between the coordinates of the reference feature center in the visual coordinate system and the coordinates of the center of the feature to be measured in the visual coordinate system.
6. The position degree detection method according to claim 1, wherein the number of the reference feature photographing positions is two, and the step of obtaining the position degree of the center of the feature to be measured with respect to the center of the reference feature includes:
establishing a reference coordinate system by using the two reference feature centers, and calculating a conversion relation between the reference coordinate system and the visual coordinate system;
calculating the coordinate of the center of the feature to be measured in the reference coordinate system;
and the position degree of the center of the feature to be detected relative to the center of the reference feature is equal to the distance between the center of the feature to be detected and the origin of coordinates of the reference coordinate system.
7. The position degree detection method according to claim 6, wherein the step of establishing a reference coordinate system with two reference feature centers includes:
taking one of the reference feature centers as an origin;
taking a connecting line of the two reference features as an X axis;
and establishing the reference coordinate system by taking the direction vertical to the X axis as a Y axis.
8. The position degree detection method according to claim 1, wherein before acquiring the image at the reference feature photographing position and the image at the feature-to-be-measured photographing position, the position degree detection method includes: and marking the reference characteristic photographing position and the to-be-detected characteristic photographing position through teaching.
9. A computer storage medium storing a computer program, wherein the computer program, when executed by a processor, is configured to implement the position degree detection method according to any one of claims 1 to 8.
10. A robot is characterized by comprising a clamping system, an image acquisition system and a main control system, wherein the clamping system is used for clamping and positioning a product to be detected, the image acquisition system is used for taking pictures, and the main control system is used for executing the position degree detection method according to any one of claims 1 to 8.
CN201911311789.1A 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium Active CN111145247B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911311789.1A CN111145247B (en) 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911311789.1A CN111145247B (en) 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium

Publications (2)

Publication Number Publication Date
CN111145247A true CN111145247A (en) 2020-05-12
CN111145247B CN111145247B (en) 2023-07-07

Family

ID=70518749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911311789.1A Active CN111145247B (en) 2019-12-18 2019-12-18 Position degree detection method based on vision, robot and computer storage medium

Country Status (1)

Country Link
CN (1) CN111145247B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111805158A (en) * 2020-05-25 2020-10-23 福建星云电子股份有限公司 Self-detection method and system for displacement of clamp of robot
CN116840243A (en) * 2023-09-01 2023-10-03 湖南睿图智能科技有限公司 Correction method and system for machine vision object recognition

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5125035A (en) * 1989-12-18 1992-06-23 Chromalloy Gas Turbine Corporation Five axis generated hole inspection system
JPH06785A (en) * 1992-06-23 1994-01-11 Citizen Watch Co Ltd Correcting method for visual sensor coordinate system
US7194387B1 (en) * 2005-09-30 2007-03-20 Intel Corporation Apparatus and method for determining the minimum zone for an array of features with a true position tolerance
CN102183205A (en) * 2011-01-19 2011-09-14 北京航空航天大学 Method for matching optimal assembly poses of large-sized parts
CN102589429A (en) * 2012-01-20 2012-07-18 天津大学 On-line engine cylinder hexahedral hole set position error detecting method
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN109900207A (en) * 2019-03-12 2019-06-18 精诚工科汽车系统有限公司 The tool center point scaling method and system of robot vision tool
CN110017769A (en) * 2019-03-12 2019-07-16 精诚工科汽车系统有限公司 Part detection method and system based on industrial robot
CN110068271A (en) * 2019-04-19 2019-07-30 怡得乐电子(杭州)有限公司 The PIN needle position degree detection method of the large size product of sub-pixel precision
CN110207565A (en) * 2019-05-31 2019-09-06 南京理工大学 A kind of high speed warp knit machine crankshaft rigging position degree measurement tooling and measurement method
CN110281238A (en) * 2019-06-17 2019-09-27 深圳视觉龙智能传感器有限公司 Assembly line multi-robot scaling method, device, computer equipment and storage medium
CN110455225A (en) * 2019-08-22 2019-11-15 吉林大学 Rectangle splined spindle concentricity and key position degree measurement method based on structure light vision

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5125035A (en) * 1989-12-18 1992-06-23 Chromalloy Gas Turbine Corporation Five axis generated hole inspection system
JPH06785A (en) * 1992-06-23 1994-01-11 Citizen Watch Co Ltd Correcting method for visual sensor coordinate system
US7194387B1 (en) * 2005-09-30 2007-03-20 Intel Corporation Apparatus and method for determining the minimum zone for an array of features with a true position tolerance
CN102183205A (en) * 2011-01-19 2011-09-14 北京航空航天大学 Method for matching optimal assembly poses of large-sized parts
CN102589429A (en) * 2012-01-20 2012-07-18 天津大学 On-line engine cylinder hexahedral hole set position error detecting method
CN109952176A (en) * 2017-05-26 2019-06-28 深圳配天智能技术研究院有限公司 A kind of robot calibration method, system, robot and storage medium
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN109900207A (en) * 2019-03-12 2019-06-18 精诚工科汽车系统有限公司 The tool center point scaling method and system of robot vision tool
CN110017769A (en) * 2019-03-12 2019-07-16 精诚工科汽车系统有限公司 Part detection method and system based on industrial robot
CN110068271A (en) * 2019-04-19 2019-07-30 怡得乐电子(杭州)有限公司 The PIN needle position degree detection method of the large size product of sub-pixel precision
CN110207565A (en) * 2019-05-31 2019-09-06 南京理工大学 A kind of high speed warp knit machine crankshaft rigging position degree measurement tooling and measurement method
CN110281238A (en) * 2019-06-17 2019-09-27 深圳视觉龙智能传感器有限公司 Assembly line multi-robot scaling method, device, computer equipment and storage medium
CN110455225A (en) * 2019-08-22 2019-11-15 吉林大学 Rectangle splined spindle concentricity and key position degree measurement method based on structure light vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
R.VISVANATHAN等: "Mobile robot localization system using multiple ceiling mounted cameras", 《2015 IEEE SENSORS》 *
YUN-HUI LIU等: "Dynamic tracking of manipulators using visual feedback from an uncalibrated fixed camera", 《PROCEEDINGS 2006 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION》 *
李兵等: "基于多传感器测量的孔组位置度快速检测方法", 《计算机集成制造系统》 *
陈朝: "孔组位置度视觉测量技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111805158A (en) * 2020-05-25 2020-10-23 福建星云电子股份有限公司 Self-detection method and system for displacement of clamp of robot
CN116840243A (en) * 2023-09-01 2023-10-03 湖南睿图智能科技有限公司 Correction method and system for machine vision object recognition
CN116840243B (en) * 2023-09-01 2023-11-28 湖南睿图智能科技有限公司 Correction method and system for machine vision object recognition

Also Published As

Publication number Publication date
CN111145247B (en) 2023-07-07

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
JP6280525B2 (en) System and method for runtime determination of camera miscalibration
CN102376089B (en) Target correction method and system
US9111177B2 (en) Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
CN105783711B (en) Three-dimensional scanner correction system and correction method thereof
CN106709955B (en) Space coordinate system calibration system and method based on binocular stereo vision
CN105066962B (en) A kind of high-precision photogrammetric apparatus of the big angle of visual field of multiresolution
CN103383238A (en) Image measurement apparatus, image measurement method and image measurement program
US20160065932A1 (en) Device and system for three-dimensional scanning, and method thereof
CN114714356A (en) Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision
CN112991453A (en) Calibration parameter calibration method and device for binocular camera and electronic equipment
CN106705860B (en) A kind of laser distance measurement method
CN111145247B (en) Position degree detection method based on vision, robot and computer storage medium
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN115540775A (en) 3D video extensometer of CCD single-phase machine
CN114018212B (en) Spherical camera monocular ranging-oriented pitch angle correction method and system
CN113822920A (en) Method for acquiring depth information by structured light camera, electronic equipment and storage medium
CN113658270A (en) Multi-view visual calibration method, device, medium and system based on workpiece hole center
Ricolfe-Viala et al. Optimal conditions for camera calibration using a planar template
CN112116665A (en) Structured light sensor calibration method
TW201317587A (en) Dimension measuring device and method thereof
CN116051634A (en) Visual positioning method, terminal and storage medium
JP7414850B2 (en) robot system
CN110866951B (en) Method for correcting optical axis inclination of monocular camera
CN113670280A (en) Verticality measuring device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant