CN111781894A - Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision - Google Patents

Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision Download PDF

Info

Publication number
CN111781894A
CN111781894A CN202010728001.3A CN202010728001A CN111781894A CN 111781894 A CN111781894 A CN 111781894A CN 202010728001 A CN202010728001 A CN 202010728001A CN 111781894 A CN111781894 A CN 111781894A
Authority
CN
China
Prior art keywords
output end
tool
coordinate system
point device
passive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010728001.3A
Other languages
Chinese (zh)
Inventor
游四清
黄科
游晓龙
白灵
黄菊芳
孔俊
陈平平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Feishuo Yiwei Chongqing Technology Co ltd
Original Assignee
Feishuo Yiwei Chongqing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Feishuo Yiwei Chongqing Technology Co ltd filed Critical Feishuo Yiwei Chongqing Technology Co ltd
Priority to CN202010728001.3A priority Critical patent/CN111781894A/en
Publication of CN111781894A publication Critical patent/CN111781894A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/19Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P19/00Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes
    • B23P19/04Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes for assembling or disassembling parts
    • B23P19/06Screw or nut setting or loosening machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35349Display part, programmed locus and tool path, traject, dynamic locus

Abstract

The invention provides a method for carrying out space positioning and attitude navigation on an assembly tool by using machine vision, which comprises the following steps: establishing a coordinate relation between the output end of the assembling tool and the mark point device; acquiring an image of the marking point device on a camera coordinate system, and calculating the space posture of the marking point device under the camera coordinate system and the position and posture of the output end of the assembling tool under the camera coordinate system; teaching, namely acquiring corresponding position and posture data in the whole process from the beginning to the completion of the operation; and in the working process of the assembly system, acquiring spatial position and attitude data of the assembly tool in real time, and comparing the spatial position and attitude data with the teaching position and attitude data. The invention provides a brand-new method for space positioning and attitude navigation of the assembly tool, and realizes high-precision space positioning of the assembly tool.

Description

Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision
Technical Field
The invention belongs to the field of vision measurement technology and intelligent assembly, and particularly relates to a method for carrying out space positioning and attitude navigation on an assembly tool by using machine vision.
Background
In industrial intelligent production, the spatial position and the attitude of an assembly tool (such as an electric wrench) need to be measured so as to carry out process control, for example, screwing or riveting a structural component on an engine cylinder block, on one hand, assembly according to a certain sequence (such as diagonal assembly or specified sequence assembly) is ensured, on the other hand, the screwing torque or the tensile force of each connecting piece meets the requirement, wrong assembly and neglected assembly cannot be carried out, and an operator is preferably guided to start assembly to a specified precise position.
In the prior art, there are a mechanical arm navigation method and an LED mark point visual navigation method by active light emission. The mechanical arm navigation method is characterized in that an assembling tool is fixed on a positioning mechanical arm with an angle encoder, and the spatial position and the attitude of the assembling tool are calculated according to the rotation angle of the angle encoder and the length of the mechanical arm or the length of a pull rope, and the method has many problems: (1) the requirement of wireless positioning of a wireless assembly tool cannot be met; (2) the device can be used only by mechanical clamping, and is very easy to interfere with a workpiece and surrounding objects; (3) poor flexibility and convenience; (4) the response time is long, and the measurement positioning accuracy is not good enough. The visual navigation method of the LED mark points through active light emitting has the following problems: (1) the LED which actively emits light needs to be powered, and if external power supply is adopted, the interference of a cable to operation can be caused; if the battery is adopted for power supply, the LED luminous brightness is not enough to influence the measurement precision after the battery power is low; (2) the use frequency is high, the power supply time of the battery is limited, the battery needs to be frequently replaced, the holding space of an assembling tool is often limited, the battery is too large and inconvenient to use, and the battery interference assembling space can be caused; if the assembly tool is adopted for direct power supply, the assembly tool manufacturer needs to redesign the tool to match the power supply, so that on one hand, the storage tool cannot be used, and on the other hand, great difficulty is brought to the popularization of the navigation technology; the active luminous LED mark point belongs to a point light source, has structures such as a lens and a packaging shell, and can not provide a consistent and stable central point for vision when shooting at different angles.
In addition, because the output end of the assembly tool is often provided with a long tool sleeve or other extension device, the device generally has a certain fit clearance for adapting to the operation requirement, and then the positioning error of the marking point is large due to the lever amplification effect, and the instability is amplified.
Disclosure of Invention
In view of the problems in the background art, the present invention is directed to a method for performing assembly tool spatial positioning and attitude navigation using machine vision. In order to achieve the above object, the present invention adopts the following technical solutions.
The method for carrying out space positioning and attitude navigation on the assembly tool by utilizing machine vision comprises the following steps:
step 1, calibrating a spatial position relation between an output end of an assembling tool and a passive light-reflecting marking point device by adopting a vision measurement host and a calibration plate, and establishing a coordinate relation between the output end of the assembling tool and the passive light-reflecting marking point device; the passive light-reflecting marking point device is arranged on the assembling tool; the passive light-reflecting marking point device is essentially a target;
step 2, acquiring an image of the passive light reflecting marking point device on a camera coordinate system, calculating the space posture of the passive light reflecting marking point device under the camera coordinate system, and calculating to obtain the position and the posture of the output end of the assembling tool under the camera coordinate system;
the camera coordinate system belongs to knowledge known by technicians in the field and is a Cartesian coordinate system established by taking a camera lens optical center as an origin, taking a lens optical axis as a Z axis, taking a two-dimensional image X axis of a camera as an X axis and taking a two-dimensional image Y axis as a Y axis;
step 3, teaching: moving the assembling tool with the passive light reflecting marking point device to one or more positions and postures needing to work one by one; when the assembling tool is started, the vision measurement host is adopted to obtain corresponding position and posture data of the whole process from the beginning to the completion of the operation;
step 4, in the working process of the assembly system, a vision measurement host is adopted to obtain the spatial position and the attitude data of the assembly tool in real time, and the spatial position and the attitude data are compared with the teaching position and the attitude data; and if the spatial position and attitude data acquired in real time are matched with the teaching position and attitude data, starting the operation of the assembling tool.
As a preferred scheme of the invention, the method for establishing the coordinate relationship between the output end of the assembling tool and the passive light-reflecting marking point device comprises the following steps:
establishing a local coordinate system C according to the known three-dimensional structure { Pi } of the mark points of the passive light-reflecting mark point device;
placing the output end of the assembling tool on a connecting port of a calibration plate, measuring a space coordinate { Qi } of a mark point on a passive light reflecting mark point device under a camera coordinate system by using a vision measurement host machine, and calculating to obtain a linear transformation (R, T), so that Pi is equal to R + Qi + T;
measuring the position and the posture of the calibration plate by using a vision measurement host (because the relative positions of the marker points and the connectors of the calibration plate are known (known in advance) during processing, the positions of the connectors of the calibration plate can be obtained by calculation), obtaining the coordinate M of the output end of the assembly tool under a camera coordinate system by calculation, and transforming the coordinate M 'into R + M + T according to the linear transformation (R, T), wherein the obtained M' is the coordinate of the output end of the assembly tool under a local coordinate system C;
similarly, calculating to obtain a representation N' of the direction vector N of the output end of the assembling tool under the local coordinate system C;
the resulting (M ', N') constitutes a complete description of the six degrees of freedom of the output of the assembly tool.
As a preferred scheme of the invention, the method for calculating the spatial posture of the output end of the assembling tool under the camera coordinate system comprises the following steps:
according to the coordinate relation of the output end of the assembling tool and the passive light-reflecting mark point device, the spatial position and the posture of the output end of the assembling tool are obtained through conversion, and the method specifically comprises the following steps:
measuring a space coordinate { Qi } of a passive glistening mark point of the passive glistening mark point device under a camera coordinate system by using a vision measurement host, setting a coordinate of the passive glistening mark point device under a local coordinate system C as { Pi }, calculating to obtain another linear transformation (R ', T'), applying the linear transformation to a coordinate M 'and a direction vector N' of an output end of an assembling tool under the local coordinate system C to obtain M ═ R '+ M' and N ═ R '+ N', and obtaining a complete six-degree-of-freedom description (M, N) of the output end of the assembling tool under the camera coordinate system.
As a preferred embodiment of the present invention, in step 4, the spatial position and posture data may be compared with the taught position and posture data by using a "preset logical sequence requirement and tolerance range", and the comparison result is output to the control system of the assembly tool in real time.
In a preferred embodiment of the invention, the vision measuring host adopts a monocular or binocular vision three-dimensional measuring host with infrared luminescence.
As a preferred scheme of the invention, a plurality of passive light reflecting marking points are arranged on the passive light reflecting marking point device, and one or more passive light reflecting marking point devices are arranged on the assembling tool to solve possible sight blind areas under different angles.
By adopting the technical scheme, the invention provides a brand-new method for space positioning and attitude navigation of the assembly tool, utilizes a machine vision technology and a passive light-reflecting mark device to carry out space positioning and attitude tracking of the assembly tool,
the assembly tool space positioning and posture tracking are wireless, and wireless use operation habits of most operators are not required to be changed;
the space positioning and posture navigation of the assembling tool are realized without mechanical interference, so that the assembling workpiece and the surrounding objects do not interfere with the assembling tool, and the operation flexibility is ensured;
the rapid response of the assembly tool space positioning and the attitude navigation is realized, and the test response time is only about 0.1 s;
the high-precision space positioning of the assembly tool is realized, and the measuring and positioning precision can reach 0.1-1mm within 2m of a monocular or binocular vision three-dimensional measuring host;
the mark structure of the assembly tool does not need power supply, has light and small structure, and does not interfere with the holding of the original tool;
the mounting tool mark structure is mounted without damage, and the mounting tool mark structure can be mounted on the existing tool without redesigning and producing the mounting tool by a tool manufacturer;
has great significance for the quality control of the assembly process, the error prevention, the leakage prevention and the fool prevention of the assembly process.
Drawings
FIG. 1 is a schematic diagram illustrating the embodiment of the present invention when a vision measuring host is used to calibrate the spatial position relationship between the output end of an assembling tool and a passive reflective marking point device;
FIG. 2 is a schematic illustration of the spatial positioning and attitude navigation of an assembly tool using machine vision during assembly;
FIG. 3 is a schematic diagram of a calibration plate used when a vision measurement host is used to calibrate the spatial position relationship between the output end of the assembly tool and the passive reflective marker point device;
FIG. 4 is an assembly start position attitude view;
FIG. 5 is an assembly completed position attitude view;
FIG. 6 is a flow chart for assembly tool spatial positioning and pose navigation using machine vision.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and specific embodiments, but the following embodiments are only used for understanding the principle of the present invention and the core idea thereof, and do not limit the scope of the present invention. It should be noted that modifications to the invention as described herein, which do not depart from the principles of the invention, are intended to be within the scope of the claims which follow.
Examples
As shown in fig. 1, 2 and 3, the system for spatial positioning and attitude navigation of an assembly tool using machine vision includes: the assembly tool control system 5 is connected with the monocular or binocular vision three-dimensional measuring host 1 through wireless or wired communication 6, the passive reflective marking point device 2 is arranged on the assembly tool 3, a plurality of passive reflective marking points are arranged on the passive reflective marking point device 2, the three-dimensional structure of the marking points is obtained by adopting mechanical processing or accurate calibration of the binocular vision three-dimensional measuring host, one or a plurality of passive reflective marking point devices 2 can be arranged on the assembly tool 3, and the monocular or binocular vision three-dimensional measuring host 1 outputs the result to the assembly tool control system 5 in real time through various wireless or wired communication protocols; the output end coordinate of the assembling tool 3 is calibrated by the calibration plate 4, a connector is arranged on the calibration plate 4, the axis of the connector coincides with the axis of the output end of the assembling tool 3, the end face is in close contact with the connector, the connector is used for closely connecting the output end of the assembling tool 3 (namely the lower end of the output end of the assembling tool 3), and a plurality of calibration points are arranged on the same plane on the calibration plate 4 on the periphery of the connector.
The method for carrying out space positioning and attitude navigation on the assembly tool by utilizing machine vision comprises the following steps:
step 1, calibrating the space position relation of the output end of the assembling tool and the passive light-reflecting marking point device by adopting a vision measuring host 1 and a calibration plate 4, and establishing the coordinate relation of the output end of the assembling tool and the passive light-reflecting marking point device, specifically:
establishing a local coordinate system C according to the known three-dimensional structure { Pi } of the mark points of the passive light-reflecting mark point device;
placing the output end of the assembling tool 3 in a connecting port of a calibration plate 4, measuring a space coordinate { Qi } of a mark point under a camera coordinate system by using a vision measurement host machine, and calculating to obtain a linear transformation (R, T) so that Pi is equal to R Qi + T;
measuring the position and the posture of the calibration plate 4 by using the vision measuring host 1 (because the relative positions of the mark points and the connectors of the calibration plate 4 are known during processing, the positions of the connectors of the calibration plate 4 can be obtained by calculation), finally obtaining the coordinate M of the output end of the assembly tool, and transforming the coordinate M into M '═ R + M according to the linear transformation (R, T), wherein the obtained M' is the coordinate of the output end of the assembly tool in a local coordinate system C;
because the passive light-reflecting marking point device is fixed on the assembling tool and belongs to a rigid body with the assembling tool, the coordinate of the output end of the assembling tool under the local coordinate system C is constant, and the expression N' of the direction vector N of the output end of the assembling tool under the local coordinate system C can be calculated in the same way;
the resulting (M ', N') constitutes a complete description of the six degrees of freedom of the output of the assembly tool;
step 2, acquiring an image of the passive light reflecting marking point device on a camera coordinate system, and calculating the space posture of the passive light reflecting marking point device under the camera coordinate system, specifically:
measuring a space coordinate { Qi } of a mark point of the passive light reflecting mark point device in a camera coordinate system by adopting a vision measurement host, setting a coordinate of the mark point in a local coordinate system C as { Pi }, and calculating to obtain another linear transformation (R ', T'), so that Qi is R '. Pi + T';
applying the linear transformation to the coordinates M 'and the direction vector N' of the output end of the assembly tool in the local coordinate system C to obtain M '. times.R'. + T 'and N'. times.M '. times.N', and obtaining a complete description (M, N) of the output end of the assembly tool in six degrees of freedom in the camera coordinate system;
step 3, teaching: the assembly tool with the passive light reflecting marking point device is moved to one or more positions and postures needing to work one by one, the effective operation range of the position and posture data corresponding to the completion of standard assembly from the beginning to the qualified end is obtained by adopting the vision measurement host, and the logic sequence and the tolerance range are preset.
Step 4, in the working process of the assembly system, a vision measurement host is adopted to obtain the spatial position and the attitude data of the output end of the assembly tool in real time, the spatial position and the attitude data are compared with the teaching position and the attitude data, and the comparison result is output to a control system of the assembly tool in real time; and if the spatial position and attitude data acquired in real time are matched with the effective operation range, the assembling tool is allowed to be started.
In one specific operation scheme, as shown in fig. 1 to 6, taking an electric wrench to tighten a bolt 7 as an example, a process for assembling a first workpiece 8 and a second workpiece 9, and performing space positioning and attitude navigation of an assembling tool by using machine vision is as follows:
starting calibration, firstly adjusting and confirming that a calibration plate 4 is in the visual field of a camera (namely a monocular or binocular vision three-dimensional measurement host 1), then placing the output end of an assembling tool 3 with a passive reflective marking point device 2 in a calibration plate connecting port (namely a connecting port on the calibration plate 4), and starting to calibrate the output end of the assembling tool 3;
establishing a coordinate relation between the output end of the assembling tool and the passive light-reflecting marking point device 2;
and (5) starting teaching: firstly, moving the assembling tool 3 to the starting operation coordinate posture of the N procedures and confirming (as shown in figure 4), and then moving the assembling tool 3 to the finishing operation coordinate posture of the N procedures and confirming (as shown in figure 5); the procedure is that the vision measurement host 1 is adopted to obtain the corresponding teaching position and attitude data of the output end of the assembly tool; the movement posture and the track of the assembling tool from the beginning to the end of the operation position can be recorded, so that the posture track of the assembling tool from the beginning to the completion of qualified screwing is taught;
after the teaching is finished, in the actual working process of the assembly system, firstly, the visual measurement host 1 is adopted to obtain the spatial position and the attitude data of the assembly tool 3 in real time, the position and the attitude of the output end of the assembly tool are calculated at the same time, the spatial position and the attitude data are compared with the taught position and the attitude data, and if the spatial position and the attitude data obtained in real time are matched with the taught position and the attitude data (namely the position and the attitude of the output end of the assembly tool, which are measured by the visual measurement host, are matched with the N process steps corresponding to the teaching), the assembly tool operation is started (the output work sequence number informs the assembly tool to operate and the bolt tightening operation; then, when the position and the posture of the output end of the measuring and assembling tool measured by the vision host machine are matched with the operation posture finished by the N working procedures corresponding to the teaching, the bolt screwing operation is finished;
and starting the next process operation until all the processes are finished.
In the specific operation scheme of the invention, the spatial position and the track of the assembly tool 3 for completing the operation are preset mainly in a teaching mode, and the relation between the current position and the teaching position is compared and displayed; in the assembly process, when the assembly tool 3 enters a teaching-specified spatial position and posture in a correct posture or track, the vision measurement host sends a qualified in-place signal to the assembly tool control system 5, an operator can start the assembly tool to work until the assembly is qualified, and coordinate and posture data are provided; furthermore, an alarm can be provided when the setting tool 3 leaves the working area.

Claims (6)

1. The method for carrying out space positioning and attitude navigation on the assembly tool by utilizing machine vision is characterized by comprising the following steps of:
step 1, calibrating a spatial position relation between an output end of an assembling tool and a passive light-reflecting marking point device by adopting a vision measurement host and a calibration plate, and establishing a coordinate relation between the output end of the assembling tool and the passive light-reflecting marking point device; the passive light-reflecting marking point device is arranged on the assembling tool;
step 2, acquiring an image of the passive light reflecting marking point device on a camera coordinate system, calculating the space posture of the passive light reflecting marking point device under the camera coordinate system, and calculating to obtain the position and the posture of the output end of the assembling tool under the camera coordinate system;
step 3, teaching: moving the assembling tool with the passive light reflecting marking point device to one or more positions and postures needing to work one by one; when the assembling tool is started, the vision measurement host is adopted to obtain corresponding position and posture data of the whole process from the beginning to the completion of the operation;
step 4, in the working process of the assembly system, a vision measurement host is adopted to obtain the spatial position and the attitude data of the assembly tool in real time, and the spatial position and the attitude data are compared with the teaching position and the attitude data; and if the spatial position and attitude data acquired in real time are matched with the teaching position and attitude data, starting the operation of the assembling tool.
2. The method of claim 1, wherein the coordinate relationship between the output of the assembly tool and the passive retro-reflective marker means is established by:
establishing a local coordinate system C according to the known three-dimensional structure { Pi } of the mark points of the passive light-reflecting mark point device;
placing the output end of the assembling tool on a connecting port of a calibration plate, measuring a space coordinate { Qi } of a mark point on a passive light reflecting mark point device under a camera coordinate system by using a vision measurement host machine, and calculating to obtain a linear transformation (R, T), so that Pi is equal to R + Qi + T; measuring the position and the posture of the calibration plate by adopting a vision measurement host, calculating to obtain a coordinate M of the output end of the assembly tool under a camera coordinate system, and transforming the coordinate M according to the linear transformation (R, T), wherein the M' is the coordinate of the output end of the assembly tool under a local coordinate system C;
similarly, calculating to obtain a representation N' of the direction vector N of the output end of the assembling tool under the local coordinate system C;
the resulting (M ', N') constitutes a complete description of the six degrees of freedom of the output of the assembly tool.
3. The method of claim 2, wherein the method of calculating the spatial pose of the output end of the assembly tool in the camera coordinate system comprises:
according to the coordinate relation of the output end of the assembling tool and the passive light-reflecting mark point device, the spatial position and the posture of the output end of the assembling tool are obtained through conversion, and the method specifically comprises the following steps:
measuring a space coordinate { Qi } of a passive glistening mark point of the passive glistening mark point device under a camera coordinate system by using a vision measurement host, setting a coordinate of the passive glistening mark point device under a local coordinate system C as { Pi }, calculating to obtain another linear transformation (R ', T'), applying the linear transformation to a coordinate M 'and a direction vector N' of an output end of an assembling tool under the local coordinate system C to obtain M ═ R '+ M' and N ═ R '+ N', and obtaining a complete six-degree-of-freedom description (M, N) of the output end of the assembling tool under the camera coordinate system.
4. The method of claim 3, wherein the spatial position and attitude data can be compared with the taught position and attitude data using a "preset logical sequence requirement and tolerance range", and the comparison result is outputted to the control system of the assembly tool in real time.
5. The method according to any one of claims 1 to 4, wherein the vision measuring host machine adopts a monocular or binocular vision three-dimensional measuring host machine with infrared luminescence.
6. The method of claim 5, wherein the passive retro-reflective marker dot means is provided with a plurality of passive retro-reflective marker dots and the assembly tool is provided with one or more passive retro-reflective marker dot means.
CN202010728001.3A 2020-07-23 2020-07-23 Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision Pending CN111781894A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010728001.3A CN111781894A (en) 2020-07-23 2020-07-23 Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010728001.3A CN111781894A (en) 2020-07-23 2020-07-23 Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision

Publications (1)

Publication Number Publication Date
CN111781894A true CN111781894A (en) 2020-10-16

Family

ID=72764184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010728001.3A Pending CN111781894A (en) 2020-07-23 2020-07-23 Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision

Country Status (1)

Country Link
CN (1) CN111781894A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112558545A (en) * 2020-11-17 2021-03-26 沈机(上海)智能系统研发设计有限公司 Interactive system, method and storage medium based on machine tool machining
CN113093356A (en) * 2021-03-18 2021-07-09 北京空间机电研究所 Large-scale block optical component assembling method based on mechanical arm
CN113211444A (en) * 2021-05-20 2021-08-06 菲烁易维(重庆)科技有限公司 System and method for robot calibration
CN114305683A (en) * 2021-12-03 2022-04-12 哈尔滨工业大学 Surgical instrument registration device and method
CN114905511A (en) * 2022-05-12 2022-08-16 南京航空航天大学 Industrial robot assembly error detection and precision compensation system calibration method
CN115570562A (en) * 2022-09-05 2023-01-06 梅卡曼德(北京)机器人科技有限公司 Robot assembly pose determining method and device, robot and storage medium
CN117531948A (en) * 2024-01-10 2024-02-09 南京航空航天大学 Man-machine cooperation riveting system and cooperation riveting method

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112558545A (en) * 2020-11-17 2021-03-26 沈机(上海)智能系统研发设计有限公司 Interactive system, method and storage medium based on machine tool machining
CN112558545B (en) * 2020-11-17 2022-07-15 沈机(上海)智能系统研发设计有限公司 Interactive system, method and storage medium based on machine tool machining
CN113093356A (en) * 2021-03-18 2021-07-09 北京空间机电研究所 Large-scale block optical component assembling method based on mechanical arm
CN113093356B (en) * 2021-03-18 2022-08-12 北京空间机电研究所 Large-scale block optical component assembling method based on mechanical arm
CN113211444A (en) * 2021-05-20 2021-08-06 菲烁易维(重庆)科技有限公司 System and method for robot calibration
CN114305683A (en) * 2021-12-03 2022-04-12 哈尔滨工业大学 Surgical instrument registration device and method
CN114305683B (en) * 2021-12-03 2023-01-06 哈尔滨工业大学 Surgical instrument registration device and method
CN114905511A (en) * 2022-05-12 2022-08-16 南京航空航天大学 Industrial robot assembly error detection and precision compensation system calibration method
CN114905511B (en) * 2022-05-12 2023-08-11 南京航空航天大学 Industrial robot assembly error detection and precision compensation system calibration method
CN115570562A (en) * 2022-09-05 2023-01-06 梅卡曼德(北京)机器人科技有限公司 Robot assembly pose determining method and device, robot and storage medium
CN117531948A (en) * 2024-01-10 2024-02-09 南京航空航天大学 Man-machine cooperation riveting system and cooperation riveting method
CN117531948B (en) * 2024-01-10 2024-04-05 南京航空航天大学 Man-machine cooperation riveting system and cooperation riveting method

Similar Documents

Publication Publication Date Title
CN111781894A (en) Method for carrying out space positioning and attitude navigation on assembly tool by using machine vision
CN109974584B (en) Calibration system and calibration method for auxiliary laser osteotomy robot
CN109373898B (en) Complex part pose estimation system and method based on three-dimensional measurement point cloud
CN109794963B (en) Robot rapid positioning method facing curved surface component
US9517560B2 (en) Robot system and calibration method of the robot system
CN110370316B (en) Robot TCP calibration method based on vertical reflection
JP5618770B2 (en) Robot calibration apparatus and calibration method
CN106355614B (en) Mechanical system correcting and monitoring device
CN113601158B (en) Bolt feeding pre-tightening system based on visual positioning and control method
WO2023193362A1 (en) Hybrid robot and three-dimensional vision based large-scale structural part automatic welding system and method
CN114643578B (en) Calibration device and method for improving robot vision guiding precision
CN112082477A (en) Universal tool microscope three-dimensional measuring device and method based on structured light
CN113146613B (en) Three-dimensional self-calibration device and method for D-H parameters of industrial robot
CN109059755B (en) High-precision hand-eye calibration method for robot
CN105737735A (en) Portable self-calibration end performer repetition positioning precision measurement device and method
CN114001653A (en) Calibration method for central point of robot tool
CN113211444B (en) System and method for robot calibration
CN113681559B (en) Line laser scanning robot hand-eye calibration method based on standard cylinder
Hvilshøj et al. Calibration techniques for industrial mobile manipulators: Theoretical configurations and best practices
CN215037637U (en) Camera external parameter calibration device for visual guidance of industrial robot
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN110962127A (en) Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof
CN105571491A (en) Binocular vision-based automobile chassis data measuring system and method thereof
CN111006706A (en) Rotating shaft calibration method based on line laser vision sensor
CN114643577B (en) Universal robot vision automatic calibration device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination