CN112577423A - Method for machine vision position location including in motion and application thereof - Google Patents

Method for machine vision position location including in motion and application thereof Download PDF

Info

Publication number
CN112577423A
CN112577423A CN202011099720.XA CN202011099720A CN112577423A CN 112577423 A CN112577423 A CN 112577423A CN 202011099720 A CN202011099720 A CN 202011099720A CN 112577423 A CN112577423 A CN 112577423A
Authority
CN
China
Prior art keywords
coordinate system
jig
station
processing
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011099720.XA
Other languages
Chinese (zh)
Other versions
CN112577423B (en
Inventor
彭伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Lingyun Photoelectronic System Co ltd
Original Assignee
Wuhan Lingyun Photoelectronic System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Lingyun Photoelectronic System Co ltd filed Critical Wuhan Lingyun Photoelectronic System Co ltd
Priority to CN202011099720.XA priority Critical patent/CN112577423B/en
Publication of CN112577423A publication Critical patent/CN112577423A/en
Application granted granted Critical
Publication of CN112577423B publication Critical patent/CN112577423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Abstract

The invention discloses a method for positioning machine vision position in motion and application thereof. The jig coordinate system is the same as or is mapped, converted and associated with a processing executing mechanism coordinate system on a processing station; in the process that the jig moves to the processing station, when the marking reference point and the workpiece to be processed are positioned in the visual field range of the vision system camera, a clear image is collected, and the position N (x) of the marking reference point under the image physical coordinate system is obtainedn,yn) And the position M (x) of the workpiece to be processed in the image physical coordinate systemm,ym) The position M (x) of the workpiece to be processed in the image physical coordinate systemm,ym) Converting into position coordinates M' (x) of jig coordinate systemm’,ym') and obtaining the processing coordinates of the workpiece to be processed at the processing station. The vision system can realize accurate and reliable processing position correction without stable stop of the processed object in the moving process, and eliminates start-stop acceleration and deceleration and start-stop acceleration and stop of the moving mechanism at the positioning identification pointThe time is stabilized in place, thereby further improving the production and processing efficiency.

Description

Method for machine vision position location including in motion and application thereof
Technical Field
The invention belongs to a machine vision positioning technology in the field of industrial processing, and particularly relates to a technology for positioning a substitute processing workpiece by utilizing machine vision when the workpiece is processed.
Background
In some industrial processing systems, such as dispensing, soldering, welding, and laser processing systems, in order to improve the processing accuracy, a vision-aided positioning system is commonly used in cooperation. After the positioning camera in fig. 1 collects an image, the image is processed to generate position information of an object to be processed, and the rear end performs position correction post-processing by using the position information. The simplified main processes of the processing system are feeding fixation, visual identification and positioning and end processing. In actual production, for safety consideration and other objective factors, a processing object often has a small movement distance between each flow point. The stay starting and stopping of the object to be processed moving to the visual identification point inevitably increases the time cost and reduces the production efficiency.
In the conventional solution (fig. 2), there are generally three preset fixing positions: a feeding fixing position A, a visual identification position B and a processing position C. And the vision system corrects the image physical coordinate system at the position B through pre-calibration and associates or coincides with the processing coordinate system at the position C. And the position information of the object to be processed obtained by visual processing at the position B is the processing position corrected at the position C. However, when the motion mechanism is not moving stably to position B (not in place or overshooting), the actual coordinate system in the image field of view is offset or even rotated from the machining coordinate system at C. In this case, the position information of the processing object visually recognized will be erroneous, which is one of the common causes of errors in the visual positioning data. Therefore, the scheme can obtain accurate and reliable position correction information only when the object is still at a known position, and time is wasted when the object enters the state.
Disclosure of Invention
The invention aims to provide a method for positioning a machine vision position in motion and application thereof, so as to improve the beat of workpiece feeding and conveying processing and further improve the processing efficiency.
One of the technical schemes of the invention is as follows: the method comprises a feeding station, a vision system recognition station, a processing station and a jig, wherein the feeding station, the vision system recognition station and the processing station are sequentially arranged at intervals, a marking reference point is arranged on the jig, a jig coordinate system is established based on the marking reference point, and the jig coordinate system is the same as or is in mapping conversion association with a processing execution mechanism coordinate system on the processing station; the workpiece to be processed is fixedly arranged on the jig at the feeding station, and when the jig moves to the processing station, the reference point is markedWhen the workpiece to be processed is positioned in the visual field range of the vision system camera, a clear image of a frame is collected, and the position N (x) of the mark reference point under the image physical coordinate system is obtainedn,yn) And the position M (x) of the workpiece to be processed in the image physical coordinate systemm,ym) The position M (x) of the workpiece to be processed in the image physical coordinate systemm,ym) Converting into position coordinate M of jig coordinate system(xm ,ym ) And obtaining the processing coordinate of the workpiece to be processed at the processing station.
The further preferred technical scheme is as follows: the method for collecting the clear image of one frame comprises the step of adjusting the shutter speed of the vision system camera by utilizing the movement speed of the jig.
The further preferred technical scheme is as follows: the marking reference point constructing jig coordinate system is a rectangular coordinate system with the marking reference point as an original point.
The further preferred technical scheme is as follows: a positioning structure is arranged between the jig and the machining station, and after the jig moves to the machining station, the jig is positioned through the positioning structure, so that the positioning of the mark reference point on the jig at the machining station is realized.
The further preferred technical scheme is as follows: the coordinate system of the jig is unified with the coordinate system of the processing executing mechanism on the processing station in a calibration manner or the association of mapping conversion is carried out after the jig and the processing station are positioned.
The further preferred technical scheme is as follows: the coordinate system of the jig is unified with the coordinate system of the processing executing mechanism on the processing station in a calibration manner or the association of mapping conversion is carried out after the jig and the processing station are positioned.
The further preferred technical scheme is as follows: the feeding stations are arranged at intervals in sequence, a conveying device is arranged between the visual system recognition station and the machining station, and the jig is arranged on the conveying device in the machine visual position positioning process.
The further preferred technical scheme is as follows: the step of collecting a clear image comprises collecting a clear image in a jig moving state or collecting a clear image in a jig static state.
The second technical scheme of the invention is to use the method in the laser processing of the workpiece.
The coordinate system of the jig and the coordinate system of the processing executing mechanism of the processing station are calibrated uniformly or are associated with mapping conversion; according to the invention, the position coordinate of the processed workpiece in the image coordinate system is converted into the position coordinate of the processed workpiece in the jig coordinate system through the image coordinate relationship of the vision system, and after the jig is positioned at the processing station, the marking reference point on the jig is also positioned at the processing station, so that the position coordinate of the processed workpiece on the jig, or the position coordinate of the processed workpiece on the processing station, is determined. Like this, can realize gathering tool (work piece) image in the motion process, improve image acquisition's efficiency greatly, simultaneously, also can realize that arbitrary setting is processed the work piece on same tool, improves the tool to the compatibility of work piece, and then improves the efficiency in material loading stage, improves the beat that the work piece material loading transported processing. The method can be widely used for processing scenes based on machine vision, and is particularly suitable for laser processing, such as laser welding and the like.
Drawings
FIG. 1 is a schematic view of a visual imaging system
FIG. 2 is a schematic view of a vision-assisted positioning system.
FIG. 3 is a schematic view of a coordinate system of the jig of the present invention.
Fig. 4 is a schematic diagram of an effective acquisition frame in a motion state.
In the illustration, 1-camera of vision system; 2-a camera lens; 3-a vision light source of a vision system; 4-a jig; 5-a workpiece to be processed; 6-normal shooting position indication; 7-the workpiece does not arrive at the position indicating position; 8-workpiece overshoot schematic position; 9-marking a reference point on the jig; 10-image physical coordinate system; 11-a jig coordinate system; 12-carrying the processing zone, 13-visual system camera field of view (calibration) diagram. A-a feeding station; b-visual system identification station; c-processing station.
Detailed Description
The following detailed description is provided for the purpose of explaining the claimed embodiments of the present invention so that those skilled in the art can understand the claims. The scope of the invention is not limited to the following specific implementation configurations. It is intended that the scope of the invention be determined by those skilled in the art from the following detailed description, which includes claims that are directed to this invention.
The jig is used for fixing the workpiece to be processed and can move among all stations. But are not limited to, a load-bearing platform (stage) and clamp structure or a load-bearing platform (stage) and a hold-down structure or clamp may be employed.
The mark reference points arranged on the jig are not limited to convex points or concave points arranged on a certain position on the jig, or coated colored points, or adhered mark points.
The jig coordinate system is constructed by a rectangular coordinate system with the mark reference point as an original point. The coordinate system can be a data coordinate system in software, or can comprise a data coordinate system in software and a physical coordinate system;
the image physical coordinate system is the coordinate system of camera imaging in the vision system, which is a conventional arrangement in the machine vision recognition technology.
The coordinate system of the processing executing mechanism is a processing position, and the coordinate system of the software data set for determining the position of the processed workpiece is a conventional setting for determining the position of the processed workpiece in the machine vision identification technology.
The conveying device arranged between the feeding station A, the vision system identification station B and the processing station C which are arranged at intervals in sequence can adopt but not limited to a transmission belt, a transmission chain, a robot and the like.
The positioning structure arranged between the jig and the processing station can adopt but is not limited to a positioning pin, a positioning buckle, a positioning bulge and a positioning groove.
After the jig coordinate system is established, the jig is positioned at the processing station, and the processing execution mechanism coordinate system on the processing station is calibrated by using the jig coordinate system, so that the processing execution mechanism coordinate system and the jig coordinate system can be calibrated into the same coordinate system (the two coordinate systems are overlapped), or a fixed conversion mapping conversion association (comprising translation and/or rotation angle) is formed between the two coordinate systems.
In this embodiment, the coordinate system of the processing executing mechanism and the coordinate system of the jig are calibrated to be the same.
This embodiment is used for laser processing of a workpiece. And a vibrating mirror processing head is arranged at the processing station C.
In the embodiment, the reference mark point of the convex coloring is arranged in the non-bearing non-processing area on the carrier of the jig (carrier and clamp). FIG. 3 is a schematic view;
the workpiece to be machined is fixedly arranged on the jig at a feeding station, the jig and the workpiece to be machined are moved to the machining station by the transmission device, the visual system identifies the station, after the reference point is marked and the workpiece to be machined is located in the visual field range of the visual system camera, the sensor of the visual system identification station is triggered, the camera starts to shoot and collect a clear frame of image, and the image comprises the workpiece to be machined and the reference point. And in the process of acquiring the image by the vision system, the jig does not stop moving.
In order to obtain clear images, the shutter speed (frame rate) of the vision system camera is adjusted by using the jig movement speed. It can adopt the fixed movement speed set by the transmission device and the set shutter speed; the moving speed of the conveying device can be detected in real time, and the shooting shutter speed of the camera can be dynamically adjusted.
Acquiring images of a workpiece to be processed and a marked reference point, and acquiring a position N (x) of the marked reference point in an image physical coordinate system based on the image physical coordinate systemn,yn) And the position M (x) of the workpiece to be processed in the image physical coordinate systemm,ym) Calculating the position coordinate M of the workpiece M point under the jig coordinate system by the transformation principle of the two-dimensional plane coordinate system(xm ,ym ) As shown in fig. 4.
After the jig moves to the processing station C and is positioned by a positioning structure arranged between the jig and the processing station, the position coordinate M of the obtained workpiece M point under the jig coordinate system(xm ,ym ) Are the machining coordinates. After the workpiece is loaded and clamped, the relative position of the workpiece in the jig is always kept unchanged, namely the relative positions of M and N are unchanged, so that the workpiece processing coordinate information obtained in all effective acquisition frame images of the current workpiece is consistent. Namely, the actual processing position of the workpiece is accurately calculated in the movement process.
The invention solves the defects in the conventional scheme, so that the vision system can realize accurate and reliable processing position correction without stably stopping the processed object in the moving process, and eliminates the start-stop acceleration and deceleration and in-place stabilization time of the moving mechanism at the positioning identification point, thereby further improving the production and processing efficiency.

Claims (9)

1. A method for positioning a machine vision position in motion comprises a feeding station, a vision system identification station, a processing station and a jig which are sequentially arranged at intervals, and is characterized in that a marking reference point is arranged on the jig, a jig coordinate system is established based on the marking reference point, and the jig coordinate system is the same as or is mapped, converted and associated with a processing execution mechanism coordinate system on the processing station; a workpiece to be processed is fixedly arranged on the jig at a feeding station, when the jig moves to the processing station and the marked reference point and the workpiece to be processed are positioned in the visual field range of the camera of the visual system, a clear image is acquired, and the position N (x) of the marked reference point under the image physical coordinate system is acquiredn,yn) And the position M (x) of the workpiece to be processed in the image physical coordinate systemm,ym) The position M (x) of the workpiece to be processed in the image physical coordinate systemm,ym) Converting into position coordinate M of jig coordinate system(xm ,ym ) And obtaining the processing coordinate of the workpiece to be processed at the processing station.
2. The method of claim 1 including performing machine vision position location during motion, wherein the step of capturing a clear frame of image includes adjusting a shutter speed of a vision system camera using a tool motion speed.
3. The method of claim 1 including machine vision position location during motion wherein the marking reference points to construct the tool coordinate system is a rectangular coordinate system with the marking reference points as the origin.
4. A method as claimed in claim 1 or 3 for machine vision position location during movement, wherein after the jig has moved to the machining station, the jig is positioned by the positioning structure, thereby enabling the positioning of the reference points marked on the jig at the machining station.
5. A method as claimed in claim 1 or 3 including machine vision position location during movement, wherein the tool coordinate system is unified with or mapped to the machine actuator coordinate system at the machining station by locating the tool with the machining station and then unifying the coordinate system or mapping.
6. The method as claimed in claim 4, comprising the step of performing machine vision position positioning during the movement, wherein the jig coordinate system is unified with the coordinate system of the processing execution mechanism on the processing station or the mapping conversion is associated with the jig coordinate system after the jig is positioned with the processing station.
7. The method as claimed in claim 1, including the step of positioning the machine vision position during the movement, wherein conveyors are provided between the loading station, the vision system recognition station and the processing station, said jigs being provided on the conveyors during the positioning of the machine vision position.
8. The method of claim 1 including machine vision position location while in motion, wherein capturing a clear image includes capturing a clear image while the fixture is in motion or capturing a clear image while the fixture is stationary.
9. Use of a method comprising machine vision position positioning in motion, characterized in that it is used in laser machining of a workpiece according to the method of claims 1-8.
CN202011099720.XA 2020-10-13 2020-10-13 Method for machine vision position location in motion and application thereof Active CN112577423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011099720.XA CN112577423B (en) 2020-10-13 2020-10-13 Method for machine vision position location in motion and application thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011099720.XA CN112577423B (en) 2020-10-13 2020-10-13 Method for machine vision position location in motion and application thereof

Publications (2)

Publication Number Publication Date
CN112577423A true CN112577423A (en) 2021-03-30
CN112577423B CN112577423B (en) 2022-09-09

Family

ID=75119838

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011099720.XA Active CN112577423B (en) 2020-10-13 2020-10-13 Method for machine vision position location in motion and application thereof

Country Status (1)

Country Link
CN (1) CN112577423B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706611A (en) * 2021-10-22 2021-11-26 成都新西旺自动化科技有限公司 High-precision correction control system and correction method based on visual precision movement mechanism
CN114147664A (en) * 2021-12-09 2022-03-08 苏州华星光电技术有限公司 Jig replacing method and electronic equipment manufacturing method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001074428A (en) * 1999-09-03 2001-03-23 Sanyo Electric Co Ltd Method and jig for calibrating shape measuring apparatus
US20030211409A1 (en) * 2002-05-10 2003-11-13 Christopher C. Nunes Through-the-lens alignment for photolithography
CN106426161A (en) * 2015-08-06 2017-02-22 康耐视公司 System and method for interlinking machine vision coordinate spaces together in a guide assembly environment
CN106903426A (en) * 2017-04-01 2017-06-30 广东顺威精密塑料股份有限公司 A kind of laser welding localization method based on machine vision
US20180047178A1 (en) * 2016-08-12 2018-02-15 Robert L Kay Determination of relative positions
CN108062550A (en) * 2018-02-08 2018-05-22 唐山英莱科技有限公司 A kind of welding position calibration system and method
CN110328461A (en) * 2019-03-19 2019-10-15 重庆金康动力新能源有限公司 Pad localization method and welding spot positioning device
CN110508930A (en) * 2019-08-22 2019-11-29 湖北工业大学 The localization method of PCB on-line marking
CN110559077A (en) * 2018-06-05 2019-12-13 上海联影医疗科技有限公司 Coordinate system registration method, robot control method, device, equipment and medium
CN111360789A (en) * 2020-03-23 2020-07-03 广东美的白色家电技术创新中心有限公司 Workpiece processing teaching method, control method and robot teaching system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001074428A (en) * 1999-09-03 2001-03-23 Sanyo Electric Co Ltd Method and jig for calibrating shape measuring apparatus
US20030211409A1 (en) * 2002-05-10 2003-11-13 Christopher C. Nunes Through-the-lens alignment for photolithography
CN106426161A (en) * 2015-08-06 2017-02-22 康耐视公司 System and method for interlinking machine vision coordinate spaces together in a guide assembly environment
US20180047178A1 (en) * 2016-08-12 2018-02-15 Robert L Kay Determination of relative positions
CN106903426A (en) * 2017-04-01 2017-06-30 广东顺威精密塑料股份有限公司 A kind of laser welding localization method based on machine vision
CN108062550A (en) * 2018-02-08 2018-05-22 唐山英莱科技有限公司 A kind of welding position calibration system and method
CN110559077A (en) * 2018-06-05 2019-12-13 上海联影医疗科技有限公司 Coordinate system registration method, robot control method, device, equipment and medium
CN110328461A (en) * 2019-03-19 2019-10-15 重庆金康动力新能源有限公司 Pad localization method and welding spot positioning device
CN110508930A (en) * 2019-08-22 2019-11-29 湖北工业大学 The localization method of PCB on-line marking
CN111360789A (en) * 2020-03-23 2020-07-03 广东美的白色家电技术创新中心有限公司 Workpiece processing teaching method, control method and robot teaching system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
WU XING等: "Coordinated path tracking of two vision-guided tractors for heavy-duty robotic vehicles", 《ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING》 *
勾治践等: "智能三坐标视觉测量在线标定", 《长春工业大学学报》 *
周春: "基于结构光视觉引导的工业机器人定位系统设计分析", 《山东工业技术》 *
黄义等: "考虑舰艇运动的舰炮弹道修正弹速度模型", 《探测与控制学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706611A (en) * 2021-10-22 2021-11-26 成都新西旺自动化科技有限公司 High-precision correction control system and correction method based on visual precision movement mechanism
CN113706611B (en) * 2021-10-22 2022-04-12 成都新西旺自动化科技有限公司 High-precision correction control system and correction method based on visual precision movement mechanism
CN114147664A (en) * 2021-12-09 2022-03-08 苏州华星光电技术有限公司 Jig replacing method and electronic equipment manufacturing method

Also Published As

Publication number Publication date
CN112577423B (en) 2022-09-09

Similar Documents

Publication Publication Date Title
US8346392B2 (en) Method and system for the high-precision positioning of at least one object in a final location in space
CN108818536B (en) Online offset correction method and device for robot hand-eye calibration
US10232512B2 (en) Coordinate system setting method, coordinate system setting apparatus, and robot system provided with coordinate system setting apparatus
CN110125926B (en) Automatic workpiece picking and placing method and system
US11964396B2 (en) Device and method for acquiring deviation amount of working position of tool
US11254006B2 (en) Robot device
CN112577423B (en) Method for machine vision position location in motion and application thereof
CN110881748A (en) Robot sole automatic gluing system and method based on 3D scanning
EP1003212A2 (en) Method of and apparatus for bonding light-emitting element
EP2836869A1 (en) Active alignment using continuous motion sweeps and temporal interpolation
JP2012528016A (en) Method and system for accurately positioning at least one object in a final pose in space
CN112334760A (en) Method and device for locating points on complex surfaces in space
US20200189108A1 (en) Work robot and work position correction method
CN112247525A (en) Intelligent assembling system based on visual positioning
CN109732601B (en) Method and device for automatically calibrating pose of robot to be perpendicular to optical axis of camera
CN112536539A (en) Feeding system and method of laser processing equipment
US20170255181A1 (en) Measurement apparatus, system, measurement method, and article manufacturing method
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
CN111482964A (en) Novel robot hand-eye calibration method
US20170328706A1 (en) Measuring apparatus, robot apparatus, robot system, measuring method, control method, and article manufacturing method
CN213890029U (en) AI visual control automatic switch-over robot system based on degree of depth learning
CN112170124B (en) Visual positioning method and device for vehicle body and vehicle frame
CN112834505B (en) Three-dimensional visual detection positioning device and method for pasted welding line of pipeline workpiece
JPH09222913A (en) Teaching position correcting device for robot
CN110977950B (en) Robot grabbing and positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant