CN108711174B - Approximate parallel vision positioning system for mechanical arm - Google Patents

Approximate parallel vision positioning system for mechanical arm Download PDF

Info

Publication number
CN108711174B
CN108711174B CN201810331537.4A CN201810331537A CN108711174B CN 108711174 B CN108711174 B CN 108711174B CN 201810331537 A CN201810331537 A CN 201810331537A CN 108711174 B CN108711174 B CN 108711174B
Authority
CN
China
Prior art keywords
coordinate system
operation panel
camera
coordinate
needing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810331537.4A
Other languages
Chinese (zh)
Other versions
CN108711174A (en
Inventor
骆无意
巩庆海
柳嘉润
吴润
刘志钧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Launch Vehicle Technology CALT
Beijing Aerospace Automatic Control Research Institute
Original Assignee
China Academy of Launch Vehicle Technology CALT
Beijing Aerospace Automatic Control Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Launch Vehicle Technology CALT, Beijing Aerospace Automatic Control Research Institute filed Critical China Academy of Launch Vehicle Technology CALT
Priority to CN201810331537.4A priority Critical patent/CN108711174B/en
Publication of CN108711174A publication Critical patent/CN108711174A/en
Application granted granted Critical
Publication of CN108711174B publication Critical patent/CN108711174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Abstract

A mechanical arm approximately parallel vision positioning system relates to the technical field of computer vision and industrial automation; the device comprises a test emission control module and a test operation module; the test emission control module comprises a display screen and an operation panel; the test operation module comprises a first camera, a second camera, a coordinate conversion module, an execution mechanism and a control host; the used camera is in a plane vision mode, and the coordinate position of the target in each frame of picture shot by the camera cannot be well related to the coordinate system of the mechanical arm; the coordinate system of the camera is well related to the coordinate of the mechanical arm; the invention provides a simple approximate parallel vision positioning method. The method is simple and easy to understand, easy to realize programming, and capable of easily performing repositioning marking after the relative position of the mechanical arm and the camera is changed, and is a mode very suitable for visual positioning of the mechanical arm.

Description

Approximate parallel vision positioning system for mechanical arm
Technical Field
The invention relates to the technical field of computer vision and industrial automation, in particular to an approximately parallel vision positioning system for a mechanical arm.
Background
In recent years, many studies have been made on automated unmanned production lines based on visual feedback, but the control methods for different robots are different in different scenes.
Most of the existing mechanical arms have no visual feedback part when executing tasks, have poor environment-adapting capability, and can not finish the tasks under the original planning path when some execution target positions move.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, provides the approximately parallel vision positioning system for the mechanical arm, adapts to one-key positioning after the relative position of the mechanical arm and the camera is changed at will, and has the advantages of simple operation, easy realization and error allowance range.
The above purpose of the invention is realized by the following technical scheme:
the approximately parallel vision positioning system of mechanical arm is characterized in that: the device comprises a test emission control module and a test operation module; the test emission control module comprises a display screen and an operation panel; the test operation module comprises a first camera, a second camera, a coordinate conversion module, an execution mechanism and a control host;
a first camera: shooting and judging the display screen, and judging whether the display screen pops up a control instruction window or not; if a command window pops up on a display screen, identifying the content of the command window, generating command identification information and sending the command identification information to the control host; if the control instruction window is not identified, the display screen pops up, and no processing is carried out;
the control host computer: receiving instruction identification information transmitted by the first camera, and starting the second camera according to the instruction identification information; receiving the coordinates marked in the visual coordinate system by the button needing to execute the operation and transmitted by the second camera; sending the coordinates marked by the buttons needing to be operated in the visual coordinate system to a coordinate conversion module; receiving the coordinate position in the execution coordinate system transmitted by the coordinate conversion module; generating a control instruction and sending the control instruction to an execution mechanism;
a coordinate conversion module: receiving the coordinates marked in the visual coordinate system by the button needing to execute the operation and transmitted by the control host; establishing an execution coordinate system; performing coordinate conversion, namely converting the coordinates marked by the button needing to be operated in the visual coordinate system into coordinate positions in the executive coordinate system; sending the coordinate position in the execution coordinate system to a control host;
a second camera: shooting an operation panel to form a picture, and establishing a visual coordinate system for the shot picture; searching a button needing to be operated in the instruction identification information in the picture; marking the coordinates of the buttons needing to be operated in the instruction identification information in a visual coordinate system; sending the coordinates marked by the buttons needing to be operated in the visual coordinate system to a control host;
an executing mechanism: and receiving a control instruction transmitted by the control host, and executing the operation on the buttons of the operation panel according to the coordinate position in the execution coordinate system.
In the above approximately parallel visual positioning system for a robot arm, the method for establishing the visual coordinate system includes: taking the upper left corner of the picture shot by the operation panel as a coordinate origin O; the horizontal rightward direction is the + X direction; the vertically downward direction is the + Y direction.
In the above system for approximately parallel vision positioning of a robot arm, the method for establishing the execution coordinate system includes: in the picture shot by the operation panel, the upper left corner of the operation panel is taken as an origin O1, and the horizontal right direction of the operation panel is taken as a + X1 direction; the vertically downward direction of the operation panel is the direction + Y1.
In the above approximately parallel vision positioning system for the mechanical arm, when the second camera shoots the operation panel to form a picture, the side edge of the picture is parallel to the side edge of the operation panel.
In the above-mentioned system for approximately parallel visual positioning of a robot arm, the second camera takes a picture of the operation panel, and the pixel number is 640 × 480.
In the above system for approximately parallel visual positioning of a robot arm, a method for converting coordinates marked by a button to be operated in a visual coordinate system into coordinate positions in an execution coordinate system comprises:
setting the real coordinates of the button needing to be operated on the operation panel as (x2, y 2); the coordinates of the button needing to perform the operation in the visual coordinate system are (x, y); the coordinates of the button needing to perform the operation in the execution coordinate system are (x1, y 1); the size scaling coefficient of the operation panel and the operation panel photo is A;
then: x1 ═ a (y-y2) + x 2;
y1=A(x-x2)+y2。
in the above system for approximately parallel visual positioning of a robot arm, the calculation method of the size scaling factor a includes: setting the actual left vertical side length of the operation panel as L; the length of the left vertical side of the operation panel in the operation panel photo is L1, then:
A=L/L1。
compared with the prior art, the invention has the following advantages:
(1) the method can obtain the desired result without complicated calculation, and the precision meets the reliability;
(2) the invention does not need to refer to other external articles by means of the operation object per se;
(3) according to the invention, after the position of the camera or the mechanical arm moves, a coordinate system can be quickly reestablished;
(4) according to the invention, the single camera realizes positioning in a plane area, and the positioning precision is consistent with that of the mechanical arm.
Drawings
FIG. 1 is a schematic view of a parallel vision positioning system according to the present invention.
Detailed Description
The invention is described in further detail below with reference to the following figures and specific examples:
according to the invention, the camera is added at the top end of the mechanical arm, and the visual servo principle is introduced through secondary development of the mechanical arm, so that the mechanical arm can intelligently judge and execute some operations through the camera vision. The used camera is in a plane vision mode, and the coordinate position of the target in each frame of picture shot by the camera cannot be well related to the coordinate system of the mechanical arm. In order to well correlate the coordinate system of the camera and the coordinate of the mechanical arm, a simple approximate parallel vision positioning method is provided. The method is simple and easy to understand, easy to realize programming, and capable of easily performing repositioning marking after the relative position of the mechanical arm and the camera is changed, and is a mode very suitable for visual positioning of the mechanical arm.
As shown in fig. 1, which is a schematic view of a parallel vision positioning system, it can be known that a robot approximate parallel vision positioning system includes a test launch control module and a test operation module; the test emission control module comprises a display screen and an operation panel; the test operation module comprises a first camera, a second camera, a coordinate conversion module, an execution mechanism and a control host;
a first camera: shooting and judging the display screen, and judging whether the display screen pops up a control instruction window or not; if a command window pops up on a display screen, identifying the content of the command window, generating command identification information and sending the command identification information to the control host; if the control instruction window is not identified, the display screen pops up, and no processing is carried out;
the control host computer: receiving instruction identification information transmitted by the first camera, and starting the second camera according to the instruction identification information; receiving the coordinates marked in the visual coordinate system by the button needing to execute the operation and transmitted by the second camera; sending the coordinates marked by the buttons needing to be operated in the visual coordinate system to a coordinate conversion module; receiving the coordinate position in the execution coordinate system transmitted by the coordinate conversion module; generating a control instruction and sending the control instruction to an execution mechanism;
a coordinate conversion module: receiving the coordinates marked in the visual coordinate system by the button needing to execute the operation and transmitted by the control host; establishing an execution coordinate system; the establishment method of the execution coordinate system comprises the following steps: in the picture shot by the operation panel, the upper left corner of the operation panel is taken as an origin O1, and the horizontal right direction of the operation panel is taken as a + X1 direction; the vertically downward direction of the operation panel is the direction + Y1. Performing coordinate conversion, namely converting the coordinates marked by the button needing to be operated in the visual coordinate system into coordinate positions in the executive coordinate system; the method for converting the coordinates marked by the button needing to execute the operation in the visual coordinate system into the coordinate position in the execution coordinate system comprises the following steps:
setting the real coordinates of the button needing to be operated on the operation panel as (x2, y 2); the coordinates of the button needing to perform the operation in the visual coordinate system are (x, y); the coordinates of the button needing to perform the operation in the execution coordinate system are (x1, y 1); the size scaling coefficient of the operation panel and the operation panel photo is A;
then: x1 ═ a (y-y2) + x 2;
y1=A(x-x2)+y2。
the calculation method with the size scaling coefficient A comprises the following steps: setting the actual left vertical side length of the operation panel as L; the length of the left vertical side of the operation panel in the operation panel photo is L1;
then:
A=L/L1。
sending the coordinate position in the execution coordinate system to a control host;
a second camera: shooting the operation panel to form a picture, wherein when the second camera shoots the operation panel to form the picture, the side edge of the picture is parallel to the side edge of the operation panel; the second camera takes a picture of the operation panel, and the pixels are 640 x 480. Establishing a visual coordinate system for the shot picture; the method for establishing the visual coordinate system comprises the following steps: taking the upper left corner of the picture shot by the operation panel as a coordinate origin O; the horizontal rightward direction is the + X direction; the vertically downward direction is the + Y direction. Searching a button needing to be operated in the instruction identification information in the picture; marking the coordinates of the buttons needing to be operated in the instruction identification information in a visual coordinate system; sending the coordinates marked by the buttons needing to be operated in the visual coordinate system to a control host;
an executing mechanism: and receiving a control instruction transmitted by the control host, and executing the operation on the buttons of the operation panel according to the coordinate position in the execution coordinate system.
Has the advantages that:
the invention provides a simple approximate parallel vision positioning method. The method is simple and easy to understand, easy to realize programming, and capable of easily performing repositioning marking after the relative position of the mechanical arm and the camera is changed, and is a mode very suitable for visual positioning of the mechanical arm. The mode can adapt to one-key positioning after the relative position of the mechanical arm and the camera changes at will, is simple to operate and easy to realize, and is within an error allowable range.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (5)

1. The approximately parallel vision positioning system of mechanical arm is characterized in that: the device comprises a test emission control module and a test operation module; the test emission control module comprises a display screen and an operation panel; the test operation module comprises a first camera, a second camera, a coordinate conversion module, an execution mechanism and a control host;
a first camera: shooting and judging the display screen, and judging whether the display screen pops up a control instruction window or not; if a command window pops up on a display screen, identifying the content of the command window, generating command identification information and sending the command identification information to the control host; if the control instruction window is not identified, the display screen pops up, and no processing is carried out;
the control host computer: receiving instruction identification information transmitted by the first camera, and starting the second camera according to the instruction identification information; receiving the coordinates marked in the visual coordinate system by the button needing to execute the operation and transmitted by the second camera; sending the coordinates marked by the buttons needing to be operated in the visual coordinate system to a coordinate conversion module; receiving the coordinate position in the execution coordinate system transmitted by the coordinate conversion module; generating a control instruction and sending the control instruction to an execution mechanism;
a coordinate conversion module: receiving the coordinates marked in the visual coordinate system by the button needing to execute the operation and transmitted by the control host; establishing an execution coordinate system; performing coordinate conversion, namely converting the coordinates marked by the button needing to be operated in the visual coordinate system into coordinate positions in the executive coordinate system; sending the coordinate position in the execution coordinate system to a control host;
a second camera: shooting an operation panel to form a picture, and establishing a visual coordinate system for the shot picture; searching a button needing to be operated in the instruction identification information in the picture; marking the coordinates of the buttons needing to be operated in the instruction identification information in a visual coordinate system; sending the coordinates marked by the buttons needing to be operated in the visual coordinate system to a control host;
an executing mechanism: receiving a control instruction transmitted by the control host, and executing the operation on the button of the operation panel according to the coordinate position in the execution coordinate system;
the method for establishing the visual coordinate system comprises the following steps: taking the upper left corner of the picture shot by the operation panel as a coordinate origin O; the horizontal rightward direction is the + X direction; the vertically downward direction is the + Y direction;
the establishment method of the execution coordinate system comprises the following steps: in the picture shot by the operation panel, the upper left corner of the operation panel is taken as an origin O1, and the horizontal right direction of the operation panel is taken as a + X1 direction; the vertically downward direction of the operation panel is the direction + Y1.
2. The system of claim 1, wherein the robotic arm is configured to be positioned in a substantially parallel vision orientation mode, the system further comprising: when the second camera shoots the operation panel to form a picture, the side edge of the picture is parallel to the side edge of the operation panel.
3. The system of claim 2, wherein the robotic arm is configured to be positioned in a substantially parallel vision orientation mode, the system further comprising: the second camera takes a formed picture of the operation panel, and the pixels are 640 x 480.
4. The system of claim 3, wherein the robotic arm is configured to be positioned in a substantially parallel vision orientation mode, the system further comprising: the method for converting the coordinates marked by the button needing to execute the operation in the visual coordinate system into the coordinate position in the execution coordinate system comprises the following steps:
setting the actual coordinates of the buttons needing to be operated on the operation panel as (x2, y 2); the coordinates of the button needing to perform the operation in the visual coordinate system are (x, y); the coordinates of the button needing to perform the operation in the execution coordinate system are (x1, y 1); the size scaling coefficient of the operation panel and the operation panel photo is A;
then: x1 ═ a (y-y2) + x 2;
y1=A(x-x2)+y2。
5. the system of claim 4, wherein the robotic arm is configured to be positioned in a substantially parallel vision orientation mode, the system further comprising: the calculation method of the size scaling factor A comprises the following steps: setting the actual left vertical side length of the operation panel as L; the length of the left vertical side of the operation panel in the operation panel photo is L1, then:
A=L/L1。
CN201810331537.4A 2018-04-13 2018-04-13 Approximate parallel vision positioning system for mechanical arm Active CN108711174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810331537.4A CN108711174B (en) 2018-04-13 2018-04-13 Approximate parallel vision positioning system for mechanical arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810331537.4A CN108711174B (en) 2018-04-13 2018-04-13 Approximate parallel vision positioning system for mechanical arm

Publications (2)

Publication Number Publication Date
CN108711174A CN108711174A (en) 2018-10-26
CN108711174B true CN108711174B (en) 2021-12-07

Family

ID=63866687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810331537.4A Active CN108711174B (en) 2018-04-13 2018-04-13 Approximate parallel vision positioning system for mechanical arm

Country Status (1)

Country Link
CN (1) CN108711174B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109754421A (en) * 2018-12-31 2019-05-14 深圳市越疆科技有限公司 A kind of vision calibration method, device and robot controller
CN109839557A (en) * 2019-01-14 2019-06-04 普联技术有限公司 Automatization test system, method and test platform

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106354316A (en) * 2016-08-31 2017-01-25 广东格兰仕集团有限公司 Operation panel based on AR technology and image recognition technology
CN106986272A (en) * 2017-02-24 2017-07-28 北京航天自动控制研究所 It is a kind of to prevent slinging method and system based on the container container car that machine vision is tracked

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9336629B2 (en) * 2013-01-30 2016-05-10 F3 & Associates, Inc. Coordinate geometry augmented reality process
US9264702B2 (en) * 2013-08-19 2016-02-16 Qualcomm Incorporated Automatic calibration of scene camera for optical see-through head mounted display

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106354316A (en) * 2016-08-31 2017-01-25 广东格兰仕集团有限公司 Operation panel based on AR technology and image recognition technology
CN106986272A (en) * 2017-02-24 2017-07-28 北京航天自动控制研究所 It is a kind of to prevent slinging method and system based on the container container car that machine vision is tracked

Also Published As

Publication number Publication date
CN108711174A (en) 2018-10-26

Similar Documents

Publication Publication Date Title
CN110497386B (en) Automatic calibration method for hand-eye relationship of cooperative robot
CN111452040B (en) System and method for associating machine vision coordinate space in a pilot assembly environment
CN107256567B (en) Automatic calibration device and calibration method for hand-eye camera of industrial robot
WO2020090809A1 (en) External input device, robot system, control method for robot system, control program, and recording medium
TWI670153B (en) Robot and robot system
US20200101613A1 (en) Information processing apparatus, information processing method, and system
US10059005B2 (en) Method for teaching a robotic arm to pick or place an object
CN110102490B (en) Assembly line parcel sorting device based on vision technology and electronic equipment
US8977378B2 (en) Systems and methods of using a hieroglyphic machine interface language for communication with auxiliary robotics in rapid fabrication environments
CN108711174B (en) Approximate parallel vision positioning system for mechanical arm
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
US20190030722A1 (en) Control device, robot system, and control method
JP2014188617A (en) Robot control system, robot, robot control method, and program
CN111390910A (en) Manipulator target grabbing and positioning method, computer readable storage medium and manipulator
US10471592B2 (en) Programming method of a robot arm
CN113504063B (en) Three-dimensional space touch screen equipment visualization test method based on multi-axis mechanical arm
US20210197391A1 (en) Robot control device, robot control method, and robot control non-transitory computer readable medium
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
WO2023102647A1 (en) Method for automated 3d part localization and adjustment of robot end-effectors
WO2022105575A1 (en) Image processing method and related device
WO2022107684A1 (en) Device for adjusting parameter, robot system, method, and computer program
CN114193440A (en) Robot automatic grabbing system and method based on 3D vision
CN113858214A (en) Positioning method and control system for robot operation
CN107379015B (en) Robot arm correction device and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant