CN114227674A - Mechanical arm navigation method based on visual identification and positioning - Google Patents
Mechanical arm navigation method based on visual identification and positioning Download PDFInfo
- Publication number
- CN114227674A CN114227674A CN202111492837.9A CN202111492837A CN114227674A CN 114227674 A CN114227674 A CN 114227674A CN 202111492837 A CN202111492837 A CN 202111492837A CN 114227674 A CN114227674 A CN 114227674A
- Authority
- CN
- China
- Prior art keywords
- mechanical arm
- target position
- coordinate
- axis
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 27
- 238000000034 method Methods 0.000 title claims abstract description 25
- 230000005484 gravity Effects 0.000 claims description 6
- 238000005259 measurement Methods 0.000 claims description 6
- 238000010276 construction Methods 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000002002 slurry Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The invention provides a mechanical arm navigation method based on visual identification and positioning, which comprises the following steps: adjusting the mechanical arm until the end part of the mechanical arm is parallel to the outermost vertical plane of the target position; the visual identification module identifies two-dimensional coordinates of a center point of a target position; taking the initial position of the end part of the mechanical arm as an original point, establishing a plane coordinate axis, and taking the central point of the visual range of the camera as a reference point; obtaining a two-dimensional coordinate of the target object according to the coordinate offset of the coordinate of the central point of the target position and the coordinate of the origin; adjusting the coordinate from the end of the mechanical arm to the reference point to be the same as the two-dimensional coordinate of the central point of the target position; acquiring the distance between the end part of the mechanical arm and a target position through a distance measuring module; the control module drives the mechanical arm to advance by the obtained distance, so that the mechanical arm reaches a target position; the invention can enable the mechanical arm of the construction robot to realize identification, positioning and navigation on a three-dimensional coordinate, so that the mechanical arm can move to a specific position.
Description
Technical Field
The invention relates to the field of buildings, in particular to a mechanical arm navigation method based on visual identification and positioning.
Background
In recent years, the development of the building industry is accelerated, the influence of the robot on new potential is larger and larger, and the building robot is accelerated to be integrated into the transformation and upgrading of the building industry from the market demand of the existing engineering machinery building industry, so that the automatic and intelligent revolution of the building field is promoted continuously.
When construction is carried out through the construction robot pair, when the construction robot faces different construction processes, the mechanical arm of the construction robot needs to reach certain specific positions; the visual recognition technology in the prior art can recognize a required object, but only provides one two-dimensional coordinate, so that the control of the mechanical arm can be realized only in two dimensions.
Disclosure of Invention
The invention provides a mechanical arm navigation method based on visual identification and positioning.
In order to achieve the purpose, the technical scheme of the invention is as follows: a mechanical arm navigation method based on visual identification positioning comprises the following steps:
and S1, adjusting the mechanical arm until the end of the mechanical arm is parallel to the outermost vertical plane of the target position.
And S2, recognizing the two-dimensional coordinates of the center point of the target position by the vision recognition module.
The initial position of the end part of the mechanical arm is used as an original point, a plane coordinate axis is established, and a central point of a camera sight distance range is used as a reference point; obtaining two-dimensional coordinates (X1, Y1) of the target object according to the coordinate offset of the coordinates of the center point of the target position and the coordinates (0, 0) of the origin; the reference point coordinates are (X2, Y2).
And S3, adjusting the coordinate from the end part of the mechanical arm to the reference point to be the same as the two-dimensional coordinate of the center point of the target position.
And S4, acquiring the distance Z2 between the end part of the mechanical arm and the target position through the ranging module.
S5, the control module drives the robotic arm to advance a Z2 distance so that the robotic arm reaches the target position.
According to the method, the mechanical arm is adjusted firstly, so that the end part of the mechanical arm and the outermost vertical plane where the target position is located are in a parallel state, the posture of the end part of the mechanical arm is corrected, and the accuracy of subsequent actions of the mechanical arm is prevented from being influenced by errors of the end part of the mechanical arm; then, acquiring two-dimensional coordinates of a central point of a target position through a visual recognition module, establishing a plane coordinate axis by taking an initial position of the end part of the mechanical arm as an original point, and taking a central point of a camera sight distance range as a reference point; adjusting the coordinate from the end of the mechanical arm to the reference point to be the same as the two-dimensional coordinate of the central point of the target position; the mechanical arm can be adjusted based on a plane coordinate system, the end part of the mechanical arm can be accurately positioned on the same straight line with the central point of the target position after adjustment, then the distance Z2 between the end part of the mechanical arm and the target position is obtained through the ranging module, the control module drives the mechanical arm to keep the current two-position coordinate, the mechanical arm moves forward to the central point of the target position by the distance Z2, and the end part of the mechanical arm can reach the central point of the target position.
Further, in step S1, the tilt angle sensor measures an angle perpendicular to the gravity to obtain a horizontal angle, the control module determines whether the feedback angle of the tilt angle sensor is positive or negative based on the horizontal angle, so as to adjust the tilt angle of the end of the mechanical arm until the feedback angle of the tilt angle sensor is parallel to the gravity direction, the end of the mechanical arm is a vertical plane, and the vertical plane where the end of the mechanical arm is located is parallel to the outermost vertical plane where the target position is located; therefore, the posture of the end part of the mechanical arm can be corrected, and the influence of the error of the end part of the mechanical arm on the accuracy of the follow-up action of the mechanical arm is prevented.
Further, the step S3 specifically includes: the control module calculates quotient d according to the coordinate offset of the coordinates (X1, Y1) of the center point of the target position and the coordinates (X2, Y2) of the reference point and the actual distance ratio of the coordinates, and obtains the movement amount of the mechanical arm on the X axis and the Y axis; the control module drives the mechanical arm to move according to the feedback movement amount and the movement direction, so that the reference point coordinates (X2, Y2) are equal to the two-dimensional coordinates (X1, Y1) of the target object; therefore, the control module can acquire the movement amount between the end part of the mechanical arm and the central point of the target position through the visual recognition module, and the movement amount is converted into the actual movement amount of the mechanical arm on the x axis and the y axis, so that the adjustment is accurate.
Further, step S3 specifically includes: s3.1, when the mechanical arm moves to the center point of the target position, the moving distance on the x axis is L1, and the moving distance on the y axis is L2; the ratio of the coordinate value to the actual distance of the coordinate is d; wherein L1= d · X1; l2= d · Y1; s3.2, the horizontal moving speed of the mechanical arm is V1, and the vertical moving speed of the mechanical arm is V2; when the mechanical arm moves to the central point of the target position, the moving time required by the x axis is L1/V1; the required movement time on the y-axis is L2/V2; s3.3, the mechanical arm moves for L1/V1 seconds according to the x-axis moving direction fed back by the control module and then stops acting; the mechanical arm stops after moving for L2/V2 seconds according to the y-axis moving direction fed back by the control module.
Further, in step S4, the method specifically includes: when the two-dimensional coordinate of the mechanical arm is equal to the two-dimensional coordinate of the target position, acquiring the actual distance Z2 between the end part of the mechanical arm and the target position through a ranging module; establishing a three-dimensional coordinate system, taking the two-dimensional coordinate origin of the mechanical arm in the step S3 as the origin of the x axis and the y axis in the three-dimensional coordinate system, and taking the center of the current position of the mechanical arm as the origin of the z axis in the three-dimensional coordinate system; obtaining a coordinate Z1 of the center point of the target position on the Z axis according to the product of the distance Z2 and the actual distance ratio of the coordinate, and further obtaining three-dimensional coordinates (X1, Y1, Z1) of the target position; in the above arrangement, based on the plane coordinate system of step S3, the distance measurement module is used to obtain the actual distance Z2 from the end of the mechanical arm to the target position, so as to establish a three-dimensional coordinate system, and the offset of the center point of the end of the mechanical arm and the target position on the three-dimensional coordinate system can be obtained in real time.
Further, in step S5, the method specifically includes: the forward moving speed of the mechanical arm is V3; the forward moving time is Z1/V3; the mechanical arm stops after moving forward for Z1/V3 seconds.
Further, in step S4, if the target position center point is hollow, a reflector is disposed on one side of the target position center point; therefore, the problem that the actual distance between the end part of the mechanical arm and the target position cannot be accurately identified by the ranging module when the target position is set to be a through hole or a hollow position can be solved.
Furthermore, a distance measuring module, a visual identification module, a control module and an inclination angle sensor are arranged on the mechanical arm; the distance measurement module, the visual identification module and the inclination angle sensor are respectively arranged at the end part of the mechanical arm; the control module is arranged on the mechanical arm, and the distance measuring module, the vision recognition module and the tilt sensor are respectively and electrically connected with the control module.
Drawings
FIG. 1 is a schematic structural diagram of the present invention.
FIG. 2 is a block diagram of the process flow of the present invention.
Reference numerals: 1. a mechanical arm; 11. the end part of the mechanical arm; 2. a distance measurement module; 3. a visual recognition module; 4. an inclination angle sensor.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1-2, a method for navigating a robot arm based on visual recognition positioning includes the following steps:
s1, adjusting the mechanical arm 1 to enable the end part 11 of the mechanical arm to be parallel to the outermost vertical plane where the target position is located; the angle perpendicular to the gravity is measured through the tilt angle sensor 4, so that a horizontal angle is obtained, the control module judges whether the feedback angle of the tilt angle sensor 4 is positive or negative by taking the horizontal angle as a reference, the tilt angle of the end part 11 of the mechanical arm is adjusted until the angle fed back by the tilt angle sensor 4 is parallel to the gravity direction, the end part of the mechanical arm 1 is a vertical plane, and at the moment, the vertical plane where the end part 11 of the mechanical arm is located is parallel to the outermost vertical plane where the target position is located.
And S2, recognizing the two-dimensional coordinates of the center point of the target position by the vision recognition module 3.
The initial position of the end part 11 of the mechanical arm is used as an original point, a plane coordinate axis is established, and a central point of a camera sight distance range is used as a reference point; obtaining two-dimensional coordinates (X1, Y1) of the target object according to the coordinate offset of the coordinates of the center point of the target position and the coordinates (0, 0) of the origin; the reference point coordinates are (X2, Y2).
S3, adjusting the coordinate from the end part 11 of the mechanical arm to the reference point to be the same as the two-dimensional coordinate of the center point of the target position; the control module calculates quotient d according to the coordinate offset of the coordinates (X1, Y1) of the center point of the target position and the coordinates (X2, Y2) of the reference point and the actual distance ratio of the coordinates, and obtains the movement amount of the mechanical arm 1 on the X axis and the Y axis; the control module moves the robot arm 1 according to the fed back movement amount and movement direction so that the reference point coordinates (X2, Y2) are equal to the target object two-dimensional coordinates (X1, Y1).
S3.1, acquiring that the moving distance on the x axis is L1 and the moving distance on the y axis is L2 when the mechanical arm 1 moves to the target position center point; the ratio of the coordinate value to the actual distance of the coordinate is d; wherein L1= d.x 1; l2= d.y 1; in the present embodiment, the coordinate actual distance ratio d is 1: 10, i.e. L1 is 10X1 and L2 is 10Y 1.
S3.2, the horizontal moving speed of the mechanical arm 1 is V1, and the vertical moving speed of the mechanical arm 1 is V2; when the mechanical arm 1 moves to the central point of the target position, the movement time required by the x axis is L1/V1; the required movement time on the y-axis is L2/V2.
S3.3, the mechanical arm 1 stops moving after moving for L1/V1 seconds according to the x-axis moving direction fed back by the control module; the robot arm 1 moves for L2/V2 seconds according to the y-axis moving direction fed back by the control module and then stops.
S4, the distance Z2 between the end 11 of the robot arm and the target position is acquired by the ranging module 2.
When the two-dimensional coordinates of the mechanical arm 1 are equal to the two-dimensional coordinates of the target position, the actual distance Z2 between the end portion 11 of the mechanical arm and the center point of the target position is obtained by the ranging module 2. If the center point of the target position is hollow, a reflecting piece is arranged on one side of the center point of the target position, and the reflecting piece can be made of any flexible or rigid material capable of reflecting laser emitted by a laser ranging sensor or sound waves emitted by an ultrasonic ranging sensor; therefore, when the construction process such as cast-in-place is needed, the target position can be a pouring opening, the target position is a through hole, the reflecting piece is arranged, and the distance measuring module 2 can accurately feed back the actual distance between the end part 11 of the mechanical arm and the pouring opening; and then the accessible arm 1 drives the pipe of pouring that is used for pouring the slurry and through inserting the pipe of pouring that discernment is the target location to when overcoming the target location and setting up for through-hole or cavity, the problem of the actual distance between arm tip 11 and the target location can't be discerned to range finding module 2 accurately.
Establishing a three-dimensional coordinate system, taking the two-dimensional coordinate origin of the mechanical arm 1 in the step S3 as the origin of the x axis and the y axis in the three-dimensional coordinate system, and taking the center of the current position of the mechanical arm 1 as the origin of the z axis in the three-dimensional coordinate system; and obtaining a coordinate Z1 of the central point of the target position on the Z axis according to the product of the distance Z2 and the actual distance ratio of the coordinate, and further obtaining three-dimensional coordinates (X1, Y1 and Z1) of the target position.
S5, the control module drives the mechanical arm 1 to advance for a Z2 distance, so that the mechanical arm 1 reaches the target position; the forward moving speed of the mechanical arm 1 is V3; the forward moving time is Z1/V3; the robot arm 1 stops after moving forward Z1/V3 seconds.
The mechanical arm 1 is provided with a distance measuring module 2, a visual recognition module 3, a control module and an inclination angle sensor 4; the distance measurement module 2, the vision recognition module 3 and the tilt sensor 4 are respectively arranged at the end part of the mechanical arm 1; the control module is arranged on the mechanical arm 1, and the distance measurement module 2, the vision recognition module 3 and the tilt sensor 4 are respectively and electrically connected with the control module; wherein, a base used for rotating the mechanical arm 1 is arranged at one end part of the mechanical arm 1 far away from the distance measuring module 2; the distance measuring module 2 is a laser distance measuring sensor; the visual sensor is a camera; the control module can be a CPU, a PLC or a singlechip and the like.
The mechanical arm 1 further comprises a driving mechanism for driving the mechanical arm 1 to adjust the position, wherein the driving mechanism comprises a horizontal motor for driving the end part 11 of the mechanical arm to move in the x-axis direction, a vertical motor for driving the end part 11 of the mechanical arm to move in the y-axis direction and a forward motor for driving the end part 11 of the mechanical arm to move in the z-axis direction.
Claims (8)
1. A mechanical arm navigation method based on visual identification and positioning is characterized in that: the method comprises the following steps:
s1, adjusting the mechanical arm until the end of the mechanical arm is parallel to the outermost vertical plane of the target position;
s2, recognizing the two-dimensional coordinates of the center point of the target position by the vision recognition module;
the initial position of the end part of the mechanical arm is used as an original point, a plane coordinate axis is established, and a central point of a camera sight distance range is used as a reference point; obtaining two-dimensional coordinates (X1, Y1) of the target object according to the coordinate offset of the coordinates of the center point of the target position and the coordinates (0, 0) of the origin; the reference point coordinates are (X2, Y2);
s3, adjusting the coordinate from the end part of the mechanical arm to the reference point to be the same as the two-dimensional coordinate of the center point of the target position;
s4, acquiring the distance Z2 between the end part of the mechanical arm and the target position through a ranging module;
s5, the control module drives the robotic arm to advance a Z2 distance so that the robotic arm reaches the target position.
2. The mechanical arm navigation method based on visual identification positioning as claimed in claim 1, wherein: in step S1, the tilt sensor measures an angle perpendicular to gravity to obtain a horizontal angle, and the control module determines whether the tilt sensor feeds back an angle based on the horizontal angle to adjust the tilt angle of the end of the mechanical arm until the angle fed back by the tilt sensor is parallel to the gravity direction, the end of the mechanical arm is a vertical plane, and the vertical plane where the end of the mechanical arm is located is parallel to the outermost vertical plane where the target position is located.
3. The mechanical arm navigation method based on visual identification positioning as claimed in claim 1, wherein: the step S3 specifically includes: the control module calculates quotient d according to the coordinate offset of the coordinates (X1, Y1) of the center point of the target position and the coordinates (X2, Y2) of the reference point and the actual distance ratio of the coordinates, and obtains the movement amount of the mechanical arm on the X axis and the Y axis; the control module drives the mechanical arm to move according to the feedback movement amount and the movement direction, so that the reference point coordinates (X2, Y2) are equal to the two-dimensional coordinates (X1, Y1) of the target object.
4. The mechanical arm navigation method based on visual identification positioning as claimed in claim 1, wherein: in step S3, the method specifically further includes:
s3.1, when the mechanical arm moves to the center point of the target position, the moving distance on the x axis is L1, and the moving distance on the y axis is L2; the ratio of the coordinate value to the actual distance of the coordinate is d; wherein L1= d · X1; l2= d · Y1;
s3.2, the horizontal moving speed of the mechanical arm is V1, and the vertical moving speed of the mechanical arm is V2; when the mechanical arm moves to the central point of the target position, the moving time required by the x axis is L1/V1; the required movement time on the y-axis is L2/V2;
s3.3, the mechanical arm moves for L1/V1 seconds according to the x-axis moving direction fed back by the control module and then stops acting; the mechanical arm stops after moving for L2/V2 seconds according to the y-axis moving direction fed back by the control module.
5. The mechanical arm navigation method based on visual identification positioning as claimed in claim 1, wherein: in step S4, the method specifically includes: when the two-dimensional coordinate of the mechanical arm is equal to the two-dimensional coordinate of the target position, acquiring an actual distance Z2 between the end part of the mechanical arm and the central point of the target position through a ranging module;
establishing a three-dimensional coordinate system, taking the two-dimensional coordinate origin of the mechanical arm in the step S3 as the origin of the x axis and the y axis in the three-dimensional coordinate system, and taking the center of the current position of the mechanical arm as the origin of the z axis in the three-dimensional coordinate system;
and obtaining a coordinate Z1 of the central point of the target position on the Z axis according to the product of the distance Z2 and the actual distance ratio of the coordinate, and further obtaining three-dimensional coordinates (X1, Y1 and Z1) of the target position.
6. The mechanical arm navigation method based on visual identification positioning as claimed in claim 1, wherein: in step S5, the method specifically includes: the forward moving speed of the mechanical arm is V3; the forward moving time is Z1/V3; the mechanical arm stops after moving forward for Z1/V3 seconds.
7. The mechanical arm navigation method based on visual identification positioning as claimed in claim 1, wherein: in step S4, if the target position center point is hollow, a reflector is disposed on one side of the target position center point.
8. The mechanical arm navigation method based on visual identification positioning as claimed in claim 1, wherein: the mechanical arm is provided with a distance measuring module, a visual identification module, a control module and an inclination angle sensor; the distance measurement module, the visual identification module and the inclination angle sensor are respectively arranged at the end part of the mechanical arm; the control module is arranged on the mechanical arm, and the distance measuring module, the vision recognition module and the tilt sensor are respectively and electrically connected with the control module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111492837.9A CN114227674A (en) | 2021-12-08 | 2021-12-08 | Mechanical arm navigation method based on visual identification and positioning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111492837.9A CN114227674A (en) | 2021-12-08 | 2021-12-08 | Mechanical arm navigation method based on visual identification and positioning |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114227674A true CN114227674A (en) | 2022-03-25 |
Family
ID=80754067
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111492837.9A Pending CN114227674A (en) | 2021-12-08 | 2021-12-08 | Mechanical arm navigation method based on visual identification and positioning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114227674A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114856182A (en) * | 2022-04-02 | 2022-08-05 | 广东天凛高新科技有限公司 | Precise positioning device and method based on mold robot |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS635408A (en) * | 1986-06-26 | 1988-01-11 | Fujitsu Ltd | Method of adaptive control for robot |
WO2015070696A1 (en) * | 2013-11-13 | 2015-05-21 | 三一汽车制造有限公司 | Boom control method and apparatus and concrete pump truck and spreader |
CN110842928A (en) * | 2019-12-04 | 2020-02-28 | 中科新松有限公司 | Visual guiding and positioning device and method for compound robot |
CN111191625A (en) * | 2020-01-03 | 2020-05-22 | 浙江大学 | Object identification and positioning method based on laser-monocular vision fusion |
US20200238518A1 (en) * | 2019-01-24 | 2020-07-30 | Fanuc Corporation | Following robot and work robot system |
KR20210022195A (en) * | 2019-08-19 | 2021-03-03 | 하이윈 테크놀로지스 코포레이션 | Calibration method for robot using vision technology |
CN112518748A (en) * | 2020-11-30 | 2021-03-19 | 广东工业大学 | Automatic grabbing method and system of vision mechanical arm for moving object |
-
2021
- 2021-12-08 CN CN202111492837.9A patent/CN114227674A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS635408A (en) * | 1986-06-26 | 1988-01-11 | Fujitsu Ltd | Method of adaptive control for robot |
WO2015070696A1 (en) * | 2013-11-13 | 2015-05-21 | 三一汽车制造有限公司 | Boom control method and apparatus and concrete pump truck and spreader |
US20200238518A1 (en) * | 2019-01-24 | 2020-07-30 | Fanuc Corporation | Following robot and work robot system |
KR20210022195A (en) * | 2019-08-19 | 2021-03-03 | 하이윈 테크놀로지스 코포레이션 | Calibration method for robot using vision technology |
CN110842928A (en) * | 2019-12-04 | 2020-02-28 | 中科新松有限公司 | Visual guiding and positioning device and method for compound robot |
CN111191625A (en) * | 2020-01-03 | 2020-05-22 | 浙江大学 | Object identification and positioning method based on laser-monocular vision fusion |
CN112518748A (en) * | 2020-11-30 | 2021-03-19 | 广东工业大学 | Automatic grabbing method and system of vision mechanical arm for moving object |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114856182A (en) * | 2022-04-02 | 2022-08-05 | 广东天凛高新科技有限公司 | Precise positioning device and method based on mold robot |
CN114856182B (en) * | 2022-04-02 | 2024-04-19 | 广东天凛高新科技有限公司 | Accurate positioning device and method based on mold robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106903687B (en) | Industrial robot calibration system and method based on laser ranging | |
CN108827264B (en) | Mobile workbench and its mechanical arm optics target positioning device and localization method | |
CN111226090B (en) | Laser tracker with improved roll angle measurement | |
CN109550649B (en) | Dispensing positioning method and device based on machine vision | |
US10611032B2 (en) | Measurement system | |
JP2004508954A (en) | Positioning device and system | |
CN104750115B (en) | A kind of laser active type navigation system of mobile device and air navigation aid | |
JP6557896B2 (en) | Radar axis deviation amount calculation device and radar axis deviation amount calculation method | |
JP2006110705A (en) | Calibration method of robot | |
JPH041505A (en) | Three-dimensional position measuring method and acquiring method for work | |
KR100948947B1 (en) | Localization apparatus of autonomous vehicle and method thereof | |
CN106989670B (en) | A kind of non-contact type high-precision large-scale workpiece tracking measurement method of robot collaboration | |
CN113319833A (en) | Cartesian coordinate robot calibration method and assembly system | |
CN111337908A (en) | Laser radar detection system and detection method thereof | |
CN105717499B (en) | Laser range finder deflects angular measurement and correction system and method | |
CN114227674A (en) | Mechanical arm navigation method based on visual identification and positioning | |
CN114562941B (en) | Accurate measurement system and method for relative wide area machine vision image | |
CN110471430B (en) | AGV local high-precision positioning navigation device | |
CN106949908B (en) | High-precision space motion track gesture tracking measurement correction method | |
CN113781558B (en) | Robot vision locating method with decoupling gesture and position | |
CN110624732A (en) | Automatic workpiece spraying system | |
JP2018194479A (en) | Reflection target | |
CN108363066A (en) | A kind of high-precision distance measurement method | |
JP2019078569A (en) | Position recognition method, position recognition device, moving body for reference point installation, moving body for work, and position recognition system | |
JPH08254409A (en) | Three-dimensional shape measuring and analyzing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |