CN113664834A - Assembly line material position identification and robot movement alignment method and system - Google Patents

Assembly line material position identification and robot movement alignment method and system Download PDF

Info

Publication number
CN113664834A
CN113664834A CN202111021321.6A CN202111021321A CN113664834A CN 113664834 A CN113664834 A CN 113664834A CN 202111021321 A CN202111021321 A CN 202111021321A CN 113664834 A CN113664834 A CN 113664834A
Authority
CN
China
Prior art keywords
scanning module
encoder
point
sensor
conveyor belt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111021321.6A
Other languages
Chinese (zh)
Other versions
CN113664834B (en
Inventor
万伟鑫
余栋栋
杨铿华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Qichuang Intelligent Technology Co ltd
Original Assignee
Guangdong Qichuang Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Qichuang Intelligent Technology Co ltd filed Critical Guangdong Qichuang Intelligent Technology Co ltd
Priority to CN202111021321.6A priority Critical patent/CN113664834B/en
Publication of CN113664834A publication Critical patent/CN113664834A/en
Application granted granted Critical
Publication of CN113664834B publication Critical patent/CN113664834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Abstract

The method for identifying the position of the material in the production line comprises the steps that a material sensor on a scanning module on the upper side of a conveyor belt of the production line is triggered by the front edge of the moving material to obtain a first position point; the servo driver of the drive scanning module drives the material sensor to reciprocate along the edge direction of one side of the material to at least obtain a second position point and a third position point of the side, and the encoder position of each position point is recorded through an encoder following the assembly line conveyor belt; and the controller calculates the position and the angle of the material under the coordinate system of the scanning module according to the motor position of the scanning module corresponding to each position point and the position data of the corresponding encoder. This patent still provides assembly line material position identification system, robot and removes counterpoint method and material and follow system of pasting, compares with prior art, and the material position identification method scheme of this patent is simple, and economic nature is good.

Description

Assembly line material position identification and robot movement alignment method and system
Technical Field
The invention relates to the technical field of industrial automatic production, in particular to a method and a system for identifying the position of a material in a production line and aligning the position of a robot in a moving way.
Background
In assembly line automation production process, use the conveyer belt to carry the material continuously usually, the material can produce the deviation owing to the position when placing to and carry the position error that also can produce to target station process when the material is in the material loading, therefore the material need detect the position and the angle of material when (or before) arriving the target station, in order to realize the automatic of next process and link up.
This patent is directed to a sheet-like material having a set of opposing linear edges a and side edges b at an angle (not parallel) to the linear edges a.
A common example is that in the production of boxes for electronic products, such as mobile phone boxes, the boxes need to be precisely grasped by a robot arm onto the face paper (i.e., sheet-shaped material) of a conveyor belt, and due to the deviation of the face paper during feeding, and the conveyor belt is moving, the position and angle of the face paper need to be precisely identified and tracked. The above implementation is just one example of an application, and other implementations may also be products such as ceramic tiles, cell phone glass, etc.
Chinese patent document CN109732604A discloses a method for performing robot movement alignment by electric eyes, which is used for positioning the position and angle of the leatheroid on the automatic packaging line of the mobile phone box. The method needs 3 electric eyes (material sensors), such as a longitudinal electric eye and two lateral electric eyes, the longitudinal electric eye identifies and positions the rear edge of the article to be packaged, the two lateral electric eyes identify and position the lateral edges of the article to be packaged, and the position and the angle of the paper skin are identified through data acquired by the three electric eyes. The method needs three electric eyes for detection, and has high cost and complex scheme.
Disclosure of Invention
The invention aims to solve the problems of high cost and complex scheme of detecting the position and the angle of a flake-shaped material by a plurality of material sensors in the prior art, and solves the problems by detecting the position point of at least one edge in the material conveying direction and the sum of two position points on one side edge by one material sensor.
A method for identifying the position of a material in a production line,a first position point is obtained by triggering the front edge of the moved material through a material sensor on a scanning module on the upper side of the assembly line conveyor belt; by driving the scanning dieThe servo driver of the group drives the material sensor to reciprocate along the direction of one side edge of the material to obtain a second position point, a third position point and a fourth position point on the rear edge of the material; recording the positions of the encoders of the four position points through the encoders following the assembly line conveyor belt; and the controller calculates the pose Mp of the material fixed reference point under the scanning module coordinate system according to the motor positions of the scanning module corresponding to the four position points and the position data of the corresponding encoder.
Compared with the prior art, the material position identification method has the advantages that the scanning module adopts the material sensor, the first position point of the front edge, the second position point and the third position point of the side edge and the fourth position point of the rear edge are obtained through the coupling of the assembly line conveyor belt and the scanning module in the reciprocating motion along the direction of the edge of one side of the material, the controller can calculate the position and the angle of the material under the coordinate system of the scanning module according to the motor positions of the scanning module corresponding to the four position points and the position data of the corresponding encoder, the scheme is simple, and the economical efficiency is good.
Preferably, the pose Mp of the fixed reference point of the material under the scanning module coordinate system M is calculated through the geometric relationship.
Alternatively, the position of the material can also be calculated through three position points, and the specific scheme is as follows:
a first position point is obtained by triggering the front edge of the moved material through a material sensor on a scanning module on the upper side of the assembly line conveyor belt; a servo driver of the scanning module is driven to drive the material sensor to reciprocate along the direction of one side edge of the material to obtain a second position point and a third position point; recording the positions of the encoders of the three position points through the encoders following the assembly line conveyor belt; and the controller calculates the pose Mp of the material in the scanning module coordinate system according to the motor positions of the scanning module corresponding to the three position points and the position data of the corresponding encoder.
And the controller calculates the pose Mp of the fixed reference point of the material under the scanning module coordinate system M through the geometric relation according to the four position points of the material and the corresponding motor positions of the scanning modules.
Assembly line material position identification systemThe method comprises the following steps: a controller; the scanning module comprises a servo driver, a motor and a material sensor, the material sensor is arranged on the upper side of the assembly line conveyor belt, and a first position point is obtained when the material sensor is triggered by the front edge of a material moving on the assembly line conveyor belt; the servo driver drives the material sensor to reciprocate along the direction of one side edge of the material to obtain a second position point, a third position point and a fourth position point on the rear edge of the material; the encoder is follow-up to the assembly line conveyor belt and is used for recording the positions of the encoders of the four position points; and the controller calculates the position and the angle of the material under the coordinate system of the scanning module according to the four position points and the position data of the corresponding encoders. Because the system adopts the material sensor to adopt the scanning module, the first position point of the front edge, the second position point and the third position point of one side edge and the fourth position point of the rear edge are obtained through the coupling of the assembly line conveyor belt and the scanning module which reciprocate along the direction of one side edge of the material, the controller calculates the pose Mp of the fixed reference point of the material under the scanning module coordinate system through the geometrical relationship according to the motor positions of the scanning module corresponding to the four position points and the motor positions of the corresponding scanning module, and the scheme is simple and the economical efficiency is good.
Alternatively, another scheme of the pipeline material position identification system is to calculate the position of the material through three position points, and the specific scheme is as follows:
the method comprises the following steps: a controller; the scanning module comprises a servo driver, a motor and a material sensor, the material sensor is arranged on the upper side of the assembly line conveyor belt, and a first position point is obtained when the material sensor is triggered by the front edge of a material moving on the assembly line conveyor belt; the servo driver drives the material sensor to reciprocate along the direction of one side edge of the material to obtain a second position point and a third position point; the encoder is follow-up to the assembly line conveyor belt and is used for recording the positions of the encoders of the three position points; and the controller calculates the position and the angle of the material under the coordinate system of the scanning module according to the three position points and the position data of the corresponding encoder. Because the system adopts the material sensor to adopt the scanning module, the first position point of the front edge, the second position point and the third position point of one side edge are obtained through the coupling of the assembly line conveyor belt and the scanning module which do reciprocating motion along the direction of one side edge of the material, and the controller calculates the pose Mp of the fixed reference point of the material under the scanning module coordinate system through the geometric relation according to the motor positions of the scanning module corresponding to the three position points and the motor positions of the corresponding scanning module.
Preferably, the encoder of the assembly line conveyor belt follow-up is connected to an encoder input acquisition interface of the servo driver, and the material sensor is connected to a high-speed input signal point of the servo driver. Through the configuration, when the material sensor is triggered, the servo driver high-speed probe latching function can latch and record the position values of the encoder and the servo motor at the microsecond-level response speed, and each group of recorded positions are transmitted to the controller for final operation, so that the accurate positioning of the material pose can be realized.
Robot moving alignment methodThe rotation and translation matrix of the scanning module coordinate system M and the robot base coordinate R is
Figure BDA0003242067710000031
The material coordinate Mp obtains the coordinate of the material under the robot base coordinate R through the rotation and translation matrix
Figure BDA0003242067710000032
Material follow-up systemThe method comprises the following steps: an assembly line material position identification system and a robot for placing workpieces to a material;
the rotation and translation matrix of the scanning module coordinate system M and the robot base coordinate R is
Figure BDA0003242067710000033
The material coordinate Mp obtains the coordinate of the material under the robot base coordinate R through the rotation and translation matrix
Figure BDA0003242067710000034
Drawings
FIG. 1 is a flowchart of a method for identifying a position of a material in a production line according to embodiment 1
FIG. 2 is a schematic diagram of the scanning track of the material sensor in the embodiment 1 relative to the material, and the arrow direction is the moving direction of the belt of the production line
FIG. 3 is a schematic diagram of the material position and angle algorithm of the patent, and the arrow direction is the moving direction of the belt in the production line
FIG. 4 is a flowchart of a method for identifying a position of a material in a production line according to embodiment 2
FIG. 5 is a schematic diagram of the scanning track of the material sensor in the embodiment 2 relative to the material, and the arrow direction is the moving direction of the belt of the production line
FIG. 6 is a schematic diagram of the material position identification system of the flow line of this patent
Detailed Description
Example 1
Referring to fig. 1 and 6, for example, for electronic product packaging boxes, the boxes need to be precisely grasped by a robot onto the face paper of the conveyor belt, and due to the deviation of the face paper during feeding, and the movement of the conveyor belt 1 in the line, the position and angle of the face paper need to be precisely identified and tracked.
The method for identifying the material position of the assembly line comprises the following hardware components for realizing the functions: the assembly line conveyor belt 1 is used for conveying facial tissues; the encoder 3 is provided with an encoder wheel, the encoder wheel is pressed on the assembly line conveyor belt 1, and the encoder 3 is driven to synchronously rotate when the assembly line conveyor belt 1 moves, so that the position of the assembly line conveyor belt 1 at each moment can be known; the scanning module 2 comprises a servo driver (not shown) and a motor 2.2, the servo driver controls the motor 2.2 to drive the scanning module 2.3 to move, and the tail end of the scanning module 2.3 is provided with a material sensor 2.1; the material sensor 2.1 can be a sensor capable of distinguishing the difference between the conveyor belt 1 and the material, and the irradiation light spot of the sensor should be as small as possible, and is usually a color scale, optical fiber, photoelectric sensor or the like.
A signal line of the material sensor 2.1 is connected to the servo driver, and the servo driver can be triggered to record the position of the motor 2.2 and the position of the encoder 3 when the signal is triggered; the motion controller 4 controls the servo driver and algorithm execution and operation; the material S, a product to be positioned, the embodiment is the facial tissue, and the final purpose is to determine the pose of a fixed reference point on the facial tissue.
The first position point P1 is detected by the leading edge of the material S being moved by the material sensor 2.1 located on the upper side scanning module 2 of the conveyor line 1, and the encoder 3 registers the encoder position E1.
A servo driver (not shown) for driving the scanning module 2 drives the material sensor 2.1 to perform a first reciprocating motion on the scanning module 2.3 along the direction of one side edge of the material S to obtain a second position point P2, and the encoder 3 records an encoder position E2; the controller 4 drives the scanning module 2 to perform a second reciprocating motion along the direction of one side edge of the material S to obtain a third position point P3, and the encoder 3 records an encoder position E3.
The acquisition of the second position P2 and the third position P3 may be triggered when the material sensor 2.1 reaches the side edge from the inside of the material S, or when the material sensor 2.1 reaches the side edge of the material S from the flow line conveyor 1; referring to fig. 2, the present embodiment uses the triggering of the material sensor 2.1 when it reaches the side edge of the material S from the conveyor belt 1 of the line.
The material sensor 2.1 of the scanning module 2 is triggered by the trailing edge of the material S to acquire a fourth position point P4, the encoder registers a position E4.
The controller 4 calculates the pose Mp of the material S in the scanning module coordinate system M according to the motor positions (M1, M1, M3, M4) of the scanning module 2 corresponding to the four position points (P1, P2, P3, P4) and the position data of the corresponding encoder 3.
In one embodiment, the pose Mp of the fixed reference point of the material S in the coordinate system M of the motor 2.2 is calculated by the geometric relationship between the motor positions (M1, M1, M3, M4) of the scanning module 2 corresponding to the four position points (P1, P2, P3, P4) and the position data of the corresponding encoder 3.
Compared with the prior art, the material position identification method has the advantages that the scanning module adopts the material sensor 2.1, the first position point P1 of the front edge, the second position point P2 and the third position point P3 of the side edge and the fourth position point P4 of the rear edge are obtained through the coupling of the assembly line conveyor belt 1 and the scanning module 2 in the reciprocating motion along the direction of the side edge of the material S, the controller 4 can calculate the pose Mp of the material S under the coordinate system M of the scanning module 2 according to the motor positions of the scanning module corresponding to the four position points and the position data of the corresponding encoder 4, the scheme is simple, and the economical efficiency is good.
In a preferred embodiment, the encoder 3 of the conveyor belt follower is connected to the encoder input acquisition interface of the servo driver of the scanning module 2, and the material sensor 2.1 is connected to the high-speed input signal point of the servo driver of the scanning module 2. Through the configuration, when the material sensor 2.1 is triggered, the servo driver high-speed probe latching function can latch the position values of the recording encoder 3 and the servo motor 2.2 at the microsecond-level response speed, and each group of recorded positions are transmitted to the controller 4 for final operation, so that the accurate positioning of the material pose can be realized.
Referring to fig. 3, explaining the principle of the facial tissue position solving algorithm, the motor positions M1, M1, M3 and M4 corresponding to the four position points P1, P2, P3 and P4 of the known material; the encoder 3 comprises positions E1, E2, E3 and E4 and a product angle AOC, and the coordinate value of a product corner point O (a fixed reference point) under a motor coordinate system M is obtained when a product flows to a point D at the point D.
The motor positions M1, M1, M3 and M4 corresponding to four position points P1, P2, P3 and P4 of the material can be used for calculating:
AD=E4-E1,
AE=E3-E1,
BG=E3-E2;
CG=M3-M2;
CE=M3-M1;
the angle of the assembly line conveyor belt 1 under the motor coordinate system M is equal to < AEC ═ BGC ═ DAM; BC equals BG2+CG2-2*BG*CG*cos(∠BGC);
The included angle between the edge of the facial tissue and the direction of the assembly line conveyor belt 1 is as follows: symbol CBG ═ ACOS (BC) 2+BG2-CG2) V (2 BC BG) then angle the product under the motor coordinate system M: the angle AFC is equal to the angle CBG + CGB;
since AEF and BGC are similarly triangular, then: EF ═ AE/BG × CG
Then: CF-CE-EF; the angle CFH is 180-angle CBG-angle BCG;
then: AJ ═ CH ═ CF ═ sin (zurncfh) AO ═ AJ/sin (zurnaoc);
since < OAK is < AFC-AOC, then
AL=AO*cos(∠OAK),OL=AO*sin(∠OAK);
In addition, the following can be obtained:
AM=AD*COS(∠DAM),DM=AD*SIN(∠DAM)。
the position of the point D in the scanning module coordinate system M is:
Figure BDA0003242067710000051
thus, the O point coordinates are:
Figure BDA0003242067710000052
in FIG. 3, AJ ^ BC, OL ^ Xm, AH// BC, HC ^ BC.
The above calculation process is the algorithm logic of this patent, and specific code instructions, programs, etc. can be implemented by importing the above algorithm logic according to the technical knowledge in the field.
Example 2
Referring to fig. 4, fig. 5 and fig. 6, the present embodiment is different from embodiment 1 in that the present embodiment only needs to acquire three position points, i.e., P1-P3, as follows:
a first position point P1 is obtained by the front edge trigger of the moved material S of the material sensor 2.1 on the scanning module 2 at the upper side of the production line conveyor belt 1, and the encoder 3 records an encoder position E1;
a servo driver (not shown) for driving the scanning module 2 drives the material sensor 2.1 to perform a first reciprocating motion on the scanning module 2.3 along the direction of one side edge of the material S to obtain a second position point P2, and the encoder 3 records an encoder position E2; the controller 4 drives the scanning module 2 to perform a second reciprocating motion along the direction of one side edge of the material S to obtain a third position point P3, and the encoder 3 records an encoder position E3.
The controller 4 calculates the pose Mp of the material S in the scanning module coordinate system M according to the motor positions (M1, M1, M3) of the scanning module 2 corresponding to the three position points (P1, P2, P3) and the position data of the corresponding encoder 3.
In one embodiment, the pose Mp of the fixed reference point of the material S in the coordinate system M of the motor 2.2 is calculated by the geometric relationship between the motor positions (M1, M1, M3) of the scanning module 2 corresponding to the three position points (P1, P2, P3) and the position data of the corresponding encoder 3.
Referring to fig. 3 and 5, explaining the principle of the facial tissue position solving algorithm, the motor positions M1, M1 and M3 corresponding to the three position points P1, P2 and P3 of the known material; the encoder 3 positions E1, E2 and E3 and the product angle AOC are used for solving the coordinate value of a product corner point O (fixed reference point) under a motor coordinate system M when the product flows to the point A based on the point A.
The motor positions M1, M1 and M3 corresponding to three position points P1, P2 and P3 of the material can be used for calculating:
AE=E3-E1,
BG=E3-E2;
CG=M3-M2;
CE=M3-M1;
the angle of the assembly line conveyor belt 1 under the motor coordinate system M is equal to < AEC ═ BGC; BC equals BG2+CG2-2*BG*CG*cos(∠BGC);
The included angle between the edge of the facial tissue and the direction of the assembly line conveyor belt 1 is as follows: symbol CBG ═ ACOS (BC)2+BG2-CG2) V (2 BC BG) then angle the product under the motor coordinate system M: the angle AFC is equal to the angle CBG + CGB;
since AEF and BGC are similarly triangular, then: EF ═ AE/BG × CG
Then: CF-CE-EF; the angle CFH is 180-angle CBG-angle BCG;
then: AJ ═ CH ═ CF ═ sin (zurncfh) AO ═ AJ/sin (zurnaoc);
since < OAK is < AFC-AOC, then
AL=AO*cos(∠OAK),OL=AO*sin(∠OAK);
The position of the point a in the scanning module coordinate system M is:
Figure BDA0003242067710000061
thus, the O point coordinates are:
Figure BDA0003242067710000062
see embodiment 1 for hardware configuration and other features to implement the functions.
Example 3
Referring to fig. 2 and 6, the present embodiment is a system for identifying the position of a material in a production line using the method of embodiment 1, and includes a controller 4, a scanning module 2, and an encoder 3; scanning module 2 includes servo driver (not shown) and material sensor 2.1, and material sensor 2.1 installs at assembly line conveyer belt (1) upside, and is concrete, and material sensor 2.1 installs on the scanning module 2.3 by motor 2.2 driven. The material sensor 2.1 is a color code sensor or an optical fiber sensor.
In one embodiment, the scanning module 2 is mounted on a support 5 on the upper side of the conveyor belt 1, and the scanning module 2.3 is driven by a motor 2.2 to reciprocate along the support 5, which is widely used in the prior art.
The material sensor 2.1 obtains the first position point P1 when triggered by the front edge of the material S moving on the line conveyor 1, and then the controller 4 drives the scanning module 2 to reciprocate in the direction of the side edge of the material S to obtain the second position point P2 and the third position point P3, and the fourth position point P4 on the rear edge of the material S. An encoder 3 following the line conveyor (1) and recording encoder positions E1, E2, E3 and E4 of the four position points P1, P2, P3 and P4; the controller 4 calculates the position and angle of the material S in the scanning module coordinate system M according to the motor positions (M1, M1, M3, M4) of the scanning module 2 corresponding to the four position points P1, P2, P3, P4 and the position data of the corresponding encoder 3. Please refer to embodiment one for a specific obtaining method.
In one embodiment, the controller 4 calculates the pose Mp of the fixed reference point of the material in the scanning module coordinate system M by geometric relationships based on the four position points P1, P2, P3, P4 of the material, the corresponding motor positions M1, M1, M3, M4 of the scanning module 2, and the corresponding encoder positions E1, E2, E3, E4. For a specific calculation process, please refer to example 1.
Because the system adopts the material sensor 2.1 as the scanning module, the first position point P1 of the front edge, the second position point P2 and the third position point P3 of the side edge and the fourth position point P4 of the rear edge are obtained through the coupling of the reciprocating motion of the assembly line conveyor belt 1 and the scanning module 2 along the direction of the side edge of the material S, and the controller 4 can calculate the pose Mp of the material under the scanning module coordinate system M according to the data of the motor positions (M1, M1, M3, M4) of the scanning module 2 corresponding to the four position points P1, P2, P3 and P4 and the corresponding encoder positions E1, E2, E3 and E4.
In a preferred embodiment, the encoder 3 of the conveyor belt follower is connected to the encoder input acquisition interface of the servo driver of the scanning module 2, and the material sensor 2.1 is connected to the high-speed input signal point of the servo driver of the scanning module 2. Through the configuration, when the material sensor 2.1 is triggered, the servo driver high-speed probe latching function can latch the position values of the recording encoder 3 and the servo motor 2.2 at the microsecond-level response speed, and each group of recorded positions are transmitted to the controller 4 for final operation, so that the accurate positioning of the material pose can be realized.
Example 4
Referring to fig. 5 and 6, the present embodiment is a system for identifying the position of a material in a production line using the method of embodiment 2, and includes a controller 4, a scanning module 2, and an encoder 3; scanning module 2 includes servo driver (not shown) and material sensor 2.1, and material sensor 2.1 installs at assembly line conveyer belt (1) upside, and is concrete, and material sensor 2.1 installs on the scanning module 2.3 by motor 2.2 driven. The material sensor 2.1 is a color code sensor or an optical fiber sensor.
In one embodiment, the scanning module 2 is mounted on a support 5 on the upper side of the conveyor belt 1, and the scanning module 2.3 is driven by a motor 2.2 to reciprocate along the support 5, which is widely used in the prior art.
The material sensor 2.1 obtains the first position point P1 when triggered by the front edge of the material S moving on the line conveyor 1, and then the controller 4 drives the scanning module 2 to reciprocate along the direction of one side edge of the material S to obtain the second position point P2 and the third position point P3. An encoder 3 following the line conveyor (1) and recording encoder positions E1, E2 and E3 of the three position points P1, P2 and P3; the controller 4 calculates the pose Mp of the material S in the scanning module coordinate system M according to the motor positions (M1, M1, M3) of the scanning module 2 corresponding to the three position points P1, P2, P3 and the position data of the corresponding encoder 3. Please refer to embodiment one for a specific obtaining method.
In one embodiment, the controller 4 calculates the pose Mp of the fixed reference point of the material in the scanning module coordinate system M by geometric relationship based on the three position points P1, P2, P3 of the material, the corresponding motor positions M1, M1, M3 of the scanning module 2 and the corresponding encoder positions E1, E2, E3. For a specific calculation process, please refer to example 1.
Because the system adopts the material sensor 2.1 as the scanning module, the first position point P1 of the front edge, the second position point P2 and the third position point P3 of the side edge are obtained by the coupling of the reciprocating motion of the assembly line conveyor belt 1 and the scanning module 2 along the direction of the side edge of the material S, and the controller 4 can calculate the position and the angle of the material under the scanning module coordinate system M according to the motor positions M1, M1 and M3 of the scanning module 2 corresponding to the three position points P1, P2 and P3 and the corresponding encoder positions E1, E2 and E3.
In a preferred embodiment, the encoder 3 of the conveyor belt follower is connected to the encoder input acquisition interface of the servo driver of the scanning module 2, and the material sensor 2.1 is connected to the high-speed input signal point of the servo driver of the scanning module 2. Through the configuration, when the material sensor 2.1 is triggered, the servo driver high-speed probe latching function can latch the position values of the recording encoder 3 and the servo motor 2.2 at the microsecond-level response speed, and each group of recorded positions are transmitted to the controller 4 for final operation, so that the accurate positioning of the material pose can be realized.
Example 5
Referring to fig. 1 to 3 and 6, in this embodiment, an automatic robot tracking function is implemented according to the pipeline material position identification data in embodiment 1.
The robot moving alignment method comprises the steps that the rotation and translation matrix of the scanning module coordinate system M and the robot base coordinate R is
Figure BDA0003242067710000081
The material coordinate Mp obtains the coordinate of the material under the robot base coordinate R through the rotation and translation matrix
Figure BDA0003242067710000082
And setting the production line conveyor belt 1 in a robot coordinate system R, and solving a coordinate value of a product corner point O under the robot coordinate system R when the product flows to a point D. As can be seen from the embodiment 1, the position of the point D in the scanning module coordinate system M is:
Figure BDA0003242067710000083
the coordinates of the O point are:
Figure BDA0003242067710000084
then the O point is located at the lower position of the robot coordinate system R as follows:
Figure BDA0003242067710000085
the rotational translation matrix is
Figure BDA0003242067710000091
The calibration method is obtained by calibrating the robot coordinate system R and the motor coordinate system M, and the specific calibration method is not an improvement point of the patent and can be implemented according to the prior art in the field, such as the calibration method disclosed in CN 201811532603.0.
Robot according to coordinate RPData for realizing the supply of face paper on the conveyer belt 1 of the production lineAnd (6) automatically tracking.
Example 6
Referring to fig. 3, 5 and 6, the present embodiment is a robot automatic tracking function implemented according to the pipeline material position identification data of embodiment 2.
The robot moving alignment method comprises the steps that the rotation and translation matrix of the scanning module coordinate system M and the robot base coordinate R is
Figure BDA0003242067710000092
The material coordinate Mp obtains the coordinate of the material under the robot base coordinate R through the rotation and translation matrix
Figure BDA0003242067710000093
And setting the production line conveyor belt 1 in a robot coordinate system R, and solving a coordinate value of a product corner point O under the robot coordinate system R when the product flows to the point A. As can be seen from the embodiment 1, the position of the point a in the scanning module coordinate system M is:
Figure BDA0003242067710000094
the coordinates of the O point are:
Figure BDA0003242067710000095
then the O point is located at the lower position of the robot coordinate system R as follows:
Figure BDA0003242067710000096
the rotational translation matrix is
Figure BDA0003242067710000097
The calibration method is obtained by calibrating the robot coordinate system R and the motor coordinate system M, and the specific calibration method is not an improvement point of the patent and can be implemented according to the prior art in the field, such as the calibration method disclosed in CN 201811532603.0.
Robot according to coordinate RPIs implemented on the assembly line conveyor belt 1And automatically tracking the facial tissue.
Example 7
A material follow-up system for the robot auto-tracking function of examples 5 and 6, comprising: an assembly line material position identification system and a robot for placing workpieces to a material;
the rotation and translation matrix of the scanning module coordinate system M and the robot base coordinate R is
Figure BDA0003242067710000098
The material coordinate Mp obtains the coordinate of the material under the robot base coordinate R through the rotation and translation matrix
Figure BDA0003242067710000099
Variations and modifications to the above-described embodiments may occur to those skilled in the art, in light of the above teachings and teachings, and the present invention is not limited to the particular embodiments disclosed and described, since modifications and variations may be made to the invention without departing from the scope of the invention as defined by the appended claims.

Claims (10)

1. The method for identifying the position of the material in the production line is characterized in that,
acquiring a first position point (P1) triggered by the front edge of the moved material (S) through a material sensor (2.1) on the upper side scanning module (2) of the production line conveyor belt (1);
a servo driver of the scanning module (2) drives the material sensor (2.1) to reciprocate along the direction of one side edge of the material (S) to obtain a second position point (P2) and a third position point (P3);
recording the encoder positions (E1, E2, E3) of the three position points (P1, P2, P3) by means of an encoder (3) following the line conveyor (1);
the controller (4) calculates the pose Mp of the fixed reference point of the material (S) under the coordinate system M of the scanning module according to the motor positions (M1, M1, M3) of the scanning module (2) corresponding to the three position points (P1, P2, P3) and the position data of the corresponding encoder (3).
2. The pipeline material position identification method according to claim 1, wherein the pose Mp of the fixed reference point of the material under the scanning module coordinate system M is calculated through geometric relations.
3. The method for identifying the position of the material in the production line is characterized in that,
acquiring a first position point (P1) triggered by the front edge of the moved material (S) through a material sensor (2.1) on the upper side scanning module (2) of the production line conveyor belt (1);
a servo driver of the scanning module (2) drives the material sensor (2.1) to reciprocate along the direction of one side edge of the material (S) to obtain a second position point (P2), a third position point (P3) and a fourth position point (P4) on the rear edge of the material (S);
recording the encoder positions (E1, E2, E3, E4) of the four position points (P1, P2, P3, P4) by an encoder (3) following the line conveyor (1);
the controller (4) calculates the pose Mp of the material (S) in the scanning module coordinate system M according to the motor positions (M1, M1, M3, M4) of the scanning module (2) corresponding to the four position points (P1, P2, P3, P4) and the position data of the corresponding encoder (3).
4. The pipeline material position identification method according to claim 3, wherein the pose Mp of the fixed reference point of the material under the scanning module coordinate system M is calculated through geometric relations.
5. A method according to claim 1 or 3, characterized in that the material sensor (2.1) is a color sensor or an optical fiber sensor.
6. Pipeline material position identification system, its characterized in that includes:
a controller (4);
the scanning module (2) comprises a servo driver, a motor (2.2) and a material sensor (2.1), wherein the material sensor (2.1) is arranged on the upper side of the assembly line conveyor belt (1), and a first position point (P1) is obtained when the material sensor (2.1) is triggered by the front edge of a material (S) moving on the assembly line conveyor belt (1);
the servo driver drives the material sensor (2.1) to reciprocate along the direction of one side edge of the material (S) to obtain a second position point (P2) and a third position point (P3);
an encoder (3) following the conveyor belt (1) for recording the encoder positions (E1, E2, E3) of the three position points (P1, P2, P3);
and the controller (4) calculates the pose Mp of the fixed reference point of the material under the scanning module coordinate system M through the geometrical relationship according to the motor positions (M1, M1 and M3) of the scanning module (2) corresponding to the three position points (P1, P2 and P3) and the motor positions (M1, M1 and M3) of the scanning module (2).
7. Pipeline material position identification system, its characterized in that includes:
A controller (4);
the scanning module (2) comprises a servo driver, a motor (2.2) and a material sensor (2.1), wherein the material sensor (2.1) is arranged on the upper side of the assembly line conveyor belt (1), and a first position point (P1) is obtained when the material sensor (2.1) is triggered by the front edge of a material (S) moving on the assembly line conveyor belt (1);
the servo driver drives the material sensor (2.1) to reciprocate along the direction of one side edge of the material (S) to obtain a second position point (P2), a third position point (P3) and a fourth position point (P4) on the rear edge of the material (S);
an encoder (3) following the line conveyor (1) for recording encoder positions (E1, E2, E3, E4) of said four position points (P1, P2, P3, P4);
the controller (4) calculates the pose Mp of the fixed reference point of the material under the scanning module coordinate system M through geometric relations according to the motor positions (M1, M1, M3 and M4) of the scanning module (2) corresponding to the four position points (P1, P2, P3 and P4) and the motor positions (M1, M1, M3 and M4) of the scanning module (2).
8. A pipeline material position identification system according to claim 6 or 7, characterized in that the follow-up encoder (3) of the pipeline conveyor belt (1) is connected to the encoder input acquisition interface of the servo drive, and the material sensor is connected to the high-speed input signal point of the servo drive.
9. The robot moving alignment method comprises the steps that the rotation and translation matrix of the scanning module coordinate system M and the robot base coordinate R is
Figure FDA0003242067700000021
Characterized in that the material coordinates Mp obtained by the method of any one of claims 1 to 5 are passed through a rotation-translation matrix
Figure FDA0003242067700000022
Obtaining the coordinates of the materials under the robot base coordinate R
Figure FDA0003242067700000023
10. Material follows system of pasting, its characterized in that includes: an in-line material position identification system as claimed in claim 6 or 7, and a robot for placing workpieces to the material (S);
the rotation and translation matrix of the scanning module coordinate system M and the robot base coordinate R is
Figure FDA0003242067700000024
Characterized in that the material coordinates Mp obtained by the method of claim 2 are passed through a rotation-translation matrix
Figure FDA0003242067700000025
Obtaining the coordinates of the materials under the robot base coordinate R
Figure FDA0003242067700000026
CN202111021321.6A 2021-09-01 2021-09-01 Assembly line material position identification and robot movement alignment method and system Active CN113664834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111021321.6A CN113664834B (en) 2021-09-01 2021-09-01 Assembly line material position identification and robot movement alignment method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111021321.6A CN113664834B (en) 2021-09-01 2021-09-01 Assembly line material position identification and robot movement alignment method and system

Publications (2)

Publication Number Publication Date
CN113664834A true CN113664834A (en) 2021-11-19
CN113664834B CN113664834B (en) 2022-09-30

Family

ID=78547984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111021321.6A Active CN113664834B (en) 2021-09-01 2021-09-01 Assembly line material position identification and robot movement alignment method and system

Country Status (1)

Country Link
CN (1) CN113664834B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160129594A1 (en) * 2014-11-10 2016-05-12 Faro Technologies, Inc. Human-centric robot with noncontact measurement device
CN109732604A (en) * 2019-01-21 2019-05-10 成都宇俊盛科技有限公司 A method of the mobile contraposition of manipulator is carried out by electric eye
CN110253582A (en) * 2019-06-28 2019-09-20 成都宇俊盛科技有限公司 A kind of method of the mobile contraposition of manipulator on assembly line
CN110281238A (en) * 2019-06-17 2019-09-27 深圳视觉龙智能传感器有限公司 Assembly line multi-robot scaling method, device, computer equipment and storage medium
CN110385889A (en) * 2018-04-20 2019-10-29 常州数控技术研究所 A kind of device and method of dynamic attachment vision-based detection and correction
CN212923677U (en) * 2020-07-27 2021-04-09 深圳市恒科通机器人有限公司 Positioning device
CN113319833A (en) * 2021-05-19 2021-08-31 三一建筑机器人(西安)研究院有限公司 Cartesian coordinate robot calibration method and assembly system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160129594A1 (en) * 2014-11-10 2016-05-12 Faro Technologies, Inc. Human-centric robot with noncontact measurement device
CN110385889A (en) * 2018-04-20 2019-10-29 常州数控技术研究所 A kind of device and method of dynamic attachment vision-based detection and correction
CN109732604A (en) * 2019-01-21 2019-05-10 成都宇俊盛科技有限公司 A method of the mobile contraposition of manipulator is carried out by electric eye
CN110281238A (en) * 2019-06-17 2019-09-27 深圳视觉龙智能传感器有限公司 Assembly line multi-robot scaling method, device, computer equipment and storage medium
CN110253582A (en) * 2019-06-28 2019-09-20 成都宇俊盛科技有限公司 A kind of method of the mobile contraposition of manipulator on assembly line
CN212923677U (en) * 2020-07-27 2021-04-09 深圳市恒科通机器人有限公司 Positioning device
CN113319833A (en) * 2021-05-19 2021-08-31 三一建筑机器人(西安)研究院有限公司 Cartesian coordinate robot calibration method and assembly system

Also Published As

Publication number Publication date
CN113664834B (en) 2022-09-30

Similar Documents

Publication Publication Date Title
CN107150032B (en) Workpiece identification and sorting device and method based on multi-image acquisition equipment
CN203899959U (en) Automatic barcode reader
CN106018430B (en) Six facial vision detection devices of one kind and its control method
EP2869157A2 (en) Mobile unit and method of moving mobile unit
CN207636025U (en) A kind of automatic high speed glass monitor station
CN103878094A (en) Car air-conditioning automatic gluing production line
JP2001158413A (en) Labeling device
CN106197273A (en) A kind of with acting rotary vision inspection apparatus
CN206447240U (en) Mobile phone automatic checkout equipment
CN113664834B (en) Assembly line material position identification and robot movement alignment method and system
CN203708648U (en) CCD sorting system of sliding-table type
CN108788474B (en) Metal bottle cap laser marking device, preparation production line and preparation method
US20220185599A1 (en) Picking device
CN112577423B (en) Method for machine vision position location in motion and application thereof
CN207914781U (en) A kind of glass plate laser marker
CN110253582A (en) A kind of method of the mobile contraposition of manipulator on assembly line
US10207872B2 (en) Workpiece conveyor system
CN109765231A (en) A kind of appearance detection system based on machine vision
CN112224868A (en) FPC (Flexible printed Circuit) light bar feeding method based on CCD (Charge coupled device)
CN112722839A (en) Battery loading attachment
CN110006369A (en) Rotor concentricity automatic detection device
CN110605730A (en) Robot system and robot
CN115601439A (en) Calibration device and calibration method for two-dimensional affine transformation matrix of coordinate system
CN111221298A (en) Machining system with visual positioning function and machining method
CN209102022U (en) Rotor concentricity automatic detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant