CN112720471A - Automatic spraying method for realizing online tracking of furniture industry based on robot vision - Google Patents

Automatic spraying method for realizing online tracking of furniture industry based on robot vision Download PDF

Info

Publication number
CN112720471A
CN112720471A CN202011510259.2A CN202011510259A CN112720471A CN 112720471 A CN112720471 A CN 112720471A CN 202011510259 A CN202011510259 A CN 202011510259A CN 112720471 A CN112720471 A CN 112720471A
Authority
CN
China
Prior art keywords
robot
data
track
acquired
program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011510259.2A
Other languages
Chinese (zh)
Inventor
赵闯
徐强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cma Wuhu Robotics Co ltd
Original Assignee
Cma Wuhu Robotics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cma Wuhu Robotics Co ltd filed Critical Cma Wuhu Robotics Co ltd
Priority to CN202011510259.2A priority Critical patent/CN112720471A/en
Publication of CN112720471A publication Critical patent/CN112720471A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0075Manipulators for painting or coating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention relates to the field of industrial robot online offset numerical calculation, in particular to an automatic spraying method for realizing online tracking of furniture industry based on robot vision, which comprises the following specific steps: s1, the robot mainly completes preparation work; s2, judging whether the image data needs to be acquired: s3, processing the image data acquisition in the step S2; s4, judging whether a robot track is generated or not; s5, determining the generation track of the robot; s6, finishing the spraying by the robot, comparing with the prior art, the invention adds the deviation to the track planning by the interpolation operation in the spraying process of the robot by the on-line tracking automatic spraying technology, even if the object runs, the robot can spray according to the actual track of the object.

Description

Automatic spraying method for realizing online tracking of furniture industry based on robot vision
Technical Field
The invention relates to the field of industrial robot on-line offset numerical calculation, in particular to an automatic spraying method for realizing on-line tracking of furniture industry based on robot vision.
Background
The spraying robot can carry out automatic spraying or paint spraying, Chinese develops the spraying robot of several models and puts into use and all obtains better economic benefits and effects, the spraying robot mainly comprises robot body, computer and corresponding control system, hydraulically driven spraying robot still includes hydraulic oil source, such as oil pump, oil tank, motor etc., the spraying robot is used in car, instrument, electrical apparatus, furniture field extensively, in the application field that the robot sprays small-size plate furniture, the efficiency of the fixed point spraying mode of robot is more and more not satisfied, and the function of on-line tracking is more and more pursued.
Disclosure of Invention
In order to solve the problems, the invention provides an automatic spraying method for realizing online tracking in the furniture industry based on robot vision.
The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision comprises the following specific steps:
s1, the robot mainly completes preparation work;
s2, judging whether the image data needs to be acquired:
a: if image data needs to be acquired, the robot proceeds to step S3;
b: if the image data does not need to be acquired, if not, the operation is restarted in a circulating way;
s3, processing the image data acquisition in the step S2, converting the acquired pixel data into distance data, filtering the unnecessary data and generating the data of the workpiece needing to be sprayed;
s4, judging whether a robot track is generated:
a: if the robot track needs to be generated, generating the track according to the data acquired by the image for planning the track;
b: the robot track does not need to be generated, and then the circular work is restarted;
s5, after determining the generation track of the robot, generating the generation track according to the data acquired by the image, and sending the generated track planning program to the robot, wherein the robot performs spraying work; if not, the cycle work is restarted;
and S6, finishing the robot spraying.
The step S2 is to acquire data by using the light curtain to acquire the data required by the object plate, and the data is acquired as follows:
a: firstly, establishing a connection between the robot and the light curtain by utilizing the communication mode of the R step S485, and sending an optical axis to a receiving end for the light curtain through a sending end;
b: when a sheltered object exists, the storage data of the optical axis of the strip is 1;
c: when there is no shielded object, the optical axis storage data of the strip is 0;
d: and the data of the object is obtained by analogy, the corresponding data is transmitted to the robot through the communication of the R step S45, and the robot grays the data to form an image.
The specific steps of the processing of step S3 for image data acquisition are as follows:
a: processing the image into one object, wherein the one object contains all attributes of the object;
b: and filtering the unnecessary interference points, and storing the processed data into a corresponding structural body to prepare for path planning.
The attributes noted in the object comprise the length, width and height of the object.
The trajectory planning in step S4 is to plan a path according to the data acquired in step S3, and includes the following steps:
a: generating Line and Plane instructions by path planning, and storing the generated instructions into Line and Plane structural bodies at first;
b: after the path planning acquires corresponding data, generating a program according to the format of the target program;
c: taking the CMA robot object program as an example, the program mainly includes a program header, a subprogram header, subprogram data, a command header, and command data, and the data acquired from the light curtain is generated according to the program format described above.
In step S5, when the robot is in the automatic mode, the robot is respectively responded by the following programs:
a: the robot starts to work by calculating that the value of the detected encoder is larger than the total value of Start diversity and active mm in the conversion step;
b: when the track of the robot is detected to exceed the range, the robot stops working.
In said step S5, when the robot is in the spraying interval, the effective position of the workpiece is compared with the original position and the rectangular coordinate shift can be calculated, during the automatic cycle, it is added to the rectangular coordinate set point before the kinematics is reversed, and the offset value calculated in real time is calculated when the robot detects that the object enters the spraying.
The calculation steps of the offset value are as follows:
the offset interpolation which needs to be added in the tracking motion track of the robot is equal to the initial value of the head file which is self-defined as the position of conveying and moving, the program segment starting position and the self-defined position;
the distance of the robot X direction deviation is equal to the position of the conveying movement and the angle of the conveying operation of the step Sin;
the distance of the robot in the Y direction is equal to the position of the conveying movement and the angle of Cos conveying operation;
the robot has no offset in the Z direction.
The invention has the beneficial effects that: compared with the prior art, the invention enables the robot to add the offset into the track planning according to the interpolation operation in the spraying process through the online tracking automatic spraying technology, and even if the object runs, the robot can spray according to the actual track of the object.
Drawings
The invention is further illustrated with reference to the following figures and examples.
FIG. 1 is a block flow diagram of the method of the present invention;
FIG. 2 is a schematic diagram of the robot tracking of the present invention;
fig. 3 is an interpolation schematic diagram of a robot according to the present invention.
Detailed Description
In order to make the technical means, the creation characteristics, the achievement purposes and the effects of the invention easy to understand, the invention is further explained below.
As shown in fig. 1, the automatic spraying method for realizing online tracking of furniture industry based on robot vision comprises the following specific steps:
s1, the robot mainly completes preparation work;
s2, judging whether the image data needs to be acquired:
a: if image data needs to be acquired, the robot proceeds to step S3;
b: if the image data does not need to be acquired, if not, the operation is restarted in a circulating way;
s3, processing the image data acquisition in the step S2, converting the acquired pixel data into distance data, filtering the unnecessary data and generating the data of the workpiece needing to be sprayed;
s4, judging whether a robot track is generated:
a: if the robot track needs to be generated, generating the track according to the data acquired by the image for planning the track;
b: the robot track does not need to be generated, and then the circular work is restarted;
s5, after determining the generation track of the robot, generating the generation track according to the data acquired by the image, and sending the generated track planning program to the robot, wherein the robot performs spraying work; if not, the cycle work is restarted;
and S6, finishing the robot spraying.
Compared with the prior art, the invention enables the robot to add the offset into the track planning according to the interpolation operation in the spraying process through the online tracking automatic spraying technology, and even if the object runs, the robot can spray according to the actual track of the object.
The step S2 is to acquire data by using the light curtain to acquire the data required by the object plate, and the data is acquired as follows:
a: firstly, establishing a connection between the robot and the light curtain by utilizing the communication mode of the R step S485, and sending an optical axis to a receiving end for the light curtain through a sending end;
b: when a sheltered object exists, the storage data of the optical axis of the strip is 1;
c: when there is no shielded object, the optical axis storage data of the strip is 0;
d: and the data of the object is obtained by analogy, the corresponding data is transmitted to the robot through the communication of the R step S45, and the robot grays the data to form an image.
The specific steps of the processing of step S3 for image data acquisition are as follows:
a: processing the image into one object, wherein the one object contains all attributes of the object;
b: and filtering the unnecessary interference points, and storing the processed data into a corresponding structural body to prepare for path planning.
The attributes noted in the object comprise the length, width and height of the object.
The invention relates to a robot online tracking automatic spraying technology, in particular to a robot online tracking automatic spraying technology, namely, a robot still runs in the spraying process, so that interpolation calculation of online tracking program deviation is added in a generated standard program, namely, the working principle of the program online tracking deviation and the interpolation calculation of the deviation are added.
As shown in fig. 2, when the convertor start indicates that the suspended spray object reaches the light curtain scanning position, the robot records the current encoder value, and the convery start variance indicates the distance from the start of conveying; active mm represents the program activation distance, the trajectory planning in step S4 is to perform path planning according to the data acquired in step S3, and the specific steps are as follows:
a: generating Line and Plane instructions by path planning, and storing the generated instructions into Line and Plane structural bodies at first;
b: after the path planning acquires corresponding data, generating a program according to the format of the target program;
c: taking the CMA robot object program as an example, the program mainly includes a program header, a subprogram header, subprogram data, a command header, and command data, and the data acquired from the light curtain is generated according to the program format described above.
In step S5, when the robot is in the automatic mode, the robot is respectively responded by the following programs:
a: the robot starts to work by calculating that the value of the detected encoder is larger than the total value of Start diversity and active mm in the conversion step;
b: when the track of the robot is detected to exceed the range, the robot stops working;
wherein, the convergent active area represents the working range of the robot, and the convergent direction s represents the forward direction operation or the reverse direction operation of the conveying chain.
When the track is automatically generated, the format of a template program is firstly determined for the CMA robot, and the track is generated according to the format, the format of the program is the target program, and the program comprises a program header and a subprogram header, so that the generated program can be operated by the robot, and if the format is not correct, the generated program is wrong.
As shown in fig. 3, the present invention performs offset processing on the trajectory after automatically generating the trajectory, and completes online Tracking work, assuming that the deviation of the Conveyor direction calculation is (lt _ shift), and adds it to the rectangular coordinate set point before the kinematics reversal during the automatic cycle, component 1 position indicates a program position without offset, Line-Tracking shift indicates a program position after offset, Active position reducing execution activated position indicates a position where offset is activated, and vector direction indicates whether the Conveyor chain runs in the forward direction or the reverse direction.
In said step S5, when the robot is in the spraying interval, the effective position of the workpiece is compared with the original position and the rectangular coordinate shift can be calculated, during the automatic cycle, it is added to the rectangular coordinate set point before the kinematics is reversed, and the offset value calculated in real time is calculated when the robot detects that the object enters the spraying.
The calculation steps of the offset value are as follows:
the offset interpolation which needs to be added in the tracking motion track of the robot is equal to the initial value of the head file which is self-defined as the position of conveying and moving, the program segment starting position and the self-defined position;
the distance of the robot X direction deviation is equal to the position of the conveying movement and the angle of the conveying operation of the step Sin;
the distance of the robot in the Y direction is equal to the position of the conveying movement and the angle of Cos conveying operation;
the robot has no offset in the Z direction.
The calculation formula is as follows:
db_active_conv_mm=db_conveyor.pos_mm_flt-db_active_conv_start-(double)auto_cmp_head.lt_conv_pos;
lt_shift.x=db_active_conv_mm*sin(par_conv_angle_rad);lt_shift.y=db_active_conv_mm*cos(par_conv_angle_rad);
lt_shift.z=0.0。
the foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (8)

1. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision is characterized in that: the method comprises the following specific steps:
s1, the robot mainly completes preparation work;
s2, judging whether the image data needs to be acquired:
a: if image data needs to be acquired, the robot proceeds to step S3;
b: if the image data does not need to be acquired, if not, the operation is restarted in a circulating way;
s3, processing the image data acquisition in the step S2, converting the acquired pixel data into distance data, filtering the unnecessary data and generating the data of the workpiece needing to be sprayed;
s4, judging whether a robot track is generated:
a: if the robot track needs to be generated, generating the track according to the data acquired by the image for planning the track;
b: the robot track does not need to be generated, and then the circular work is restarted;
s5, after determining the generation track of the robot, generating the generation track according to the data acquired by the image, and sending the generated track planning program to the robot, wherein the robot performs spraying work; if not, the cycle work is restarted;
and S6, finishing the robot spraying.
2. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision according to claim 1, characterized in that: the step S2 is to acquire data by using the light curtain to acquire the data required by the object plate, and the data is acquired as follows:
a: firstly, establishing a connection between the robot and the light curtain by utilizing the communication mode of the R step S485, and sending an optical axis to a receiving end for the light curtain through a sending end;
b: when a sheltered object exists, the storage data of the optical axis of the strip is 1;
c: when there is no shielded object, the optical axis storage data of the strip is 0;
d: and the data of the object is obtained by analogy, the corresponding data is transmitted to the robot through the communication of the R step S45, and the robot grays the data to form an image.
3. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision according to claim 1, characterized in that: the specific steps of the processing of step S3 for image data acquisition are as follows:
a: processing the image into one object, wherein the one object contains all attributes of the object;
b: and filtering the unnecessary interference points, and storing the processed data into a corresponding structural body to prepare for path planning.
4. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision according to the claim 3, characterized in that: the attributes noted in the object comprise the length, width and height of the object.
5. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision according to claim 1, characterized in that: the trajectory planning in step S4 is to plan a path according to the data acquired in step S3, and includes the following steps:
a: generating Line and Plane instructions by path planning, and storing the generated instructions into Line and Plane structural bodies at first;
b: after the path planning acquires corresponding data, generating a program according to the format of the target program;
c: taking the CMA robot object program as an example, the program mainly includes a program header, a subprogram header, subprogram data, a command header, and command data, and the data acquired from the light curtain is generated according to the program format described above.
6. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision according to claim 1, characterized in that: in step S5, when the robot is in the automatic mode, the robot is respectively responded by the following programs:
a: the robot starts to work by calculating that the value of the detected encoder is larger than the total value of Start diversity and active mm in the conversion step;
b: when the track of the robot is detected to exceed the range, the robot stops working.
7. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision according to claim 1, characterized in that: in said step S5, when the robot is in the spraying interval, the effective position of the workpiece is compared with the original position and the rectangular coordinate shift can be calculated, during the automatic cycle, it is added to the rectangular coordinate set point before the kinematics is reversed, and the offset value calculated in real time is calculated when the robot detects that the object enters the spraying.
8. The automatic spraying method for realizing the online tracking of the furniture industry based on the robot vision according to the claim 7, characterized in that: the calculation steps of the offset value are as follows:
the offset interpolation which needs to be added in the tracking motion track of the robot is equal to the initial value of the head file which is self-defined as the position of conveying and moving, the program segment starting position and the self-defined position;
the distance of the robot X direction deviation is equal to the position of the conveying movement and the angle of the conveying operation of the step Sin;
the distance of the robot in the Y direction is equal to the position of the conveying movement and the angle of Cos conveying operation;
the robot has no offset in the Z direction.
CN202011510259.2A 2020-12-18 2020-12-18 Automatic spraying method for realizing online tracking of furniture industry based on robot vision Pending CN112720471A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011510259.2A CN112720471A (en) 2020-12-18 2020-12-18 Automatic spraying method for realizing online tracking of furniture industry based on robot vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011510259.2A CN112720471A (en) 2020-12-18 2020-12-18 Automatic spraying method for realizing online tracking of furniture industry based on robot vision

Publications (1)

Publication Number Publication Date
CN112720471A true CN112720471A (en) 2021-04-30

Family

ID=75603381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011510259.2A Pending CN112720471A (en) 2020-12-18 2020-12-18 Automatic spraying method for realizing online tracking of furniture industry based on robot vision

Country Status (1)

Country Link
CN (1) CN112720471A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113829344A (en) * 2021-09-24 2021-12-24 深圳群宾精密工业有限公司 Visual guide track generation method, device, equipment and medium suitable for flexible product
CN114474043A (en) * 2021-12-20 2022-05-13 埃夫特智能装备股份有限公司 Method for realizing visual intelligent spraying of bedside

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179710A1 (en) * 2005-12-31 2007-08-02 Nuctech Company Limited Deviation-correction system for positioning of moving objects and motion tracking method thereof
CN105964453A (en) * 2016-07-14 2016-09-28 青岛金光鸿智能机械电子有限公司 Spraying system and method based on vision positioning
JP2017124468A (en) * 2016-01-14 2017-07-20 キヤノン株式会社 Method of controlling robot, method of manufacturing component, robot device, program, and recording medium
CN107544415A (en) * 2017-09-18 2018-01-05 上海发那科机器人有限公司 A kind of positioning compensation system
CN109848951A (en) * 2019-03-12 2019-06-07 易思维(天津)科技有限公司 Automatic processing equipment and method for large workpiece
CN109958263A (en) * 2019-05-09 2019-07-02 广东博智林机器人有限公司 Spray robot
CN111026164A (en) * 2019-12-24 2020-04-17 南京埃斯顿机器人工程有限公司 Robot target tracking trajectory planning method
CN111841970A (en) * 2020-07-30 2020-10-30 武汉湾流科技股份有限公司 Robot based on laser ranging and optimization method of paint spraying path

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179710A1 (en) * 2005-12-31 2007-08-02 Nuctech Company Limited Deviation-correction system for positioning of moving objects and motion tracking method thereof
JP2017124468A (en) * 2016-01-14 2017-07-20 キヤノン株式会社 Method of controlling robot, method of manufacturing component, robot device, program, and recording medium
CN105964453A (en) * 2016-07-14 2016-09-28 青岛金光鸿智能机械电子有限公司 Spraying system and method based on vision positioning
CN107544415A (en) * 2017-09-18 2018-01-05 上海发那科机器人有限公司 A kind of positioning compensation system
CN109848951A (en) * 2019-03-12 2019-06-07 易思维(天津)科技有限公司 Automatic processing equipment and method for large workpiece
CN109958263A (en) * 2019-05-09 2019-07-02 广东博智林机器人有限公司 Spray robot
CN111026164A (en) * 2019-12-24 2020-04-17 南京埃斯顿机器人工程有限公司 Robot target tracking trajectory planning method
CN111841970A (en) * 2020-07-30 2020-10-30 武汉湾流科技股份有限公司 Robot based on laser ranging and optimization method of paint spraying path

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113829344A (en) * 2021-09-24 2021-12-24 深圳群宾精密工业有限公司 Visual guide track generation method, device, equipment and medium suitable for flexible product
CN114474043A (en) * 2021-12-20 2022-05-13 埃夫特智能装备股份有限公司 Method for realizing visual intelligent spraying of bedside

Similar Documents

Publication Publication Date Title
US4590577A (en) Welding robot controlling method
CN112720471A (en) Automatic spraying method for realizing online tracking of furniture industry based on robot vision
US8706300B2 (en) Method of controlling a robotic tool
Huang et al. Dynamic compensation robot with a new high-speed vision system for flexible manufacturing
CN113427168A (en) Real-time welding seam tracking device and method for welding robot
CN109794382A (en) A kind of micro- coating robot of 3D and its coating method
Zhou et al. Autonomous acquisition of seam coordinates for arc welding robot based on visual servoing
CN114273726B (en) 3D vision guiding groove cutting method, device, equipment, system and storage medium
CN109623815B (en) Wave compensation double-robot system and method for unmanned salvage ship
CN112007789A (en) Prefabricated welding seam coating robot
CN114769988A (en) Welding control method and system, welding equipment and storage medium
CN116117373A (en) Intelligent welding method and system for small assembly components in ship
CN112720492B (en) Complex track fairing method and device for multi-axis robot, medium and electronic equipment
CN115254537B (en) Track correction method of glue spraying robot
CN108788544B (en) Welding seam initial point detection method based on structured light vision sensor
Luh A scheme for collision avoidance with minimum distance traveling for industrial robots
Wen et al. A 3D path following control scheme for robot manipulators
Nakhaeinia et al. Adaptive robotic contour following from low accuracy RGB-D surface profiling and visual servoing
CN113369686A (en) Intelligent welding system and method based on two-dimensional code visual teaching technology
JP3208722B2 (en) Tracking device for manipulator and tracking control method
CN115515760A (en) Robot control device
CN109807891A (en) Equipment moving processing method and processing device
CN110865657A (en) System and method for controlling contour track tracking on conveyor belt
Rodionov et al. Methods of automatic correction of technological trajectory of laser welding complex by means of computer vision
KR101312003B1 (en) Interpolation method of robot comprising r-axis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination