CN114463495A - Intelligent spraying method and system based on machine vision technology - Google Patents

Intelligent spraying method and system based on machine vision technology Download PDF

Info

Publication number
CN114463495A
CN114463495A CN202210085256.1A CN202210085256A CN114463495A CN 114463495 A CN114463495 A CN 114463495A CN 202210085256 A CN202210085256 A CN 202210085256A CN 114463495 A CN114463495 A CN 114463495A
Authority
CN
China
Prior art keywords
spraying
spray
point cloud
cloud data
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210085256.1A
Other languages
Chinese (zh)
Inventor
冯洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Derunfu Environmental Protection Technology Co ltd
Original Assignee
Shenzhen Derunfu Environmental Protection Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Derunfu Environmental Protection Technology Co ltd filed Critical Shenzhen Derunfu Environmental Protection Technology Co ltd
Priority to CN202210085256.1A priority Critical patent/CN114463495A/en
Publication of CN114463495A publication Critical patent/CN114463495A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B13/00Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
    • B05B13/02Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work
    • B05B13/04Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation
    • B05B13/0431Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation with spray heads moved by robots or articulated arms, e.g. for applying liquid or other fluent material to 3D-surfaces

Abstract

The invention relates to the technical field of intelligent spraying, and discloses an intelligent spraying method and system based on a machine vision technology, which comprises the following steps: s1, acquiring point cloud data of the spray painting piece in different directions; s2, synthesizing the point cloud data of the spray-coated piece in different directions to obtain a three-dimensional model of the spray-coated piece; and S3, calculating the spraying path of the spraying area and the direction of the spray gun during spraying in a preset mode according to the surface characteristics of the three-dimensional model based on the three-dimensional model of the spraying piece. According to the intelligent spraying method and system based on the machine vision technology, the spraying robot does not need to be programmed for the spraying piece, the spraying path is dynamically generated according to the surface of the spraying piece, and even if the spraying piece is replaced, manual intervention is not needed; the three-dimensional model of the spray-coated part can be automatically identified, manual intervention is not needed, spraying work can be carried out on different spray-coated parts, and three-dimensional modeling of the spray-coated part by professional modeling personnel is not needed.

Description

Intelligent spraying method and system based on machine vision technology
Technical Field
The invention relates to the technical field of intelligent spraying, in particular to an intelligent spraying method and system based on a machine vision technology.
Background
Currently, there are two types of painting robots, one is a teaching type, and the other is an off-line programming type. Such a spraying robot is automated to some extent, and the spraying robot is used to replace the work of a painter, but all of them have the following disadvantages: the adaptability is not strong, and the programming is needed to be carried out again aiming at different spraying pieces, even if the same spraying piece is changed in position or angle; the programming process needs the manual participation of professionals, and is tedious and inefficient. The precision is not easy to control, and is basically determined by visual inspection of a teacher. Techniques have also recently emerged that rely on three-dimensional imaging techniques to model spray parts, but are not mature enough: the modeling process depends on human participation, and post-processing needs to be carried out on the three-dimensional model to carry out manual editing and optimization. On one hand, the method has the defects that the method cannot be completely automated, and needs professional three-dimensional model editors to participate; on the other hand, errors are inevitably generated in manual editing, so that the model is not accurate enough, and the final spraying effect is influenced.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides an intelligent spraying method and system based on a machine vision technology, and solves the problems in the background technology.
(II) technical scheme
In order to achieve the above purpose, the invention provides the following technical scheme: an intelligent spraying method and system based on machine vision technology comprises the following steps:
s1, point cloud data of different orientations of the sprayed part are obtained;
s2, synthesizing point cloud data of the spray-coated piece in different directions to obtain a three-dimensional model of the spray-coated piece;
s3, calculating a spraying path of a spraying area and the direction of a spray gun during spraying in a preset mode according to the surface characteristics of the three-dimensional model based on the three-dimensional model of the spraying piece;
and S4, controlling the robot to a specified position, and moving the gun to spray the spraying area of the sprayed part according to the spraying path generated in the spraying area and the direction of the spray gun during spraying.
Preferably, the step S1 specifically includes:
the method comprises the steps of shooting different positions of a sprayed part by utilizing a plurality of depth cameras to obtain point cloud data of the sprayed part in different positions, wherein one depth camera correspondingly shoots the point cloud data of one position of the sprayed part.
Preferably, step S1 is preceded by:
s0, presetting the positions of the spray-coated parts shot by a plurality of cameras, and establishing the mapping relation between the coordinates of the cameras and the world coordinates by calculating the two-dimensional coordinates of the images of the corner points of the checkerboard in the image plane by means of a calibration board and a visual algorithm library and by means of a small hole imaging principle.
Preferably, the step S0 specifically includes:
designing a checkerboard cube, and defining the body center of the checkerboard cube as the origin of a world coordinate system;
each two adjacent cameras are called a group of camera pairs, and partial checkerboard surface images of the checkerboard cuboids in the common vision area of each pair of cameras are acquired;
calculating the image two-dimensional coordinates of the corner points by using an OpenCV (open circuit cv library) calibration algorithm according to the checkerboard surface image;
calculating 3D point cloud group coordinate data by using an algorithm in an OpenKinect library according to the two-dimensional coordinates of the image;
calculating an external reference transformation matrix between every two pairs of pairs by using an SVD algorithm in a PCL (virtual space vector) library according to the coordinate data of every two 3D point cloud pairs;
and optimizing the external parameter matrixes of the cameras to obtain each group of external parameter transformation matrixes taking the reference cameras as reference, and obtaining the three-dimensional position relation of the cameras in the space.
Preferably, the step S2 specifically includes:
selecting angle information of one camera of a plurality of cameras as a reference position coordinate, unifying point cloud data of different positions of a spray part shot by each camera to the reference position coordinate through transformation by calibrating the obtained pairwise transformation relation between the cameras, and finally transforming the point cloud data to a world coordinate system;
and synthesizing the cloud data of each point transformed into the world coordinate system to further obtain a three-dimensional model of the spray-coated part.
Preferably, the step S3 specifically includes:
s31, reconstructing the point cloud data of the spraying area by a triangular patch based on the three-dimensional model of the spraying piece to form the surface of the spraying area;
s32, calculating the three-dimensional position of the required spraying control point on the surface of the spraying area and the normal of each spraying control point relative to the spraying surface according to the surface curvature and the area size of the spraying area;
and S33, connecting all the spraying control points in a mode of manual spraying by approaching workers to form a spraying path, wherein each spraying control point is the walking position of a spray gun during spraying, and the normal direction is the orientation of the spray gun during spraying.
Preferably, the point cloud data acquisition module is used for acquiring point cloud data of different positions of the spray-coated piece;
the synthesis processing module is used for synthesizing the point cloud data of the spray-coated piece in different directions to obtain a three-dimensional model of the spray-coated piece;
the spraying path calculation module is used for calculating a spraying path of a spraying area and the direction of a spray gun during spraying in a preset mode according to the surface characteristics of the three-dimensional model based on the three-dimensional model of the sprayed part;
and the spraying control module is used for controlling the robot to reach an appointed position, and spraying the spraying area of the spraying piece by the walking gun according to the spraying path generated by the spraying area and the direction of the spray gun during spraying.
Preferably, a plurality of depth cameras are used for shooting different positions of the spray part to obtain point cloud data of the spray part in different positions, wherein one depth camera correspondingly shoots the point cloud data of one position of the spray part.
Preferably, the camera parameter calibration module is used for presetting the positions of the plurality of cameras for shooting the spray-coated parts, and establishing the mapping relation between the coordinates of the cameras and the world coordinates by calculating the two-dimensional image coordinates of the corner points of the checkerboard in the image plane and by using a small hole imaging principle by means of a calibration board and a visual algorithm library.
Preferably, the synthesis processing module is specifically configured to:
selecting angle information of one camera of the cameras as a reference position coordinate, unifying point cloud data of different positions of the spray part shot by each camera to the reference position coordinate through transformation by calibrating the obtained pairwise transformation relation between the cameras, finally transforming the point cloud data into a world coordinate system, synthesizing the point cloud data transformed into the world coordinate system, and further obtaining a three-dimensional model of the spray part.
Preferably, the spraying path calculating module specifically includes:
the triangular patch reconstruction module is used for reconstructing a triangular patch of point cloud data of a spraying area based on a three-dimensional model of a spraying piece to form the surface of the spraying area;
the control point determining module is used for determining the positions and the number of the spraying control points on the surface of the spraying area and the direction of the spray gun of each spraying control point according to the surface curvature and the area size of the spraying area;
and the path generation module is used for stringing all the spraying control points to form a spraying path and calculating the curvature and the normal direction of each spraying control point relative to the spraying path, wherein the curvature of each spraying control point relative to the spraying path is the trend of the spraying of the spray gun, and the normal direction is the direction of the spraying of the spray gun.
(III) advantageous effects
Compared with the prior art, the invention provides an intelligent spraying method and system based on a machine vision technology, and the intelligent spraying method and system have the following beneficial effects:
according to the intelligent spraying method and system based on the machine vision technology, the spraying robot does not need to be programmed for the spraying piece, the spraying path is dynamically generated according to the surface of the spraying piece, and even if the spraying piece is replaced, manual intervention is not needed; the three-dimensional model of the spray-coated part can be automatically identified, manual intervention is not needed, spraying work can be carried out on different spray-coated parts, and three-dimensional modeling of the spray-coated part by professional modeling personnel is not needed.
Drawings
Fig. 1 is a flow chart of an intelligent spraying method and system structure based on a machine vision technology provided by the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the present invention provides a technical solution: the method comprises the following steps: s1, point cloud data of different orientations of the sprayed part are obtained; s2, synthesizing point cloud data of different orientations of the sprayed part to obtain a three-dimensional model of the sprayed part; s3, calculating a spraying path of a spraying area and the direction of a spray gun during spraying in a preset mode according to the surface characteristics of the three-dimensional model based on the three-dimensional model of the spraying piece; s4, controlling the robot to move to the designated position, and performing gun spraying on the spray area of the spray part according to the spray path generated in the spray area and the direction of the spray gun during spraying, wherein the step S1 specifically includes: shooting different orientations of the spray-coated part by using a plurality of depth cameras to acquire point cloud data of different orientations of the spray-coated part, wherein one depth camera correspondingly shoots the point cloud data of one orientation of the spray-coated part, and the step S1 is preceded by: s0, presetting the positions of the spray-coated parts shot by a plurality of cameras, establishing mapping relations among the coordinates of the cameras and between the coordinates of the cameras and world coordinates by calculating the two-dimensional coordinates of the images of the corner points of the checkerboard in the image plane by means of a calibration board and a visual algorithm library and by means of a pinhole imaging principle, and the step S0 specifically comprises the following steps: designing a checkerboard cube, and defining the body center of the checkerboard cube as the origin of a world coordinate system; each two adjacent cameras are called a group of camera pairs, and partial checkerboard surface images of the checkerboard cuboids in the common vision area of each pair of cameras are acquired; calculating the image two-dimensional coordinates of the corner points according to the checkerboard surface image and by using an OpenCV (open CV library computer vision correction) library calibration algorithm; calculating the coordinate data of the 3D point cloud group by using an algorithm in an OpenKinect library according to the two-dimensional coordinates of the image; calculating an external reference transformation matrix between every two pairs of pairs by using an SVD algorithm in a PCL (virtual space vector) library according to the coordinate data of every two 3D point cloud pairs; optimizing the camera extrinsic parameter matrices to obtain groups of extrinsic parameter transformation matrices using the reference camera as a reference to obtain a three-dimensional position relationship in space between the cameras, wherein the step S2 specifically includes: selecting angle information of one camera of the cameras as a reference position coordinate, unifying point cloud data of different positions of the spray part shot by each camera to the reference position coordinate through transformation by calibrating the obtained pairwise transformation relation between the cameras, and finally transforming the point cloud data to a world coordinate system; synthesizing the cloud data of each point transformed into the world coordinate system to further obtain a three-dimensional model of the spray-coated part, wherein the step S3 specifically comprises the following steps: s31, reconstructing the point cloud data of the spraying area by a triangular patch based on the three-dimensional model of the spraying piece to form the surface of the spraying area; s32, calculating the three-dimensional position of the required spraying control point on the surface of the spraying area and the normal of each spraying control point relative to the spraying surface according to the surface curvature and the area size of the spraying area; s33, connecting all the spraying control points in a mode of approaching manual spraying of workers to form a spraying path, wherein each spraying control point is the walking position of a spray gun during spraying, the normal direction is the direction of the spray gun during spraying, and the point cloud data acquisition module is used for acquiring point cloud data of different directions of a sprayed part; the synthesis processing module is used for synthesizing the point cloud data of the spray-coated piece in different directions to obtain a three-dimensional model of the spray-coated piece; the spraying path calculation module is used for calculating a spraying path of a spraying area and the direction of a spray gun during spraying in a preset mode based on the three-dimensional model of the spraying piece and according to the surface characteristics of the three-dimensional model; the spraying control module is used for controlling the robot to reach a specified position, according to a spraying path generated by a spraying area and the orientation of a spray gun during spraying, the spraying area of a sprayed part is sprayed by a walking gun, a plurality of depth cameras are used for shooting different positions of the sprayed part to obtain point cloud data of different positions of the sprayed part, wherein one depth camera correspondingly shoots the point cloud data of one position of the sprayed part, the camera parameter calibration module is used for presetting the positions of the sprayed part shot by the plurality of cameras, by means of a calibration plate, a visual algorithm library is applied, through calculating the two-dimensional image coordinate of the angular point of a checkerboard in an image plane, through a small hole imaging principle, the mapping relation between each camera coordinate and a world coordinate is established, and the synthesis processing module is specifically used for: selecting angle information of one camera of a plurality of cameras as a reference position coordinate, unifying point cloud data of different positions of a sprayed part shot by each camera to the reference position coordinate through conversion by calibrating a pairwise conversion relation between the obtained cameras, finally converting the point cloud data into a world coordinate system, synthesizing the point cloud data converted into the world coordinate system, and further obtaining a three-dimensional model of the sprayed part, wherein the spraying path calculation module specifically comprises: the triangular patch reconstruction module is used for reconstructing a triangular patch of point cloud data of a spraying area based on a three-dimensional model of a spraying piece to form the surface of the spraying area; the control point determining module is used for determining the positions and the number of the spraying control points on the surface of the spraying area and the direction of the spray gun of each spraying control point according to the surface curvature and the area size of the spraying area; and the path generation module is used for connecting all the spraying control points in series to form a spraying path, and calculating the curvature and the normal direction of each spraying control point relative to the spraying path, wherein the curvature of each control spraying point relative to the spraying path is the trend of the spray gun during spraying, and the normal direction is the direction of the spray gun during spraying.
Working; s1, point cloud data of different orientations of the sprayed part are obtained;
s2, synthesizing point cloud data of the spray-coated piece in different directions to obtain a three-dimensional model of the spray-coated piece;
s3, calculating a spraying path of a spraying area and the direction of a spray gun during spraying in a preset mode according to the surface characteristics of the three-dimensional model based on the three-dimensional model of the spraying piece;
(S31, reconstructing the point cloud data of the spraying area by a triangular patch based on the three-dimensional model of the spraying piece to form the surface of the spraying area;
s32, calculating the three-dimensional position of the required spraying control point on the surface of the spraying area and the normal of each spraying control point relative to the spraying surface according to the surface curvature and the area size of the spraying area;
and S33, connecting all the spraying control points in a mode of manual spraying by approaching workers to form a spraying path, wherein each spraying control point is the walking position of a spray gun during spraying, and the normal direction is the orientation of the spray gun during spraying. )
And S4, controlling the robot to a specified position, and moving the gun to spray the spraying area of the sprayed part according to the spraying path generated in the spraying area and the direction of the spray gun during spraying.
In summary, according to the intelligent spraying method and system based on the machine vision technology, the spraying robot does not need to be programmed for the sprayed part, the spraying path is dynamically generated according to the surface of the sprayed part, and manual intervention is not needed even if the sprayed part is replaced; the three-dimensional model of the spraying piece can be automatically identified, manual intervention is not needed, spraying work can be carried out on different spraying pieces, and three-dimensional modeling is carried out on the spraying pieces by professional modeling persons.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (11)

1. An intelligent spraying method and system based on machine vision technology is characterized in that: the method comprises the following steps:
s1, point cloud data of different orientations of the sprayed part are obtained;
s2, synthesizing point cloud data of the spray-coated piece in different directions to obtain a three-dimensional model of the spray-coated piece;
s3, calculating a spraying path of a spraying area and the direction of a spray gun during spraying in a preset mode according to the surface characteristics of the three-dimensional model based on the three-dimensional model of the spraying piece;
and S4, controlling the robot to reach the designated position, and moving the gun to spray the spraying area of the sprayed part according to the spraying path generated in the spraying area and the direction of the spray gun during spraying.
2. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the step S1 specifically includes:
the method comprises the steps of shooting different positions of a sprayed part by utilizing a plurality of depth cameras to obtain point cloud data of the sprayed part in different positions, wherein one depth camera correspondingly shoots the point cloud data of one position of the sprayed part.
3. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the step S1 is preceded by:
s0, presetting the positions of the spray-coated parts shot by a plurality of cameras, and establishing the mapping relation between the coordinates of the cameras and the world coordinates by calculating the two-dimensional coordinates of the images of the corner points of the checkerboard in the image plane by means of a calibration board and a visual algorithm library and by means of a small hole imaging principle.
4. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the step S0 specifically includes:
designing a checkerboard cube, and defining the body center of the checkerboard cube as the origin of a world coordinate system;
each two adjacent cameras are called a group of camera pairs, and partial checkerboard surface images of the checkerboard cuboids in the common vision area of each pair of cameras are acquired;
calculating the image two-dimensional coordinates of the corner points by using an OpenCV (open circuit cv library) calibration algorithm according to the checkerboard surface image;
calculating 3D point cloud group coordinate data by using an algorithm in an OpenKinect library according to the two-dimensional coordinates of the image;
calculating an external reference transformation matrix between every two pairs of pairs by using an SVD algorithm in a PCL (virtual space vector) library according to the coordinate data of every two 3D point cloud pairs;
and optimizing the external parameter matrixes of the cameras to obtain each group of external parameter transformation matrixes taking the reference cameras as reference, and obtaining the three-dimensional position relation of the cameras in the space.
5. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the step S2 specifically includes:
selecting angle information of one camera of a plurality of cameras as a reference position coordinate, unifying point cloud data of different positions of a spray part shot by each camera to the reference position coordinate through transformation by calibrating the obtained pairwise transformation relation between the cameras, and finally transforming the point cloud data to a world coordinate system;
and synthesizing the cloud data of each point transformed into the world coordinate system to further obtain a three-dimensional model of the spray-coated part.
6. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the step S3 specifically includes:
s31, reconstructing the point cloud data of the spraying area by a triangular patch based on the three-dimensional model of the spraying piece to form the surface of the spraying area;
s32, calculating the three-dimensional position of the required spraying control point on the surface of the spraying area and the normal of each spraying control point relative to the spraying surface according to the surface curvature and the area size of the spraying area;
and S33, connecting all the spraying control points in a mode of manual spraying by approaching workers to form a spraying path, wherein each spraying control point is the walking position of a spray gun during spraying, and the normal direction is the orientation of the spray gun during spraying.
7. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the point cloud data acquisition module is used for acquiring point cloud data of different orientations of the spray-coated piece;
the synthesis processing module is used for synthesizing the point cloud data of the spray-coated piece in different directions to obtain a three-dimensional model of the spray-coated piece;
the spraying path calculation module is used for calculating a spraying path of a spraying area and the direction of a spray gun during spraying in a preset mode according to the surface characteristics of the three-dimensional model based on the three-dimensional model of the sprayed part;
and the spraying control module is used for controlling the robot to reach an appointed position, and spraying the spraying area of the spraying piece by the walking gun according to the spraying path generated by the spraying area and the direction of the spray gun during spraying.
8. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the method comprises the steps of shooting different positions of a spraying piece by utilizing a plurality of depth cameras to obtain point cloud data of the spraying piece in different positions, wherein one depth camera correspondingly shoots the point cloud data of one position of the spraying piece.
9. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: and the camera parameter calibration module is used for presetting the positions of the spraying parts shot by a plurality of cameras, and establishing the mapping relation among the coordinates of the cameras and between the coordinates of the cameras and the world coordinates by calculating the two-dimensional coordinates of the image of the angular points of the checkerboard in the image plane by using a calibration board and a visual algorithm library and by using a small hole imaging principle.
10. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the synthesis processing module is specifically configured to:
selecting angle information of one camera of the cameras as a reference position coordinate, unifying point cloud data of different positions of the spray part shot by each camera to the reference position coordinate through transformation by calibrating the obtained pairwise transformation relation between the cameras, finally transforming the point cloud data into a world coordinate system, synthesizing the point cloud data transformed into the world coordinate system, and further obtaining a three-dimensional model of the spray part.
11. The intelligent spraying method and system based on the machine vision technology as claimed in claim 1, wherein: the spraying path calculation module specifically comprises:
the triangular patch reconstruction module is used for reconstructing a triangular patch of point cloud data of a spraying area based on a three-dimensional model of a spraying piece to form the surface of the spraying area;
the control point determining module is used for determining the positions and the number of the spraying control points on the surface of the spraying area and the direction of the spray gun of each spraying control point according to the surface curvature and the area size of the spraying area;
and the path generation module is used for stringing all the spraying control points to form a spraying path and calculating the curvature and the normal direction of each spraying control point relative to the spraying path, wherein the curvature of each spraying control point relative to the spraying path is the trend of the spraying of the spray gun, and the normal direction is the direction of the spraying of the spray gun.
CN202210085256.1A 2022-01-25 2022-01-25 Intelligent spraying method and system based on machine vision technology Withdrawn CN114463495A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210085256.1A CN114463495A (en) 2022-01-25 2022-01-25 Intelligent spraying method and system based on machine vision technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210085256.1A CN114463495A (en) 2022-01-25 2022-01-25 Intelligent spraying method and system based on machine vision technology

Publications (1)

Publication Number Publication Date
CN114463495A true CN114463495A (en) 2022-05-10

Family

ID=81412387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210085256.1A Withdrawn CN114463495A (en) 2022-01-25 2022-01-25 Intelligent spraying method and system based on machine vision technology

Country Status (1)

Country Link
CN (1) CN114463495A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115971004A (en) * 2023-01-05 2023-04-18 深圳市泰达机器人有限公司 Intelligent putty spraying method and system for carriage

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115971004A (en) * 2023-01-05 2023-04-18 深圳市泰达机器人有限公司 Intelligent putty spraying method and system for carriage

Similar Documents

Publication Publication Date Title
CN107756408B (en) Robot track teaching device and method based on active infrared binocular vision
CN108717715B (en) Automatic calibration method for linear structured light vision system of arc welding robot
CN109598762B (en) High-precision binocular camera calibration method
CN106457562B (en) Method and robot system for calibration machine people
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
CN107449374B (en) Visual auxiliary laser galvanometer scanning system with flexible layout and field calibration method thereof
CN111775146A (en) Visual alignment method under industrial mechanical arm multi-station operation
CN111563878B (en) Space target positioning method
CN109712172A (en) A kind of pose measuring method of initial pose measurement combining target tracking
CN106327561A (en) Intelligent spraying method and system based on machine vision technology
CN109940626B (en) Control method of eyebrow drawing robot system based on robot vision
Boochs et al. Increasing the accuracy of untaught robot positions by means of a multi-camera system
WO2018201677A1 (en) Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
CN104463969B (en) A kind of method for building up of the model of geographical photo to aviation tilt
CN110136204B (en) Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera
CN111515950B (en) Method, device and equipment for determining transformation relation of robot coordinate system and storage medium
CN111415391A (en) Multi-view camera external orientation parameter calibration method adopting inter-shooting method
CN112907683B (en) Camera calibration method and device for dispensing platform and related equipment
CN113706619B (en) Non-cooperative target attitude estimation method based on space mapping learning
CN109443200A (en) A kind of mapping method and device of overall Vision coordinate system and mechanical arm coordinate system
CN110962127B (en) Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN114463495A (en) Intelligent spraying method and system based on machine vision technology
CN114001651B (en) Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data
CN113172659B (en) Flexible robot arm shape measuring method and system based on equivalent center point identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Feng Zhou

Inventor before: Feng Zhou

CB03 Change of inventor or designer information
WW01 Invention patent application withdrawn after publication

Application publication date: 20220510

WW01 Invention patent application withdrawn after publication