CN113680567A - Vehicle paint spraying method based on 3D camera - Google Patents

Vehicle paint spraying method based on 3D camera Download PDF

Info

Publication number
CN113680567A
CN113680567A CN202110882962.4A CN202110882962A CN113680567A CN 113680567 A CN113680567 A CN 113680567A CN 202110882962 A CN202110882962 A CN 202110882962A CN 113680567 A CN113680567 A CN 113680567A
Authority
CN
China
Prior art keywords
vehicle
camera
data
spraying
sprayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110882962.4A
Other languages
Chinese (zh)
Inventor
钱鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Curve Intelligent Equipment Co ltd
Original Assignee
Beijing Curve Intelligent Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Curve Intelligent Equipment Co ltd filed Critical Beijing Curve Intelligent Equipment Co ltd
Priority to CN202110882962.4A priority Critical patent/CN113680567A/en
Publication of CN113680567A publication Critical patent/CN113680567A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B13/00Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
    • B05B13/02Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work
    • B05B13/04Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation
    • B05B13/0431Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation with spray heads moved by robots or articulated arms, e.g. for applying liquid or other fluent material to 3D-surfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B12/00Arrangements for controlling delivery; Arrangements for controlling the spray area
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B12/00Arrangements for controlling delivery; Arrangements for controlling the spray area
    • B05B12/08Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means
    • B05B12/12Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to conditions of ambient medium or target, e.g. humidity, temperature position or movement of the target relative to the spray apparatus
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B12/00Arrangements for controlling delivery; Arrangements for controlling the spray area
    • B05B12/08Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means
    • B05B12/12Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to conditions of ambient medium or target, e.g. humidity, temperature position or movement of the target relative to the spray apparatus
    • B05B12/122Arrangements for controlling delivery; Arrangements for controlling the spray area responsive to condition of liquid or other fluent material to be discharged, of ambient medium or of target ; responsive to condition of spray devices or of supply means, e.g. pipes, pumps or their drive means responsive to conditions of ambient medium or target, e.g. humidity, temperature position or movement of the target relative to the spray apparatus responsive to presence or shape of target

Abstract

The invention provides a vehicle paint spraying method based on a 3D camera, which comprises the following steps: automatically identifying the vehicle type, the position and the shape of the part, the vehicle orientation, the vehicle position and the metal plate pattern sprayed by the vehicle of the vehicle to be sprayed, automatically matching the vehicle type and the part stored in the database, and generating a spraying track of the spraying robot according to a matching result; and outputting a control signal to the spraying robot according to the planned spraying track, and controlling the operation of an X-axis support assembly, a Y-axis support assembly, a Z-axis support assembly and a multi-stage manipulator of the spraying robot so as to realize automatic paint spraying. The automatic matching method for the AI vehicle model and the vehicle parts realizes automatic matching of the AI vehicle model and the vehicle parts, plans the spraying track of the spraying robot after matching, and further controls the spraying robot to perform mobile spraying; the method can obtain more depth of field details than the traditional binocular vision, and has stronger ambient light anti-interference capability than monocular structured light.

Description

Vehicle paint spraying method based on 3D camera
Technical Field
The invention belongs to the technical field of vehicle paint spraying, and particularly relates to a vehicle paint spraying method based on a 3D camera.
Background
The automobile paint spraying refers to spraying a layer of paint on the surface of an automobile to achieve the purpose of protecting the automobile, the automobile paint spraying gradually goes from manual paint spraying to mechanical semi-automatic paint spraying, the paint spraying efficiency is improved, the labor intensity of paint spraying workers is reduced, and the position of the mechanical semi-automatic paint spraying needs to be determined, so that manual auxiliary paint spraying equipment is still needed for paint spraying.
In order to solve the above technical problems, those skilled in the art propose a spraying robot, which is an industrial robot capable of automatically spraying paint or other paints, to perform full-automatic self-spraying. The spraying robot mostly adopts a 5 or 6-degree-of-freedom joint type structure, the arm has a large movement space and can move along a complex track, the spraying robot has the characteristics of high action speed, good explosion-proof performance and the like, and the spraying quality and the material utilization rate can be improved. When specifically using, need guarantee when selecting the robot that the work orbit scope of robot must can cover the relevant surface or the inner chamber of the work piece of required construction completely, just then must carry out accurate location to the vehicle, the unable accurate location of current automatic spraying mode however leads to spraying operating efficiency low, and the spraying orbit is accurate inadequately to influence membrane thickness, appear stratiform coat and colourity inconsistent when serious.
Disclosure of Invention
In order to solve the technical problem, the invention provides a vehicle paint spraying method based on a 3D camera. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
The invention adopts the following technical scheme:
a 3D camera based vehicle paint spraying method is provided, comprising: the optical enhancement system projects the structured light onto a surface of a vehicle to be painted parked within a vehicle parking area; the method comprises the following steps that an infrared camera receives a structured light pattern reflected by the surface of a vehicle to be sprayed; the color camera acquires color image information of a vehicle to be sprayed; the embedded type calculation vision processor calculates the three-dimensional surface shape data of the vehicle to be sprayed according to the acquired structured light pattern, simultaneously extracts RGB data in the color image information, aligns the three-dimensional surface shape data with the RGB data, and outputs the data to an upper computer; the upper computer identifies the vehicle type, the position and the shape of the part, the vehicle orientation, the vehicle position and the metal plate pattern sprayed by the vehicle of the vehicle to be sprayed according to the data uploaded by the embedded computing vision processor, automatically matches the vehicle type and the part stored in the database, and generates a spraying track of the spraying robot according to the matching result; and the upper computer outputs a control signal to the spraying robot according to the generated spraying track to control the spraying robot to move.
Further, the process of controlling the spraying robot to move by the upper computer includes: controlling a Y-axis support assembly of the spraying robot to move on the X-axis support assembly; controlling a Z-axis carriage assembly to move on the Y-axis carriage assembly; controlling the spray gun and the multi-stage manipulator to move on the Z-axis support frame assembly; and controlling each rotating shaft on the multistage manipulator to rotate so as to adjust the position and the orientation of the spray gun.
Further, the structured light projected by the optical enhancement system is four sinusoidal fringe patterns, and the four sinusoidal fringe patterns have phase differences; the infrared camera receives a structured light pattern reflected by the surface of the vehicle to be sprayed and is a modulated fringe pattern, and simultaneously, the infrared camera collects four fringes of a reference surface which is not modulated.
Further, the process of calculating the three-dimensional surface shape data of the vehicle to be sprayed by the embedded type calculation vision processor according to the acquired structured light pattern comprises the following steps: calculating the modulated phase by using a four-step phase shifting method to obtain a truncated phase diagram and restoring the truncated phase to a continuous phase; and subtracting the modulated continuous phase from the reference continuous phase to obtain a phase difference, representing the height information of the vehicle to be sprayed relative to the reference surface by the obtained phase difference, and substituting the phase difference into a phase and height conversion formula to obtain the three-dimensional model.
Further, the vehicle paint spraying method based on the 3D camera further comprises the following steps: the upper computer obtains the distance between each point in the image and the infrared camera through the data obtained by the 3D camera, obtains the three-dimensional space coordinate of each point in the obtained image by combining the two-dimensional coordinates of each point in the 2D image, and outputs a control signal to the spraying robot according to the obtained three-dimensional space coordinate.
Further, the color camera further includes, after acquiring the color image information of the vehicle to be painted: preprocessing the image, and removing hash points and isolated points in the data; the pretreatment process comprises the following steps: filtering, brightness adjustment and denoising.
Further, the process of aligning the three-dimensional shape data with the RGB data by the embedded computational vision processor comprises: temporal registration and spatial registration.
Further, the time registration is to simultaneously take out two groups of image data when calling the camera API to collect data, and acquire RGB data and depth data according to the output rule of the camera.
Further, the spatial registration is to perform offset calibration on the image data acquired by the infrared camera and the color camera, so that the coordinates of each point on the RGB image correspond to the coordinates of each point of the depth data one to one.
The invention has the following beneficial effects:
the invention can accurately and rapidly identify the vehicle type, the position and the shape of parts, the vehicle orientation, the vehicle position and the metal plate pattern sprayed by the vehicle of the vehicle to be sprayed, and plan a reasonable spraying track after matching with the database, thereby controlling the spraying robot to carry out moving spraying and ensuring the uniformity and the accuracy of paint spraying; the method can obtain more depth of field details than the traditional binocular vision, and has stronger ambient light anti-interference capability than monocular structured light; the spraying efficiency is greatly improved.
Drawings
FIG. 1 is a schematic flow diagram of a 3D camera based vehicle paint spraying method of the present invention;
FIG. 2 is a schematic structural view of the painting robot of the present invention;
FIG. 3 is a schematic structural view of a leg structure of the painting robot of the present invention;
FIG. 4 is a schematic structural view of a Z-axis carriage assembly of the painting robot of the present invention;
FIG. 5 is a schematic structural view of a multi-stage manipulator of the painting robot of the present invention;
FIG. 6 is a schematic view of the connection of the X-axis carriage assembly to the Y-axis carriage assembly.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the invention to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others.
As shown in fig. 2-6, the invention provides an intelligent spraying robot for automobiles, which can accurately and rapidly identify the type, position and shape of parts, orientation of vehicle, position of vehicle and sheet metal pattern sprayed by vehicle, and can plan a reasonable spraying track after matching with a database, so as to control the spraying robot to perform mobile spraying,
the spraying robot includes: an X-axis carriage assembly 1, a Y-axis carriage assembly 2, a Z-axis carriage assembly 3, a multi-stage robot 4, and a spray gun 5.
The X-axis carriage assembly 1 is used to support the whole apparatus, and the Y-axis carriage assembly 2 is disposed on the X-axis carriage assembly 1 and can move along the X-axis carriage assembly 1 to realize the movement of the spray gun 5 in the X direction. The Z-axis carriage assembly 3 is disposed on the Y-axis carriage assembly 2 and is movable along the Y-axis carriage assembly 2 to effect movement of the spray gun 5 in the Y-direction. Spray gun 5 is used for spraying paint vehicle to the vehicle, and spray gun 5 is connected with Z axle support frame assembly 3 through multistage manipulator 4, and spray gun 5 can move along Z axle support frame assembly 3, and Z axle support frame assembly 3 can drive multistage manipulator 4 and move in the Z direction promptly, and then drive spray gun 5 and move in the Z direction. The multi-stage manipulator 4 enables the spray gun 5 to be adjusted in multiple angles.
The X-ray shaft support assembly 1 includes: a guide beam 101, a leg structure 102, a first linear guide 103.
The number of the guide beams 101 is two, the two guide beams 101 are respectively disposed at both sides of the painting vehicle parking area 6, and both ends of the Y-axis carriage assembly 2 are respectively connected with the two guide beams 101 so that the Y-axis carriage assembly 2 is located above the painting vehicle parking area 6, so that the spray gun 5 mounted on the Y-axis carriage assembly 2 through the Z-axis carriage assembly 3 can cover the surface to be painted of the entire vehicle.
Each guiding beam 101 is supported by three leg structures 102, and the leg structures 102 are arranged at regular intervals. As shown in fig. 3, wherein the leg structure 102 comprises: a support platform 1021, a body 1022, a substrate 1023, and triangular reinforcing wings 1024.
The support platform 1021 is disposed on top of the main body 1022, the substrate 1023 is disposed on the bottom of the main body 1022, and the substrate 1023 is connected to the ground by bolts. Triangle-shaped consolidates flank 1024 and is the right triangle-shaped structure, and one of them right-angle side of triangle-shaped consolidates flank 1024 and is connected with main part 1022, and another right-angle side is connected with base plate 1023, guarantees the stability of stabilizer blade structure 102 to guarantee the stability of X-ray shaft bracket component 1, and simple structure is stable.
The first linear guide 103 is laid on the upper surface of the guide beam 101, and two ends of the Y-axis support frame assembly 2 are respectively connected with the two guide beams 101 through the first linear guide 103, specifically, a support guide of the first linear guide 103 is arranged on the guide beam 101, and two ends of the Y-axis support frame assembly 2 are connected with a moving part of the first linear guide 103, so that the Y-axis support frame assembly 2 can perform linear motion on the X-axis support frame assembly 1.
The Y-axis carriage assembly 2 includes: a suspension beam 201, a second linear guide 202 and a bearing plate 203.
Both ends of the cantilever beam 201 are provided with a bearing plate 203, the bearing plate 203 is connected with the moving part of the first linear guide 103, and the bearing plate 203 plays a role of supporting the cantilever beam 201. The second linear guide 202 is arranged on the side surface of the suspension beam 201, the Z-axis support frame assembly 3 is connected with the suspension beam 201 through the second linear guide 202, specifically, a bearing part of the second linear guide 202 is arranged on the suspension beam 201, and a moving part of the second linear guide 202 is connected with the Z-axis support frame assembly 3, so that the Z-axis support frame assembly 3 can perform linear motion on the Y-axis support frame assembly 2, and the structure is simple and stable.
As shown in fig. 4, the Z-axis bracket assembly 3 includes: a third linear guide rail 301, a moving reference plate 302, a vertical beam 303, a mounting plate 304 and an adapter 305.
The third linear guide 301 is arranged on a vertical beam 303. One surface of the moving reference plate 302 is connected with the moving member of the second linear guide 202, and the other surface is provided with an adapter 305, and the moving reference plate 302 is connected with the third linear guide 301 through the adapter 305. Specifically, the bearing piece of the third linear guide 301 is arranged on the vertical beam 303, and the moving piece of the third linear guide 301 is connected with the adapter 305, so that the vertical beam 303 can move up and down, the mounting plate 304 is arranged at the bottom of the vertical beam 303, the top of the multi-stage manipulator 4 is connected with the mounting plate 304, and the movement of the vertical beam 303 can further drive the spray gun 5 to move up and down.
As shown in fig. 5, the multistage robot 4 is a six-axis robot including: the spray gun comprises a first rotating mechanism 401, a first turnover mechanism 402, a second turnover mechanism 403, a second rotating mechanism 404, a third turnover mechanism 405 and a third rotating mechanism 406 which are sequentially connected, wherein the first rotating mechanism 401 is connected with a mounting plate 304 through a connecting seat 407, and the third rotating mechanism 406 is connected with the spray gun 5.
The first rotating mechanism 401, the first turnover mechanism 402, the second turnover mechanism 403, the second rotating mechanism 404, the third turnover mechanism 405, and the third rotating mechanism 406 may be implemented by using an existing rotating motor to implement relative rotation of the two components.
Preferably, all be provided with flexible cable protection track 7 on guiding beam 101, hanging beam 201 and the vertical roof beam 303, power supply line and signal transmission line can be fixed in the inside passageway of flexible cable protection track 7, move along with each moving part, avoid pencil and other positions of equipment to carry out frictional contact, finally lead to the pencil wearing and tearing.
In some illustrative embodiments, as shown in fig. 1, the present invention provides a 3D camera based vehicle paint spraying method wherein 3D cameras 8 are arranged at both ends of a guide beam 101, the present invention comprising the steps of:
s1: the optical enhancement system projects structured light onto a surface of a vehicle to be painted parked within a vehicle parking area to assist in calculating depth data. The structured light generally adopts invisible infrared laser with specific wavelength as a light source, the optical enhancement system is a functional component capable of emitting the structured light, the light emitted by the optical enhancement system is projected on a vehicle to be sprayed through a certain code, and the structured light generally has stripe structure light, coded structured light and speckle structure light according to different coded patterns.
Specifically, the structured light projected by the optical enhancement system of the invention is four sinusoidal fringe patterns, and the four sinusoidal fringe patterns have phase differences, so that the sinusoidal fringe patterns can be generated by programming in actual operation, and because the phase is acquired by using the deformed fringe patterns subsequently, various algorithms for acquiring the phase are provided, and the invention adopts a four-step phase shifting method.
S2: the infrared camera receives the structured light pattern reflected by the surface of the vehicle to be sprayed, structured light with specific wavelength irradiates the surface of the vehicle to be sprayed, the reflected light is received by the infrared camera with filtering, and then the distortion of the returned coding pattern can be calculated through a certain algorithm to obtain the position and depth information of the surface of the vehicle to be sprayed.
The infrared camera receives the structured light pattern reflected by the surface of the vehicle to be sprayed, and the structured light pattern is a modulated fringe pattern, and simultaneously the infrared camera collects four fringes of a reference surface which is not modulated. And projecting the four sinusoidal fringe patterns onto a vehicle to be sprayed, acquiring four modulated fringe patterns, and acquiring four reference surface fringes, wherein the reference surface fringes are not modulated and are also four.
S3: the color camera acquires color image information of the vehicle to be sprayed.
The color camera needs to preprocess the image after acquiring the color image so as to remove hash points and isolated points in the data, ensure that the quality of the output image meets the requirements and reduce interference.
The pretreatment process comprises the following steps: filtering, brightness adjustment and denoising. Preprocessing is an important link, and the accuracy of the subsequent identification step can be improved after the preprocessing is performed. Common preprocessing algorithms include mean filtering, gaussian filtering and other linear filtering; nonlinear filtering such as median filtering and bilateral filtering; morphological filtering such as corrosion, expansion, opening operation, closing operation and the like; there are also gamma correction, brightness adjustment, image enhancement based on histogram statistics, etc.
S4: and the embedded computing vision processor computes the three-dimensional surface shape data of the vehicle to be sprayed according to the acquired structured light pattern, and simultaneously extracts RGB data in the color image information.
The process of calculating the three-dimensional surface shape data comprises the following steps:
firstly, a modulated phase is calculated by utilizing a four-step phase shifting method, a truncated phase diagram is obtained, the truncated phase is recovered to be a continuous phase, the modulated phase is calculated by utilizing four acquired modulated fringe diagrams, the obtained phase diagram is the truncated phase diagram, the result obtained by the four-step phase shifting algorithm is calculated by an arc tangent function, namely, when the value exceeds the range, the phase diagram is restarted, and the truncated phase can be recovered to be the continuous phase after the jump is eliminated;
and then subtracting the modulated continuous phase from the reference continuous phase to obtain a phase difference, wherein the obtained phase difference can represent height information of the vehicle to be sprayed relative to a reference surface, and then substituting the phase difference into a phase and height conversion formula to obtain a three-dimensional model, wherein the phase and height conversion formula can be obtained by adopting the existing formula, and corresponding parameters common to the phase and height conversion need to be calibrated.
S5: the embedded computing vision processor aligns the three-dimensional surface shape data with the RGB data and outputs the data to an upper computer.
The alignment process comprises the following steps: temporal registration and spatial registration. The RGB data and the depth data are acquired in the early-stage image data acquisition, but the RGB data and the depth data need to be aligned, so that the RGB data and the depth data of each point can be ensured to correspond to each other and are at the same time in time, and therefore, the registration is divided into the time alignment and the space alignment.
And the time registration is to simultaneously take out two groups of image data when calling the camera API to acquire data and acquire RGB data and depth data according to the output rule of the camera. The infrared camera and the color camera output two sets of data, namely RGB image and depth data, the two sets of data are collected by two modules respectively, so that the two sets of image data are synchronized in time, namely, one set of data output every time is collected at the same moment, strict time synchronization is carried out on the camera, and a camera manufacturer generally provides a frame synchronization API. The upper layer application is used for simultaneously taking out two groups of image data when calling the camera API to collect data, and RGB data and depth data are obtained according to the output rule of the camera. If the RGB data and the depth data output by the camera are in the order of RGB data to depth data, the rule for receiving the data should be RGB data to depth data. One cycle is a complete 3D image data, and the scheme of receiving the 3D data is determined according to the API interface of the camera.
The spatial registration is to perform offset calibration on image data acquired by the infrared camera and the color camera, so that coordinates of each point on the RGB image correspond to coordinates of each point of the depth data one to one. Because the installation positions of the infrared camera and the color camera on hardware may have offset, and the viewing angle direction of the camera may have offset, the offset calibration of the acquired image data is needed, so that the coordinates of each point on the RGB image and the coordinates of each point of the depth data are in one-to-one correspondence, and a general camera manufacturer can provide an API for spatial registration.
The finally acquired 3D data is generally image data taken at a plurality of angles, and then the image data taken at the plurality of angles is sorted, registered and combined to complete a complete 3D data, that is, complete 3D reconstruction. The physical 3D reconstruction mainly refers to acquiring depth information and RGB information by using hardware equipment and then reconstructing the depth information and the RGB information into a 3D model, the processed data is RGBD image data, and the RGBD data can be converted into point cloud data to be stored in a form of conversion. The point cloud registration is to solve the problem existing in the point cloud data acquisition, and to superimpose multi-frame 3D images acquired at different angles and different times onto the same coordinate system, which is a world coordinate system, to form a set of complete point cloud data.
S6: the upper computer identifies the vehicle type, the position and the shape of the part, the vehicle orientation, the vehicle position and the metal plate pattern sprayed by the vehicle of the vehicle to be sprayed according to the data uploaded by the embedded computing vision processor, automatically matches the vehicle type and the part stored in the database, and generates a spraying track of the spraying robot according to the matching result; and the upper computer outputs a control signal to the spraying robot according to the generated spraying track to control the spraying robot to move.
The process that the host computer control spraying robot moved includes: controlling a Y-axis support assembly of the spraying robot to move on the X-axis support assembly; controlling a Z-axis carriage assembly to move on the Y-axis carriage assembly; controlling the spray gun and the multi-stage manipulator to move on the Z-axis support frame assembly; and controlling each rotating shaft on the multistage manipulator to rotate so as to adjust the position and the orientation of the spray gun.
The upper computer obtains the distance between each point and the infrared camera in the image through the data obtained by the 3D camera, obtains the three-dimensional space coordinate of each point in the obtained image by combining the two-dimensional coordinate of each point in the 2D image, and outputs a control signal to the spraying robot according to the obtained three-dimensional space coordinate.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.

Claims (9)

1. A 3D camera based vehicle paint spraying method comprising:
the optical enhancement system projects the structured light onto a surface of a vehicle to be painted parked within a vehicle parking area;
the method comprises the following steps that an infrared camera receives a structured light pattern reflected by the surface of a vehicle to be sprayed;
the color camera acquires color image information of a vehicle to be sprayed;
the embedded type calculation vision processor calculates the three-dimensional surface shape data of the vehicle to be sprayed according to the acquired structured light pattern, simultaneously extracts RGB data in the color image information, aligns the three-dimensional surface shape data with the RGB data, and outputs the data to an upper computer;
the upper computer identifies the vehicle type, the position and the shape of the part, the vehicle orientation, the vehicle position and the metal plate pattern sprayed by the vehicle of the vehicle to be sprayed according to the data uploaded by the embedded computing vision processor, automatically matches the vehicle type and the part stored in the database, and generates a spraying track of the spraying robot according to the matching result;
and the upper computer outputs a control signal to the spraying robot according to the generated spraying track to control the spraying robot to move.
2. The 3D camera-based vehicle paint spraying method according to claim 1, wherein the process of controlling the movement of the spraying robot by the upper computer comprises: controlling a Y-axis support assembly of the spraying robot to move on the X-axis support assembly; controlling a Z-axis carriage assembly to move on the Y-axis carriage assembly; controlling the spray gun and the multi-stage manipulator to move on the Z-axis support frame assembly; and controlling each rotating shaft on the multistage manipulator to rotate so as to adjust the position and the orientation of the spray gun.
3. The 3D camera-based vehicle paint spraying method according to claim 2, wherein the structured light projected by the optical enhancement system is four sinusoidal fringe patterns, and the four sinusoidal fringe patterns have phase differences; the infrared camera receives a structured light pattern reflected by the surface of the vehicle to be sprayed and is a modulated fringe pattern, and simultaneously, the infrared camera collects four fringes of a reference surface which is not modulated.
4. The 3D camera-based vehicle paint spraying method according to claim 3, wherein the embedded computational vision processor computing three-dimensional profile data of the vehicle to be painted according to the acquired structured light pattern comprises:
calculating the modulated phase by using a four-step phase shifting method to obtain a truncated phase diagram and restoring the truncated phase to a continuous phase;
and subtracting the modulated continuous phase from the reference continuous phase to obtain a phase difference, representing the height information of the vehicle to be sprayed relative to the reference surface by the obtained phase difference, and substituting the phase difference into a phase and height conversion formula to obtain the three-dimensional model.
5. The 3D camera-based vehicle paint spraying method according to claim 4, further comprising: the upper computer obtains the distance between each point in the image and the infrared camera through the data obtained by the 3D camera, obtains the three-dimensional space coordinate of each point in the obtained image by combining the two-dimensional coordinates of each point in the 2D image, and outputs a control signal to the spraying robot according to the obtained three-dimensional space coordinate.
6. The 3D camera-based vehicle paint spraying method according to claim 5, wherein the color camera further comprises after acquiring color image information of the vehicle to be sprayed: preprocessing the image, and removing hash points and isolated points in the data; the pretreatment process comprises the following steps: filtering, brightness adjustment and denoising.
7. The 3D camera-based vehicle paint spraying method according to claim 6, wherein the process of the embedded computational vision processor aligning the three-dimensional profile data with the RGB data comprises: temporal registration and spatial registration.
8. The 3D camera-based vehicle paint spraying method according to claim 7, wherein the time registration is to fetch two sets of image data simultaneously when a camera API is called to collect data, and the RGB data and the depth data are obtained according to an output rule of the camera.
9. The 3D camera-based vehicle paint spraying method according to claim 8 wherein the spatial registration is an offset calibration of the image data collected by the infrared camera and the color camera such that the coordinates of each point on the RGB image correspond one-to-one with the coordinates of each point of the depth data.
CN202110882962.4A 2021-08-02 2021-08-02 Vehicle paint spraying method based on 3D camera Pending CN113680567A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110882962.4A CN113680567A (en) 2021-08-02 2021-08-02 Vehicle paint spraying method based on 3D camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110882962.4A CN113680567A (en) 2021-08-02 2021-08-02 Vehicle paint spraying method based on 3D camera

Publications (1)

Publication Number Publication Date
CN113680567A true CN113680567A (en) 2021-11-23

Family

ID=78578621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110882962.4A Pending CN113680567A (en) 2021-08-02 2021-08-02 Vehicle paint spraying method based on 3D camera

Country Status (1)

Country Link
CN (1) CN113680567A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114618704A (en) * 2022-02-23 2022-06-14 深圳远荣智能制造股份有限公司 3D vision-guided robot programming-free spraying method and system thereof
CN114733683A (en) * 2022-04-25 2022-07-12 广汽本田汽车有限公司 Automatic spraying method and system for automobile body
CN115592688A (en) * 2022-12-14 2023-01-13 中铭谷智能机器人(广东)有限公司(Cn) Paint spraying track control method and system for paint spraying robot arranged on truss manipulator

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070115484A1 (en) * 2005-10-24 2007-05-24 Peisen Huang 3d shape measurement system and method including fast three-step phase shifting, error compensation and calibration
CN102509094A (en) * 2011-11-25 2012-06-20 哈尔滨工业大学深圳研究生院 Structured-light-based embedded 3D (three dimensional) fingerprint collecting method and system
CN104487174A (en) * 2012-05-21 2015-04-01 Cmo索迪尼蒂诺和C.S.N.C.公司 Method and apparatus for painting objects
CN107899814A (en) * 2017-12-20 2018-04-13 芜湖哈特机器人产业技术研究院有限公司 A kind of robot spraying system and its control method
CN108592788A (en) * 2018-03-29 2018-09-28 湖南大学 A kind of 3D intelligent camera systems towards spray-painting production line and workpiece On-line Measuring Method
CN109870129A (en) * 2019-03-25 2019-06-11 中国计量大学 A kind of wafer surface roughness detection device based on phase deviation principle
CN212059961U (en) * 2019-12-20 2020-12-01 苏州聚悦信息科技有限公司 Circuit board defect detection device of dot matrix infrared light imaging technology
CN112964262A (en) * 2021-03-26 2021-06-15 南京理工大学 Data acquisition and processing system and method for unmanned vehicle-mounted sensor
CN113000263A (en) * 2021-03-06 2021-06-22 麦特汽车服务股份有限公司 Method for adjusting angle of spray gun of automatic paint spraying equipment for automobile repair

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070115484A1 (en) * 2005-10-24 2007-05-24 Peisen Huang 3d shape measurement system and method including fast three-step phase shifting, error compensation and calibration
CN102509094A (en) * 2011-11-25 2012-06-20 哈尔滨工业大学深圳研究生院 Structured-light-based embedded 3D (three dimensional) fingerprint collecting method and system
CN104487174A (en) * 2012-05-21 2015-04-01 Cmo索迪尼蒂诺和C.S.N.C.公司 Method and apparatus for painting objects
CN107899814A (en) * 2017-12-20 2018-04-13 芜湖哈特机器人产业技术研究院有限公司 A kind of robot spraying system and its control method
CN108592788A (en) * 2018-03-29 2018-09-28 湖南大学 A kind of 3D intelligent camera systems towards spray-painting production line and workpiece On-line Measuring Method
CN109870129A (en) * 2019-03-25 2019-06-11 中国计量大学 A kind of wafer surface roughness detection device based on phase deviation principle
CN212059961U (en) * 2019-12-20 2020-12-01 苏州聚悦信息科技有限公司 Circuit board defect detection device of dot matrix infrared light imaging technology
CN113000263A (en) * 2021-03-06 2021-06-22 麦特汽车服务股份有限公司 Method for adjusting angle of spray gun of automatic paint spraying equipment for automobile repair
CN112964262A (en) * 2021-03-26 2021-06-15 南京理工大学 Data acquisition and processing system and method for unmanned vehicle-mounted sensor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114618704A (en) * 2022-02-23 2022-06-14 深圳远荣智能制造股份有限公司 3D vision-guided robot programming-free spraying method and system thereof
CN114733683A (en) * 2022-04-25 2022-07-12 广汽本田汽车有限公司 Automatic spraying method and system for automobile body
CN115592688A (en) * 2022-12-14 2023-01-13 中铭谷智能机器人(广东)有限公司(Cn) Paint spraying track control method and system for paint spraying robot arranged on truss manipulator
CN115592688B (en) * 2022-12-14 2023-03-10 中铭谷智能机器人(广东)有限公司 Paint spraying track control method and system for paint spraying robot arranged on truss manipulator

Similar Documents

Publication Publication Date Title
CN113680567A (en) Vehicle paint spraying method based on 3D camera
CN108571971B (en) AGV visual positioning system and method
CN109483539A (en) Vision positioning method
CN106392267B (en) A kind of real-time welding seam tracking method of six degree of freedom welding robot line laser
CN108747132B (en) Autonomous mobile welding robot vision control system
EP3011362B1 (en) Systems and methods for tracking location of movable target object
CN206263418U (en) A kind of real-time seam tracking system of six degree of freedom welding robot line laser
KR101549103B1 (en) Detection apparatus, Detection method and manipulator
CN104175330B (en) A kind of six joint industrial robot real-time servo tracking means based on aiming mechanism
CN111192307A (en) Self-adaptive deviation rectifying method based on laser cutting of three-dimensional part
WO2021093288A1 (en) Magnetic stripe-simulation positioning method and device based on ceiling-type qr codes
CN108032011B (en) Initial point guiding device and method are stitched based on laser structure flush weld
ZA200607676B (en) Optical method of determining a physical attribute of a moving object
CN104408408A (en) Extraction method and extraction device for robot spraying track based on curve three-dimensional reconstruction
CN110136211A (en) A kind of workpiece localization method and system based on active binocular vision technology
CN107340788A (en) Industrial robot field real-time temperature compensation method based on visual sensor
CN114535825B (en) Laser marking vehicle identification code system based on manipulator
CN112099442A (en) Parallel robot vision servo system and control method
KR101452437B1 (en) Method for setting the mobile manipulator onto the workbench
CN112170124B (en) Visual positioning method and device for vehicle body and vehicle frame
CN115790366A (en) Visual positioning system and method for large array surface splicing mechanism
CN115014338A (en) Mobile robot positioning system and method based on two-dimensional code vision and laser SLAM
CN113000263B (en) Method for adjusting angle of spray gun of automatic paint spraying equipment for automobile repair
CN113664831B (en) Welding robot system and method for acquiring weld point cloud information
CN112085796A (en) Automatic coating method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211123