CN111123853B - Control method of robot for detecting and remedying spraying on inner surface of automobile - Google Patents
Control method of robot for detecting and remedying spraying on inner surface of automobile Download PDFInfo
- Publication number
- CN111123853B CN111123853B CN201911166209.4A CN201911166209A CN111123853B CN 111123853 B CN111123853 B CN 111123853B CN 201911166209 A CN201911166209 A CN 201911166209A CN 111123853 B CN111123853 B CN 111123853B
- Authority
- CN
- China
- Prior art keywords
- spraying
- information
- target frame
- robot
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000005507 spraying Methods 0.000 title claims abstract description 145
- 238000000034 method Methods 0.000 title claims abstract description 26
- 239000003973 paint Substances 0.000 claims abstract description 28
- 239000007921 spray Substances 0.000 claims description 23
- 238000010422 painting Methods 0.000 claims description 12
- 239000007787 solid Substances 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 11
- 238000005067 remediation Methods 0.000 claims description 8
- 238000007689 inspection Methods 0.000 claims description 6
- 230000002452 interceptive effect Effects 0.000 claims description 3
- 230000001678 irradiating effect Effects 0.000 claims description 3
- 238000001514 detection method Methods 0.000 abstract description 12
- 238000007591 painting process Methods 0.000 description 3
- 238000007592 spray painting technique Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000006115 industrial coating Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41875—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by quality surveillance of production
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32368—Quality control
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Spray Control Apparatus (AREA)
- Application Of Or Painting With Fluid Materials (AREA)
Abstract
The invention provides a control method of a robot for detecting and remedying the spraying of the inner surface of an automobile. The invention is used for detecting the interior of the automobile frame and filling paint. The type of the target frame is quickly and accurately identified through the identification information arranged on the target frame. According to the invention, the second three-dimensional contour model is generated through the first three-dimensional contour model and the first spraying data information in the target frame, so that the inner spraying condition of the sprayed target frame can be generally judged by a person or a robot. According to the invention, the pictures of the spraying areas are obtained, the shooting range of the pictures is slightly larger than the range of the corresponding spraying area, so that the outline of the spraying area is conveniently and accurately extracted from the pictures, meanwhile, errors in the range and color recognition of the spraying area caused by excessive shooting of other spraying areas into the target picture when the shooting range is too large are reduced, and the detection accuracy is improved.
Description
Technical Field
The invention relates to the field of automatic control of intelligent machines, in particular to a control method of a robot for detecting and remedying the spraying of the inner surface of an automobile.
Background
With the end of the cold war, a great number of painting processes and equipment became civilian, and in order to pursue greater flexibility and greater efficiency of the painting process, robots were introduced from the 90 s of the 20 th century to replace painting machines in the automobile industry, and robots were used to perform automatic painting of interior surfaces to replace manual painting. Thus, various spray coating lines using intelligent robots have been developed.
The current spraying process has a key effect on some devices such as refrigerators and automobiles, and different spray paints play an important role in the aspects of temperature resistance, corrosion resistance and appearance aesthetic feeling improvement. And for automobiles, the processes are directly related to the quality of the automobiles.
The spraying robot has the advantages of accurate spraying and controllable spraying range, so that a large amount of industrial coatings are saved in the spraying process, less paint waste is generated in the spraying process, and the damage to the environment is reduced. Therefore, a large number of novel spraying robots and related control technologies have also become hot spots of research, and a large number of such patents "workpiece spraying systems" as in patent application No. CN201811569171.0 appear, but the structures of hardware systems of the spraying robots are all the same and different, so the research directions of the existing spraying robots mainly lie in directions of trajectory planning, spraying control, machine vision, and the like.
However, in the actual painting process, especially for some private customized vehicles, since there are many painting areas on the outer surface and the inner surface and the painting areas need to be painted with patterns with different colors, even a painting robot is often prone to have a phenomenon of missing or wrong painting. Meanwhile, the inner surface of the automobile is difficult to detect due to the problems of narrow light visual angle, narrow space and the like. The manual detection consumes time and labor, and the conventional machine detection often has the problem of missed detection. Therefore, a control method of a robot for detecting and remedying the spraying of the inner surface of the automobile, which has high detection precision and high spraying efficiency, can quickly confirm a spraying error area and timely perform compensation spraying, becomes necessary.
Disclosure of Invention
In order to solve the technical problem, the invention provides a control method of a robot for detecting and remedying the spraying of the inner surface of an automobile, which is used for detecting the inner part of an automobile frame and filling paint.
The invention provides a control method of a robot for detecting and remedying the spraying of the inner surface of an automobile, which comprises the following steps: s1, acquiring identification information of the target frame, and determining the type of the target frame according to the identification information; s2, acquiring a first three-dimensional outline model of the interior of the target frame from a preset automobile frame model library according to the type of the target frame; s3, acquiring first spraying data information corresponding to the first three-dimensional contour model from a preset spraying data information base; s4, inputting the first spraying data information into the first three-dimensional contour model, and further obtaining a second three-dimensional contour model capable of displaying the sprayed effect; s5, acquiring space coordinates of each position point corresponding to the target frame and the second three-dimensional contour model; s6, combining the second three-dimensional contour model to obtain space coordinate information corresponding to each spraying area of the target frame, and obtaining first picture information of each spraying area; s7, judging whether spraying omission exists or not according to the comparison of the first picture information of each spraying area and the corresponding display effect in the second three-dimensional outline model; and S8, when the spray omission phenomenon of the corresponding spray area is determined, generating a first track of the corresponding robot spray gun to carry out spray painting remediation on the range.
Further, the identification information of the target frame in step S1 is information printed on an electronic label provided on one side of the target frame, and the identification information may include one or more of a character string, a two-dimensional code, and a bar code.
Further, a plurality of three-dimensional contour models are stored in the vehicle frame model library, identification information of each target vehicle frame corresponds to one three-dimensional contour model, when a first three-dimensional contour model corresponding to the target vehicle frame does not exist in the vehicle frame model library, input can be carried out through upper computer software, and meanwhile, the input first three-dimensional contour model is stored in the vehicle frame model library.
Further, the first spraying data information stored in the preset spraying data information base comprises paint color information, paint type information and paint spraying thickness information of each spraying area.
Further, the step S5 of obtaining the spatial coordinates of each position point corresponding to the target frame and the second solid contour model specifically includes: s51, taking the initial position of the nozzle center of the spray gun of the spraying robot as the origin of a space coordinate axis; s52, selecting a plurality of edge points of the target frame from multiple directions as reference points, and acquiring the space coordinates of the reference points through a laser positioner at a spray gun of the spraying robot; and S53, inputting the space coordinates of each reference point into the second solid body contour model according to the relative position of the reference point in the second solid body contour model, thereby acquiring the space coordinates of other position points of the target frame corresponding to the second solid body contour model.
Further, step S6 specifically includes: s61, combining the second three-dimensional contour model to obtain the coordinates of the edge feature points of all the areas of the target frame needing to be sprayed; s62, preliminarily classifying all position points of all areas needing to be sprayed according to the paint color information, the paint type information and the paint spraying thickness information; s63, merging the position points which are identical in information and are directly adjacent into a single independent spraying area; and S64, sequentially controlling the cameras of the robot to approach each spraying area, acquiring pictures of each spraying area, wherein the shooting range of the pictures is slightly larger than the range of the corresponding spraying area, and integrating the pictures of each spraying area and the space coordinate information of the spraying area into corresponding first picture information.
Furthermore, when pictures of all spraying areas are obtained, the flash lamps are used for irradiating all the spraying areas respectively, and the shooting of all the spraying areas is carried out in a staggered shooting mode so as to prevent the flash lamps from interfering with each other to influence the shooting effect.
Further, the step S7 of "determining whether spray omission" specifically includes: s71, extracting corresponding feature points from the first image information of each spraying area respectively; s72, extracting actual contour information of each spraying area according to the extracted feature points of each spraying area; s73, respectively acquiring preset contour information of each spraying area from the second three-dimensional contour model, and comparing the preset contour information with actual contour information; s74, when part of the actual contour information is substantially the same as the preset contour information, extracting the actual chromatic value information of the corresponding spraying area, and comparing the actual chromatic value information with the corresponding preset chromatic value information obtained from the second three-dimensional contour model; and S75, when the chromatic value information corresponding to the target spraying area is the same as the preset chromatic value information, indicating that the spraying of the spraying area is not omitted.
The type of the target frame is quickly identified through the identification information arranged on the target frame, so that the problems that the detection speed is slow and the CPU of the processing equipment is lost due to the fact that the type of the target frame is obtained through the three-dimensional laser scanner are solved. According to the invention, the second three-dimensional contour model is generated through the first three-dimensional contour model and the first spraying data information in the target frame, so that the inner spraying condition of the sprayed target frame can be generally judged by a person or a robot. According to the invention, the pictures of the spraying areas are obtained, the shooting range of the pictures is slightly larger than the range of the corresponding spraying area, so that the outline of the spraying area is conveniently and accurately extracted from the pictures, meanwhile, errors in the range and color recognition of the spraying area caused by excessive shooting of other spraying areas into the target picture when the shooting range is too large are reduced, and the detection accuracy is improved.
According to the method, the actual shot pictures of all the spraying areas are compared with the profiles and the preset chromaticity information of the corresponding positions of all the spraying areas in the second three-dimensional profile model, so that the conditions of spraying omission and wrong spraying of all the spraying areas are judged. The detection mode generates and detects in real time relative to the three-dimensional images of all the spraying areas, the detection of the two-dimensional images obviously reduces the calculation amount of equipment, improves the detection speed, and simultaneously ensures that the detection effect is not influenced.
Drawings
FIG. 1 is a flow chart of a method of controlling a robot for inspection and remediation of interior automotive surfaces in accordance with the present invention;
FIG. 2 is a flowchart of step S5 of a method for controlling a robot for painting inspection and remediation on interior surfaces of a vehicle according to the present invention;
FIG. 3 is a flowchart of step S6 of a method for controlling a robot for painting inspection and remediation on interior surfaces of a vehicle according to the present invention;
fig. 4 is a flowchart of step S7 of the method for controlling the robot for detecting and remedying the inner surface painting of the automobile according to the present invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention and/or the technical solutions in the prior art, the following description will explain specific embodiments of the present invention with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort. In addition, the term "orientation" merely indicates a relative positional relationship between the respective members, not an absolute positional relationship.
As shown in FIG. 1, the present invention provides a method for controlling a robot for inspecting and remedying the inner surface painting of an automobile, which is used for inspecting and filling paint inside the automobile frame, and comprises the following steps S1 to S8.
And step S1, acquiring identification information of the target frame, and determining the type of the target frame according to the identification information.
In the invention, in order to conveniently and quickly identify the frame, the identification information of the target frame is information printed on the electronic tag arranged on one side of the target frame, and the identification information can comprise one or more of character strings, two-dimensional codes and bar codes. And identification devices such as RFC are arranged in the external detection interval to identify the identification information, so that the frame information is determined.
And S2, acquiring a first three-dimensional contour model of the interior of the target frame from a preset automobile frame model library according to the type of the target frame.
When the first three-dimensional contour model corresponding to the target frame does not exist in the vehicle frame model library, the input can be carried out through upper computer software, and meanwhile, the input first three-dimensional contour model is stored in the vehicle frame model library. The provision of the vehicle frame model library as a rewritable database facilitates an increase in the types of vehicles on which the present invention operates.
And S3, acquiring first spraying data information corresponding to the first three-dimensional contour model from a preset spraying data information base.
The first spraying data information stored in the preset spraying data information base in the method comprises paint color information, paint type information and paint spraying thickness information of each spraying area.
And S4, inputting the first spraying data information into the first three-dimensional outline model, and further acquiring a second three-dimensional outline model capable of displaying the sprayed effect.
And S5, acquiring the space coordinates of each position point corresponding to the target frame and the second three-dimensional contour model.
As shown in fig. 2, the step S5 "acquiring the space coordinates of each position point corresponding to the target frame and the second solid contour model" specifically includes: s51, taking the initial position of the nozzle center of the spray gun of the spraying robot as the origin of a space coordinate axis; s52, selecting a plurality of edge points of the target frame from multiple directions as reference points, and acquiring the space coordinates of the reference points through a laser positioner at a spray gun of the spraying robot; and S53, inputting the space coordinates of each reference point into the second solid body contour model according to the relative position of the reference point in the second solid body contour model, thereby acquiring the space coordinates of other position points of the target frame corresponding to the second solid body contour model. The acquisition of the space coordinates is beneficial to corresponding each position of the target frame with the second three-dimensional body profile model, so that the position of each spraying area can be determined according to the coordinates of a small number of characteristic points.
And S6, acquiring space coordinate information corresponding to each spraying area of the target frame by combining the second three-dimensional contour model, and acquiring first picture information of each spraying area.
As shown in fig. 3, step S6 specifically includes: s61, combining the second three-dimensional contour model to obtain the coordinates of the edge feature points of all the areas of the target frame needing to be sprayed; s62, preliminarily classifying all position points of all areas needing to be sprayed according to the paint color information, the paint type information and the paint spraying thickness information; s63, merging the position points which are identical in information and are directly adjacent into a single independent spraying area; and S64, sequentially controlling the cameras of the robot to approach each spraying area, acquiring pictures of each spraying area, wherein the shooting range of the pictures is slightly larger than the range of the corresponding spraying area, and integrating the pictures of each spraying area and the space coordinate information of the spraying area into corresponding first picture information.
When pictures of all spraying areas are acquired, the flash lamps are used for irradiating the spraying areas respectively, and the shooting of the spraying areas is carried out in a staggered shooting mode to prevent the flash lamps from interfering with each other to influence the shooting effect.
S7, according to the comparison of the first picture information of each spraying area and the corresponding display effect in the second three-dimensional outline model, whether spraying omission occurs is judged.
As shown in fig. 4, the step S7 of "determining whether spray omission" specifically includes: s71, extracting corresponding feature points from the first image information of each spraying area respectively; s72, extracting actual contour information of each spraying area according to the extracted feature points of each spraying area; s73, respectively acquiring preset contour information of each spraying area from the second three-dimensional contour model, and comparing the preset contour information with actual contour information; s74, when part of the actual contour information is substantially the same as the preset contour information, extracting the actual chromatic value information of the corresponding spraying area, and comparing the actual chromatic value information with the corresponding preset chromatic value information obtained from the second three-dimensional contour model; and S75, when the chromatic value information corresponding to the target spraying area is the same as the preset chromatic value information, indicating that the spraying of the spraying area is not omitted.
And S8, when the spray omission phenomenon of the corresponding spray area is determined, generating a first track of the corresponding robot spray gun to carry out spray painting remediation on the range.
The generation of the first track comprises the acquisition of coordinates of edge points of the spraying area where spraying omission or spraying error occurs. And generating a spraying track capable of spraying and covering the surface of the area by taking the center of a nozzle of a spray gun of the spraying robot as an origin according to the paint color information, the paint type information and the paint spraying thickness information of the corresponding position of the spraying robot in the second three-dimensional contour model. The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.
Claims (8)
1. A control method of a robot for detecting and remedying the spraying of the inner surface of an automobile is used for detecting the inner part of an automobile frame and filling paint, and is characterized by comprising the following steps: s1, acquiring identification information of the target frame, and determining the type of the target frame according to the identification information; s2, acquiring a first three-dimensional outline model of the interior of the target frame from a preset automobile frame model library according to the type of the target frame; s3, acquiring first spraying data information corresponding to the first three-dimensional contour model from a preset spraying data information base; s4, inputting the first spraying data information into the first three-dimensional contour model, and further obtaining a second three-dimensional contour model capable of displaying the sprayed effect; s5, acquiring space coordinates of each position point corresponding to the target frame and the second three-dimensional contour model; s6, combining the second three-dimensional contour model to obtain space coordinate information corresponding to each spraying area of the target frame, and obtaining first picture information of each spraying area; s7, judging whether spraying omission exists or not according to the comparison of the first picture information of each spraying area and the corresponding display effect in the second three-dimensional outline model; and S8, when the spray omission phenomenon of the corresponding spray area is determined, generating a first track of the corresponding robot spray gun to spray paint and remedy the spray area with the spray omission phenomenon.
2. A method of controlling a robot for inspection and remediation of automotive interior surfaces as recited in claim 1, further comprising: the identification information of the target frame in step S1 is information printed on an electronic label provided on one side of the target frame, and the identification information may include one or more of a character string, a two-dimensional code, and a bar code.
3. A method of controlling a robot for inspection and remediation of automotive interior surfaces as recited in claim 1, further comprising: the vehicle frame model base is stored with a plurality of three-dimensional contour models, the identification information of each target vehicle frame corresponds to one three-dimensional contour model, when the vehicle frame model base does not have the first three-dimensional contour model corresponding to the target vehicle frame, the first three-dimensional contour model can be input through the upper computer software, and the input first three-dimensional contour model is stored in the vehicle frame model base.
4. A method of controlling a robot for inspection and remediation of automotive interior surfaces as recited in claim 1, further comprising: the first spraying data information stored in the preset spraying data information base comprises paint color information, paint type information and paint spraying thickness information of each spraying area.
5. The method as claimed in claim 1, wherein the step S5 of obtaining the spatial coordinates of each position point of the target frame corresponding to the second solid contour model includes: s51, taking the initial position of the nozzle center of the spray gun of the spraying robot as the origin of a space coordinate axis; s52, selecting a plurality of edge points of the target frame from multiple directions as reference points, and acquiring the space coordinates of the reference points through a laser positioner at a spray gun of the spraying robot; and S53, inputting the space coordinates of each reference point into the second solid body contour model according to the relative position of the reference point in the second solid body contour model, thereby acquiring the space coordinates of other position points of the target frame corresponding to the second solid body contour model.
6. The method as claimed in claim 1 or 4, wherein the step S6 includes: s61, combining the second three-dimensional contour model to obtain the coordinates of the edge feature points of all the areas of the target frame needing to be sprayed; s62, preliminarily classifying all position points of all areas needing to be sprayed according to the paint color information, the paint type information and the paint spraying thickness information; s63, merging the position points which are identical in information and are directly adjacent into a single independent spraying area; and S64, sequentially controlling the cameras of the robot to approach each spraying area, acquiring pictures of each spraying area, wherein the shooting range of the pictures is slightly larger than the range of the corresponding spraying area, and integrating the pictures of each spraying area and the space coordinate information of the spraying area into corresponding first picture information.
7. The method for controlling the robot for detecting and remedying the inner surface painting of the automobile as claimed in claim 6, wherein: when pictures of all spraying areas are obtained, the flash lamps are used for irradiating all the spraying areas respectively, and the shooting of all the spraying areas is carried out in a staggered shooting mode to prevent the flash lamps from interfering with each other to influence the shooting effect.
8. The method as claimed in claim 6, wherein the step S7 of determining whether the spray omission is omitted includes: s71, extracting corresponding feature points from the first image information of each spraying area respectively; s72, extracting actual contour information of each spraying area according to the extracted feature points of each spraying area; s73, respectively acquiring preset contour information of each spraying area from the second three-dimensional contour model, and comparing the preset contour information with actual contour information; s74, when part of the actual contour information is substantially the same as the preset contour information, extracting the actual chromatic value information of the corresponding spraying area, and comparing the actual chromatic value information with the corresponding preset chromatic value information obtained from the second three-dimensional contour model; and S75, when the chromatic value information corresponding to the target spraying area is the same as the preset chromatic value information, indicating that the spraying of the spraying area is not omitted.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911166209.4A CN111123853B (en) | 2019-11-25 | 2019-11-25 | Control method of robot for detecting and remedying spraying on inner surface of automobile |
PCT/CN2019/124729 WO2021103153A1 (en) | 2019-11-25 | 2019-12-12 | Method for controlling robot for spraying detection and remediation on inner surface of automobile |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911166209.4A CN111123853B (en) | 2019-11-25 | 2019-11-25 | Control method of robot for detecting and remedying spraying on inner surface of automobile |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111123853A CN111123853A (en) | 2020-05-08 |
CN111123853B true CN111123853B (en) | 2021-05-14 |
Family
ID=70496630
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911166209.4A Active CN111123853B (en) | 2019-11-25 | 2019-11-25 | Control method of robot for detecting and remedying spraying on inner surface of automobile |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111123853B (en) |
WO (1) | WO2021103153A1 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111013883A (en) * | 2019-11-25 | 2020-04-17 | 浙江明泉工业涂装有限公司 | Robot control method for intelligent spraying of multiple vehicle types |
CN111744735B (en) * | 2020-07-08 | 2021-08-31 | 福州大学 | Control method based on surface spraying simulation of artware |
CN114769021B (en) * | 2022-04-24 | 2022-11-25 | 广东天太机器人有限公司 | Robot spraying system and method based on full-angle template recognition |
CN115228317B (en) * | 2022-07-05 | 2024-04-26 | 中国农业科学院烟草研究所(中国烟草总公司青州烟草研究所) | Fertilizer and pesticide preparation device and method for flue-cured tobacco planting |
CN115283164B (en) * | 2022-07-29 | 2023-06-09 | 东风柳州汽车有限公司 | Vehicle paint spraying track generation method, device, equipment and storage medium |
CN116128878B (en) * | 2023-04-14 | 2023-06-23 | 中铭谷智能机器人(广东)有限公司 | Intelligent spraying track generation method and system based on automobile sheet metal |
CN116393273B (en) * | 2023-06-08 | 2023-08-22 | 深圳中宝新材科技有限公司 | Intelligent paint plating adjusting method and equipment for alloy wires and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001062358A (en) * | 1999-07-31 | 2001-03-13 | Abb Res Ltd | Method for determining distribution of layer thickness of coating layer |
CN106127747A (en) * | 2016-06-17 | 2016-11-16 | 史方 | Car surface damage classifying method and device based on degree of depth study |
CN107020230A (en) * | 2017-04-19 | 2017-08-08 | 天长市金陵电子有限责任公司 | A kind of ray cast type electrostatic spraying restorative procedure |
CN107403424A (en) * | 2017-04-11 | 2017-11-28 | 阿里巴巴集团控股有限公司 | A kind of car damage identification method based on image, device and electronic equipment |
CN107480378A (en) * | 2017-08-16 | 2017-12-15 | 郑州云海信息技术有限公司 | A kind of auto model method of modifying and device |
CN107833089A (en) * | 2017-10-18 | 2018-03-23 | 深圳开思时代科技有限公司 | Obtain method, apparatus, system and the electronic equipment of client vehicles spray painting demand |
CN107899814A (en) * | 2017-12-20 | 2018-04-13 | 芜湖哈特机器人产业技术研究院有限公司 | A kind of robot spraying system and its control method |
CN108031588A (en) * | 2017-12-29 | 2018-05-15 | 深圳海桐防务装备技术有限责任公司 | Automatic spray apparatus and use its automatic painting method |
CN109325531A (en) * | 2018-09-17 | 2019-02-12 | 平安科技(深圳)有限公司 | Car damage identification method, device, equipment and storage medium based on image |
CN109731708A (en) * | 2018-12-27 | 2019-05-10 | 上海理工大学 | Auto repair auto spray painting method based on image recognition |
CN110013934A (en) * | 2019-04-02 | 2019-07-16 | 清华大学 | Automobile body-in-white fully-automatic laser scanning spraying detection integrated system |
CN110032995A (en) * | 2019-05-16 | 2019-07-19 | 安徽三众智能装备有限公司 | A kind of white body automatic identification system and recognition methods |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ATE280239T1 (en) * | 1995-07-26 | 2004-11-15 | Mixis France Sa | HOMOLOGOUS RECOMBINATION IN EUKARYOTIC CELLS WITH INACTIVATED MISMATCH REPAIR SYSTEM |
CN1166459C (en) * | 2002-11-21 | 2004-09-15 | 西安交通大学 | Manufacture method of intelligence integral dies of arc spraying and brush plating |
US7181856B1 (en) * | 2005-11-09 | 2007-02-27 | Hanchett Michael T | Laser measurement system |
CN104385640A (en) * | 2014-10-20 | 2015-03-04 | 合肥斯科尔智能科技有限公司 | High-precision repairing method for three-dimensional printed product |
CN104634787B (en) * | 2015-02-13 | 2017-06-06 | 东华大学 | A kind of automobile body external panel spray painting flaw automatic detection device and method |
CN105983502B (en) * | 2015-03-19 | 2018-06-08 | 何守印 | Intelligent automobile lacquering and stoving varnish integrated robot |
JP6862460B2 (en) * | 2016-03-09 | 2021-04-21 | ユニリーバー・ナームローゼ・ベンノートシヤープ | Modeling system |
CN107225075A (en) * | 2017-04-19 | 2017-10-03 | 天长市金陵电子有限责任公司 | A kind of electrostatic powder coating restorative procedure based on three-dimensional modeling |
CN107894423B (en) * | 2017-11-08 | 2021-04-20 | 安吉汽车物流股份有限公司 | Automatic detection equipment and method for vehicle body surface quality loss and intelligent vehicle detection system |
CN108020557B (en) * | 2017-12-18 | 2020-11-24 | 北京航天测控技术有限公司 | Vehicle body spraying quality self-adaptive detection method based on laser scanning |
US10323932B1 (en) * | 2017-12-28 | 2019-06-18 | Ford Motor Company | System for inspecting vehicle bodies |
CN109239086B (en) * | 2018-10-22 | 2023-11-17 | 上海易清智觉自动化科技有限公司 | Vehicle paint surface and appearance flaw detection system |
CN109461149A (en) * | 2018-10-31 | 2019-03-12 | 泰州市创新电子有限公司 | The intelligent checking system and method for lacquered surface defect |
-
2019
- 2019-11-25 CN CN201911166209.4A patent/CN111123853B/en active Active
- 2019-12-12 WO PCT/CN2019/124729 patent/WO2021103153A1/en active Application Filing
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001062358A (en) * | 1999-07-31 | 2001-03-13 | Abb Res Ltd | Method for determining distribution of layer thickness of coating layer |
CN106127747A (en) * | 2016-06-17 | 2016-11-16 | 史方 | Car surface damage classifying method and device based on degree of depth study |
CN107403424A (en) * | 2017-04-11 | 2017-11-28 | 阿里巴巴集团控股有限公司 | A kind of car damage identification method based on image, device and electronic equipment |
CN107020230A (en) * | 2017-04-19 | 2017-08-08 | 天长市金陵电子有限责任公司 | A kind of ray cast type electrostatic spraying restorative procedure |
CN107480378A (en) * | 2017-08-16 | 2017-12-15 | 郑州云海信息技术有限公司 | A kind of auto model method of modifying and device |
CN107833089A (en) * | 2017-10-18 | 2018-03-23 | 深圳开思时代科技有限公司 | Obtain method, apparatus, system and the electronic equipment of client vehicles spray painting demand |
CN107899814A (en) * | 2017-12-20 | 2018-04-13 | 芜湖哈特机器人产业技术研究院有限公司 | A kind of robot spraying system and its control method |
CN108031588A (en) * | 2017-12-29 | 2018-05-15 | 深圳海桐防务装备技术有限责任公司 | Automatic spray apparatus and use its automatic painting method |
CN109325531A (en) * | 2018-09-17 | 2019-02-12 | 平安科技(深圳)有限公司 | Car damage identification method, device, equipment and storage medium based on image |
CN109731708A (en) * | 2018-12-27 | 2019-05-10 | 上海理工大学 | Auto repair auto spray painting method based on image recognition |
CN110013934A (en) * | 2019-04-02 | 2019-07-16 | 清华大学 | Automobile body-in-white fully-automatic laser scanning spraying detection integrated system |
CN110032995A (en) * | 2019-05-16 | 2019-07-19 | 安徽三众智能装备有限公司 | A kind of white body automatic identification system and recognition methods |
Also Published As
Publication number | Publication date |
---|---|
CN111123853A (en) | 2020-05-08 |
WO2021103153A1 (en) | 2021-06-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111123853B (en) | Control method of robot for detecting and remedying spraying on inner surface of automobile | |
CN111013883A (en) | Robot control method for intelligent spraying of multiple vehicle types | |
CN114549519B (en) | Visual detection method and system for automobile spraying production line and readable storage medium | |
US9561593B2 (en) | Working method using sensor and working system for performing same | |
US20180350056A1 (en) | Augmented reality application for manufacturing | |
US6714831B2 (en) | Paint defect automated seek and repair assembly and method | |
CN101666619B (en) | Method for calculating absolute coordinates of work piece | |
JPH10253322A (en) | Method and apparatus for designating position of object in space | |
US10197987B2 (en) | Use of manufacturing compounds to create fiducial marks | |
CA3102241A1 (en) | Method and plant for locating points on a complex surface in the space | |
CN114720475A (en) | Intelligent detection and polishing system and method for automobile body paint surface defects | |
US6597967B2 (en) | System and method for planning a tool path along a contoured surface | |
CN116852374B (en) | Intelligent robot control system based on machine vision | |
Lee et al. | Implementation of a robotic arm with 3D vision for shoes glue spraying system | |
JPH1063324A (en) | Picture input-type robot system | |
CN106558070A (en) | A kind of method and system of the visual tracking under the robot based on Delta | |
CN111928852B (en) | Indoor robot positioning method and system based on LED position coding | |
KR101048467B1 (en) | Position measuring method of conveyor-based body | |
CN107600313A (en) | A kind of body section robot derusting and spray painting detail design system | |
CN111906770A (en) | Workpiece mounting method and system, computer readable storage medium | |
CN113500594B (en) | Binocular vision positioning method suitable for automatic mounting system of automobile windshield | |
US20230103030A1 (en) | Process for Painting a Workpiece Comprising Generating a Trajectory Suitable for the Actual Workpiece | |
CN116148259B (en) | Vehicle defect positioning system, method, device and storage medium | |
Gülırmak et al. | Determining Robot Trajectory Planning Using Image Processing for Wood Painting | |
WO2024008705A1 (en) | Method and apparatus for generating robot path data to automatically coat at least part of a surface of a spatial substrate with at least one coating material |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |