KR101271620B1 - Method of detecting painted surface and painting method using the same - Google Patents

Method of detecting painted surface and painting method using the same Download PDF

Info

Publication number
KR101271620B1
KR101271620B1 KR1020100140352A KR20100140352A KR101271620B1 KR 101271620 B1 KR101271620 B1 KR 101271620B1 KR 1020100140352 A KR1020100140352 A KR 1020100140352A KR 20100140352 A KR20100140352 A KR 20100140352A KR 101271620 B1 KR101271620 B1 KR 101271620B1
Authority
KR
South Korea
Prior art keywords
histogram
cumulative distribution
distribution function
ideal
pixel
Prior art date
Application number
KR1020100140352A
Other languages
Korean (ko)
Other versions
KR20120078143A (en
Inventor
박영준
조기용
최두진
이재상
최승준
배성준
최윤규
Original Assignee
삼성중공업 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성중공업 주식회사 filed Critical 삼성중공업 주식회사
Priority to KR1020100140352A priority Critical patent/KR101271620B1/en
Publication of KR20120078143A publication Critical patent/KR20120078143A/en
Application granted granted Critical
Publication of KR101271620B1 publication Critical patent/KR101271620B1/en

Links

Images

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)

Abstract

A paint surface detection method is presented. According to an exemplary embodiment, a method for detecting a painting surface may include: reinforcing an image of a painted surface, distinguishing colors of the enhanced image, extracting boundaries of paint regions divided according to the divided colors, and Setting up the node.

Description

Coating surface detection method and coating method using the same {METHOD OF DETECTING PAINTED SURFACE AND PAINTING METHOD USING THE SAME}

The present invention relates to a coating surface detection method and a coating method using the same.

When painting in hull blocks or other unfavorable working environment, it is common to work in unmanned environment using a painting robot. When working in such an unmanned environment, it is necessary to move the painting robot to the correct position in order to paint the desired color in the desired area.

When a person works, the painter visually checks the painting surface and paints it at a desired position, but in the case of a painting robot, the painting robot predicts the painting environment and programs a route accordingly to move the painting robot. However, the painting robot may not be able to move to the desired location if the predicted environment and the actual physical environment are different or an error occurs.

One embodiment of the present invention is to enable the painting robot to move to the correct position on the painting surface to perform the painting operation.

In the painting surface detection method according to an embodiment of the present invention, the method may further include reinforcing an image photographing the painting surface, distinguishing colors of the enhanced image, and extracting a boundary of a painting area divided according to the divided colors. And setting up a node at said boundary.

The image enhancement step may include performing histogram specification.

The histogram specification step may include: converting a color coordinate of each pixel of the enhanced image into a YCrCb color coordinate system; Generating an ideal histogram and an ideal cumulative distribution function for the illuminance of the enhanced image, and matching the ideal histogram and the ideal cumulative distribution function with the actual histogram and the cumulative distribution function. The method may include calculating a histogram and a final cumulative distribution function, and converting color coordinates of each pixel based on the final histogram.

The calculating of the final histogram and the final cumulative distribution function may include finding an ideal cumulative distribution function value (hereinafter referred to as an “ideal function value”) closest to the actual cumulative distribution function value of the gray level for each gray level. Finding a gray level having the abnormal function value in the ideal cumulative distribution function, finding a value of the gray level having the abnormal function value in the ideal histogram (hereinafter, referred to as an “abnormal histogram value”), and each gray level. The method may include generating the final histogram by assigning the abnormal histogram value to, and obtaining the final cumulative distribution function from the final histogram.

The coordinate conversion step may include converting the Y coordinate of each pixel to a gray level having the abnormal function value.

The color classification step may include converting the coordinate converted color coordinates for each pixel into an RGB color coordinate system, normalizing the RGB color coordinates of each pixel, and normalizing the standardized RGB color coordinates and the reference color of each pixel. Calculating a distance between the RGB color coordinates, and replacing the normalized RGB color coordinates of each pixel with the standardized RGB color coordinates of the reference color at the closest distance.

The reference color may be a color painted on the painted surface.

The boundary extracting step may use a Laplace boundary.

The node setting step may include setting a start point and an end point of the boundary.

The painting surface detection method according to an embodiment of the present invention comprises the steps of photographing the painting surface, reinforcing the photographed image of the painting surface, distinguishing the color of the enhanced image, divided according to the divided color Extracting a boundary of a painting area, setting a node at the boundary, moving a painting module according to the node, and painting the painting surface using the painting module.

The image enhancement step may include converting a color coordinate of each pixel of the enhanced image into a YCrCb color coordinate system, and calculating an actual histogram and an actual cumulative distribution function for illuminance which is Y coordinate of the YCrCb color coordinate system for the enhanced image. Generating an ideal histogram and an ideal cumulative distribution function for the illuminance of the enhanced image, and matching the ideal histogram and the ideal cumulative distribution function with the actual histogram and the cumulative distribution function. The method may include calculating a histogram and a final cumulative distribution function, and converting color coordinates of each pixel based on the final histogram.

The color classification step may include converting the coordinate converted color coordinates for each pixel into an RGB color coordinate system, standardizing the RGB color coordinates of each pixel, and normalizing one or more reference colors painted on the painting surface. Calculating a distance between an RGB color coordinate and a normalized RGB color coordinate of each pixel, and replacing the standardized RGB color coordinate of each pixel with a standardized RGB color coordinate of a reference color at a closest distance. .

In the painting surface detection method according to an embodiment of the present invention, the coating surface may be photographed and processed, and then the painting module may be moved to a more accurate position because the target position of the painting module is found.

1 is a schematic diagram of a painting apparatus according to an embodiment of the present invention.
2 is a schematic diagram of a painting module according to an embodiment of the present invention.
3 is a flowchart of a painting operation using the painting apparatus according to an embodiment of the present invention.
4 is a view showing the painted surface of the coating target according to an embodiment of the present invention.
5 is a flowchart illustrating a process of processing a photographed image of a painted surface in a painting method according to an embodiment of the present invention.
6 is a view for explaining an image reinforcing step in the painting method according to an embodiment of the present invention.
7 is a graph showing an actual histogram and a cumulative distribution function in an image reinforcing step of a painting method according to an embodiment of the present invention.
8 is a graph showing an ideal histogram and cumulative distribution function in an image reinforcing step of a painting method according to an embodiment of the present invention.
9 is a graph showing a final histogram and a cumulative distribution function in an image reinforcing step of a painting method according to an embodiment of the present invention.
FIG. 10 is a diagram for describing a process of obtaining a final histogram in an image reinforcing step of a painting method according to an embodiment of the present invention.
11 is a view for explaining the color separation step in the coating method according to an embodiment of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The above and other features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings in which: FIG. The present invention may be embodied in many different forms and is not limited to the embodiments described herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and the same or similar components are denoted by the same reference numerals throughout the specification.

First, the structure of the coating apparatus according to an embodiment of the present invention will be described in detail with reference to FIGS. 1 and 2.

1 is a schematic diagram of a painting apparatus according to an embodiment of the present invention, Figure 2 is a schematic diagram of a painting module according to an embodiment of the present invention.

Referring to FIG. 1, the painting apparatus 100 according to the present embodiment includes a frame 10, a painting module 20, a plurality of wires 30, an autonomous movement control module 40, and a controller 50. .

The painting module 20 paints the surface of the workpiece 60 in the frame 10.

The painting module 20 includes a platform 21 movable within the frame 10, a work robot 22 and a camera 25 mounted on the platform 21, and a sprayer mounted on the work robot 22 ( 23).

The movement of the platform 21 is performed by a plurality of wires 30 connected between the frame 10 and the platform 21. One end of each wire 30 is fixed to the frame 10 and the other end is fixed to the platform 21. The number of wires 30 is for example four pairs, where the position of the platform 21 can be determined by adjusting the length of each wire 30. The length adjustment of the wire 30 is made by the autonomous movement control module 40.

The working robot 22 may include a multi-joint manipulator that can change the position on the platform 21 and, in some cases, adjust the position and angle.

The sprayer 23 is provided with a plurality of spray nozzles 24 to spray the paint 29 onto the surface of the work object 60.

The camera 25 photographs the painted surface of the work object 60.

The controller 50 is connected to the work robot 22, the camera 25, and the autonomous movement control module 40 of the painting module 20 through a cable 55 to control them. Specifically, the control unit 50 separates the coating area of the coating surface from the image of the coating surface photographed by the camera 25, extracts the boundary between the coating regions, and moves the coating module 20 based on the painting to move the coating. The operation is performed, and this will be described in detail with reference to FIGS. 3 and 4 together with FIGS. 1 and 2.

3 is a flowchart of a painting operation using a painting apparatus according to an embodiment of the present invention, Figure 4 is a view showing a painted surface of the coating target according to an embodiment of the present invention.

Referring to FIG. 3, first, a work robot 22, a sprayer 23, a camera 25, and the like are installed on a platform 21, and equipment installation is completed, such as filling a paint of a color to be painted on the sprayer 23. And initialize (S10).

After photographing the painting surface of the work object 60 through the camera 25, image processing is performed, and based on this, the target position to be moved by the painting module 20 is determined and compared with the current position to move the path. Perform the measurement operation to determine the (S20).

For example, as shown in the photograph of the painting surface of FIG. 4, the painting of the upper three painting areas 61, 63, and 65 has been completed in the previous work, and the lowermost painting area 67 has only the base color painted and the main color not yet painted. Let's say the state. In this case, it is assumed that the upper two coating areas 61 and 63 are coated with paints of different colors, and the coating areas 65 thereunder are coated with paints currently filled in the painting module 20. The controller 50 obtains a clean image by preprocessing the photographed image, and then checks the color painted on each of the painting areas 61, 63, 65, and 67 with a reference color, and checks the painting areas 61, 63,. The boundary between 65 and 67 is extracted to determine that the point below the boundary between the third painting area 65 and the lowest painting area 67 is the target position. Next, the target position is compared with the current position to determine the path to move. This image processing process will be described in more detail later.

When the movement path is determined, the painting module 20 is moved to the target position along the corresponding path (S30).

Taking the paint surface of the work object 60 once again and grasping the target position through the substantially same process as described above (S40), and comparing the current position with the current position of the painting module 20 and the current position and the target position It is determined whether the same (S50). If the current position is the same as the target position, the filled paint is sprayed onto the work object 60 to perform a painting operation (S60), otherwise, return to step S30 to correct the position of the painting module 20.

Painting operations are usually performed by spraying the paint while the painting module 20 moves from the right end to the left end of the paint surface or in the opposite direction. In FIG. 4, the boundaries between the coating areas 61, 63, 65, and 67 are curved, but may be formed in a straight line and parallel to each other. In this case, the painting module 20 may move along a straight line without changing the height. .

When the painting module 20 starting from one end reaches the opposite end, it is determined whether the painting work is completed (S70). If the painting work is completed, the equipment is dismantled (S80), otherwise, the process returns to step S30.

Next, an image processing process of processing an image of the painted surface photographed with reference to FIGS. 5 to 11 will be described in detail.

5 is a flowchart illustrating a process of processing a photographed image of a painted surface in a painting method according to an embodiment of the present invention, and FIG. 6 is a view for explaining an image reinforcing step in a painting method according to an embodiment of the present invention. 7 is a graph showing the actual histogram and the cumulative distribution function in the image enhancement step of the coating method according to an embodiment of the present invention, and FIG. 8 is an ideal image enhancement step of the coating method according to an embodiment of the present invention. 9 is a graph showing a histogram and a cumulative distribution function, FIG. 9 is a graph showing a final histogram and a cumulative distribution function in an image reinforcing step of a painting method according to an embodiment of the present invention, and FIG. 10 is a graph showing a cumulative distribution function according to an embodiment of the present invention. FIG. 11 is a diagram illustrating a process of obtaining a final histogram in an image reinforcing step of a painting method, and FIG. 11 is a view of an embodiment of the present invention. In the coating method according to an embodiment, a view for explaining a color classification step is performed.

First, the image photographed by the camera 25 is digitized so that the position on the painted surface is a pixel, and the color of the position is stored in a storage device (not shown) or the like as a color coordinate of the pixel.

Referring to FIG. 5, first, the controller 50 reads an image stored in a memory device (S21).

Next, an image enhancement operation using a histogram specification method is performed on the captured image to reduce distortion of the image due to reflection or shadow by light (S22). For example, the input image shown in (a) of FIG. 6 may have an unclear color due to a shadow on a corner of the painted surface, and an unclear color due to light reflection in the lower right middle part. If the input image is preprocessed using the histogram specification method, an image as shown in FIG. 6 (b) can be obtained. Although the overall image is darkened, the color is clearly displayed in all areas.

To describe the histogram specification method, first, the color coordinate of each pixel is converted from the RGB color coordinate system to the YCrCb color coordinate system. Then, the color coordinate of each pixel is represented by the Y coordinate corresponding to the illuminance and the other two coordinates. The illuminance may be displayed in 256 gray levels, for example.

Then, each illuminance value, that is, the number of pixels having gray levels is calculated, and a histogram representing the illuminance and a cumulative distribution function corresponding thereto are generated. In FIG. 7, (a) is a real histogram and (b) is a real cumulative distribution function, and the actual cumulative distribution function P o (j) for gray level j is

Figure 112010088056990-pat00001

. Here, p o (i) is the number of pixels having the i-th gradation.

Next, an average histogram, a maximum deviation, a minimum value, and the like of grays of each pixel are obtained, and an ideal histogram representing a normal distribution and a cumulative distribution function corresponding thereto are generated based on the average and standard deviation of the gray levels. 8 (a) is an ideal histogram and (b) is a corresponding cumulative distribution function, the ideal cumulative distribution function P id (j) for gray level j is

Figure 112010088056990-pat00002

. Here, p id (i) is the number of pixels having the i th gray scale in the ideal histogram.

The final histogram and the corresponding cumulative distribution function shown in FIGS. 9A and 9B can be obtained by matching the ideal histogram and cumulative distribution function and the actual histogram and cumulative distribution function thus obtained. In FIG. 9, dotted lines represent actual histograms and actual cumulative distribution functions.

To explain this process in detail, first, for each of the gray scales, the value of the ideal cumulative distribution function closest to the actual cumulative distribution function of the gray scale is found. As shown in Fig. 10, assume that the graph on the left is the actual cumulative distribution function and the graph on the right is the ideal cumulative distribution function. In the graph of the actual cumulative distribution function on the left, the actual cumulative distribution function value of the gray level k is P o (k). In the ideal cumulative distribution function graph on the right, there are two values, P id (m) and P id (n), of which the value close to P o (k) is P id (m).

Next, find the grayscale with the ideal cumulative distribution function value found. M corresponds to this case. Next, find the value for gray m, i.e., the number of pixels p id (m) in the ideal histogram, and set the value p f (k) for gray k in the final histogram as p id (m).

The final cumulative distribution function is calculated in the same way as the actual cumulative distribution function is obtained from the actual histogram.

Thus, since the gray level k in the actual histogram is changed to the gray level m in the ideal histogram, it is the last step to change the gray level of the pixel whose gray level is k to m in the input image.

5 again, when the image enhancement operation is completed, color classification is performed (S23). To do this, first convert the color coordinates of each pixel represented by the YCrCb color coordinate system to the RGB coordinate system and normalize them. The distance between the normalized RGB color coordinates of each pixel and the standardized RGB color coordinates of the various reference colors already painted (for example, four reference colors painted on the four paint regions 61, 63, 65, and 67 in FIG. 4). The color classification step is completed by calculating and replacing the color coordinates of each pixel with the color coordinates of the nearest reference color. For example, the image shown in (a) of FIG. 11 is converted to the image shown in (b) after the color separation step is completed, and each color and its boundary appear clearly in the converted image.

Subsequently, the boundary between the painting areas 61, 63, 65, and 67 is extracted using a Laplacian edge or the like (S24). In FIG. 4, the borders 62, 64, 66 of the coating areas 61, 63, 65, and 67 extracted using the Laplacian edge are indicated by white lines.

Finally, referring to FIG. 4, nodes such as start points a, c and e and end points b, d and f are set at the extracted boundaries 62, 64 and 66 (S25). The starting points a, c and e and the ending points b, d and f serve as reference for setting the movement position of the painting module 20.

As described above, in the painting surface detection method according to the present embodiment, the coating surface may be photographed and image processed, and then the painting module may be moved to a more accurate position because the target position of the painting module is found.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, Of the right.

10: frame
20: painting module
21: platform
22: working robot
23: sprayer
24: spray nozzle
25: camera
29: paint
30: wire
40: autonomous movement control module
50: control unit
55: cable
60: work object
61, 63, 65, 67: paint area
62, 64, 66: boundary of paint zone

Claims (12)

delete delete Reinforcing the image of the painted surface,
Distinguishing colors of the enhanced image;
Extracting a boundary of the painting area divided according to the divided colors; and
Establishing a node at the boundary
Lt; / RTI >
The image enhancement step includes performing histogram specification.
The histogram specification step,
Converting a color coordinate of each pixel of the enhanced image into a YCrCb color coordinate system;
Generating an actual histogram and an actual cumulative distribution function for illuminance which is a Y coordinate of the YCrCb color coordinate system for the enhanced image;
Generating an ideal histogram and an ideal cumulative distribution function for the illumination of the enhanced image,
Matching the ideal histogram and the ideal cumulative distribution function with the actual histogram and the cumulative distribution function to obtain a final histogram and a final cumulative distribution function, and
And converting a color coordinate of each pixel based on the final histogram.
4. The method of claim 3,
Obtaining the final histogram and the final cumulative distribution function,
Finding a value of an ideal cumulative distribution function (hereinafter, referred to as an “ideal function value”) closest to the actual cumulative distribution function of the gray level for each gray level,
Finding a gray level having the ideal function value in the ideal cumulative distribution function,
Finding a value of the grayscale having the abnormal function value in the ideal histogram (hereinafter, referred to as an “ideal histogram value”);
Generating the final histogram by assigning the abnormal histogram value to each of the gray levels, and
Obtaining the final cumulative distribution function from the final histogram.
5. The method of claim 4,
The coordinate conversion step includes converting the Y coordinate of each pixel to a gray level having the abnormal function value.
6. The method according to any one of claims 3 to 5,
The color separation step,
Converting the coordinate converted color coordinates for each pixel into an RGB color coordinate system;
Normalizing the RGB color coordinates of each pixel;
Calculating a distance between the standardized RGB color coordinates of each pixel and the standardized RGB color coordinates of the reference color; and
And replacing the normalized RGB color coordinates of each pixel with the standardized RGB color coordinates of the reference color at the closest distance.
The method of claim 6,
The reference color is a coating surface detection method, which is a color painted on the coating surface.
In claim 7,
The boundary extraction step uses a Laplace boundary, painting surface detection method.
9. The method of claim 8,
The node setting step includes setting a start point and an end point of the boundary.
delete Photographing the painted surface,
Reinforcing the photographed image of the painted surface;
Distinguishing colors of the enhanced image;
Extracting a boundary of the painting area divided according to the divided colors;
Establishing a node at the boundary,
Moving the painting module according to the node, and
Painting the painted surface using the painted module
Including;
The image enhancement step,
Converting a color coordinate of each pixel of the enhanced image into a YCrCb color coordinate system;
Generating an actual histogram and an actual cumulative distribution function for illuminance which is a Y coordinate of the YCrCb color coordinate system for the enhanced image;
Generating an ideal histogram and an ideal cumulative distribution function for the illumination of the enhanced image,
Matching the ideal histogram and the ideal cumulative distribution function with the actual histogram and the cumulative distribution function to obtain a final histogram and a final cumulative distribution function, and
And converting the color coordinates of each pixel based on the final histogram.
12. The method of claim 11,
The color separation step,
Converting the coordinate converted color coordinates for each pixel into an RGB color coordinate system;
Normalizing the RGB color coordinates of each pixel;
Calculating a distance between a standardized RGB color coordinate of at least one reference color painted on the painted surface and a standardized RGB color coordinate of each pixel, and
And replacing the normalized RGB color coordinates of each pixel with the standardized RGB color coordinates of the reference color at the closest distance.
KR1020100140352A 2010-12-31 2010-12-31 Method of detecting painted surface and painting method using the same KR101271620B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100140352A KR101271620B1 (en) 2010-12-31 2010-12-31 Method of detecting painted surface and painting method using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100140352A KR101271620B1 (en) 2010-12-31 2010-12-31 Method of detecting painted surface and painting method using the same

Publications (2)

Publication Number Publication Date
KR20120078143A KR20120078143A (en) 2012-07-10
KR101271620B1 true KR101271620B1 (en) 2013-06-11

Family

ID=46711532

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100140352A KR101271620B1 (en) 2010-12-31 2010-12-31 Method of detecting painted surface and painting method using the same

Country Status (1)

Country Link
KR (1) KR101271620B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101868905B1 (en) * 2017-05-25 2018-06-19 금오공과대학교 산학협력단 Method and Electronic Apparatus for Detecting of Painting Error using Image Data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05223522A (en) * 1992-02-07 1993-08-31 Mazda Motor Corp Method and apparatus for detecting mending position of vehicle painting
JP2003254719A (en) * 2002-03-04 2003-09-10 Nitto Denko Corp Method of inspecting coating area
KR20090023785A (en) * 2007-09-03 2009-03-06 한국도로공사 Steel bridge coating inspection system using image processing and the processing method for the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05223522A (en) * 1992-02-07 1993-08-31 Mazda Motor Corp Method and apparatus for detecting mending position of vehicle painting
JP2003254719A (en) * 2002-03-04 2003-09-10 Nitto Denko Corp Method of inspecting coating area
KR20090023785A (en) * 2007-09-03 2009-03-06 한국도로공사 Steel bridge coating inspection system using image processing and the processing method for the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
논문 2008.06 *

Also Published As

Publication number Publication date
KR20120078143A (en) 2012-07-10

Similar Documents

Publication Publication Date Title
CN111986178B (en) Product defect detection method, device, electronic equipment and storage medium
CN111932609A (en) Cloud deck calibration method and device for valve hall equipment inspection robot and storage medium
JP4328286B2 (en) Face area estimation device, face area estimation method, and face area estimation program
CN110703800A (en) Unmanned aerial vehicle-based intelligent identification method and system for electric power facilities
CN110570454B (en) Method and device for detecting foreign matter invasion
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
CN112964724B (en) Multi-target multi-region visual detection method and detection system
CN113674299A (en) 3D printing method and device
JP2004334819A (en) Stereo calibration device and stereo image monitoring device using same
JP2012215394A (en) Three-dimensional measuring apparatus and three-dimensional measuring method
CN110942470B (en) Image processing apparatus and method
CN111340879A (en) Image positioning system and method based on up-sampling
CN112307912A (en) Method and system for determining personnel track based on camera
CN111401341B (en) Deceleration strip detection method and device based on vision and storage medium thereof
KR100647750B1 (en) Image processing apparatus
KR101271620B1 (en) Method of detecting painted surface and painting method using the same
CN113240670B (en) Image segmentation method for object to be worked in live working scene
JP2022152845A (en) Calibration device for controlling robot
KR102366396B1 (en) RGB-D Data and Deep Learning Based 3D Instance Segmentation Method and System
CN116276938B (en) Mechanical arm positioning error compensation method and device based on multi-zero visual guidance
CN108074264A (en) A kind of classification multi-vision visual localization method, system and device
US12094227B2 (en) Object recognition device and object recognition method
KR20240025248A (en) A method of teaching a screw assembly location based on a deep learning automatically, an apparatus of teaching a screw assembly location based on a deep learning automatically, and medium of storitng a program teaching a screw assembly location based on a deep learning automatically
JP2014006852A (en) Recognition processing method, recognition processing device, robot system and recognition processing program
JP3456029B2 (en) 3D object recognition device based on image data

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190502

Year of fee payment: 7