US20160110840A1 - Image processing method, image processing device, and robot system - Google Patents
Image processing method, image processing device, and robot system Download PDFInfo
- Publication number
- US20160110840A1 US20160110840A1 US14/883,033 US201514883033A US2016110840A1 US 20160110840 A1 US20160110840 A1 US 20160110840A1 US 201514883033 A US201514883033 A US 201514883033A US 2016110840 A1 US2016110840 A1 US 2016110840A1
- Authority
- US
- United States
- Prior art keywords
- edge
- image
- model
- edge image
- similarity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0014—Image feed-back for automatic industrial control, e.g. robot with camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G06T7/0085—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/752—Contour matching
-
- H04N5/23229—
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39391—Visual servoing, track end effector with camera image feedback
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39393—Camera detects projected image, compare with reference image, position end effector
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/02—Arm motion controller
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/27—Arm part
- Y10S901/28—Joint
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/46—Sensing device
- Y10S901/47—Optical
Definitions
- the present invention relates to an image processing method in which a model edge image that is generated based on a detection target object used for pattern matching is used as a final model edge image. Further, the present invention also relates to an image processing device and a robot system.
- a pattern matching method has been known as a method for detecting a position and an orientation of a detection target object such as a work.
- a method known as shape pattern matching (hereinafter, referred to as “pattern matching”) has been widely employed because the method has high robustness in terms of variations in illumination and an object with a hidden or void portion.
- the detection target object may have a surface condition different from the ideal surface condition. Therefore, a position and an edge direction may be different at an edge point of the detection target object in the search image and an edge point of the model. In particular, because the similarity is lowered when the position and the edge direction are significantly different at the edge point of the detection target object in the search image and the edge point of the model, the detection target object may be mistakenly determined to have low similarity. Therefore, there arises a problem in that detection accuracy of the detection target object is lowered.
- Japanese Patent Application Laid-Open No. 2010-97438 discusses a method for generating a model edge image based on a model.
- a long edge that is deemed less influenced by the noise is kept while a short edge that can be considerably influenced by the noise is eliminated in generating a model edge image.
- the similarity can be prevented from being lowered and the detection accuracy of the detection target object can be improved in comparison to the case where the similarity is determined by using a model edge image that is generated without using the above-described generation method.
- an image processing method includes generating a captured model edge image by executing edge extraction processing on a captured model image acquired by capturing a detection target object, executing pattern matching of the captured model edge image and a model edge image related to the detection target object, calculating similarity at respective edge points in the model edge image in the pattern matching, selecting an edge point to be eliminated based on the similarity from among the respective edge points in the model edge image, generating an edge image acquired by eliminating the selected edge point as a final model edge image, and executing pattern matching of the final model edge image and a search edge image related to a search image acquired by capturing a search object.
- an elimination-target edge point is selected based on the similarity, and an edge image acquired by eliminating the selected edge point is generated as a model edge image. Therefore, even if the detection target object has a different surface condition because of the influence of various kinds of noise, detection accuracy of the detection target object can be suppressed from being lowered. Further, pattern matching of the model edge image and a search edge image can be executed with high accuracy by suppressing the lowering of detection accuracy of the detection target.
- FIGS. 1A, 1B, and 1C are diagrams illustrating a general configuration of a robot system according to a present exemplary embodiment.
- FIG. 1A is a side view of the entire robot system
- FIG. 1B is a plan view of a work held by a hand
- FIG. 1C is a captured image of the work held by the hand illustrated in FIG. 1B .
- FIG. 2 is a block diagram illustrating a control device according to a present exemplary embodiment.
- FIG. 3 is a flowchart illustrating processing of pattern matching executed by a pattern matching unit according to a present exemplary embodiment.
- FIG. 4 is a flowchart illustrating the processing for generating an original model edge image executed by a model edge image generation unit according to a present exemplary embodiment.
- FIGS. 5A, 5B, and 5C are diagrams illustrating processing for generating a model edge image according to a present exemplary embodiment.
- FIG. 5A is a diagram illustrating a reference image in which a clip image is clipped out
- FIG. 5B is a diagram illustrating a vector of one pixel
- FIG. 5C is a diagram illustrating a model edge image generated from the clip image.
- FIG. 6 is a flowchart illustrating the processing that is to be executed when a model edge image is generated by the model edge image generation unit according to a present exemplary embodiment.
- FIG. 7 is a flowchart illustrating the processing of pattern matching that is to be executed when a model edge image is generated by the model edge image generation unit according to a present exemplary embodiment.
- FIG. 8 is a diagram illustrating a captured model edge image according to a present exemplary embodiment.
- FIG. 9 is a flowchart illustrating the processing for selecting an edge point that is to be executed when a model edge image is generated by the model edge image generation unit according to a present exemplary embodiment.
- a robot system 1 is configured of a robot main body 2 , a camera 4 , and a control device 5 of the robot main body 2 , and the control device 5 includes an image processing device having a model edge image generation unit and a pattern matching unit.
- the model edge image generation unit and the pattern matching unit may be respectively configured of hardware, or may be configured of software as a part of the control device 5 .
- the robot system 1 includes a work supply device 3 that supplies a work 6 (i.e., detection target object) to the robot main body 2 .
- a work 6 i.e., detection target object
- the robot main body 2 can operate the work 6 , and the control device 5 controls the robot main body 2 and the camera 4 .
- the work 6 is formed into a circular-ring shape, a part of which has a projection 6 a externally projected in a diameter direction as a phase reference.
- the phase reference of the work 6 is represented by the projection 6 a
- the phase reference is not limited to the above, for example, a mark may be used as the phase reference.
- the camera 4 is fixed onto a camera fixing base 40 , so that an image of the work 6 supplied to the robot main body 2 from the work supply device 3 , i.e., the work 6 gripped by a hand 23 , can be captured by the camera 4 from above.
- a search image 12 can be acquired when the work 6 and its periphery are captured as a search object.
- the work 6 can be formed into any shape.
- the work 6 is formed into a triangular-prism shape as illustrated in FIG. 5A .
- the robot main body 2 includes a 6-axis vertical multi-joint arm (hereinafter, referred to as “arm”) 22 and the hand 23 serving as an end effector.
- arm 6-axis vertical multi-joint arm
- the hand 23 is attached to and supported by a leading-edge link 60 of the arm 22 , so that at least one degree of freedom of a position and an orientation thereof can be adjusted according to the operation of the arm 22 .
- the hand 23 includes two fingers 23 a and a hand main body 23 b that supports the fingers 23 a and enables the fingers 23 a to increase or decrease the space therebetween, so that the hand 23 can hold the work 6 by moving the fingers 23 a close together.
- the hand 23 is employed as an end effector.
- the end effector is not limited to the above, and any tools capable of holding the work 6 can be employed therefor.
- the arm 22 includes seven links and six joints that swingably or rotatably connect respective links to each other. Links having fixed lengths are employed for the respective links. However, for example, a link that is extensible and retractable with a linear actuator may be also employed therefor. As illustrated in FIG. 2 , each of the joints includes a motor 80 for driving the joint, an encoder 81 for detecting a rotation angle of the motor 80 , and a motor control unit 82 for transmitting and receiving a signal to/from the control device 5 to control the motor 80 and the encoder 81 . In the present exemplary embodiment, the 6-axis vertical multi-joint arm is employed as the arm 22 . However, a number of axes may be changed as appropriate according to the usage or purpose.
- the control device 5 is configured of a computer in order to control the robot main body 2 .
- the control device 5 includes a central processing unit (CPU) 50 serving as a calculation unit, a random access memory (RAM) 51 serving as a storage unit capable of temporarily storing data, a read only memory (ROM) 52 for storing a program for controlling respective units, and an input/output interface (I/F) circuit 53 that enables the control device 5 to communicate with the robot main body 2 .
- CPU central processing unit
- RAM random access memory
- ROM read only memory
- I/F input/output interface
- control device 5 functions as a model edge image generation unit for generating a model edge image 10 e as well as functioning as a pattern matching unit for executing pattern matching.
- the model edge image generation unit according to the present exemplary embodiment is configured of the RAM 51 capable of storing the model edge image 10 e and the CPU 50 for generating the model edge image 10 e .
- the pattern matching unit according to the present exemplary embodiment is configured of the RAM 51 capable of storing at least a search edge image 12 e and the model edge image 10 e and the CPU 50 for executing pattern matching of the search edge image 12 e and a final model edge image.
- the control device 5 includes respective functions of the model edge image generation unit and the pattern matching unit, the model edge image generation unit and the pattern matching unit may be provided separately from the control device 5 .
- the ROM 52 stores below-described programs such as a model edge image generation program 52 a , a pattern matching program 52 b , a robot control program for controlling the operation of the robot main body 2 , and an arithmetic program relating to the calculation of a positional orientation of the work 6 executed by the CPU 50 . Unless the CPU 50 writes or deletes data, the data stored in the ROM 52 can be saved therein even if the power of the control device 5 is turned off.
- the RAM 51 temporarily stores the below-described data such as a reference image 10 , a model edge image 10 e of the reference image 10 , a captured model image 11 , an edge image of the captured model image 11 (i.e., captured model edge image 11 e ), a search image 12 , and an edge image of the search image 12 (i.e., search edge image 12 e ).
- the CPU 50 includes a robot control unit 54 for controlling an operation of the robot main body 2 , a model edge image generation unit 55 , and a pattern matching unit 56 .
- the robot control unit 54 executes the robot control program to control the operation of the robot main body 2 .
- model edge image generation unit 55 executes the model edge image generation program 52 a to generate the model edge image 10 e before executing the pattern matching of the model edge image 10 e and the search edge image 12 e related to the work 6 .
- the model edge image generation unit 55 executes processing for extracting an edge from the captured model image 11 acquired by capturing the work 6 to generate the captured model edge image 11 e . Then, the model edge image generation unit 55 executes the pattern matching of the captured model edge image 11 e and the model edge image 10 e . Further, the model edge image generation unit 55 calculates similarity at respective edge points in the model edge image 10 e when the pattern of the captured model edge image 11 e matches the model edge image 10 e . Then, of the respective edge points in the model edge image 10 e , the model edge image generation unit 55 selects an edge point that is to be eliminated based on the similarity, and generates an edge image acquired by eliminating the selected edge point as a final model edge image.
- the pattern matching unit 56 executes the pattern matching program 52 b to execute the pattern matching of the search edge image 12 e and the final model edge image.
- the pattern matching unit 56 executes the pattern matching of the final model edge image related to the work 6 and the search edge image 12 e related to the search image 12 that is acquired by capturing the work 6 .
- the final model edge image is generated by the above-described model edge image generation unit 55 .
- step S 1 the model edge image generation unit 55 uses the robot system 1 to capture the work 6 under various conditions to acquire the captured model image 11 , and generates a final model edge image from the original model edge image 10 e .
- the processing for generating the final model edge image will be described below.
- step S 2 the model edge image generation unit 55 inputs the generated final model edge image to the RAM 51 .
- step S 3 the pattern matching unit 56 uses the robot system 1 to capture the search image 12 including the work 6 .
- step S 4 the pattern matching unit 56 extracts an edge from the search image 12 .
- step S 5 the pattern matching unit 56 inputs the search edge image 12 e acquired from the search image 12 to the RAM 51 .
- step S 6 the pattern matching unit 56 executes the pattern matching of the final model edge image and the search edge image 12 e stored in the RAM 51 .
- the CPU 50 can detect a position and an orientation of the work 6 based on the result of the pattern matching, so that the robot main body 2 can be controlled as appropriate based on that detection result.
- the processing for generating the model edge image 10 e executed in step S 1 will be described in detail with reference to the flowcharts in FIGS. 4, 6, 7 , and 9 .
- the model edge image 10 e as an original is firstly generated (see FIG. 4 ), so that the final model edge image is generated based on the original model edge image 10 e (see FIG. 6 ).
- a configuration will be described in which the model edge image 10 e is generated by using the camera 4 and the control device 5 according to the present exemplary embodiment.
- the configuration is not limited to the above, and another camera or computer may be used therefor.
- the original model edge image 10 e is generated based on the image captured by the camera 4
- the configuration is not limited thereto.
- CAD computer-aided design
- step S 10 in order to prepare the original reference image 10 , the work 6 is placed on the ideal positional orientation under the ideal illumination condition, and an image of the work 6 is captured by the camera 4 .
- step S 11 the model edge image generation unit 55 inputs the reference image 10 (see FIG. 5A ) captured by the camera 4 to the RAM 51 .
- the model edge image generation unit 55 displays the reference image on a display monitor (not illustrated), so that an operator sets a rectangular region in a periphery of the work 6 as a detection-target. In order to set the rectangular region, as illustrated in FIG.
- step S 12 by making the two clicked-points as the upper left and the lower right corners of the rectangular region, only the rectangular region is clipped from the reference image 10 as a clip image 10 a.
- the model edge image generation unit 55 calculates a gradient magnitude and a gradient direction of luminance at each pixel of the clip image 10 a .
- the gradient magnitude is calculated by using the Sobel filter in an x-axis direction and a y-axis direction.
- the model edge image generation unit 55 respectively calculates a gradient magnitude 71 in an x-axis direction and a gradient magnitude 72 in a y-axis direction at a target pixel 70 . Then, as illustrated in FIG.
- the model edge image generation unit 55 calculates a gradient magnitude 73 of the target pixel 70 as a square root of a summation of the squared gradient magnitudes 71 and 72 in the x-axis direction and the y-axis direction.
- the gradient magnitude 73 can be acquired through the following formula 1.
- E represents a gradient magnitude
- E x represents a gradient magnitude in the x-axis direction
- E y represents a gradient magnitude in the y-axis direction
- a gradient direction ⁇ is calculated through the following formula 2 by using the gradient magnitude E x in the x-axis direction and the gradient magnitude E y in the y-axis direction.
- ⁇ represents a gradient direction
- the model edge image generation unit 55 extracts a pixel having the gradient magnitude E equal to or greater than a predetermined threshold value as an edge, and generates the original model edge image 10 e .
- edge is a pixel having the gradient magnitude E equal to or greater than a predetermined threshold value.
- a coordinate, a gradient magnitude, and a gradient direction of the extracted pixel are respectively referred to as an edge position coordinate, an edge magnitude, and an edge direction, while an image having the edge position coordinate, the edge magnitude, and the edge direction is referred to as an edge image.
- a two-channel data region is provided for each of the pixels in the image.
- pixels 74 having the edge magnitude equal to or greater than the threshold value, indicated by a hatched region in the model edge image 10 e are regarded as effective pixels, so that values of the edge magnitude and the edge direction are stored in the first and the second channels respectively.
- pixels 75 having the edge magnitude less than the threshold value, illustrated in white color in FIG. 5C are regarded as non-effective pixels, so that invalid values (such as “0”) are stored therein.
- the two-channel data region is provided for each of the pixels, the configuration is not limited to the above.
- two images such as an edge magnitude image that stores only the edge magnitude and an edge direction image that stores only the edge direction may make a pair to store the data.
- three images such as an edge magnitude image that stores only the edge magnitude, an X-direction edge magnitude image that stores only the X-direction edge magnitude, and a Y-direction edge magnitude image that stores only the Y-direction edge magnitude may make a pair to store the data.
- the Sobel filter is used in order to calculate the edge magnitude and the edge direction.
- the configuration is not limited thereto, and an edge extraction filter such as the Canny filter may be used therefor.
- step S 20 the model edge image generation unit 55 inputs the generated original model edge image 10 e to the RAM 51 .
- step S 21 the model edge image generation unit 55 captures the captured model image 11 including the work 6 with the camera 4 and inputs the captured model image 11 to the RAM 51 .
- the captured model image 11 be an image influenced by the noise such as dust or dirt adhering thereto, variations in illumination, and individual variability.
- an artificial edge or artificial noise may be added thereto.
- step S 22 by employing the same edge extraction method described in step S 13 , the model edge image generation unit 55 generates the captured model edge image 11 e from the captured model image 11 .
- step S 23 the model edge image generation unit 55 executes the pattern matching of the original model edge image 10 e and the captured model edge image 11 e.
- step S 23 processing of the pattern matching executed in step S 23 will be described in detail with reference to the flowchart (i.e., subroutine) illustrated in FIG. 7 .
- the model edge image generation unit 55 sets a detection position for matching the pattern of the model edge image 10 e across the entire region within the captured model edge image 11 e in a pixel unit (see FIG. 8 ).
- the upper left end position in the captured model edge image 11 e is set as a first detection position.
- the detection position is sequentially moved and set in an order from the upper left to the upper right end position.
- the detection position is brought down by one pixel and sequentially set in an order from the left end position to the right end position.
- the model edge image generation unit 55 calculates a score at each detection position.
- a score S ij at an optional detection position (i, j) is calculated by the following formula 3.
- S ij represents a score at the detection position (i, j)
- N represents number of edge points in the model edge image 10 e
- s k represents a local score
- the local score s k is a score calculated at each edge point of the model edge image 10 e , which is a cosine value of a difference between the edge direction of the captured model edge image 11 e and the edge direction of the model edge image 10 e at one edge point.
- the local score s k is calculated by the following formula 4.
- ⁇ Tk represents an edge direction of the captured model edge image 11 e
- ⁇ Mk represents an edge direction of the model edge image 10 e.
- a range of values the local score s k can take is ⁇ 1 to +1. Because a summation of the local scores s k is normalized after being divided by the number of edge points, a range of values the score S ij can take is also ⁇ 1 to +1.
- step S 32 the model edge image generation unit 55 determines whether the calculated score S ij is equal to or greater than a predetermined threshold value. In a case where the model edge image generation unit 55 determines that the calculated score S ij is equal to or greater than a predetermined threshold value (YES in step S 32 ), the processing proceeds to step S 33 .
- step S 33 the model edge image generation unit 55 sets the detection position as a matching candidate point, and stores the detection position (i, j), the score S ij , and the local scores s k at respective edge points.
- step S 34 the model edge image generation unit 55 determines whether calculation of the score S ij has been completed for all of the detection positions.
- step S 34 the processing returns to step S 30 so that the model edge image generation unit 55 calculates the score S ij again by setting the next detection position through the processing in steps S 30 to S 33 .
- step S 35 the model edge image generation unit 55 outputs information of a matching candidate point having the greatest score S ij from among the matching candidate points.
- the model edge image generation unit 55 outputs the information such as the detection position (i, j), the score S ij , and the local scores s k at respective edge points of the matching candidate point. Then, the model edge image generation unit 55 returns the processing to the original routine, so that the processing proceeds to step S 24 in FIG. 6 . If the matching candidate point does not exist, the processing in step S 35 will not be executed.
- step S 24 the model edge image generation unit 55 stores the local scores s k at respective edge points output in step S 35 .
- a set of local scores s k at respective edge points can be acquired with respect to a single captured model image 11 .
- the local scores s k correspond to the similarity at respective edge points in the model edge image 10 e in the pattern matching of the captured model edge image 11 e and the model edge image 10 e.
- step S 25 the model edge image generation unit 55 determines whether the processing for acquiring the local scores sk at respective edge points has been completed for all of the captured model images 11 .
- the processing be executed on the captured model images 11 of a statistically reliable number. In a case where the model edge image generation unit 55 determines that the processing has not been completed for all of the captured model images 11 (NO in step S 25 ), the processing is executed from step S 21 again.
- step S 25 the processing proceeds to step S 26 .
- step S 26 the model edge image generation unit 55 selects the edge point that is to be eliminated from the original model edge image 10 e .
- the elimination-target edge point is selected from among the edge points in the model edge image 10 e based on the local score s k (i.e., similarity).
- step S 26 processing for selecting the edge point executed in step S 26 will be described in detail with reference to the flowchart (i.e., subroutine) illustrated in FIG. 9 .
- step S 40 the model edge image generation unit 55 calculates the average of the local scores s k at respective edge points in the original model edge image 10 e .
- the local scores s k for M-piece of the captured model images 11 are acquired through the processing executed in steps S 21 to S 25 .
- an average m k of the local scores s Lk at respective edge points is calculated by the following formula 5.
- step S 41 a variance ⁇ k 2 of the local scores s Lk at respective edge points can be calculated by the following formula 6.
- the model edge image generation unit 55 can select the edge point by eliminating the edge point easily influenced by the noise.
- step S 42 the model edge image generation unit 55 determines whether calculation of the average m k and the variance ⁇ k 2 has been completed for all of the edge points. In a case where the model edge image generation unit 55 determines that the calculation thereof has not been completed for all of the edge points (NO in step S 42 ), the processing returns to step S 40 so that the model edge image generation unit 55 executes the calculation for the next edge point through the processing in steps S 40 to S 41 . In a case where the model edge image generation unit 55 determines that the calculation thereof has been completed for all of the edge points (YES in step S 42 ), the processing proceeds to step S 43 . In step S 43 , the model edge image generation unit 55 selects the edge point by executing threshold determination based on the calculated average m k and the variance ⁇ k 2 .
- the model edge image generation unit 55 selects the elimination-target edge point based on at least one of the average m k and the variance ⁇ k 2 of the local scores s Lk (i.e., similarity) with the model edge image 10 e at the respective edge points of a plurality of captured model edge images 11 e .
- the model edge image generation unit 55 previously sets the threshold values with respect to the average m k and the variance ⁇ k 2 , and determines and selects the edge point based on the threshold values.
- the model edge image generation unit 55 determines that the edge point is influenced by the noise, and eliminates that edge point.
- the model edge image generation unit 55 may sort the averages m k of the local scores s Lk at respective edge points in an descending order while sorting the variances ⁇ k 2 thereof in an ascending order, and eliminate an optional percentage (e.g., 20%) of the edge points from the lowest order. Then, the model edge image generation unit 55 returns the processing to the original routine, so that the processing proceeds to step S 27 in FIG. 6 .
- step S 27 the model edge image generation unit 55 eliminates the selected elimination-target edge point from the original model edge image 10 e to generate the final model edge image. Then, the pattern matching unit 56 uses the generated final model edge image to execute the pattern matching of the final model edge image and the search edge image 12 e through the processing described in step S 6 .
- the control device 5 selects the edge point to be eliminated based on the local score s Lk from among the edge points in the model edge image 10 e and generates an edge image acquired by eliminating the selected edge point as the final model edge image. Therefore, even if the work 6 has a different surface condition because of the influence of various kinds of noise, the detection accuracy of the work 6 can be suppressed from being lowered. Further, the pattern matching of the final model edge image and the search edge image 12 e can be executed with high accuracy by suppressing the lowering of detection accuracy of the work 6 .
- a final model edge has been generated and pattern matching is executed by using the final model edge.
- the present exemplary embodiment is not limited to the above.
- the generated model edge image 10 e may be registered on a library.
- the respective processing operations of the above-described present exemplary embodiment are specifically executed by the model edge image generation unit 55 and the pattern matching unit 56 .
- a storage medium storing a program of software that realizes the above-described functions may be supplied to the model edge image generation unit 55 and the pattern matching unit 56 .
- the CPU 50 constituting the model edge image generation unit 55 may read and execute the model edge image generation program 52 a stored in the storage medium to achieve the functions.
- the CPU 50 constituting the pattern matching unit 56 may read and execute the pattern matching program 52 b stored in the storage medium in order to achieve the functions.
- a program itself that is read from the storage medium realizes the functions of the above-described exemplary embodiments, and thus the program itself and the storage medium storing that program configure the present invention.
- a computer readable storage medium serves as the ROM 52 , and the model edge image generation program 52 a and the pattern matching program 52 b are stored in the ROM 52 .
- the configuration is not limited to the above.
- the above-described programs can be stored in a computer readable storage medium of any type.
- a hard disk drive (HDD) an external storage device, or a storage disk may be employed as the storage medium for supplying the programs.
- Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
- computer executable instructions e.g., one or more programs
- a storage medium which may also be referred to more fully as a
- the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Image Analysis (AREA)
Abstract
An image processing method can suppress detection accuracy of a detection target object from being lowered even if the detection target object has a different surface condition because of the influence of various kinds of noise. The image processing method includes the following operations of generating a captured model edge image by executing edge extraction processing on a captured model image acquired by capturing a detection target object, executing pattern matching of the captured model edge image and a model edge image, calculating similarity at respective edge points in the model edge image in the pattern matching of the captured model edge image and the model edge image, selecting an edge point to be eliminated based on the similarity from among the respective edge points in the model edge image, and generating an edge image acquired by eliminating the selected edge point as a final model edge image.
Description
- 1. Field of the Invention
- The present invention relates to an image processing method in which a model edge image that is generated based on a detection target object used for pattern matching is used as a final model edge image. Further, the present invention also relates to an image processing device and a robot system.
- 2. Description of the Related Art
- Conventionally, in a field of image processing, a pattern matching method has been known as a method for detecting a position and an orientation of a detection target object such as a work. In particular, a method known as shape pattern matching (hereinafter, referred to as “pattern matching”) has been widely employed because the method has high robustness in terms of variations in illumination and an object with a hidden or void portion.
- In the pattern matching, because a similarity is calculated by using shape characteristics of a model (i.e., reference image) and a search image of a detection target object, it is necessary to extract the shape characteristics of the detection target object from the images. Generally, an edge extraction method such as the Sobel filter or the Canny filter is known as the method for extracting the shape characteristics. Therefore, there has been known a calculation method in which the above-described edge extraction method is applied to a model and a search image to generate a model edge image and a search edge image to calculate the similarity between the model edge image and the search edge image.
- However, in practice, because of influence of the noise such as dust or dirt adhering thereto, variations in illumination, or individual variability, the detection target object may have a surface condition different from the ideal surface condition. Therefore, a position and an edge direction may be different at an edge point of the detection target object in the search image and an edge point of the model. In particular, because the similarity is lowered when the position and the edge direction are significantly different at the edge point of the detection target object in the search image and the edge point of the model, the detection target object may be mistakenly determined to have low similarity. Therefore, there arises a problem in that detection accuracy of the detection target object is lowered.
- Japanese Patent Application Laid-Open No. 2010-97438 discusses a method for generating a model edge image based on a model. In the method, a long edge that is deemed less influenced by the noise is kept while a short edge that can be considerably influenced by the noise is eliminated in generating a model edge image. According to the above-described generation method of the model edge image, the similarity can be prevented from being lowered and the detection accuracy of the detection target object can be improved in comparison to the case where the similarity is determined by using a model edge image that is generated without using the above-described generation method.
- However, according to the generation method of the model edge image described in Japanese Patent Application Laid-Open No. 2010-97438, determination on whether to eliminate the edge point is simply made based on a length of the edge. Therefore, for example, in a case where an edge has a short length, the edge is eliminated even if the edge is not influenced by the noise, so that the similarity is lowered. Therefore, there is a problem in that detection accuracy of the detection target object is lowered. On the other hand, in a case where an edge has a long length, the edge is not eliminated even if the edge is influenced by the noise, so that the similarity thereof is lowered. Therefore, there is a problem in that detection accuracy of the detection target object is lowered.
- According to an aspect of the present invention, an image processing method includes generating a captured model edge image by executing edge extraction processing on a captured model image acquired by capturing a detection target object, executing pattern matching of the captured model edge image and a model edge image related to the detection target object, calculating similarity at respective edge points in the model edge image in the pattern matching, selecting an edge point to be eliminated based on the similarity from among the respective edge points in the model edge image, generating an edge image acquired by eliminating the selected edge point as a final model edge image, and executing pattern matching of the final model edge image and a search edge image related to a search image acquired by capturing a search object.
- According to the present invention, of the edge points in a model edge image, an elimination-target edge point is selected based on the similarity, and an edge image acquired by eliminating the selected edge point is generated as a model edge image. Therefore, even if the detection target object has a different surface condition because of the influence of various kinds of noise, detection accuracy of the detection target object can be suppressed from being lowered. Further, pattern matching of the model edge image and a search edge image can be executed with high accuracy by suppressing the lowering of detection accuracy of the detection target.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
FIGS. 1A, 1B, and 1C are diagrams illustrating a general configuration of a robot system according to a present exemplary embodiment.FIG. 1A is a side view of the entire robot system,FIG. 1B is a plan view of a work held by a hand, andFIG. 1C is a captured image of the work held by the hand illustrated inFIG. 1B . -
FIG. 2 is a block diagram illustrating a control device according to a present exemplary embodiment. -
FIG. 3 is a flowchart illustrating processing of pattern matching executed by a pattern matching unit according to a present exemplary embodiment. -
FIG. 4 is a flowchart illustrating the processing for generating an original model edge image executed by a model edge image generation unit according to a present exemplary embodiment. -
FIGS. 5A, 5B, and 5C are diagrams illustrating processing for generating a model edge image according to a present exemplary embodiment.FIG. 5A is a diagram illustrating a reference image in which a clip image is clipped out,FIG. 5B is a diagram illustrating a vector of one pixel, andFIG. 5C is a diagram illustrating a model edge image generated from the clip image. -
FIG. 6 is a flowchart illustrating the processing that is to be executed when a model edge image is generated by the model edge image generation unit according to a present exemplary embodiment. -
FIG. 7 is a flowchart illustrating the processing of pattern matching that is to be executed when a model edge image is generated by the model edge image generation unit according to a present exemplary embodiment. -
FIG. 8 is a diagram illustrating a captured model edge image according to a present exemplary embodiment. -
FIG. 9 is a flowchart illustrating the processing for selecting an edge point that is to be executed when a model edge image is generated by the model edge image generation unit according to a present exemplary embodiment. - Hereinafter, an exemplary embodiment embodying the present invention will be described in detail with reference to the appended drawings.
- As illustrated in
FIG. 1A , arobot system 1 is configured of a robotmain body 2, acamera 4, and acontrol device 5 of the robotmain body 2, and thecontrol device 5 includes an image processing device having a model edge image generation unit and a pattern matching unit. The model edge image generation unit and the pattern matching unit may be respectively configured of hardware, or may be configured of software as a part of thecontrol device 5. - In the present exemplary embodiment, the
robot system 1 includes awork supply device 3 that supplies a work 6 (i.e., detection target object) to the robotmain body 2. - The robot
main body 2 can operate thework 6, and thecontrol device 5 controls the robotmain body 2 and thecamera 4. As illustrated inFIG. 1B , for example, thework 6 is formed into a circular-ring shape, a part of which has aprojection 6 a externally projected in a diameter direction as a phase reference. Herein, although the phase reference of thework 6 is represented by theprojection 6 a, the phase reference is not limited to the above, for example, a mark may be used as the phase reference. Thecamera 4 is fixed onto acamera fixing base 40, so that an image of thework 6 supplied to the robotmain body 2 from thework supply device 3, i.e., thework 6 gripped by ahand 23, can be captured by thecamera 4 from above. For example, as illustrated inFIG. 1C , asearch image 12 can be acquired when thework 6 and its periphery are captured as a search object. In addition, thework 6 can be formed into any shape. Hereinafter, in order to describe the present exemplary embodiment with ease, thework 6 is formed into a triangular-prism shape as illustrated inFIG. 5A . - The robot
main body 2 includes a 6-axis vertical multi-joint arm (hereinafter, referred to as “arm”) 22 and thehand 23 serving as an end effector. - The
hand 23 is attached to and supported by a leading-edge link 60 of thearm 22, so that at least one degree of freedom of a position and an orientation thereof can be adjusted according to the operation of thearm 22. Thehand 23 includes twofingers 23 a and a handmain body 23 b that supports thefingers 23 a and enables thefingers 23 a to increase or decrease the space therebetween, so that thehand 23 can hold thework 6 by moving thefingers 23 a close together. In the present exemplary embodiment, thehand 23 is employed as an end effector. However, the end effector is not limited to the above, and any tools capable of holding thework 6 can be employed therefor. - For example, the
arm 22 includes seven links and six joints that swingably or rotatably connect respective links to each other. Links having fixed lengths are employed for the respective links. However, for example, a link that is extensible and retractable with a linear actuator may be also employed therefor. As illustrated inFIG. 2 , each of the joints includes amotor 80 for driving the joint, anencoder 81 for detecting a rotation angle of themotor 80, and amotor control unit 82 for transmitting and receiving a signal to/from thecontrol device 5 to control themotor 80 and theencoder 81. In the present exemplary embodiment, the 6-axis vertical multi-joint arm is employed as thearm 22. However, a number of axes may be changed as appropriate according to the usage or purpose. - The
control device 5 is configured of a computer in order to control the robotmain body 2. Thecontrol device 5 includes a central processing unit (CPU) 50 serving as a calculation unit, a random access memory (RAM) 51 serving as a storage unit capable of temporarily storing data, a read only memory (ROM) 52 for storing a program for controlling respective units, and an input/output interface (I/F)circuit 53 that enables thecontrol device 5 to communicate with the robotmain body 2. - Further, the
control device 5 functions as a model edge image generation unit for generating amodel edge image 10 e as well as functioning as a pattern matching unit for executing pattern matching. In other words, the model edge image generation unit according to the present exemplary embodiment is configured of theRAM 51 capable of storing themodel edge image 10 e and theCPU 50 for generating themodel edge image 10 e. Further, the pattern matching unit according to the present exemplary embodiment is configured of theRAM 51 capable of storing at least asearch edge image 12 e and themodel edge image 10 e and theCPU 50 for executing pattern matching of thesearch edge image 12 e and a final model edge image. Furthermore, in the present exemplary embodiment, although thecontrol device 5 includes respective functions of the model edge image generation unit and the pattern matching unit, the model edge image generation unit and the pattern matching unit may be provided separately from thecontrol device 5. - The
ROM 52 stores below-described programs such as a model edgeimage generation program 52 a, apattern matching program 52 b, a robot control program for controlling the operation of the robotmain body 2, and an arithmetic program relating to the calculation of a positional orientation of thework 6 executed by theCPU 50. Unless theCPU 50 writes or deletes data, the data stored in theROM 52 can be saved therein even if the power of thecontrol device 5 is turned off. TheRAM 51 temporarily stores the below-described data such as areference image 10, amodel edge image 10 e of thereference image 10, a capturedmodel image 11, an edge image of the captured model image 11 (i.e., capturedmodel edge image 11 e), asearch image 12, and an edge image of the search image 12 (i.e.,search edge image 12 e). - The
CPU 50 includes arobot control unit 54 for controlling an operation of the robotmain body 2, a model edgeimage generation unit 55, and apattern matching unit 56. Therobot control unit 54 executes the robot control program to control the operation of the robotmain body 2. - While details will be described below, the model edge
image generation unit 55 executes the model edgeimage generation program 52 a to generate themodel edge image 10 e before executing the pattern matching of themodel edge image 10 e and thesearch edge image 12 e related to thework 6. - Specifically, the model edge
image generation unit 55 executes processing for extracting an edge from the capturedmodel image 11 acquired by capturing thework 6 to generate the capturedmodel edge image 11 e. Then, the model edgeimage generation unit 55 executes the pattern matching of the capturedmodel edge image 11 e and themodel edge image 10 e. Further, the model edgeimage generation unit 55 calculates similarity at respective edge points in themodel edge image 10 e when the pattern of the capturedmodel edge image 11 e matches themodel edge image 10 e. Then, of the respective edge points in themodel edge image 10 e, the model edgeimage generation unit 55 selects an edge point that is to be eliminated based on the similarity, and generates an edge image acquired by eliminating the selected edge point as a final model edge image. - While details will be described below, the
pattern matching unit 56 executes thepattern matching program 52 b to execute the pattern matching of thesearch edge image 12 e and the final model edge image. - Specifically, the
pattern matching unit 56 executes the pattern matching of the final model edge image related to thework 6 and thesearch edge image 12 e related to thesearch image 12 that is acquired by capturing thework 6. Herein, the final model edge image is generated by the above-described model edgeimage generation unit 55. - Subsequently, pattern matching of the
work 6 executed by the above-describedcontrol device 5 of the robotmain body 2 will be described with reference to the flowchart inFIG. 3 . Herein, a processing flow will be briefly described, and the processing in respective steps will be described below in detail. - First, in step S1, the model edge
image generation unit 55 uses therobot system 1 to capture thework 6 under various conditions to acquire the capturedmodel image 11, and generates a final model edge image from the originalmodel edge image 10 e. The processing for generating the final model edge image will be described below. - Then, in step S2, the model edge
image generation unit 55 inputs the generated final model edge image to theRAM 51. Then, in step S3, thepattern matching unit 56 uses therobot system 1 to capture thesearch image 12 including thework 6. Further, in step S4, thepattern matching unit 56 extracts an edge from thesearch image 12. In step S5, thepattern matching unit 56 inputs thesearch edge image 12 e acquired from thesearch image 12 to theRAM 51. - Then, in step S6, the
pattern matching unit 56 executes the pattern matching of the final model edge image and thesearch edge image 12 e stored in theRAM 51. TheCPU 50 can detect a position and an orientation of thework 6 based on the result of the pattern matching, so that the robotmain body 2 can be controlled as appropriate based on that detection result. - Next, the processing for generating the
model edge image 10 e executed in step S1 will be described in detail with reference to the flowcharts inFIGS. 4, 6, 7 , and 9. In the present exemplary embodiment, themodel edge image 10 e as an original is firstly generated (seeFIG. 4 ), so that the final model edge image is generated based on the originalmodel edge image 10 e (seeFIG. 6 ). Herein, a configuration will be described in which themodel edge image 10 e is generated by using thecamera 4 and thecontrol device 5 according to the present exemplary embodiment. However, the configuration is not limited to the above, and another camera or computer may be used therefor. Further, in the present exemplary embodiment, although the originalmodel edge image 10 e is generated based on the image captured by thecamera 4, the configuration is not limited thereto. For example, CAD (computer-aided design) data to which an artificial edge is applied may be used therefor. - As illustrated in
FIG. 4 , in step S10, in order to prepare theoriginal reference image 10, thework 6 is placed on the ideal positional orientation under the ideal illumination condition, and an image of thework 6 is captured by thecamera 4. In step S11, the model edgeimage generation unit 55 inputs the reference image 10 (seeFIG. 5A ) captured by thecamera 4 to theRAM 51. The model edgeimage generation unit 55 displays the reference image on a display monitor (not illustrated), so that an operator sets a rectangular region in a periphery of thework 6 as a detection-target. In order to set the rectangular region, as illustrated inFIG. 5A , the operator uses a mouse (not illustrated) provided on thecontrol device 5 to click two points at the upper left and the lower right of a region that includes thework 6 in thereference image 10 displayed on the display monitor. In step S12, by making the two clicked-points as the upper left and the lower right corners of the rectangular region, only the rectangular region is clipped from thereference image 10 as aclip image 10 a. - The model edge
image generation unit 55 calculates a gradient magnitude and a gradient direction of luminance at each pixel of theclip image 10 a. The gradient magnitude is calculated by using the Sobel filter in an x-axis direction and a y-axis direction. First, as illustrated inFIG. 5A , the model edgeimage generation unit 55 respectively calculates agradient magnitude 71 in an x-axis direction and agradient magnitude 72 in a y-axis direction at atarget pixel 70. Then, as illustrated inFIG. 5B , the model edgeimage generation unit 55 calculates agradient magnitude 73 of thetarget pixel 70 as a square root of a summation of the squaredgradient magnitudes gradient magnitude 73 can be acquired through the followingformula 1. -
E=√{square root over (E x 2 +E y 2)} <Formula 1> - In the above, “E” represents a gradient magnitude, “Ex” represents a gradient magnitude in the x-axis direction, and “Ey” represents a gradient magnitude in the y-axis direction.
- At this time, a gradient direction θ is calculated through the following
formula 2 by using the gradient magnitude Ex in the x-axis direction and the gradient magnitude Ey in the y-axis direction. -
- In the above, “θ” represents a gradient direction.
- After calculating the gradient magnitude E and the gradient direction θ of all of the pixels in the
clip image 10 a, in step S13, the model edgeimage generation unit 55 extracts a pixel having the gradient magnitude E equal to or greater than a predetermined threshold value as an edge, and generates the originalmodel edge image 10 e. Herein, “edge” is a pixel having the gradient magnitude E equal to or greater than a predetermined threshold value. Hereinafter, for descriptive purpose, a coordinate, a gradient magnitude, and a gradient direction of the extracted pixel are respectively referred to as an edge position coordinate, an edge magnitude, and an edge direction, while an image having the edge position coordinate, the edge magnitude, and the edge direction is referred to as an edge image. - In order to store data of the
model edge image 10 e, a two-channel data region is provided for each of the pixels in the image. Then, as illustrated inFIG. 5C ,pixels 74 having the edge magnitude equal to or greater than the threshold value, indicated by a hatched region in themodel edge image 10 e, are regarded as effective pixels, so that values of the edge magnitude and the edge direction are stored in the first and the second channels respectively. On the other hand,pixels 75 having the edge magnitude less than the threshold value, illustrated in white color inFIG. 5C , are regarded as non-effective pixels, so that invalid values (such as “0”) are stored therein. In the present exemplary embodiment, although the two-channel data region is provided for each of the pixels, the configuration is not limited to the above. For example, two images such as an edge magnitude image that stores only the edge magnitude and an edge direction image that stores only the edge direction may make a pair to store the data. Further, three images such as an edge magnitude image that stores only the edge magnitude, an X-direction edge magnitude image that stores only the X-direction edge magnitude, and a Y-direction edge magnitude image that stores only the Y-direction edge magnitude may make a pair to store the data. Furthermore, in the present exemplary embodiment, the Sobel filter is used in order to calculate the edge magnitude and the edge direction. However, the configuration is not limited thereto, and an edge extraction filter such as the Canny filter may be used therefor. - Subsequently, with reference to the flowchart illustrated in
FIG. 6 , description will be given of the processing for generating the final model edge image based on the original model edge image (i.e., captured model image 11). The processing illustrated inFIG. 6 corresponds to an image processing method characterized by a model edge image generation method. First, in step S20, the model edgeimage generation unit 55 inputs the generated originalmodel edge image 10 e to theRAM 51. In step S21, the model edgeimage generation unit 55 captures the capturedmodel image 11 including thework 6 with thecamera 4 and inputs the capturedmodel image 11 to theRAM 51. Herein, taking asearch image 12 that is to be input thereto in the pattern matching into consideration, it is preferable that the capturedmodel image 11 be an image influenced by the noise such as dust or dirt adhering thereto, variations in illumination, and individual variability. Alternatively, instead of using the image captured by thecamera 4 as it is, an artificial edge or artificial noise may be added thereto. - In step S22, by employing the same edge extraction method described in step S13, the model edge
image generation unit 55 generates the capturedmodel edge image 11 e from the capturedmodel image 11. In step S23, the model edgeimage generation unit 55 executes the pattern matching of the originalmodel edge image 10 e and the capturedmodel edge image 11 e. - Herein, processing of the pattern matching executed in step S23 will be described in detail with reference to the flowchart (i.e., subroutine) illustrated in
FIG. 7 . - First, in step S30, the model edge
image generation unit 55 sets a detection position for matching the pattern of themodel edge image 10 e across the entire region within the capturedmodel edge image 11 e in a pixel unit (seeFIG. 8 ). In the present exemplary embodiment, as illustrated inFIG. 8 , the upper left end position in the capturedmodel edge image 11 e is set as a first detection position. Then, the detection position is sequentially moved and set in an order from the upper left to the upper right end position. Thereafter, the detection position is brought down by one pixel and sequentially set in an order from the left end position to the right end position. Then, in step S31, the model edgeimage generation unit 55 calculates a score at each detection position. In the present exemplary embodiment, a score Sij at an optional detection position (i, j) is calculated by the followingformula 3. -
- In the above, “Sij” represents a score at the detection position (i, j), “N” represents number of edge points in the
model edge image 10 e, and “sk” represents a local score. - The local score sk is a score calculated at each edge point of the
model edge image 10 e, which is a cosine value of a difference between the edge direction of the capturedmodel edge image 11 e and the edge direction of themodel edge image 10 e at one edge point. The local score sk is calculated by the followingformula 4. -
s k=cos(θTk−θMk) <Formula 4> - In the above, “k” is a value equal to 1 to N (k=1, . . . , N), and represents an edge point index of the
model edge image 10 e, “θ Tk” represents an edge direction of the capturedmodel edge image 11 e, and “θMk” represents an edge direction of themodel edge image 10 e. - A range of values the local score sk can take is −1 to +1. Because a summation of the local scores sk is normalized after being divided by the number of edge points, a range of values the score Sij can take is also −1 to +1.
- In step S32, the model edge
image generation unit 55 determines whether the calculated score Sij is equal to or greater than a predetermined threshold value. In a case where the model edgeimage generation unit 55 determines that the calculated score Sij is equal to or greater than a predetermined threshold value (YES in step S32), the processing proceeds to step S33. In step S33, the model edgeimage generation unit 55 sets the detection position as a matching candidate point, and stores the detection position (i, j), the score Sij, and the local scores sk at respective edge points. After the matching candidate point is set in step S33, or in a case where the model edgeimage generation unit 55 determines that the score Sij is not equal to or greater than a predetermined threshold value (NO in step S32), the processing proceeds to step S34. In step S34, the model edgeimage generation unit 55 determines whether calculation of the score Sij has been completed for all of the detection positions. - In a case where the model edge
image generation unit 55 determines that calculation of the score Sij has not been completed for all of the detection positions (NO in step S34), the processing returns to step S30 so that the model edgeimage generation unit 55 calculates the score Sij again by setting the next detection position through the processing in steps S30 to S33. In a case where the model edgeimage generation unit 55 determines that calculation of the score Sij has been completed for all of the detection positions (YES in step S34), the processing proceeds to step S35. In step S35, the model edgeimage generation unit 55 outputs information of a matching candidate point having the greatest score Sij from among the matching candidate points. Specifically, the model edgeimage generation unit 55 outputs the information such as the detection position (i, j), the score Sij, and the local scores sk at respective edge points of the matching candidate point. Then, the model edgeimage generation unit 55 returns the processing to the original routine, so that the processing proceeds to step S24 inFIG. 6 . If the matching candidate point does not exist, the processing in step S35 will not be executed. - As illustrated in
FIG. 6 , in step S24, the model edgeimage generation unit 55 stores the local scores sk at respective edge points output in step S35. Through the above-described processing in steps S21 to S24, a set of local scores sk at respective edge points can be acquired with respect to a single capturedmodel image 11. In other words, according to the present exemplary embodiment, the local scores sk correspond to the similarity at respective edge points in themodel edge image 10 e in the pattern matching of the capturedmodel edge image 11 e and themodel edge image 10 e. - Then, in step S25, the model edge
image generation unit 55 determines whether the processing for acquiring the local scores sk at respective edge points has been completed for all of the capturedmodel images 11. Herein, it is preferable that the processing be executed on the capturedmodel images 11 of a statistically reliable number. In a case where the model edgeimage generation unit 55 determines that the processing has not been completed for all of the captured model images 11 (NO in step S25), the processing is executed from step S21 again. - In a case where the model edge
image generation unit 55 determines that the processing has been completed for all of the captured model images 11 (YES in step S25), the processing proceeds to step S26. In step S26, the model edgeimage generation unit 55 selects the edge point that is to be eliminated from the originalmodel edge image 10 e. In the present exemplary embodiment, the elimination-target edge point is selected from among the edge points in themodel edge image 10 e based on the local score sk (i.e., similarity). - Herein, processing for selecting the edge point executed in step S26 will be described in detail with reference to the flowchart (i.e., subroutine) illustrated in
FIG. 9 . - First, in step S40, the model edge
image generation unit 55 calculates the average of the local scores sk at respective edge points in the originalmodel edge image 10 e. In the present exemplary embodiment, it is assumed that the local scores sk for M-piece of the capturedmodel images 11 are acquired through the processing executed in steps S21 to S25. Then, a repetitive processing index L for M-piece of the capturedmodel images 11 is assumed to be a value equal to 1 to M (L=1, . . . , M). At this time, if the local scores sLk are taken out by making the edge point index k of the originalmodel edge image 10 e as a reference, an average mk of the local scores sLk at respective edge points is calculated by the followingformula 5. -
- In the above, the edge point index k is a value equal to 1 to N (k=1, . . . , N).
- In other words, calculation of the average mk of the local scores sLk at one edge point is executed as much as the number of edge points. Similarly, in step S41, a variance σk 2 of the local scores sLk at respective edge points can be calculated by the following
formula 6. -
- In the above, the edge point index k is a value equal to 1 to M (k=1, . . . , M).
- For example, in a case where one edge point is influenced by the noise discontinuously changing in a time direction, the average mk of the local scores sLk at the one edge point calculated by the
formula 5 has a small value while the variance σk 2 thereof has a large value. In other words, according to the values of the average mk and the variance σk 2, the model edgeimage generation unit 55 can select the edge point by eliminating the edge point easily influenced by the noise. - Then, in step S42, the model edge
image generation unit 55 determines whether calculation of the average mk and the variance σk 2 has been completed for all of the edge points. In a case where the model edgeimage generation unit 55 determines that the calculation thereof has not been completed for all of the edge points (NO in step S42), the processing returns to step S40 so that the model edgeimage generation unit 55 executes the calculation for the next edge point through the processing in steps S40 to S41. In a case where the model edgeimage generation unit 55 determines that the calculation thereof has been completed for all of the edge points (YES in step S42), the processing proceeds to step S43. In step S43, the model edgeimage generation unit 55 selects the edge point by executing threshold determination based on the calculated average mk and the variance σk 2. - In the present exemplary embodiment, the model edge
image generation unit 55 selects the elimination-target edge point based on at least one of the average mk and the variance σk 2 of the local scores sLk (i.e., similarity) with themodel edge image 10 e at the respective edge points of a plurality of capturedmodel edge images 11 e. Herein, the model edgeimage generation unit 55 previously sets the threshold values with respect to the average mk and the variance σk 2, and determines and selects the edge point based on the threshold values. Specifically, when the average mk has a value equal to or less than the set threshold value, or when the variance σk 2 has a value equal to or greater than the set threshold value, the model edgeimage generation unit 55 determines that the edge point is influenced by the noise, and eliminates that edge point. Alternatively, instead of using the threshold values, for example, the model edgeimage generation unit 55 may sort the averages mk of the local scores sLk at respective edge points in an descending order while sorting the variances σk 2 thereof in an ascending order, and eliminate an optional percentage (e.g., 20%) of the edge points from the lowest order. Then, the model edgeimage generation unit 55 returns the processing to the original routine, so that the processing proceeds to step S27 inFIG. 6 . - As illustrated in
FIG. 6 , in step S27, the model edgeimage generation unit 55 eliminates the selected elimination-target edge point from the originalmodel edge image 10 e to generate the final model edge image. Then, thepattern matching unit 56 uses the generated final model edge image to execute the pattern matching of the final model edge image and thesearch edge image 12 e through the processing described in step S6. - As described above, the
control device 5 according to the present exemplary embodiment selects the edge point to be eliminated based on the local score sLk from among the edge points in themodel edge image 10 e and generates an edge image acquired by eliminating the selected edge point as the final model edge image. Therefore, even if thework 6 has a different surface condition because of the influence of various kinds of noise, the detection accuracy of thework 6 can be suppressed from being lowered. Further, the pattern matching of the final model edge image and thesearch edge image 12 e can be executed with high accuracy by suppressing the lowering of detection accuracy of thework 6. - In the above-described
control device 5 according to the present exemplary embodiment, a final model edge has been generated and pattern matching is executed by using the final model edge. However, the present exemplary embodiment is not limited to the above. For example, the generatedmodel edge image 10 e may be registered on a library. - The respective processing operations of the above-described present exemplary embodiment are specifically executed by the model edge
image generation unit 55 and thepattern matching unit 56. Accordingly, a storage medium storing a program of software that realizes the above-described functions may be supplied to the model edgeimage generation unit 55 and thepattern matching unit 56. Then, theCPU 50 constituting the model edgeimage generation unit 55 may read and execute the model edgeimage generation program 52 a stored in the storage medium to achieve the functions. Alternatively, theCPU 50 constituting thepattern matching unit 56 may read and execute thepattern matching program 52 b stored in the storage medium in order to achieve the functions. In such a case, a program itself that is read from the storage medium realizes the functions of the above-described exemplary embodiments, and thus the program itself and the storage medium storing that program configure the present invention. - Further, according to a configuration described in the present exemplary embodiment, a computer readable storage medium serves as the
ROM 52, and the model edgeimage generation program 52 a and thepattern matching program 52 b are stored in theROM 52. However, the configuration is not limited to the above. The above-described programs can be stored in a computer readable storage medium of any type. For example, a hard disk drive (HDD), an external storage device, or a storage disk may be employed as the storage medium for supplying the programs. - Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
- This application claims the benefit of Japanese Patent Application No. 2014-211412, filed Oct. 16, 2014, which is hereby incorporated by reference herein in its entirety.
Claims (16)
1. An image processing method comprising:
generating a captured model edge image by executing edge extraction processing on a captured model image acquired by capturing a detection target object;
executing pattern matching of the captured model edge image and a model edge image related to the detection target object;
calculating similarity at respective edge points in the model edge image in the pattern matching;
selecting an edge point to be eliminated based on the similarity from among the respective edge points in the model edge image;
generating an edge image acquired by eliminating the selected edge point as a final model edge image; and
executing pattern matching of the final model edge image and a search edge image related to a search image acquired by capturing a search object.
2. The image processing method according to claim 1 ,
wherein a plurality of the captured model images is acquired, and in the selecting an edge point to be eliminated based on the similarity, the edge point to be eliminated is selected based on at least one of an average and a variance of the similarity with the model edge image at respective edge points in the captured model edge images respectively acquired from the plurality of the captured model images.
3. The image processing method according to claim 2 ,
wherein an edge point in which at least one of the average and the variance of the similarity has a value smaller than a predetermined value is selected as the edge point to be eliminated.
4. The image processing method according to claim 2 ,
wherein an edge point in which at least one of the average and the variance of the similarity is included in a predetermined percentage from a smallest order is selected as the edge point to be eliminated from among all of the edge points.
5. A non-transitory computer readable storage medium storing a program for causing a computer to execute respective operations of the image processing method comprising:
generating a captured model edge image by executing edge extraction processing on a captured model image acquired by capturing a detection target object;
executing pattern matching of the captured model edge image and a model edge image related to the detection target object;
calculating similarity at respective edge points in the model edge image in the pattern matching;
selecting an edge point to be eliminated based on the similarity from among the respective edge points in the model edge image;
generating an edge image acquired by eliminating the selected edge point as a final model edge image; and
executing pattern matching of the final model edge image and a search edge image related to a search image acquired by capturing a search object.
6. The non-transitory computer readable storage medium according to claim 5 ,
wherein a plurality of the captured model images is acquired, and in the selecting an edge point to be eliminated based on the similarity, the edge point to be eliminated is selected based on at least one of an average and a variance of the similarity with the model edge image at respective edge points in the captured model edge images respectively acquired from the plurality of the captured model images.
7. The non-transitory computer readable storage medium according to claim 6 ,
wherein an edge point in which at least one of the average and the variance of the similarity has a value smaller than a predetermined value is selected as the edge point to be eliminated.
8. The non-transitory computer readable storage medium according to claim 6 ,
wherein an edge point in which at least one of the average and the variance of the similarity is included in a predetermined percentage from a smallest order is selected as the edge point to be eliminated from among all of the edge points.
9. An image processing device comprising:
an image generation unit configured to generate a captured model edge image by executing edge extraction processing on a captured model image acquired by capturing a detection target object, execute pattern matching of the captured model edge image and a model edge image related to the detection target object, calculate similarity at respective edge points in the model edge image in the pattern matching, select an edge point to be eliminated based on the similarity from among the respective edge points in the model edge image, and generate an edge image acquired by eliminating the selected edge point as a final model edge image; and
a calculation unit configured to execute image processing;
wherein the calculation unit executes pattern matching of a search edge image related to a search image acquired by capturing a search object and the final model edge image.
10. The image processing device according to claim 9 ,
wherein a plurality of the captured model images is acquired, and in the selecting an edge point to be eliminated based on the similarity, the edge point to be eliminated is selected based on at least one of an average and a variance of the similarity with the model edge image at respective edge points in the captured model edge images respectively acquired from the plurality of the captured model images.
11. The image processing device according to claim 10 ,
wherein an edge point in which at least one of the average and the variance of the similarity has a value smaller than a predetermined value is selected as the edge point to be eliminated.
12. The image processing device according to claim 10 ,
wherein an edge point in which at least one of the average and the variance of the similarity is included in a predetermined percentage from a smallest order is selected as the edge point to be eliminated from among all of the edge points.
13. A robot system comprising:
a robot main body configured to operate a work;
a camera configured to capture the work; and
the image processing device according to claim 9 configured to generate the final model edge image based on an image acquired by capturing the work with the camera.
14. The robot system according to claim 13 ,
wherein, in the image processing device, a plurality of the captured model images is acquired, and in the selecting an edge point to be eliminated based on the similarity, the edge point to be eliminated is selected based on at least one of an average and a variance of the similarity with the model edge image at respective edge points in the captured model edge images respectively acquired from the plurality of the captured model images.
15. The robot system according to claim 14 ,
wherein, in the image processing device, an edge point in which at least one of the average and the variance of the similarity has a value smaller than a predetermined value is selected as the edge point to be eliminated.
16. The robot system according to claim 14 ,
wherein, in the image processing device, an edge point in which at least one of the average and the variance of the similarity is included in a predetermined percentage from a smallest order is selected as the edge point to be eliminated from among all of the edge points.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-211412 | 2014-10-16 | ||
JP2014211412A JP6075888B2 (en) | 2014-10-16 | 2014-10-16 | Image processing method, robot control method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160110840A1 true US20160110840A1 (en) | 2016-04-21 |
US10207409B2 US10207409B2 (en) | 2019-02-19 |
Family
ID=55749434
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/883,033 Active 2035-12-23 US10207409B2 (en) | 2014-10-16 | 2015-10-14 | Image processing method, image processing device, and robot system |
Country Status (2)
Country | Link |
---|---|
US (1) | US10207409B2 (en) |
JP (1) | JP6075888B2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11200632B2 (en) | 2018-11-09 | 2021-12-14 | Canon Kabushiki Kaisha | Image processing method and image processing apparatus |
US11389965B2 (en) * | 2019-07-26 | 2022-07-19 | Mujin, Inc. | Post-detection refinement based on edges and multi-dimensional corners |
CN115294162A (en) * | 2022-10-09 | 2022-11-04 | 腾讯科技(深圳)有限公司 | Target identification method, device, equipment and storage medium |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7049983B2 (en) * | 2018-12-26 | 2022-04-07 | 株式会社日立製作所 | Object recognition device and object recognition method |
CN111571591B (en) * | 2020-05-22 | 2021-07-30 | 中国科学院自动化研究所 | Four-eye bionic eye device, four-eye bionic eye device and target searching method thereof |
JP2023175326A (en) | 2022-05-30 | 2023-12-12 | 横河電機株式会社 | Detecting apparatus, analyzing method, and analyzing program |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6763125B2 (en) * | 1999-09-29 | 2004-07-13 | Fujitsu Ten Limited | Image recognition apparatus and image processing apparatus |
US6898316B2 (en) * | 2001-11-09 | 2005-05-24 | Arcsoft, Inc. | Multiple image area detection in a digital image |
US6937761B2 (en) * | 2001-06-07 | 2005-08-30 | Commissariat A L'energie Atomique | Process for processing images to automatically extract semantic features |
US7038577B2 (en) * | 2002-05-03 | 2006-05-02 | Donnelly Corporation | Object detection system for vehicle |
US7146057B2 (en) * | 2002-07-10 | 2006-12-05 | Northrop Grumman Corporation | System and method for image analysis using a chaincode |
US7292263B2 (en) * | 2005-03-16 | 2007-11-06 | The Regents Of The University Of California | Robotic CCD microscope for enhanced crystal recognition |
US7409092B2 (en) * | 2002-06-20 | 2008-08-05 | Hrl Laboratories, Llc | Method and apparatus for the surveillance of objects in images |
US7672507B2 (en) * | 2004-01-30 | 2010-03-02 | Hewlett-Packard Development Company, L.P. | Image processing methods and systems |
US8014590B2 (en) * | 2005-12-07 | 2011-09-06 | Drvision Technologies Llc | Method of directed pattern enhancement for flexible recognition |
US8050509B2 (en) * | 2006-11-21 | 2011-11-01 | Samsung Electronics Co., Ltd. | Method of and apparatus for eliminating image noise |
US8086020B2 (en) * | 2007-12-24 | 2011-12-27 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. | System and method for analyzing impurities of an object |
US8120679B2 (en) * | 2008-08-01 | 2012-02-21 | Nikon Corporation | Image processing method |
US8155473B2 (en) * | 2008-10-16 | 2012-04-10 | Keyence Corporation | Method for deciding image data reduction ratio in image processing, pattern model positioning method in image processing, pattern model creating method in image processing, image processing apparatus, image processing program, and computer readable recording medium |
US8253829B2 (en) * | 2009-05-27 | 2012-08-28 | Sony Corporation | Image processing apparatus, imaging apparatus, and image processing method |
US8358307B2 (en) * | 2008-04-21 | 2013-01-22 | Sharp Kabushiki Kaisha | Image processing device, display device, image processing method, program, and storage medium |
US8363728B2 (en) * | 2008-04-18 | 2013-01-29 | Sony Corporation | Block based codec friendly edge detection and transform selection |
US8401305B2 (en) * | 2008-10-16 | 2013-03-19 | Keyence Corporation | Contour-information extracting method by use of image processing, pattern model creating method in image processing, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium |
US8406527B2 (en) * | 2008-08-09 | 2013-03-26 | Keyence Corporation | Pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium |
US8593335B2 (en) * | 2010-02-23 | 2013-11-26 | Furuno Electric Company Limited | Method and device for processing echo signal, radar device and echo signal processing program |
US8649592B2 (en) * | 2010-08-30 | 2014-02-11 | University Of Illinois At Urbana-Champaign | System for background subtraction with 3D camera |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0528273A (en) * | 1991-05-13 | 1993-02-05 | Nikon Corp | Method and device for processing picture |
JP5080416B2 (en) * | 2008-10-15 | 2012-11-21 | ファナック株式会社 | Image processing apparatus for detecting an image of a detection object from an input image |
JP5816148B2 (en) * | 2012-09-14 | 2015-11-18 | 株式会社神戸製鋼所 | Output value prediction apparatus, output value prediction method, and program thereof |
-
2014
- 2014-10-16 JP JP2014211412A patent/JP6075888B2/en active Active
-
2015
- 2015-10-14 US US14/883,033 patent/US10207409B2/en active Active
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6763125B2 (en) * | 1999-09-29 | 2004-07-13 | Fujitsu Ten Limited | Image recognition apparatus and image processing apparatus |
US6937761B2 (en) * | 2001-06-07 | 2005-08-30 | Commissariat A L'energie Atomique | Process for processing images to automatically extract semantic features |
US6898316B2 (en) * | 2001-11-09 | 2005-05-24 | Arcsoft, Inc. | Multiple image area detection in a digital image |
US7038577B2 (en) * | 2002-05-03 | 2006-05-02 | Donnelly Corporation | Object detection system for vehicle |
US7679498B2 (en) * | 2002-05-03 | 2010-03-16 | Donnelly Corporation | Object detection system for vehicle |
US7409092B2 (en) * | 2002-06-20 | 2008-08-05 | Hrl Laboratories, Llc | Method and apparatus for the surveillance of objects in images |
US7146057B2 (en) * | 2002-07-10 | 2006-12-05 | Northrop Grumman Corporation | System and method for image analysis using a chaincode |
US7672507B2 (en) * | 2004-01-30 | 2010-03-02 | Hewlett-Packard Development Company, L.P. | Image processing methods and systems |
US7292263B2 (en) * | 2005-03-16 | 2007-11-06 | The Regents Of The University Of California | Robotic CCD microscope for enhanced crystal recognition |
US8014590B2 (en) * | 2005-12-07 | 2011-09-06 | Drvision Technologies Llc | Method of directed pattern enhancement for flexible recognition |
US8050509B2 (en) * | 2006-11-21 | 2011-11-01 | Samsung Electronics Co., Ltd. | Method of and apparatus for eliminating image noise |
US8086020B2 (en) * | 2007-12-24 | 2011-12-27 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. | System and method for analyzing impurities of an object |
US8363728B2 (en) * | 2008-04-18 | 2013-01-29 | Sony Corporation | Block based codec friendly edge detection and transform selection |
US8358307B2 (en) * | 2008-04-21 | 2013-01-22 | Sharp Kabushiki Kaisha | Image processing device, display device, image processing method, program, and storage medium |
US8120679B2 (en) * | 2008-08-01 | 2012-02-21 | Nikon Corporation | Image processing method |
US8406527B2 (en) * | 2008-08-09 | 2013-03-26 | Keyence Corporation | Pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium |
US8155473B2 (en) * | 2008-10-16 | 2012-04-10 | Keyence Corporation | Method for deciding image data reduction ratio in image processing, pattern model positioning method in image processing, pattern model creating method in image processing, image processing apparatus, image processing program, and computer readable recording medium |
US8401305B2 (en) * | 2008-10-16 | 2013-03-19 | Keyence Corporation | Contour-information extracting method by use of image processing, pattern model creating method in image processing, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium |
US8253829B2 (en) * | 2009-05-27 | 2012-08-28 | Sony Corporation | Image processing apparatus, imaging apparatus, and image processing method |
US8593335B2 (en) * | 2010-02-23 | 2013-11-26 | Furuno Electric Company Limited | Method and device for processing echo signal, radar device and echo signal processing program |
US8649592B2 (en) * | 2010-08-30 | 2014-02-11 | University Of Illinois At Urbana-Champaign | System for background subtraction with 3D camera |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11200632B2 (en) | 2018-11-09 | 2021-12-14 | Canon Kabushiki Kaisha | Image processing method and image processing apparatus |
US11389965B2 (en) * | 2019-07-26 | 2022-07-19 | Mujin, Inc. | Post-detection refinement based on edges and multi-dimensional corners |
US20220297305A1 (en) * | 2019-07-26 | 2022-09-22 | Mujin, Inc. | Post-detection refinement based on edges and multi-dimensional corners |
US11850760B2 (en) * | 2019-07-26 | 2023-12-26 | Mujin, Inc. | Post-detection refinement based on edges and multi-dimensional corners |
CN115294162A (en) * | 2022-10-09 | 2022-11-04 | 腾讯科技(深圳)有限公司 | Target identification method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP6075888B2 (en) | 2017-02-08 |
JP2016081264A (en) | 2016-05-16 |
US10207409B2 (en) | 2019-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10207409B2 (en) | Image processing method, image processing device, and robot system | |
US11400598B2 (en) | Information processing apparatus, method, and robot system | |
US10997465B2 (en) | Information processing device, information processing method, and storage medium | |
US10059002B2 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable medium | |
US10636165B2 (en) | Information processing apparatus, method and non-transitory computer-readable storage medium | |
US9495750B2 (en) | Image processing apparatus, image processing method, and storage medium for position and orientation measurement of a measurement target object | |
US9639942B2 (en) | Information processing apparatus, information processing method, and storage medium | |
US9984291B2 (en) | Information processing apparatus, information processing method, and storage medium for measuring a position and an orientation of an object by using a model indicating a shape of the object | |
US10572762B2 (en) | Image processing method for performing pattern matching for detecting a position of a detection target | |
US9135519B2 (en) | Pattern matching method and pattern matching apparatus | |
US20070183665A1 (en) | Face feature point detecting device and method | |
US9747023B2 (en) | Information processing apparatus and method thereof | |
US20130243251A1 (en) | Image processing device and image processing method | |
JPWO2018154709A1 (en) | Motion learning device, skill discrimination device and skill discrimination system | |
US20170345184A1 (en) | Three-dimensional information restoration device, three-dimensional information restoration system, and three-dimensional information restoration method | |
US10623629B2 (en) | Imaging apparatus and imaging condition setting method and program | |
JP2018036770A (en) | Position attitude estimation device, position attitude estimation method, and position attitude estimation program | |
JP2014170368A (en) | Image processing device, method and program and movable body | |
US20210174062A1 (en) | Image processing device, image processing method, and recording medium | |
CN111199533B (en) | Image processing apparatus and method | |
US10521653B2 (en) | Image processing device, image processing method, and storage medium | |
JP2009216480A (en) | Three-dimensional position and attitude measuring method and system | |
JP2015085434A (en) | Robot, image processing method and robot system | |
US10671881B2 (en) | Image processing system with discriminative control | |
US11200632B2 (en) | Image processing method and image processing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ODAGIRI, JUN;REEL/FRAME:037360/0449 Effective date: 20151002 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |