US7084900B1 - Image processing apparatus - Google Patents
Image processing apparatus Download PDFInfo
- Publication number
- US7084900B1 US7084900B1 US09/546,214 US54621400A US7084900B1 US 7084900 B1 US7084900 B1 US 7084900B1 US 54621400 A US54621400 A US 54621400A US 7084900 B1 US7084900 B1 US 7084900B1
- Authority
- US
- United States
- Prior art keywords
- image
- posture
- processing apparatus
- workpiece
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- the present invention relates to an image processing apparatus for detecting three-dimensional position and posture (orientation) of an object, and in particular to an image processing apparatus suitable for use in a bin-picking operation for taking out a workpiece one by one from a stack of workpieces using an industrial machine such as a robot.
- An operation of taking out an individual workpiece from a randomly arranged stack of workpieces or an aggregation of workpieces contained in a container of a predetermined size, which have identical shapes and different three-dimensional positions/postures, has been performed manually.
- a (dedicated) robot since it has been impossible to directly take out an individual workpiece one by one from the randomly arranged stack of workpieces by the dedicated robot, it has been necessary to rearrange the workpieces in advance so as to be picked out by the robot. In this rearrangement operation, it has been necessary to take out an individual workpiece from the stack manually.
- An object of the present invention is to provide an image processing apparatus capable of detecting three-dimensional position and posture of individual objects in a randomly arranged stack or an aggregation in a container of a predetermined region, which have identical shapes and different three-dimensional positions/postures.
- An image processing apparatus of the present invention includes an image capturing device; and a memory storing reference models created based on image data of a reference object captured by the image capturing device in a plurality of directions, and storing information of the capturing directions to be respectively associated with the reference models.
- the reference object may be an object of detection itself or an object having a shape identical to that of the object of detection.
- the image processing apparatus also includes a processor to perform matching processing on image data containing an image of the object of detection captured by the image capturing device with the reference models to select an image of an object matched with one of the reference models, and to obtain posture, or posture and position of the object based on the selected image of the object, said one reference model and the information of the direction associated with said one reference model.
- the reference models may be a part of the image data of the reference object or obtained by processing the image data of the reference object.
- the image capturing device may be attached to a wrist of a robot. Further, the image data of the reference object can be captured in a place different from a place where the detection of the object is performed, and supplied to the image processing apparatus on line or off line.
- the image capturing device may be attached to an wrist of an robot. Further, the image data of the reference object can be captured in a place different from a place where the detection of the object is performed, and supplied to the image processing apparatus on line or off line.
- FIG. 1 is a diagram for showing a picking operation by a robot to take out an individual workpiece from a stack of workpieces using an image processing apparatus according to an embodiment of the present invention
- FIGS. 2 a – 2 d show examples of reference models
- FIG. 3 is a block diagram of a principal part of a robot controller
- FIG. 4 is a block diagram of the image processing apparatus according to an embodiment of the present invention.
- FIG. 5 is a flowchart of the processing for creating reference models
- FIG. 6 is a flowchart of the processing for the picking operation
- FIG. 7 is a diagram showing an example of scanning motion of a visual sensor capable of obtaining distance data
- FIG. 8 is a diagram of the two-dimensional arrangement data containing distance data as image data obtained by the visual sensor
- FIG. 9 is a flowchart of processing for obtaining the two-dimensional arrangement data.
- an image processing apparatus of the present invention is used in combination with a robot system.
- an image of a stack of workpieces which are objects of detection having identical shapes and randomly arranged as shown in FIG. 1
- an image capturing device camera or visual sensor 20
- position and posture (orientation) of the individual workpieces are detected based on the captured image.
- images of a reference object which is one of workpieces W subjected to a picking operation or an object having a shape identical to that of the workpiece W are captured in different directions by the image capturing device and reference models are created from the image data obtained by the image capturing and stored in advance.
- Matching processing between the image data obtained by capturing the image of the stack of workpieces and the reference models is executed to select an image of one workpiece matched with one of the reference models, and a position/posture of the selected workpiece is determined based on the selected image of the workpiece in the image field of view, the selected one of taught modes and the position/posture information associated with the selected one of the reference models.
- FIG. 3 is a block diagram showing a principal part of a robot controller 10 for use in the embodiment of the present invention.
- a main processor 1 a memory, 2 including a RAM, a ROM and a nonvolatile memory (such as an EEPROM), an interface 3 for a teaching operating panel, an interface 6 for external devices, an interface 7 for an image processing apparatus and a servo control section 5 are connected to a bus 8 .
- a teaching operating panel 4 is connected to the interface 3 for a teaching operating panel.
- a system program for supporting basic functions of the robot RB and robot controller 10 are stored in the ROM of the memory 2 .
- Robot operation programs and their related determined data which are taught in accordance with various operations are stored in the nonvolatile memory of the memory 2 .
- the RAM of the memory 2 is used for temporarily storage of data for various arithmetic operations performed by the processor 1 .
- the servo control section 5 includes servo controllers 5 a 1 to 5 an (n: sum of the number of all the axes of the robot including additional movable axes of a tool attached to a wrist of the robot), each composed of a processor, a ROM, a RAM, etc.
- Each servo controller performs position/velocity loop control and also current loop control for its associated servomotor for driving the axis, to function as a co-called digital servo controller for performing loop control of position, velocity and current by software.
- Each servomotor M 1 –Mn for driving each axis is drivingly controlled according to outputs of the associated servo controller 5 al – 5 an through the associated servo amplifier 5 b 1 – 5 bn .
- a position/velocity detector is attached to each servomotor M 1 –Mn, and the position and velocity of each servomotor detected by the associated position/velocity detector is fed back to the associated servo controller 5 al – 5 am .
- To the input-output interface 6 are connected sensors of the robot, and actuators and sensors of peripheral devices.
- FIG. 4 is a block diagram of the image processing apparatus 30 connected to the interface 7 of the robot controller 10 .
- the image processing apparatus 30 includes a processor 31 to which a ROM 32 for storing a system program to be executed by the processor 31 , an image processor 33 , an image-capturing-device interface 34 connected to the image capturing device 20 , an MDI 35 with a display such as a CRT or a liquid crystal display for inputting and outputting various commands and data, a frame memory 36 , a nonvolatile memory 37 , a RAM 38 for temporary storage of data and a communication interface 39 for the robot controller are connected.
- An image captured by the camera 20 is stored in the frame memory 36 .
- the image processor 33 performs image processing of images stored in the frame memory 36 on demand of the processor 31 so as to recognize an object.
- the architecture and function of the image processing apparatus 30 itself is no way different from the conventional image processing apparatus.
- the image processing apparatus 30 of the present invention is different from the conventional one in that reference models as described later are stored in the nonvolatile memory 37 and pattern matching processing is performed on an image of a stack of workpieces W captured by the image capturing device 20 using the reference models to obtain a position and posture of a workpiece W.
- the image capturing device 20 is used for obtaining image data, as described later, and may be a CCD camera for obtaining two-dimensional images data or a visual sensor capable of obtaining three-dimensional image data including distance data.
- the image data is obtained by a conventional method based on two-dimensional images captured by the CCD camera, but in cases of the visual sensor capable of obtaining three-dimensional data including distance data, two-dimensional arrangement data with distance data between the sensor and an object is obtained.
- the visual sensor for obtaining the three-dimensional data including distance data is known, for example, from a three-dimensional visual sensor of a spot light scanning type disclosed in Japanese Patent Publication No. 7-270137, and the summary to the three-dimensional visual sensor is described below.
- This visual sensor detects a three-dimensional position of an object by irradiating a light beam to form a light spot on the object for scanning the object in two different directions (X direction and Y direction) and by detecting the light reflected on the object by a position sensitive detector (PSD).
- PSD position sensitive detector
- Three dimensional position of the object is measured by a calculation using the respective inclination angles ⁇ x, ⁇ y of mirrors for scanning and incident positions of the reflected light beam on the PSD.
- Scanning range (measuring range) on an object is set in advance, and an inclination angle ⁇ x, ⁇ y of the mirrors is controlled discretely.
- the scanning is performed from a point (1, 1) to a point (1, n), from a point (2, 1) to a point (2, n), . . . , from a point (m, 1) to a point (m, n) on the X-Y plane within the scanning range, to measure three-dimensional positions of each reflected point on the object.
- a distance Z (i, j) between the sensor and the reflection point (i, j) on the object is obtained and stored in the RAM 38 of the image processing apparatus 30 .
- the image data is obtained as two dimensional arrangement data including the distance data Z (i, j) between the sensor and the reflection point on the object, as shown in FIG. 8 .
- FIG. 9 is a flowchart of processing to be executed by the processor 31 of the image processing apparatus 30 for obtaining the image data.
- indexes i and j are respectively set to “1” (Step 300 ) and the inclination angle ( ⁇ x, ⁇ y) of the mirrors is set to (x1, y1) to direct to the start point (1, 1) and an irradiation command with the inclination angle is send to the sensor 20 (Steps 301 – 303 ).
- the sensor irradiates a light beam with the mirrors set at the inclination angle.
- the signal representing the image captured by the PSD is sent to the image processing apparatus 30 .
- the processor 31 of the image processing apparatus 30 calculates the position of the reflection point on the object from the signal from the PSD and the inclination angle ( ⁇ x, ⁇ y) of the mirrors to obtain the distance Z (i, j) between the sensor and the position of the reflection point on the object.
- This value Z (i, j) is stored in the RAM 38 as the two-dimensional arrangement data [i, j] (Step 304 , 305 ).
- the calculation for obtaining the position of the reflection point and the distance Z (i, j) may be performed by the sensor 20 .
- Step 306 , 307 It is determined whether or not the index i exceeds the set value n (Step 308 ). If the index i does not exceed the set value n, the procedure returns to Step 303 and the processing from Step 303 to Step 308 is executed to obtain the distance Z (i, j) of the next point. Subsequently, the processing of Steps 303 – 308 is reportedly executed until the index i exceeds the set value n to obtain and store the distance Z (i, j) of the respective points (1, 1) to (1, n) shown in FIG. 7 .
- Step 308 If it is determined that the index i exceeds the set value n in Step 308 , the index i is set to “1” and the index j is incrementally increased by “1” to increase the inclination angle ⁇ y of the mirror for Y-axis direction scanning (Steps 309 – 311 ). Then, it is determined whether or not the index j exceeds the set value m (Step 312 ) and if the index j does not exceed the set value m, the procedure returns to Step 302 to repeatedly execute the processing of Step 302 and the subsequent Steps.
- Step 302 to Step 312 is repeatedly executed until the index j exceeds the set value m. If the index j exceeds the set value m, the points in the measurement range (scanning range) shown in FIG. 7 have been measured entirely, the distance data Z (1, 1)–Z (m, n) as two-dimensional arrangement data are stored in the RAM 28 and the image data obtaining processing is terminated. A part of the image data of two-dimensional arrangements or a plurality of distance data can be obtained by appropriately omitting the measurement of the distance for the index i.
- FIG. 5 is a flowchart showing processing for teaching reference models to the image processing apparatus 30 according to the present invention.
- One reference workpiece (one of the worpieces W as objects of robot operation or a workpiece having a three-dimensional shape identical to that of the workpiece W) is prepared for creating reference models.
- a first (0-th) position/posture of the reference workpiece at which the camera 20 attached to a distal end of a robot wrist captures the image of the object is set, and an axis of rotation and rotation angles with respect to the first (0-th) position/posture are set in order to determine the subsequent positions/postures of the reference workpiece.
- the number of the positions/postures of the workpiece at which the camera 20 captures the image of the object are set.
- information of both position and posture is used, however it is sufficient for creating reference models using only posture (orientation) information if a demanded precision of position is not high.
- images of the reference workpiece are captured from four different directions and reference models are created based on the four image data.
- an image of the reference workpiece is captured from the direction of a Z-axis of a world coordinate system at 0-th position/posture to create 0-th reference model.
- an axis perpendicular to an optical axis of the camera and passing a central point of the workpiece (an origin of a work coordinate system set to the workpiece) and rotation angles of the workpiece along the rotation axis are set for this camera position.
- an axis parallel to either the X-axis or the Y-axis of the world coordinate system, which is perpendicular to the Z axis, can be selected and the workpiece is rotated around the rotation axis at the workpiece position.
- an axis parallel to the X-axis of the world coordinate system is set as the rotation axis, and for the position/posture shown in FIG. 2 b , the rotation angle of 30° is set to rotate the workpiece by 30° with respect to the camera along the rotation axis.
- 1 st reference model is created based on the image data of the workpiece at the position/posture shown in FIG. 2 b .
- the workpiece is rotated by 60° and 90°, respectively, along the rotation axis for capturing images of the workpiece to create 2nd and 3rd reference models.
- rotation angles of zero degree, 30 degrees, 60 degrees and 90 degrees are set for creating four reference models.
- the dividing range of the rotation angles may be set more finely and/or range of the rotation angle may be set greater to create more reference models for more precise detection of the position/posture of the workpiece.
- the 0-th position/posture of the robot at which the camera 20 captures the image of the object, and the rotation axis and the rotation angels with respect to the 0-th position/posture are set in advance in order to determine the subsequent positions/postures of the reference workpiece, and also the number of the subsequent positions/postures of the workpiece are set.
- an optical axis of the camera is parallel to the Y-axis of the world coordinate system and that a position where the X-axis and Y-axis coordinate values are identical to those of the reference workpiece and only the Z-axis coordinate value is different from that of the position of the reference workpiece is taught to the robot as the 0-th image capturing position for obtaining the 0-th reference model.
- the positions of the robot where the camera is rotated with respect to the reference workpiece by 30 degrees, 60 degrees and 90 degrees along the axis passing the central point of the reference workpiece and parallel to the X-axis of the world coordinate system are set as the 1st, 2nd and 3rd image capturing position, and the number N of the image capturing positions is set “4.”
- the processor 1 of the robot controller 10 sets a counter M for counting the number of the image capturing to “0” (Step 100 ).
- the robot is operated to have the M-th position/posture and a command for image capturing is outputted to the image processing apparatus 30 (Step 101 ).
- the image processing apparatus 30 performs capturing of an image of the reference workpiece with the camera 20 and the captured image data is stored in the frame memory 36 (Step 102 ).
- relative position/posture of the workpiece with respect to the camera is obtained and stored in the nonvolatile memory 37 as relative position/posture of M-th reference model, and a data-captured signal is sent to a robot controller (Step 103 ).
- position/posture of the workpiece is a camera coordinate system set to the camera is obtained from the position/posture of the camera and the position/posture of the reference workpiece in the world coordinate system when capturing the image by the camera, and is stored as the relative position/posture of the workpiece with respect to the camera.
- the position/posture of the workpiece in the camera coordinate system is stored as [x0, y0, z0, ⁇ 0, ⁇ 0, ⁇ 0]c, where ⁇ , ⁇ and ⁇ mean rotation angle around X-, Y-. Z- axes, and “c” means the camera coordinate system.
- Step 104 the processor 1 of the robot controller 10 increments the value of the counter M by “1”
- N a set value N
- the camera is successively turned by 30 degrees around the axis parallel to X axis of the world coordinate system and passing the workpiece position, and successively captures the image of the workpiece, and reference models and relativity positions/postures of the camera with respect to the workpiece at the image capturing are stored.
- the reference models and the relative position/posture of the workpiece W and the camera 20 are stored in the nonvolatile memory 37 of the image processing apparatus 30 .
- the reference models are created using a robot, however, the reference models may be created by a manual operation without using a robot.
- the reference workpiece is arranged within a field of view of the camera connected to the image processing apparatus 30 , and the images of the workpiece with different postures are captured by the camera.
- the reference models are created based on the image data and the relative positions/postures of the camera and the workpiece at the image capturing manually inputted, and are stored with the respective relative positions/postures.
- the reference models may be created from a part of the image data of the reference object, and may be created by processing the image data of the reference object.
- the reference models may be created based on the stored image data of the reference workpiece when detecting the position/posture of the objective workpiece, without creating and storing the reference models in advance.
- FIG. 6 is a flowchart of processing for the picking operation.
- the processor 1 operates the robot RB to move the camera attached to the robot wrist to an image capturing position where a stack of workpieces is within a field of view of the camera 20 (Step 200 ).
- There-dimensional position/posture of the camera 20 on the world coordinate system at this image capturing position is outputted to the image processing apparatus 30 , and a image capturing command is outputted (Step 201 ).
- the processor 31 of the image processing apparatus 30 captures an image of the stack of the workpieces W, to obtain image data of some workpieces W and store it in the frame memory 36 (Step 202 ).
- pattern matching processing is performed for the image data stored in the frame memory 36 using one of reference models (1st reference model) stored in the nonvolatile memory 37 so as to detect a workpiece W (Step 203 ).
- this pattern matching processing matching of the image data of the reference model with the image data of workpieces is performed on the basis of position, turn and scale. It is determined whether or not an object has a matching value equal or greater than the set value (Step 204 ). If an object having a matching value equal or greater than the set value is not detected, the procedure proceeds to Step 205 to determine whether or not the pattern matching is performed using all the reference models (1st to 4th reference models). If the pattern matching using all of the reference models is not yet performed, further pattern matching is performed using another reference model (Step 206 ).
- Step 204 If it is determined in Step 204 that an object having a matching value equal or greater than the set value with respect to any of the reference models is detected, the procedure proceeds to Step 207 to perform matching processing on the two-dimensional data of the detected workpieces W, using every taught mode.
- Step 208 the reference model having the highest matching value in the pattern matching processing is selected, and the relative position/posture of the workpiece W with respect to the camera 20 is determined based on the relative position/posture of the camera and the reference workpiece stored for the selected reference model, and position, rotation angle and scale of the image of the workpiece in the matching processing, (Step 208 ).
- the position and posture (orientation) of the detected workpiece on the world coordinate system is determined from the position and posture of the camera 20 in the world coordinate system, which has been set in Step 201 , and the relative position/posture of the workpiece W with respect to the camera 20 , and is outputted (Step 209 ).
- the position and posture (orientation) of the detected workpiece W in the world coordinate system is obtained by an arithmetic operation of coordinate transformation using the data of the position/posture of the workpiece W in the camera coordinate system and the position/posture of the camera 20 in the world coordinate system (Step 209 ).
- the reference model having the highest matching value is selected in this embodiment, however, a reference model of the rotation angle of zero degree (the 0-th reference model) may be selected with precedence, or an object having the highest expansion rate of scale (the object which is nearest to the camera, i.e. located at the summit of the stack in this example) may be selected with precedence.
- the robot controller 10 operates the robot to perform a picking operation to grip and hold the detected workpiece W and move the held workpiece W to a predetermined position, based on the three-dimensional position/posture of the workpiece W (Step 210 ). Then, the procedure returns to Step 202 to repeatedly execute the processing of Step 202 and subsequent Steps.
- the procedure may return to Step 200 when it is determined “Yes” in Step 205 , to move the camera to another position/posture at which an image of the objective workpiece can be captured.
- the robot controller 10 may store the three-dimensional position/posture of the camera without outputting it to the image processing apparatus 30 in Step 201 and the relative position/posture of the workpiece and the camera may be outputted from the image processing apparatus 30 to the robot controller 10 in Step 208 to execute the processing of Step 209 in the robot controller 10 .
- the camera may be moved parallel in accordance with the position of the workpiece in the field of view of the camera to a position right above the workpiece to lose influence of parallax, and at this position the image capturing processing of Step 201 and the subsequent Steps in FIG. 6 are performed so that the false judgment is prevented.
- the camera is arranged to capture an image of a stack of workpieces or a region containing the objective workpiece within a field of view of the camera, and the position/posture of the camera in the world coordinate system is inputted to the image processing apparatus 30 and issuing an object detection command to the image processing apparatus 30 , to make the image processing apparatus 30 execute Steps 202 – 209 of FIG. 6 .
- the image data for creating the reference models may be obtained at a place different form the place where the robot is installed.
- the image data may be supplied to the image processing apparatus on line through a communication interface provided in the image processing apparatus, or may be supplied off line through a disc driver for reading a floppy disk, etc.
- a position/posture of an objective workpiece in a randomly arranged stack of workpieces or an aggregation of workpieces gathered in a predetermined region which have identical shapes and different three-dimensional positions/postures is detected, to thereby enable a robot to automatically pick out an individual workpiece from such a stack or an aggregation.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (10)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10188599A JP3377465B2 (en) | 1999-04-08 | 1999-04-08 | Image processing device |
Publications (1)
Publication Number | Publication Date |
---|---|
US7084900B1 true US7084900B1 (en) | 2006-08-01 |
Family
ID=14312402
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/546,214 Expired - Lifetime US7084900B1 (en) | 1999-04-08 | 2000-04-10 | Image processing apparatus |
Country Status (3)
Country | Link |
---|---|
US (1) | US7084900B1 (en) |
EP (1) | EP1043689A3 (en) |
JP (1) | JP3377465B2 (en) |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040041907A1 (en) * | 2002-08-28 | 2004-03-04 | Kazuhito Saeki | Image processing device and method |
US20040172164A1 (en) * | 2002-01-31 | 2004-09-02 | Babak Habibi | Method and apparatus for single image 3D vision guided robotics |
US20050213818A1 (en) * | 2003-04-28 | 2005-09-29 | Sony Corporation | Image recognition device and method, and robot device |
US20060108960A1 (en) * | 2002-07-18 | 2006-05-25 | Michiharu Tanaka | Robot controller and robot system |
US20060222260A1 (en) * | 2005-03-30 | 2006-10-05 | Casio Computer Co., Ltd. | Image capture apparatus, image processing method for captured image, and recording medium |
US7212293B1 (en) * | 2004-06-01 | 2007-05-01 | N&K Technology, Inc. | Optical determination of pattern feature parameters using a scalar model having effective optical properties |
US20070276539A1 (en) * | 2006-05-25 | 2007-11-29 | Babak Habibi | System and method of robotically engaging an object |
US20070293986A1 (en) * | 2006-06-15 | 2007-12-20 | Fanuc Ltd | Robot simulation apparatus |
US20080069435A1 (en) * | 2006-09-19 | 2008-03-20 | Boca Remus F | System and method of determining object pose |
US20080109184A1 (en) * | 2006-11-06 | 2008-05-08 | Canon Kabushiki Kaisha | Position and orientation measurement method and apparatus |
US20090033655A1 (en) * | 2007-08-02 | 2009-02-05 | Boca Remus F | System and method of three-dimensional pose estimation |
CN101100060B (en) * | 2006-07-04 | 2010-06-16 | 发那科株式会社 | Device and method for preparing robot program |
US20100324737A1 (en) * | 2009-06-19 | 2010-12-23 | Kabushiki Kaisha Yaskawa Denki | Shape detection system |
WO2010150515A1 (en) | 2009-06-25 | 2010-12-29 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and program |
US20110141251A1 (en) * | 2009-12-10 | 2011-06-16 | Marks Tim K | Method and System for Segmenting Moving Objects from Images Using Foreground Extraction |
US20120028585A1 (en) * | 2007-08-17 | 2012-02-02 | Mayumi Takada | Communication system and communication program |
US20130238128A1 (en) * | 2012-03-09 | 2013-09-12 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US8559699B2 (en) | 2008-10-10 | 2013-10-15 | Roboticvisiontech Llc | Methods and apparatus to facilitate operations in image based systems |
US20150124056A1 (en) * | 2013-11-05 | 2015-05-07 | Fanuc Corporation | Apparatus and method for picking up article disposed in three-dimensional space using robot |
US20150160650A1 (en) * | 2013-12-11 | 2015-06-11 | Honda Motor Co., Ltd. | Apparatus, system and method for kitting and automation assembly |
US9102053B2 (en) | 2012-03-09 | 2015-08-11 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US20150251314A1 (en) * | 2014-03-07 | 2015-09-10 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
US9156162B2 (en) | 2012-03-09 | 2015-10-13 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US20150321354A1 (en) * | 2014-05-08 | 2015-11-12 | Toshiba Kikai Kabushiki Kaisha | Picking apparatus and picking method |
CN105196103A (en) * | 2015-09-29 | 2015-12-30 | 佛山市利迅达机器人系统有限公司 | Automatic material grabbing system |
US20160283792A1 (en) * | 2015-03-24 | 2016-09-29 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
US20170312921A1 (en) * | 2016-04-28 | 2017-11-02 | Seiko Epson Corporation | Robot and robot system |
US9832373B2 (en) | 2014-06-24 | 2017-11-28 | Cyberlink Corp. | Systems and methods for automatically capturing digital images based on adaptive image-capturing templates |
US9996805B1 (en) * | 2016-09-30 | 2018-06-12 | Amazon Technologies, Inc. | Systems and methods for automated shipping optimization |
US10099380B2 (en) * | 2015-06-02 | 2018-10-16 | Seiko Epson Corporation | Robot, robot control device, and robot system |
US10360531B1 (en) * | 2016-12-19 | 2019-07-23 | Amazon Technologies, Inc. | Robot implemented item manipulation |
DE102018009836B4 (en) * | 2017-12-21 | 2021-01-07 | Fanuc Corporation | Object inspection system and object inspection method |
US11200695B2 (en) | 2016-12-05 | 2021-12-14 | Sony Interactive Entertainment Inc. | System, jig, information processing device, information processing method, and program |
US20240185455A1 (en) * | 2021-05-20 | 2024-06-06 | Fanuc Corporation | Imaging device for calculating three-dimensional position on the basis of image captured by visual sensor |
US20240193808A1 (en) * | 2021-05-27 | 2024-06-13 | Fanuc Corporation | Imaging device for calculating three-dimensional position on the basis of image captured by visual sensor |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3768174B2 (en) | 2002-07-24 | 2006-04-19 | ファナック株式会社 | Work take-out device |
JP3842233B2 (en) * | 2003-03-25 | 2006-11-08 | ファナック株式会社 | Image processing apparatus and robot system |
JP3834297B2 (en) | 2003-05-12 | 2006-10-18 | ファナック株式会社 | Image processing device |
DE102004046584A1 (en) * | 2003-09-26 | 2005-05-19 | Micro-Epsilon Messtechnik Gmbh & Co Kg | Contactless optical method for determining three-dimensional position of object e.g. for production line automation in manufacturing plant, uses evaluation of object image provided by camera |
JP4401727B2 (en) * | 2003-09-30 | 2010-01-20 | キヤノン株式会社 | Image display apparatus and method |
JP2005111619A (en) * | 2003-10-08 | 2005-04-28 | Tsubakimoto Chain Co | Method and apparatus for arranging article |
JP2005279804A (en) * | 2004-03-29 | 2005-10-13 | Advanced Telecommunication Research Institute International | Object recognition device |
JP2006035397A (en) | 2004-07-29 | 2006-02-09 | Fanuc Ltd | Conveyance robot system |
SE529377C2 (en) * | 2005-10-18 | 2007-07-24 | Morphic Technologies Ab Publ | Method and arrangement for locating and picking up items from a carrier |
JP4640290B2 (en) | 2005-11-21 | 2011-03-02 | 日産自動車株式会社 | Method and system for workpiece transfer |
EP1840053B1 (en) | 2006-02-27 | 2010-02-10 | Nissan Motor Company Limited | Workpiece transfer method, system and device |
JP4940715B2 (en) * | 2006-03-15 | 2012-05-30 | 日産自動車株式会社 | Picking system |
KR100920931B1 (en) | 2007-11-16 | 2009-10-12 | 전자부품연구원 | Object posture recognition method of robot using TFT camera |
JP2010152664A (en) * | 2008-12-25 | 2010-07-08 | Nissei Corp | Sensorless motor-driven robot using image |
JP5381166B2 (en) * | 2009-03-04 | 2014-01-08 | オムロン株式会社 | Model image acquisition support apparatus, model image acquisition support method, and model image acquisition support program |
JP5544464B2 (en) * | 2009-03-11 | 2014-07-09 | 本田技研工業株式会社 | 3D position / posture recognition apparatus and method for an object |
EP2383696A1 (en) * | 2010-04-30 | 2011-11-02 | LiberoVision AG | Method for estimating a pose of an articulated object model |
JP5767464B2 (en) | 2010-12-15 | 2015-08-19 | キヤノン株式会社 | Information processing apparatus, information processing apparatus control method, and program |
JP6265784B2 (en) * | 2014-03-06 | 2018-01-24 | 株式会社メガチップス | Posture estimation system, program, and posture estimation method |
JP2015202544A (en) * | 2014-04-15 | 2015-11-16 | 株式会社安川電機 | Robot control system, information communication module, robot controller, computer program and robot control method |
JP2017227463A (en) * | 2016-06-20 | 2017-12-28 | 清水建設株式会社 | Position and attitude determination device |
JP6825454B2 (en) * | 2017-03-30 | 2021-02-03 | 愛知製鋼株式会社 | Item number counting device |
CN108827371B (en) * | 2018-03-12 | 2021-11-02 | 中国电力科学研究院有限公司 | Appearance detection device of a communication unit |
CN109205332A (en) * | 2018-10-24 | 2019-01-15 | 山东科技大学 | A kind of packaged type tire stacker crane tool hand and method |
JP7275759B2 (en) * | 2019-03-28 | 2023-05-18 | セイコーエプソン株式会社 | OBJECT DETECTION METHOD, OBJECT DETECTION DEVICE, AND ROBOT SYSTEM |
JP2020166371A (en) * | 2019-03-28 | 2020-10-08 | セイコーエプソン株式会社 | Information processing method, information processing device, object detection device and robot system |
CN112591356A (en) * | 2020-12-04 | 2021-04-02 | 佛山隆深机器人有限公司 | Stacking visual detection system in closed or semi-closed limited space |
EP4070922A3 (en) | 2021-04-06 | 2023-01-11 | Canon Kabushiki Kaisha | Robot system, control method, image processing apparatus, image processing method, method of manufacturing products, program, and recording medium |
JP7324923B1 (en) | 2022-11-14 | 2023-08-10 | 株式会社 日立産業制御ソリューションズ | Object recognition device and object recognition method |
CN115497087B (en) * | 2022-11-18 | 2024-04-19 | 广州煌牌自动设备有限公司 | Tableware gesture recognition system and method |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2085629A (en) | 1980-10-17 | 1982-04-28 | Micro Cosultants Ltd | Object recognition |
US4680802A (en) * | 1984-03-26 | 1987-07-14 | Hitachi, Ltd. | Posture judgement system in image processing |
US4985846A (en) * | 1989-05-11 | 1991-01-15 | Fallon Patrick J | Acoustical/optical bin picking system |
US5550928A (en) * | 1992-12-15 | 1996-08-27 | A.C. Nielsen Company | Audience measurement system and method |
US5559727A (en) * | 1994-02-24 | 1996-09-24 | Quad Systems Corporation | Apparatus and method for determining the position of a component prior to placement |
US5845048A (en) | 1995-02-06 | 1998-12-01 | Fujitsu Limited | Applicable recognition system for estimating object conditions |
US5897611A (en) * | 1994-08-11 | 1999-04-27 | Cyberoptics Corporation | High precision semiconductor component alignment systems |
US5909504A (en) * | 1996-03-15 | 1999-06-01 | Cognex Corporation | Method of testing a machine vision inspection system |
US6026189A (en) * | 1997-11-13 | 2000-02-15 | National Research Council Of Canada | Method of recognizing objects within two-dimensional and three-dimensional images |
US6266442B1 (en) * | 1998-10-23 | 2001-07-24 | Facet Technology Corp. | Method and apparatus for identifying objects depicted in a videostream |
US6424745B1 (en) * | 1998-05-19 | 2002-07-23 | Lucent Technologies Inc. | Method and apparatus for object recognition |
US6437784B1 (en) * | 1998-03-31 | 2002-08-20 | General Mills, Inc. | Image producing system for three-dimensional pieces |
US6445814B2 (en) * | 1996-07-01 | 2002-09-03 | Canon Kabushiki Kaisha | Three-dimensional information processing apparatus and method |
-
1999
- 1999-04-08 JP JP10188599A patent/JP3377465B2/en not_active Expired - Fee Related
-
2000
- 2000-04-10 US US09/546,214 patent/US7084900B1/en not_active Expired - Lifetime
- 2000-04-10 EP EP00303009A patent/EP1043689A3/en not_active Withdrawn
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2085629A (en) | 1980-10-17 | 1982-04-28 | Micro Cosultants Ltd | Object recognition |
US4680802A (en) * | 1984-03-26 | 1987-07-14 | Hitachi, Ltd. | Posture judgement system in image processing |
US4985846A (en) * | 1989-05-11 | 1991-01-15 | Fallon Patrick J | Acoustical/optical bin picking system |
US5550928A (en) * | 1992-12-15 | 1996-08-27 | A.C. Nielsen Company | Audience measurement system and method |
US5559727A (en) * | 1994-02-24 | 1996-09-24 | Quad Systems Corporation | Apparatus and method for determining the position of a component prior to placement |
US5897611A (en) * | 1994-08-11 | 1999-04-27 | Cyberoptics Corporation | High precision semiconductor component alignment systems |
US5845048A (en) | 1995-02-06 | 1998-12-01 | Fujitsu Limited | Applicable recognition system for estimating object conditions |
US5909504A (en) * | 1996-03-15 | 1999-06-01 | Cognex Corporation | Method of testing a machine vision inspection system |
US6445814B2 (en) * | 1996-07-01 | 2002-09-03 | Canon Kabushiki Kaisha | Three-dimensional information processing apparatus and method |
US6026189A (en) * | 1997-11-13 | 2000-02-15 | National Research Council Of Canada | Method of recognizing objects within two-dimensional and three-dimensional images |
US6437784B1 (en) * | 1998-03-31 | 2002-08-20 | General Mills, Inc. | Image producing system for three-dimensional pieces |
US6424745B1 (en) * | 1998-05-19 | 2002-07-23 | Lucent Technologies Inc. | Method and apparatus for object recognition |
US6266442B1 (en) * | 1998-10-23 | 2001-07-24 | Facet Technology Corp. | Method and apparatus for identifying objects depicted in a videostream |
Non-Patent Citations (7)
Title |
---|
Ichiro Masaki, "Industrial Vision Systems Based on Application-Specific IC Chips", IEICE Transactions, Institute of Electronics, vol. E74 No. 6, Jun. 1, 1991, pp. 1728-1734. |
J. Hornegger et al, "Statistical Learning, Localization, and Identification of Objects", IEEE, Jun. 20, 1995, pp. 914-919. |
Juan Andrade-Cetto et al., Object Recognition, "Wiley Encyclopedia of Electrical Engineering", Apr. 1, 2000, pp. 449-470. |
Kohtaro Ohba et al, "Recognition of the Multi Specularity Objects for Bin-picking Task", IEEE, Nov. 4, 1996, pp. 1440-1447. |
Michael Magee et al, "An Industrial Model Based Computer Vision System", Journal of Manufacturing Systems, Society of Manufacturing Engineers, vol. 14 No. 3, 1995, pp. 169-186. |
Sarah Wang et al, "Model-Based Vision for Robotic Manipulation of Twisted Tubular Parts: Using Affine Transforms and Heuristuic Search", Robotics and Automation, IEEE, May 8, 1994, pp. 208-215. |
Toshiyuki Amano et al, "Eignespace Approach for Object Recognition and Its Pose Detection", Systems and Computer in Japan, vol. 31 No. 11, Oct. 2000, pp. 60-69. |
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040172164A1 (en) * | 2002-01-31 | 2004-09-02 | Babak Habibi | Method and apparatus for single image 3D vision guided robotics |
US8095237B2 (en) | 2002-01-31 | 2012-01-10 | Roboticvisiontech Llc | Method and apparatus for single image 3D vision guided robotics |
US7391178B2 (en) * | 2002-07-18 | 2008-06-24 | Kabushiki Kaisha Yaskawa Denki | Robot controller and robot system |
US20060108960A1 (en) * | 2002-07-18 | 2006-05-25 | Michiharu Tanaka | Robot controller and robot system |
US7274812B2 (en) * | 2002-08-28 | 2007-09-25 | Keyence Corporation | Image processing device and method |
US20040041907A1 (en) * | 2002-08-28 | 2004-03-04 | Kazuhito Saeki | Image processing device and method |
US20050213818A1 (en) * | 2003-04-28 | 2005-09-29 | Sony Corporation | Image recognition device and method, and robot device |
US7627178B2 (en) * | 2003-04-28 | 2009-12-01 | Sony Corporation | Image recognition device using feature points, method for recognizing images using feature points, and robot device which recognizes images using feature points |
US7212293B1 (en) * | 2004-06-01 | 2007-05-01 | N&K Technology, Inc. | Optical determination of pattern feature parameters using a scalar model having effective optical properties |
US20060222260A1 (en) * | 2005-03-30 | 2006-10-05 | Casio Computer Co., Ltd. | Image capture apparatus, image processing method for captured image, and recording medium |
US7760962B2 (en) * | 2005-03-30 | 2010-07-20 | Casio Computer Co., Ltd. | Image capture apparatus which synthesizes a plurality of images obtained by shooting a subject from different directions, to produce an image in which the influence of glare from a light is reduced |
US20070276539A1 (en) * | 2006-05-25 | 2007-11-29 | Babak Habibi | System and method of robotically engaging an object |
US20070293986A1 (en) * | 2006-06-15 | 2007-12-20 | Fanuc Ltd | Robot simulation apparatus |
CN101100060B (en) * | 2006-07-04 | 2010-06-16 | 发那科株式会社 | Device and method for preparing robot program |
US20080069435A1 (en) * | 2006-09-19 | 2008-03-20 | Boca Remus F | System and method of determining object pose |
US8437535B2 (en) | 2006-09-19 | 2013-05-07 | Roboticvisiontech Llc | System and method of determining object pose |
US20080109184A1 (en) * | 2006-11-06 | 2008-05-08 | Canon Kabushiki Kaisha | Position and orientation measurement method and apparatus |
US7698094B2 (en) * | 2006-11-06 | 2010-04-13 | Canon Kabushiki Kaisha | Position and orientation measurement method and apparatus |
US20090033655A1 (en) * | 2007-08-02 | 2009-02-05 | Boca Remus F | System and method of three-dimensional pose estimation |
US7957583B2 (en) * | 2007-08-02 | 2011-06-07 | Roboticvisiontech Llc | System and method of three-dimensional pose estimation |
US20120028585A1 (en) * | 2007-08-17 | 2012-02-02 | Mayumi Takada | Communication system and communication program |
US8559699B2 (en) | 2008-10-10 | 2013-10-15 | Roboticvisiontech Llc | Methods and apparatus to facilitate operations in image based systems |
US20100324737A1 (en) * | 2009-06-19 | 2010-12-23 | Kabushiki Kaisha Yaskawa Denki | Shape detection system |
US8660697B2 (en) * | 2009-06-19 | 2014-02-25 | Kabushiki Kaisha Yaskawa Denki | Shape detection system |
EP2446223A4 (en) * | 2009-06-25 | 2017-02-22 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and program |
WO2010150515A1 (en) | 2009-06-25 | 2010-12-29 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and program |
US20110141251A1 (en) * | 2009-12-10 | 2011-06-16 | Marks Tim K | Method and System for Segmenting Moving Objects from Images Using Foreground Extraction |
US8941726B2 (en) * | 2009-12-10 | 2015-01-27 | Mitsubishi Electric Research Laboratories, Inc. | Method and system for segmenting moving objects from images using foreground extraction |
US9026234B2 (en) * | 2012-03-09 | 2015-05-05 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US9156162B2 (en) | 2012-03-09 | 2015-10-13 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US20130238128A1 (en) * | 2012-03-09 | 2013-09-12 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US9102053B2 (en) | 2012-03-09 | 2015-08-11 | Canon Kabushiki Kaisha | Information processing apparatus and information processing method |
US9503704B2 (en) * | 2013-11-05 | 2016-11-22 | Fanuc Corporation | Apparatus and method for picking up article disposed in three-dimensional space using robot |
US20150124056A1 (en) * | 2013-11-05 | 2015-05-07 | Fanuc Corporation | Apparatus and method for picking up article disposed in three-dimensional space using robot |
US9778650B2 (en) * | 2013-12-11 | 2017-10-03 | Honda Motor Co., Ltd. | Apparatus, system and method for kitting and automation assembly |
US10520926B2 (en) | 2013-12-11 | 2019-12-31 | Honda Motor Co., Ltd. | Apparatus, system and method for kitting and automation assembly |
US20150160650A1 (en) * | 2013-12-11 | 2015-06-11 | Honda Motor Co., Ltd. | Apparatus, system and method for kitting and automation assembly |
US9656388B2 (en) * | 2014-03-07 | 2017-05-23 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
USRE47553E1 (en) * | 2014-03-07 | 2019-08-06 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
US20150251314A1 (en) * | 2014-03-07 | 2015-09-10 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
US9604364B2 (en) * | 2014-05-08 | 2017-03-28 | Toshiba Kikai Kabushiki Kaisha | Picking apparatus and picking method |
US20150321354A1 (en) * | 2014-05-08 | 2015-11-12 | Toshiba Kikai Kabushiki Kaisha | Picking apparatus and picking method |
US9832373B2 (en) | 2014-06-24 | 2017-11-28 | Cyberlink Corp. | Systems and methods for automatically capturing digital images based on adaptive image-capturing templates |
US20160283792A1 (en) * | 2015-03-24 | 2016-09-29 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
US9984291B2 (en) * | 2015-03-24 | 2018-05-29 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium for measuring a position and an orientation of an object by using a model indicating a shape of the object |
US10099380B2 (en) * | 2015-06-02 | 2018-10-16 | Seiko Epson Corporation | Robot, robot control device, and robot system |
CN105196103A (en) * | 2015-09-29 | 2015-12-30 | 佛山市利迅达机器人系统有限公司 | Automatic material grabbing system |
CN105196103B (en) * | 2015-09-29 | 2017-09-05 | 佛山市利迅达机器人系统有限公司 | A kind of automated material grasping system |
US20170312921A1 (en) * | 2016-04-28 | 2017-11-02 | Seiko Epson Corporation | Robot and robot system |
US10532461B2 (en) * | 2016-04-28 | 2020-01-14 | Seiko Epson Corporation | Robot and robot system |
US9996805B1 (en) * | 2016-09-30 | 2018-06-12 | Amazon Technologies, Inc. | Systems and methods for automated shipping optimization |
US11200695B2 (en) | 2016-12-05 | 2021-12-14 | Sony Interactive Entertainment Inc. | System, jig, information processing device, information processing method, and program |
US10360531B1 (en) * | 2016-12-19 | 2019-07-23 | Amazon Technologies, Inc. | Robot implemented item manipulation |
DE102018009836B4 (en) * | 2017-12-21 | 2021-01-07 | Fanuc Corporation | Object inspection system and object inspection method |
US20240185455A1 (en) * | 2021-05-20 | 2024-06-06 | Fanuc Corporation | Imaging device for calculating three-dimensional position on the basis of image captured by visual sensor |
US20240193808A1 (en) * | 2021-05-27 | 2024-06-13 | Fanuc Corporation | Imaging device for calculating three-dimensional position on the basis of image captured by visual sensor |
Also Published As
Publication number | Publication date |
---|---|
EP1043689A2 (en) | 2000-10-11 |
EP1043689A3 (en) | 2003-07-16 |
JP3377465B2 (en) | 2003-02-17 |
JP2000293695A (en) | 2000-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7084900B1 (en) | Image processing apparatus | |
US7177459B1 (en) | Robot system having image processing function | |
US7200260B1 (en) | Teaching model generating device | |
US9604363B2 (en) | Object pickup device and method for picking up object | |
EP1607194B1 (en) | Robot system comprising a plurality of robots provided with means for calibrating their relative position | |
US7532949B2 (en) | Measuring system | |
JP3556589B2 (en) | Position and orientation recognition device | |
JP3242108B2 (en) | Target mark recognition and tracking system and method | |
EP0528054B1 (en) | Detected position correcting method | |
KR100693262B1 (en) | Image processing unit | |
JP5743499B2 (en) | Image generating apparatus, image generating method, and program | |
EP1584426A1 (en) | Tool center point calibration system | |
US20170270631A1 (en) | Automated guidance system and method for a coordinated movement machine | |
US7502504B2 (en) | Three-dimensional visual sensor | |
US11454498B2 (en) | Coordinate measuring system | |
JPS6332306A (en) | Non-contact three-dimensional automatic dimension measuring method | |
CN109983299A (en) | The measuring system and method for industrial robot | |
JP2003136465A (en) | Method for determining 3D position / posture of detection target and visual sensor for robot | |
JP3516668B2 (en) | Three-dimensional shape recognition method, apparatus and program | |
JPH02110788A (en) | Shape recognition method for 3D objects | |
JP7372161B2 (en) | Manipulators, automation methods and programs | |
JPH0617795B2 (en) | Three-dimensional position measuring device | |
JPH05329793A (en) | Visual sensor | |
JPH0425907A (en) | Correction device for aberration of automatic unit | |
Yu et al. | Temporal and spatial sensor fusion in a robotic manufacturing workcell |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FANUC LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WATANABE, ATSUSHI;ARIMATSU, TARO;REEL/FRAME:010868/0907 Effective date: 20000424 |
|
AS | Assignment |
Owner name: FANUC LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WATANABE, ATSUSHI;ARIMATSU, TARO;REEL/FRAME:010947/0440 Effective date: 20000623 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
CC | Certificate of correction | ||
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553) Year of fee payment: 12 |