GB2123172A - A robot control system - Google Patents
A robot control system Download PDFInfo
- Publication number
- GB2123172A GB2123172A GB08219569A GB8219569A GB2123172A GB 2123172 A GB2123172 A GB 2123172A GB 08219569 A GB08219569 A GB 08219569A GB 8219569 A GB8219569 A GB 8219569A GB 2123172 A GB2123172 A GB 2123172A
- Authority
- GB
- United Kingdom
- Prior art keywords
- pattern
- sequence
- store
- job sequence
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37572—Camera, tv, vision
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
A robot control system, by means of which a robot can be programmed, by demonstration, to repeat a job sequence, consists of a camera 1, which senses a sequence of patterns, during the demonstration, which are labelled and stored in a job sequence store 5 and a disc store 6. A scoring unit 4 compares subsequent in-coming images, sensed through the camera 1, with stored and labelled images to identify a particular required object, and generates error signals, which are passed to mechanical servos provided on an arm 2 of the robot to cause movement thereof in accordance with the stored job sequence. There can be two arms and two cameras. <IMAGE>
Description
SPECIFICATION
A robot control system
This invention relates to a robot control system.
Robots, with arms capable of being moved to specified co-ordinates in accordance with a program, are already known. Robots incorporating pattern recognition devices are also known. These devices are capable of recognising unknown patterns, which may be images of solid objects by generating information indicative of the unknown pattern for comparison with stored information indicative of known patterns. The comparison yields a "score" indicating the extent to which the unknown pattern resembles each of the known patterns for which information is stored, and the unknown pattern is identified with the known pattern which it most resembles, provided that the relevant score is sufficiently good.
However, in order to be economical, a robot should be able to work at speeds which are at least comparable with those attainable by a human worker and the pattern recognition techniques employed heretofore in the field of robotics may not operate at a sufficiently high speed to enable full use of the robot's capabilities to be made.
Another problem which may arise is that an operator will generally require a substantial knowledge of computer techniques before he is capable of programming the robot to perform even a simple operation.
It is therefore an object of the present invention to provide an improved robot control system which substantially alleviates at least the afore-mentioned problems.
According to the invention there is provided a robot control system, wherein a robot can be programmed by demonstration to repeat a job sequence, comprising a pattern recognition device including means for sensing and labelling image and position co-ordinates of known objects in various attitudes and various manipulations, means for storing the sensed and labelled images, means for searching for and recognising an image of a particular required object at a certain stage of the job sequence by comparing the image of the required object with the sensed and labelled image thereof, and means for manipulating the required object to a desired location in accordance with the job sequence by the use of reference positions in a set of chosen co-ordinates, in which the robot works during the job sequence.
The robot preferably senses the images shown by an operatore via a vidicon television camera, or, alternatively, a flying spot scanner, using eight variables, which have been shown to work in simulation therewith. It may be convenient to amount the camera on an arm of the robot which is to perform an operation, or alternatively the camera may be fixed.
The co-ordinates in which the robot works need not be expressed in any absolute dimensions, as any arbitrary scale that is convenient may be used. For example, if it were required to insert a bolt into a hole, visual recognition and location of the hole or its surroundings would provide one set of co-ordinates, visual recognition and location of the bolt in an artificial hand of the robot would provide a second set of co-ordinates, and the difference between these co-ordinates would provide the necessary data for the servo control of the robot.
Three modes of operation, i.e. a search mode, a recognition mode and a manipulation mode, are attainable with the control system in accordance with the present invention.
The preferred type of recognition mode is similar to that described in Patent No. 1,234,941, but including some modification to provide a substantially more automatic program procedure. The search mode and the manipulation mode utilised in the present invention are both extensions of the above-mentioned patent.
It is assumed that, in normal operation, objects which are either components to be included in an assembly or thing which require some operation as part of a manufacturing process will be placed within reach of the robot.
In order to proceed the robot must find the correct object, in accordance with the job sequence, and discern what attitude it is in.
The recognition logic of Patent No. 1,234,941 can recognise an object as seen from one point of view and can allow for perspective effects of a single plane. Thus, in practice, most solid objects could be turned through +20 away from the original view and still be correctly recognised. However, this is only an approximate solution, as the perspective allowed for is only that associated with a single plane. In general, a plane containing important features can be selected through the object and the distortion of those parts of the object outside the plane can be ignored, for recognition purposes, provided the total tilt does not exceed, say +20 .
As long as the lengths or diameters of the objects are significantly different, it is quite adequate for recognition purposes to store four or five images of each object at chosen angles to the axis thereof.
The above-mentioned patent provides a method which can be arranged to accommodate a +45 rotation about the camera axis, together with all perspective effects of the predominant plane of the object, apart from small errors at the end thereof.
Thus, with five stored views of each object, the present control system can recognise an object in any attitude.
In order to carry out this recognition procedure, the robot starts with an image from the camera which contains at least a large part of the object, preferably towards the centre of the field.
At a particular stage in the assembly or fabrication process, in accordance with the job sequence, the system is required to recognise only one particular object, for example a particular type of bolt. The recognition system need therefore only classify two types of object - the correct type and all wrong types.
In order to recognise the correct object from all others, the system may require images of the others for comparison. It is preferable that these should each be given an indicium of some kind, for exampie, a name or a number, so that, if the system is later required to select a different object, it can make use of some of these comparison patterns.
An iterative teaching process is used, wherein the objects are shown, in turn, to the robot until it is able to distinguish between each object.
The television camera used in the present invention is provided with two rasters, which scan the image in accordance with certain parameters, similar to those used in Patent No.1,234,941.
During the search mode, a first raster remains locked on to an object being worked on, using error signals to hold an image of the part seen in registration.
A second raster is locked on to a tool on the object being added to the assembly. The image of this is kept in registration by means of error signals derived from the difference between the image of the tool (or other object) and a stored image, by moving the arm of the robot so as to correct the errors.
During the teaching phase an operator causes the second raster to move and be altered so as to view those positions to which he wishes the tool to go.
Once the process has started, small movements can be achieved simply by altering the scan raster.
Larger movements may involve changes of attitude beyond those which can be simulated by changing the raster, or the size of the raster may become either too big or too small to be convenient. It is also possible that parts of other objects may obscure direct vision. To overcome these difficulties the system is arranged so that it automatically stores new patterns.
In order to make the error signals produce the correct arm motions, it is necessary to know the approximate positions of the arms and a small subsidiary computer is provided to carry out the necessary trigonometry. Amplitudes need not be very accurate, but it is important to get the correct sign for each component.
A pattern that has just been stored should be substantially identical, apart from random noise, with the incoming television signal.
The operator should place the indicium (e.g. name or number) of the tool that should be in use on a register, so that new stored patterns are always labelled and so that, if another pattern which is the image of some other tool starts to fit better than the correct image, then it can be readily detected by noting disagreement between the labels.
The parameters associated with each pattern in the store are also attached to the pattern. These parameters include the relative position of the raster concerned to the other raster being used as a reference position on the object being worked on.
In the search mode, the first raster is locked on to a prominent feature, such as the corner of a tray from which components are to be taken. The second raster is then made to assume, in sequence, each of a series of different positions relative to the first raster. This series of positions is originally chosen by the operator. A scoring mechanism is provided, which detects a sufficiently good fit with a stored pattern, and then this patter is selected and scanned in orderto produce error signals indicating how to improve its matching with the incoming television signal. Over a limited range, these error signals are allowed to act so as to improve said matching by alteration of the selected stored pattern.
From then on, location and identification is effected as described in Patent No. 1,234,941, except that it is desirable to automate the learning procedure to a greater degree. This is achieved by providing the operator with means of adjusting the second raster to a suitable size and bringing the object to be recognised into view with the register loaded with the appropriate indicium. He then releases the pattern to move under its own servo action.
From the patterns already in the store, the one with the highest score provides the error signals for position. If the position moves towards the position indicated by the operator as the correct one, the pattern is allowed to settle. The height and width servo is now allowed to change the pattern and, if it remains within certain tolerances, the parallelogram is then allowed to servo. This process is continued until all the parameters have been put under servo control. If, during this process, the raster starts to move out of the agreed tolerances for position, size, parallelogram distortion, etc., then a new pattern is stored. That new pattern is marked with a normalising signal, which causes the system to jump to the position and size selected by the operator.
If the pattern settles under servo control but has the wrong indicium, then another with the correct indicium will be stored.
This routine is now tested by moving the raster away from the correct position and checking that it returns under servo control and identifies the correct pattern by the highest score. If it does not, then the necessary extra patterns must be inserted. This routine can be provided automatically under computer control, which moves the raster away from the correct position by perhaps half its width in all directions in order to check for correct servo action and identification.
It can now be appreciated that the present invention has a substantial number of advantages, which include the requirement of any knowledge of computer programming being unnecessary, because the operator can easily programme the robot two perform a sequence of movements by merely demonstrating the sequence himself.
The present invention can therefore also enable a robot to be programmed to do a simple operation in about the time it takes a man to demonstrate that operation and thereafter to repeat it at substantially the same speed.
Moreover, the control system in accordance with the invention can be made from standard integrated circuits, so that the system can be sold at a reasonable price.
The invention will now be further described by way of example only with reference to the accompanying drawing, the single figure of which shows a block circuit diagram of one embodiment of the present invention.
Referring to the figure, a television camera 1 has a programmable scan raster and in general the raster has two properties which make it different from a normal camera. These properties are the ability to scan two areas of the light sensitive screen with alternate frames, and a preferable option of scanning each area with lines in two directions at right angles, or at least at very different angles.
A mechanical arm 2 is provided with as many types of freedom as are considered necessary for the jobs it must do. The arm 2 also has rough position sensors, which are connected to a computer 3 and from which it is possible to compute the relationship between a required change in an image and a particular movement of individual joints of the arm.
That is to say it is possible to compute how the arm 2 should move in order to correct any particular combination of error signals.
The system works in three modes of operation, i.e.
a search mode, a recognition mode and a manipulation mode, by using stored patterns. An operator generates control signals, conveyed by a line 8, to determine the original sequence of stored patterns since they represent views which are rather like frames from a cine-film showing the sequence of events during the assembly or fabrication process, or during successive stages of recognition. However, fewer frames are required in the present invention than are required in a cine-film, as each frame represents an image which can be altered to allow for motion, including perspective.
A scoring unit 4 is a special computer which holds the patterns in a set of dynamic buffer stores. A number of patterns, say 64, are available in these stores and parallel scoring devices are at ail times comparing all the patterns with the incoming frame from the camera 1.
At the same time, logic built into the scoring unit 4 provides error signals, in the manner described in
Patent No. 1 234941, which are either sent to the mechanical servos on the arm 2 or to the camera 1.
The computer 3 is needed to change error signals from picture co-ordinates to arm space co-ordinates when the arm is receiving the error signals.
The operator originally controls the television raster during the teaching phase. The first and last values for all parameters such as "x" position, "y" position, vertical scale, horizontal scale, etc. are recorded as part of the label on each pattern, together with the name of the object concerned and an arbitrary numberforthe purpose of identifying it in the store.
The same pattern may be found to be useful at a later stage in the sequence of operations, so it may be convenient to provide an additional store (not shown in the figure) to which the labels may be transferred, so that different instructions may be associated with any one pattern on different occasions. Thus, there may be several entries during the sequence of operations for each stored pattern.
When the arm 2 is being controlled by the system the pattern with the highest score gives the values for the raster position and dimensions which are then interpolated between the first and the last values so as to provide the camera raster with smoothly changing parameters. The position is usually with respect to another raster, this latter raster being locked on to a reference pattern holding the position of a prominent local feature, which is used as a reference point.
Thus during normal servo operations two lots of information i.e. servo information and parameters defining the second raster, go to the camera 1 on alternate frames together with relative position of the two rasters.
The servo information holds the reference pattern in position against the image of the reference point.
The parameters defining the second raster are simply the operator's instruction reproduced, and they provide co-ordinate data to move the second raster in relation to the first raster. Thus, any motion of the camera, accidental or otherwise, does not affect the relative position of the two rasters.
Servo error signals derived from the second raster are passed to the mechanical servos which are thus
not affected by small movements of the camera.
This procedure has the advantage that it is largely self testing since the operator is made aware at once if the arm servos fail to respond to their error signals as he intends. The system will closely repeat its operation as the interpolation can be made to reproduce the operator's original values very accurately.
A convenient method of passing the information to the system so as to alter the raster in the camera
during the teaching mode is to provide a dummy
object to represent the tool or the piece which is to
be assembled. For example, to mount a ball on the end of lazy tongs, first the lazy tongs may be attached to a pillar in front of the operator. By
holding the ball and moving it in three dimensions
relative to the pillar the operator can express the intention of moving the tool in the corresponding direction relative to the camera. Change of length of the lazy tongs causes the image in the camera to change size. Change of direction from side to side produces an "x" shift of the image and an up or down motion produces a "y" shift. A switch enables the connection to be broken if the operator wishes to continue the motion beyond that which can be initiated by the lazy tongs.
Movement of the ball on the lazy tongs need not be on the same scale as the motion of the tool. It can be helpful to have a "fine control" in which a large movement of the ball qives a small movement of the tool.
The ball itself is rotatable. Two modes are provided for imparting rotation to the tool.
A rotation about an axis perpendicularto its long axis, if it has one, can be imparted by changing the raster in the camera so as to simulate the desired movement which is the same method as for "x" and "y" positions and for range.
If it is necessary to rotate about the axis of a long thin object, this method is not applicable and an alternative mode is provided in which the error signals are sent via the computer 3 to the arm 2 without using the camera 1 for this purpose. During this mode of operation the system stores new patterns and the labels of these record the signals which were sent to the servos. However, the process needs to be controlled with direct feed back from the arm or with tactile feed-back, as visual feed-back is not adequate.
Job sequence store 5 records the sequence of patterns used in the teaching routine. When the system is in its manipulation mode the normal sequence of pattern involves each pattern using a different sized raster from the one before. Furthermore the transfer from one pattern to another uses information stored in the job sequence store 5 to give the identity of the pattern and its position and size. As soon as the transfer is complete the highest score system is used to check that the correct workpiece or tool is in view. A warning is sounded if the number on the pattern with the highest score differs from the number given by the job sequence store 5.
In general the highest score determines the next pattern to be used. On some occasions, particularly in the recognition and search modes, the next pattern chosen may not be the next pattern in the sequence. This does not generally matter provided the one chosen is substantially close to it.
On some occasions part of a sequence will be repeated in a later sequence on the same job, for example, many parts which are all alike may be taken from a tray to be put into different places on an assembly.
At the end of the repeated sequence the system must be prevented from repeating the first sequence, which it would do if the highest score routine were followed. This may be carried out by marking those patterns which must be followed by the pattern indicated by the job sequence store 5. This mark is deposited when the pattern is used for the second time and the sequences diverge.
A disc store 6 feeds the storing unit 4 with patterns well in advance of when they are needed.
During the teaching phase the patterns which are first recorded from the camera 1 in the scoring unit 4 are passed to the disc store 6 so that they can be reproduced when needed. The job sequence store 5 indicates which ones are going to be needed.
The operator can ask for a particular pattern in the store 5 to be used as the next one in a sequence by entering the number on a keyboard (not shown in the figure). A monitor 7 displays any pattern called for by the operator so that he can inspect parts of any past routine.
If a routine is already learnt by the system the operator may ask for it as a subroutine by putting the first number of that routine on the pattern in use at the time the subroutine is to be entered. For this to work the arm 2 must be in a suitable position so that it starts the routine correctly.
In certain limited circumstances it is possible to
program using mainly existing subroutines which would be called upon by number.
As a programming facility the operator may ask for a different sized raster in a new position at any time and with the system in any mode. When this is
done the information of the identity of the new
pattern to be used, and the size and position of the
raster to be used with it are recorded in the job sequence store 5 at the appropriate place in the sequence.
Once the data from the job sequence store 5 has
been used to determine the pattern and its initial position, the scoring system is given power to decide if this pattern is to be kept for use in providing servo error signals and the servo error signals are given control of keeping the image in registration, either by moving the camera raster within the camera or moving the arm with the tool or other object on it.
The camera 1 may be attached to the mechanical arm 2, so as to move with the arm, or alternatively, it may be fixed, as in the present embodiment.
Preferably there should be two arms and two cameras provided on the robot, as many jobs require the use of two hands.
CLAIMS (Filed on 17.6.83.)
1. A robot control system, wherein a robot can be programmed by demonstration to repeat a job sequence, comprising a pattern recognition device including means for sensing and labelling image and position co-ordinates of known objects in various attitudes and various manipulations, means for storing the sensed and labelled images, means for searching for and recognising an image of a particular required object at a certain stage of the job sequence by comparing the image of the required object with the sensed and labelled image thereof, and means for manipulating the required object to a desired location in accordance with the job sequence by the use of reference positions in a set of chosen co-ordinates, in which the robot works during the job sequence.
2. A robot control system as claimed in claim 1 wherein said means for sensing images comprises a television camera having a programmable scan raster, which can scan first and second areas of a light sensitive screen of the camera during alternate frames.
3. A robot control system as claimed in claim 2 wherein said television camera includes means for scanning each of said areas with first and second lines substantially orthogonal to each other.
4. A robot control system as claimed in any preceding claim wherein said means for storing the sensed and labelled images includes a job sequence store, in which a sequence of patterns, sensed during said demonstration, is stored, each of said patterns having a label stored therewith.
5. A robot control system as claimed in any preceding claim wherein said means for searching for and recognising an image of a particular required object comprises a scoring unit which generates error signals to control movement of the robot, in accordance with the job sequence.
6. A robot control system as claimed in any preceding claim including display means for displaying a selected part or parts of the job sequence.
7. A robot control system substantially as herein described with reference to the accompanying drawing.
**WARNING** end of DESC field may overlap start of CLMS **.
Claims (7)
1. A robot control system, wherein a robot can be programmed by demonstration to repeat a job sequence, comprising a pattern recognition device including means for sensing and labelling image and position co-ordinates of known objects in various attitudes and various manipulations, means for storing the sensed and labelled images, means for searching for and recognising an image of a particular required object at a certain stage of the job sequence by comparing the image of the required object with the sensed and labelled image thereof, and means for manipulating the required object to a desired location in accordance with the job sequence by the use of reference positions in a set of chosen co-ordinates, in which the robot works during the job sequence.
2. A robot control system as claimed in claim 1 wherein said means for sensing images comprises a television camera having a programmable scan raster, which can scan first and second areas of a light sensitive screen of the camera during alternate frames.
3. A robot control system as claimed in claim 2 wherein said television camera includes means for scanning each of said areas with first and second lines substantially orthogonal to each other.
4. A robot control system as claimed in any preceding claim wherein said means for storing the sensed and labelled images includes a job sequence store, in which a sequence of patterns, sensed during said demonstration, is stored, each of said patterns having a label stored therewith.
5. A robot control system as claimed in any preceding claim wherein said means for searching for and recognising an image of a particular required object comprises a scoring unit which generates error signals to control movement of the robot, in accordance with the job sequence.
6. A robot control system as claimed in any preceding claim including display means for displaying a selected part or parts of the job sequence.
7. A robot control system substantially as herein described with reference to the accompanying drawing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB08219569A GB2123172B (en) | 1982-07-06 | 1982-07-06 | A robot control system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB08219569A GB2123172B (en) | 1982-07-06 | 1982-07-06 | A robot control system |
Publications (2)
Publication Number | Publication Date |
---|---|
GB2123172A true GB2123172A (en) | 1984-01-25 |
GB2123172B GB2123172B (en) | 1986-10-22 |
Family
ID=10531507
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB08219569A Expired GB2123172B (en) | 1982-07-06 | 1982-07-06 | A robot control system |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2123172B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0301586A1 (en) * | 1987-07-29 | 1989-02-01 | Phoenix Software Development Co. | Vision system and method for automated painting equipment |
US7200260B1 (en) * | 1999-04-08 | 2007-04-03 | Fanuc Ltd | Teaching model generating device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB1518244A (en) * | 1975-11-28 | 1978-07-19 | Bendix Corp | Method and apparatus for calibrating mechanical visual part manipulation system |
GB1534167A (en) * | 1975-08-20 | 1978-11-29 | Bendix Corp | Method and apparatus for transferring parts |
GB2063514A (en) * | 1978-05-26 | 1981-06-03 | Auto Place Inc | Programmable robot with video system |
-
1982
- 1982-07-06 GB GB08219569A patent/GB2123172B/en not_active Expired
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB1534167A (en) * | 1975-08-20 | 1978-11-29 | Bendix Corp | Method and apparatus for transferring parts |
GB1518244A (en) * | 1975-11-28 | 1978-07-19 | Bendix Corp | Method and apparatus for calibrating mechanical visual part manipulation system |
GB2063514A (en) * | 1978-05-26 | 1981-06-03 | Auto Place Inc | Programmable robot with video system |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0301586A1 (en) * | 1987-07-29 | 1989-02-01 | Phoenix Software Development Co. | Vision system and method for automated painting equipment |
US4941182A (en) * | 1987-07-29 | 1990-07-10 | Phoenix Software Development Co. | Vision system and method for automated painting equipment |
US7200260B1 (en) * | 1999-04-08 | 2007-04-03 | Fanuc Ltd | Teaching model generating device |
Also Published As
Publication number | Publication date |
---|---|
GB2123172B (en) | 1986-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6763284B2 (en) | Robot teaching apparatus | |
CN108724179A (en) | control device, robot and robot system | |
US4891767A (en) | Machine vision system for position sensing | |
CN108177162B (en) | The interference region setting device of mobile robot | |
CN105518486B (en) | The system and method for following the trail of the orientation of movable object object | |
US4305130A (en) | Apparatus and method to enable a robot with vision to acquire, orient and transport workpieces | |
US4831548A (en) | Teaching apparatus for robot | |
RU2587104C2 (en) | Method and device to control robot | |
US20090234502A1 (en) | Apparatus for determining pickup pose of robot arm with camera | |
US4658193A (en) | Sensing arrangement | |
JP5427566B2 (en) | Robot control method, robot control program, and robot hand used in robot control method | |
JPH012882A (en) | Robot control method | |
GB2123172A (en) | A robot control system | |
Zhang et al. | Vision-guided robotic assembly using uncalibrated vision | |
Graefe et al. | The sensor-control Jacobian as a basis for controlling calibration-free robots | |
US7764386B2 (en) | Method and system for three-dimensional measurement and method and device for controlling manipulator | |
JPS63196358A (en) | Work line following method | |
Kent et al. | Eyes for automatons: Faster, more accurate, and more consistent than humans can hope to be, machine vision systems are on their way to broad application | |
JPH01193902A (en) | Coordinate calibration method for robot with terminal visual sense | |
US20240001533A1 (en) | Display system and teaching system | |
RU2800443C1 (en) | Method of object manipulation | |
JP2915979B2 (en) | Position and rotation angle detecting device, pointing device thereof, and robot operation teaching device using the same | |
JP2519444B2 (en) | Work line tracking device | |
JP2021003782A (en) | Object recognition processing device, object recognition processing method and picking apparatus | |
Freeman | Survey of image processing applications in industry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |