WO2007085330A1 - Procédé et système permettant la supervision d'une zone de travail comportant un robot industriel - Google Patents

Procédé et système permettant la supervision d'une zone de travail comportant un robot industriel Download PDF

Info

Publication number
WO2007085330A1
WO2007085330A1 PCT/EP2006/069682 EP2006069682W WO2007085330A1 WO 2007085330 A1 WO2007085330 A1 WO 2007085330A1 EP 2006069682 W EP2006069682 W EP 2006069682W WO 2007085330 A1 WO2007085330 A1 WO 2007085330A1
Authority
WO
WIPO (PCT)
Prior art keywords
marking
robot
changes
images
human
Prior art date
Application number
PCT/EP2006/069682
Other languages
English (en)
Inventor
Torgny BROGÅRDH
Original Assignee
Abb Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Ab filed Critical Abb Ab
Publication of WO2007085330A1 publication Critical patent/WO2007085330A1/fr

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/142Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones

Definitions

  • the present invention relates to a method and a system for su- pervising a work area including an industrial robot having at least two defined security levels.
  • the robot is often placed in a robot cell.
  • the robot cell encloses a dangerous area in which there is a risk of collisions with the robot.
  • the robot cell is often enclosed by a fence having a gate, or by light barriers.
  • Different security levels are applied if a human is outside or inside the robot cell. If a human enters the robot cell, the security level is increased and certain safety rules are applied.
  • One safety rule is that the safe equipment of a portable operator control unit, generally denoted a Teach Pendant Unit, must function within the robot cell and the controller must be in a safety state in which the robot is not allowed to run faster than 250 mm/sec. At the same time, the operator must use a three-position safety switch on the Teach Pendant in order to be able to run or make programs.
  • the robot cell is provided with detecting means detecting when a person is entering or leaving the cell. If a fence and gates enclose the robot cell, it is, for example, detected when the gates are opened and closed. Thereby, it is possible to change the security level of the robot in dependence on whether there is a human visiting the cell or not.
  • detecting means detecting when a person is entering or leaving the cell. If a fence and gates enclose the robot cell, it is, for example, detected when the gates are opened and closed. Thereby, it is possible to change the security level of the robot in dependence on whether there is a human visiting the cell or not.
  • the object of the present invention is to provide a solution to the above-mentioned problem on how to safely supervise a work area including an industrial robot without using fences and gates.
  • this object is achieved with a method as defined in claim 1 .
  • At least one visible marking indicating a potentially dangerous region is provided in the vicinity of the robot, the marking having at least one unique feature relative to other objects in the dangerous region.
  • the method comprises repeatedly during operation of the robot:
  • the visible marking marks a region of a certain security level.
  • the marked region may, for example, correspond to the reach of the robot.
  • the security level must be changed if a human crosses the marking. If the human enters the dangerous region, the security level must be increased, and if the human leaves the dangerous region the security level must be decreased.
  • the marking is, for example, in the form of a line or a chain of patterns on the floor of the work area.
  • the marking can, for example, be painted on the floor, consist of tape or coloured sheets of a material, or even be part of the floor as a mosaic pattern.
  • the colour and shape of the marking must be such that the contrast between the marking and its background is good enough for the pattern recognition software.
  • the marking must have at least one unique feature, for example colour or shape, relative to other objects in the dangerous region in order to ensure that the pattern recognition does not mix up the marking with any other object, for example a cable, in the region.
  • the visible marking has two functions. One function of the mark- ing it that it outlines dangerous parts of a work area, with respect to risk of collision with a moving robot, so that people in the work area will know where the dangerous parts are located.
  • the other function of the marking is to be used for pattern recognition in order to detect if a human is entering or leaving the dangerous region. Thus, it is possible to detect if a human gets inside a potentially dangerous region and if so to increase the security level of the robot. Different levels of security mean, for example, a reduction of the speed of the robot, a limitation of the operating range of the robot, or a stop of the robot upon de- tecting that a human is crossing the marking.
  • the main idea with present invention is that when the images are processed, only properties of the marking are used, information on the surroundings is discarded, whereby simple and reli- able algorithms with high redundancy and selectivity can be adopted.
  • the marking must have at least one unique feature relative to other objects in the dangerous region. That makes it possible to distinguish the marking from the other objects in the region with a very high reliability. If a human is crossing the marking, parts of the marking will be missing.
  • changes in the marking are detected and based on the changes it is decided if the security level is to be changed or not. For example, changes of position, width, and length of the missing parts of the marking are detected.
  • detecting changes in the marking includes: determining at least one parameter for the marking based on said images, and comparing the deter- mined parameter with at least one previously determined parameter and based thereon detecting changes in the marking.
  • the parameter of the marking is based on one or more of the following properties: colour, area, length, width, center of gravity position, pattern complexity measures, round- ness, spatial frequency components, pattern domain descriptors, contrast relative to the surrounding etc. Changes in those parameters include valuable information to enable a safe detection of a human crossing the marking, and also information to be used to detect malfunction of the hardware and software of the supervising system.
  • the method includes determining, based on the detected changes in the marking, whether it is likely that the changes in the marking originate from a human crossing the marking and based thereon deciding whether the security level of the robot should be changed or not.
  • Changes in the marking can be caused by other factors than a human crossing the marking, for example by wear of the marking, dirt on the marking, a small animal, such as a mouse or a cat, crossing the marking, other moving object inside the robot cell, such as the robot itself, is crossing the marking, or a sta- tionary object which has been put down on the marking.
  • Information obtained from the detected changes in the marking is used to judge if it is likely that the changes originate from a human or if it is impossible that the changes arise from a human. If it is likely, or at least not can be ruled out that the changes are caused by a human crossing the markings in a direction towards the robot, the security level is increased.
  • two images of the dangerous region seen from two different angles are captured and changes in the marking are detected based on both images.
  • An advantage with having two images is that they achieve redundancy.
  • the two images can be used to detect errors in any of the cameras used to capture the images. Due to the redun- dancy a very high reliability is obtained.
  • This embodiment also makes it possible to determine the height of the object that causes the changes in the marking.
  • the method com- prises: estimating the height of an object by registration of changes in the marking based on said two images, and determining based on the estimated height of the object whether or not it is possible that the changes in the marking originate from a human crossing the marking, and based thereon deciding whether the security level of the robot should be changed or not. If a high object, such as a human, is crossing the marking, there will be a large difference between the two images of the positions of the missing parts of the marking. If the position of a missing part is the same in both photos, the height of the object causing the missing part is zero. This is, for example, the case when the object is a liquid flowing on the floor, or the missing part is due to dirt or wear of the marking.
  • slow 2- dimensional changes in the marking are detected based on previous and current images, and a warning is generated when the changes exceed a threshold value.
  • 2-dimesional changes are meant changes caused by an object with no height. If changes having a slow development and the height of the object causing the missing parts is approximately zero, slow 2-dimesional changes have been detected.
  • This embodiment detects if the visibility of the marking has been reduced due to wear and dirt and generates an alarm if the visibility of the marking has significantly decreased. Thus, the reliability of the method is increased.
  • the marking or markings are colour-coded in at least two different colours.
  • a coloured marking is an advantage, both in terms of telling the people the warning level of the marking and making it possi- ble to perform pattern recognition in different colours, which very much increases the reliability of the camera-based supervision. It is possible to detect errors in hardware and software of the system based on the different colours. To use more than one colour on the marking gives it a unique appearance, which makes it easier to distinguish the marking from the surroundings and further increases the reliability of the method.
  • At least two markings in the form of lines or a chain of patterns are provided at different distances from the robot.
  • This embodiment makes it possible to have areas with different security levels, which are marked with different markings. Which of the security levels to be applied depends on which type of the markings the human crosses.
  • the colours and shapes of the markings can be made in many different ways, but they should be organized in such a way that the meaning of them is intuitively evident.
  • the appearance of marking for example the colour of the markings, will tell the people in the work area that if they pass these markings another security level will be applied.
  • a higher security level is determined if changes in a marking with a shorter distance to the robot is detected, and a lower security level is de- termined if changes in a marking line with a longer distance to the robot are detected. For example, if people pass a first line they will enter an area in which the robot may reach them and the robot will be stopped if the robot arm is too close to them, and if they pass a second line the robot will immediately stop independently of where the robot arm is located. This embodiment also makes it possible to supervise the position of the robot if one of the makings is positioned inside the working range of the robot.
  • the method further comprises: estimating the position of the robot based on said changes in the marking, comparing the estimated robot position with the robot position from the control system of the robot, and indicating a supervision error if the difference between the positions is larger than a threshold value.
  • the method further comprises: estimating the position of a human in the work area based on said changes in the marking, receiving information on the next planned movement of the robot, and determining whether or not the robot will move in a direction towards the human during the next movement, and based thereon deciding whether the next movement of the robot will be allowed or blocked.
  • This embodiment increases the security for a human visiting an area in the vicinity of the robot.
  • this object is achieved by a system as defined in claim 14.
  • Such a system comprises: at least one visible marking indicating a potentially dangerous region in the vicinity of the robot, the marking having at least one unique feature relative to other objects in the dangerous region, at least one camera adapted to repeatedly during operation of the robot capture images of the potentially dangerous region including said marking, and a computer unit adapted to detect changes in the marking based on said images, and to decide whether the security level of the ro- bot should be changed based on the detected changes in the marking line.
  • the object is achieved by a computer program product directly loadable into the internal memory of a computer or a processor, comprising software code portions for performing the steps of the method according to the appended set of method claims, when the program is run on a computer.
  • the computer program is provided either on a computer-readable medium or through a network, such as the Internet.
  • the object is achieved by a computer readable medium having a program re- corded thereon, when the program is to make a computer perform the steps of the method according to the appended set of method claims, and the program is run on the computer.
  • Fig. 1 shows an example of a prior art robot cell seen from above.
  • Fig. 2 shows a robot cell seen from above using camera supervision according to a first embodiment of the invention .
  • Fig . 3 shows a robot cell according to a second embodiment of the invention.
  • Fig . 4 shows a part of the cell in figure 3 in perspective.
  • Fig . 5 shows a human entering the cell in figure 3 and 4 in a view as seen from the camera.
  • Fig . 6 shows a robot cell according to a third embodiment of the invention.
  • Fig . 7 shows a pattern of the robot cell in figure 6 after pattern recognition.
  • Fig . 8 shows the pattern in figure 6 after pattern restoration us- ing 3D.
  • Fig . 9 shows the pattern in figure 6 after pattern restoration using a simple robot model.
  • Fig. 10a-b show an example of stereo measurements using two cameras.
  • Fig . 1 1 shows an example of software design for robot cell supervision.
  • Fig . 12 shows an example of a flow diagram for the robot cell supervision.
  • Fig . 13 shows an example of geometric redundancy tests of pat- tern on floor.
  • Fig. 14 shows an example of geometric redundancy tests of 3D pattern.
  • Figs. 15a-b show examples of colour-coded markings.
  • Figs. 16a-c show examples of shape-coded markings.
  • Fig. 17 shows an example of a supervision system according to the invention.
  • Fig. 18 shows an example of how the supervision system according to the invention can also be used to improve the safety of the robot control.
  • Fig. 19 shows a further example of markings on the floor.
  • Figure 1 shows a typical prior art robot cell with an industrial robot 1 , two work objects 2, 3, two positioners 4, 5, a cell loading/unloading space 6 and a fence 7.
  • the purpose of the fence is to prevent people from entering the robot cell when the robot is working.
  • the fence has two gates. Those gates work in such a way that as soon as one of the gates is opened the robot stops. As can be seen from the figure, considerable floor space is wasted because of the fence.
  • Figure 2 shows a possible way of using cameras in the cell of figure 1 to avoid fences and gates.
  • the cell equipment is the same as in figure 1 but the fence and gates have been replaced with four cameras 9a, 9b, 9c and 9d.
  • the broken line circle 8 corresponds to the reach of the robot and thus the cameras must detect if a human gets inside this circle. To do so, the camera lenses and the height of the camera mountings are selected in such a way that the cameras cover the critical parts of the scene as shown by the continuous circles 10a, 10b, 10c and 1 Od. In some way the people in the workshop must know where the dangerous parts of the work area are with respect to risk of collision with a moving robot. This is achieved by providing a visible marking indicating the dangerous parts of the work area.
  • Figure 3 the dangerous parts of the work area are marked using elongated markings 1 1 , 12 on the floor in the vicinity of the robot.
  • markings 1 1 , 12 can be painted on the floor or consist of tape or coloured sheets of material or even be part of the floor as a mosaic pattern. In an environment where dust and dirt would rapidly cover the markings, these can be kept free by air flow and even the nozzles of an air system can be used as markings.
  • a second marking 12 in the form of a circular line of a second colour, for example red, tells the people in the workshop that inside this red area the robot will immediately stop.
  • a second colour for example red
  • the colours and shapes of the markings can be made in other ways, but should be organised in such a way that the meaning of them is intuitively evident for the people working in the workshop.
  • the cameras 9a-d are connected to a computer unit 22 adapted to receive the images and to detect changes in the marking based on the images, and to decide whether the security level of the robot should be changed based on the detected changes in the marking line.
  • Figure 4 shows part of the cell in figure 3 in perspective. As exemplified in the figure the scene for the camera 9 can be quite complex and dynamic.
  • FIG. 10 Inside the view circle 10 of the camera there is a conveyer 13 with a moving item 15. There is also a loading station 14 with varying height dependent on how many objects are stacked on it and sometimes with a bright reflection 15 on a mirror-like object surface. Moreover, there is oil 16, dirt 20 and a robot cable 21 on the floor, which will make a safe de- tection even more difficult.
  • Figure 5 shows the same part of the cell as in figure 4, but this time from the camera view. As can be seen, a human 17 is entering the robot cell walking on the floor 18 towards the robot 1 .
  • the main idea with the camera-based supervision concept is that when the camera frames are processed, only the properties of the pattern on the floor are used, whereby simple and reliable algorithms with high redundancy and selectivity can be adopted.
  • the pattern does not need to consist of circular bows even if this is best for a robot with a circular workspace.
  • the markings in the case of a gantry robot should have a rectangular pattern to match the rectangular workspace.
  • the pattern recognition system will only detect the markings and not the other objects in the work area. As for the example shown in figure 6, the pattern recognition system will only detect circular formed markings with the colours yellow, red and blue. The scene detected will look as in Figure 7.
  • the circular markings 1 1 a-c, 12a-b, 19 are interrupted by the objects 15, 16, 20 and 21 and by the arm 1 a of the robot and by the legs of the human 17.
  • the objective of the camera-based safety supervision system is to safely detect the broken mark- ings caused by the human, which means to distinguish the missing parts of the marking between the marking segments 1 1 a:2, 1 1 a:3 and 1 1 a:4 from all other missing parts of the markings.
  • all the other missing parts of the markings 1 1 a-c, 12a-b and 19 must be ruled out as being a human with very high prob- ability to avoid false stops of the robot.
  • One way to increase the reliability when detecting the human or another safety-critical object is to use two cameras separated from each other as in a stereo vision setup. Then the flat and low height objects 16, 20 and 21 can be separated from a human being, which always is above a certain height. Thereby, the missing parts 23-26 of the markings can be reconstructed and it can be ascertained with full confidence that it is not a safety- critical object as a human, see Figure 8.
  • the object 15 in figure 5 has such a height that it could be a human, but since this object never moves and is static the time history of the missing part between marking segment 1 1 a: 1 and marking segment 1 1 a:2 can be ruled out after some time from the start or the camera supervision system.
  • the object 15 starts moving it will be detected as a human, which is correct since all moving objects may be connected to a human and should therefore not be tolerated in the supervised area around the robot.
  • the low height objects 16, 20 and 21 could also be ruled out since they are static.
  • Two camera systems are needed to get redundancy both with respect to hardware and software. Having two camera systems, as much redundancy as possible should be made used of in order to find the difference between a human entering the robot area and objects already there.
  • both their static and low height properties should be use. This may be important for example if reflections or light spots on the floor that disturb the detection of the markings move because of changing light conditions, or if a dark liquid flows on the floor above a marking. Even if these are dynamic objects the height detection will still rule them out as human.
  • the blue marking 19 is used to detect the angle of the robot arm 1 a and an extrapolation with this angle in a radial direction from the centre of the robot will then give possible broken red and yellow markings, and if such lacking markings are found it can be concluded that this is because of the robot arm and also the missing marking parts 28a-d can be ruled out as being a human.
  • One redundancy here is that the position of axis one of the robot can be used to verify that the angle is the same as detected by the blue markings. Of course with more than 1 blue marking the redundancy is increased. It should also be pointed out that with two cameras in a 3D constellation the robot arms will give more complicated shadows on the markings, which further increases the probability to differentiate between a robot and a for example a human.
  • the movement pattern of a robot arm is different from that of a human, which can be used to further increase the redundancy in the robot arm detection.
  • the marking to measure the robot angle can have another colour and then preferably not the same colour as the robot arm.
  • this situation can be handled by using multi-coloured markings as exemplified in Figure 15.
  • Figure 10a-b ex- emplifies how two close marking lines 1 1 x and 1 1 y are broken as seen from the two cameras 30a and 30b mounted above the floor in between the inner yellow marking 1 1 y and the centre of the robot.
  • the light rays 31 ,32 corresponding to the ends of the human shadowing of the markings are shown for the left camera 30a and the right camera 30b.
  • the missing parts of the markings as seen from the two cameras are marked out in Figure 10b with 33 and 34 for camera 30a and 35 and 36 for camera 30b.
  • the positions of the missing parts can be calculated in a global coordinate system from a calibration of the cameras using the pat- tern on the floor.
  • the floor can be marked with unique symbols 29a, 29b, 29c known by the camera software and each camera can measure the distances, as 35, 36, 37 and 38, from the symbols to the missing parts of the markings and it can be checked if these distances relate to each other as expected from camera calibration.
  • specific features of a human could be checked, for example if the size and shape is realistic.
  • the most im- portant feature for the detection of safe-critical objects like a human is its motion, meaning dynamic changes in the missing parts of the markings and a deviation of the missing parts of the markings in relation to an initial static capture of the marking pattern.
  • Figure 1 1 shows an example of the software architecture of the camera-based safety system.
  • three cameras are mounted over the area to be supervised, one camera to the left 30a, one in the middle 30b and one to the right 30c.
  • a common clock 40 trigs the reading of camera frames to respective memory partitions 39v, 39m and 39h. Then the images are separated in different colours and each colour is processed individually in processing units 41 a-i. Of course, more colours can be used to increase the redundancy of the detection algorithms.
  • 2D filtering the marking geometry is detected either by binary or grey- scale methods and properties for the geometric pattern domains are obtained, as for example width, length and radius of a circular marking.
  • Figure 12 shows a flow diagram of some of the actions of the software in figure 1 1 .
  • the cameras are calibrated by means of the pattern on the floor block 50. This can be made in different ways, for example by measuring the distances between markings and input these values to the camera system or by putting a marking on the robot and move this marking to different places in the workspace with the robot.
  • all the markings are recognized and their positions and parameters are calculated and saved, block 51 . This can be made automatically if standard markings already programmed into the pattern recognition software are used. Also data on the contrast of the pattern relative to the background are detected and stored to have as a reference with respect to the supervision of the condition of the pattern. When these become too dirty or have been partly destroyed the system will make a warning and ask for cleaning/restoration of the pattern on the floor.
  • the initially determined static pattern parameters are stored for use during the workspace supervision, block 52.
  • the dynamic pattern recogni- tion starts, block 53.
  • the continuously captured scene marking parameters are calculated after pattern recognition (in the same way as made for the initial static pattern) and the parameters for the pattern geometries with its lacking parts are compared with the initial static pattern parameters stored after starting the system, block 54. Examples of parameters are marking segment colours, contrast, lengths and widths and the position on the floor of the missing parts of the pattern. If there is no or a very small parameter difference (level decided by the noise level, which is calculated during the calculation of the initial static parameters, which was made from several camera frames) new scene frames will be captured and processed until there is a difference in the parameters, block 55.
  • block 55 If there is a parameter difference, block 55, and there is no robot arm detected, block 56, it is tested if there are other moving objects, and if that is the case a comparison is made with the initial captured frame again and earlier frames of the history of a moving object, block 62. If the change in pattern is large enough a new alarm level is set meaning that a safety-critical object is in the supervised area, block 64, and if the pattern where the critical object is detected has a red pattern, block 65, the cam- era supervision system orders the robot controller to stop the robot, block 59. If the pattern is not red a further test is made if the robot is in the same area as the critical object, block 66, and if that is true the robot will be stopped, block 59, otherwise a new camera capture will be made.
  • a waiting level of the alarm is set, block 68, and after that it is tested how long the alarm waiting level has been active, block 69, and if this is too long it is de- cided that the moving object is not safety-critical and a new static pattern will be set, block 70.
  • the original static pattern which is used for determining the degradation of the pattern will not be changed. If the difference between the new static pattern and the original static pattern during camera cali- bration is above a certain level (test not shown in the figure) an error signal is given for restoring the markings. After restoration a new start up of the system is made giving a new original static pattern.
  • Figure 13 exemplifies the redundancy used for tests with respect to measurements of the height of the cam- eras.
  • 46a and 47a are the camera lenses and 48a is the image plane of the cameras and 49a the floor plane with the pattern 50a, which in this simple 2D figure is just a line.
  • the distance between the cameras is 2 * d and the length of the line on the floor is L.
  • Two redundant tests are exemplified, one for the height and one for the parameters of the offset between the pattern and the cameras in the horizontal plane.
  • residuals can be calculated and when the residuals are larger than the expected noise with respect to the camera measurements and the geometric model errors, the system is stopped.
  • the pattern of the marking can be uniquely multi-coloured as exemplified in Figure 15.
  • Each marking includes parts of different colours. The different colours can be processed in parallel and the obtained patterns in each colour can be fused to obtain the global pattern. The interference between objects on the floor and the pattern can be obtained in different colours and residuals can be formed between the detected interference in the different colours, thus obtaining redundant detection of static and dynamic objects in the scene.
  • the shapes of the pattern can also be made unique as exemplified in Figure 16. In the same way as described for colour pattern, unique shapes can be used to avoid the mix of safety pattern and objects in the cell.
  • the pattern can also be made with pointing elements as the triangles in figure 16 to show in what direction the robot is located, giving redundancy in detecting the direction of movement of a critical object.
  • the redundancy with respect to cameras may be neces- sary when objects like conveyors or tables prevent one of the cameras to register the whole pattern. In this case a third camera as in figure 1 1 will be needed . Simultaneously this increases the redundancy for the rest of the pattern.
  • the main feature of a critical object is its dynamics, which means that there will be a difference between the interruptions of the patterns both relative to the initially captured scene and relative to a buffer of history scenes.
  • the evolution of the changes in the interruptions in the history scenes give redundant information of direction of movements, speed of move- ments and changes in the structure of the interruptions of the patterns. All of these registrations of changes can be obtained with redundant measurements in different colours if the pattern is colour-coded and in the interruption of different pattern shapes if the pattern is shape-coded.
  • test patterns For test purposes standard images with different patterns should be stored in the system and pattern recognition and parameter calculations should be made for these stored patterns to test the software. These tests could be repeated at decided time intervals and should give the correct stored parameter results. If there are errors in the results when processing standard images, the software has problems and the system must be shut down. These test patterns should include different cases with static and dynamic objects in the scene.
  • - Pattern recognition must be the same in different colours for the same pattern, which means that, for example, the position, curvature, width and length of two lines close to each other should be the same even if these parameters are calculated independently by different software codes in different computers.
  • - Pattern recognition will test that pattern elements have correct shape features.
  • shape features of a marking in the captured image is compared with the same features of a stored version of the same reference marking
  • Standard markings can also be mounted on the top of the upper robot arm to get a redundant measurement of the robot upper arm position.
  • the cameras can also be used to measure the relative distance between a human an the markings, this will be a redundant way to obtain the position of the human.
  • Figure 17 shows an example of a system architecture adapted to safe camera supervision by testing redundant calculations in steps like those outlined above.
  • 70 and 71 are cameras and 72 and 73 memories for read-out of the camera image devices. The cameras are separated in order to obtain 3D information from the scene.
  • 76, 78, 79 and 80 pattern calculations are made in different colours and for each camera resulting feature parameters in the different colours are tested in 77 and 81 . If the differences are larger than a given alarm limit, an alarm will be given.
  • 75 is a data base with calibration camera views and correct pat- tern features parameters and test camera views with test patterns and test pattern parameters.
  • pattern calculation modules 78 and 79 and in 82 and 96 pattern feature parameters for these patterns are compared with the corresponding feature parameters calculated for the calibration and test patterns stored in 75.
  • 3D calculations on known pattern elements are made in order to test that the cameras have not been moved or that the optics has not been changed.
  • the geometry parameters obtained are compared with the same parameters calculated during the calibration of the camera system and stored in 75.
  • the calculations in 84 are moreover tested by the use of standard patterns given to 78 and 79 from 75 with test conditions in 85.
  • box 87 3D calculations are used from pattern gaps, whereby 2D objects are discriminated as well as static 3D objects.
  • Object parameters are stored in the data container 86, and box 87 makes use of the data in 86 to decide what objects are static and dynamic.
  • box 83 normal 3D feature identification is made without the use of the patterns and in box 88 the results of the normal 3D recognition is compared with the 3D recognition based on the reference patterns.
  • tests are made about the robot position, where 89 calculates the robot position from pattern gaps and 90 from controller signals with comparison in box 93 and where boxes 91 and 92 calculate the position of the human (dynamic object) from camera view calibration 91 and position relative pattern with test in 94. Knowing the position of robot and human decision in made in 95 what safety action to take.
  • Figure 18 shows how the safe camera supervision system also can be used to make the robot control safer when the robot is controlled in its manual state.
  • the same robot cell 97 as shown before is used, but now the operator is close to the robot when he programs the robot and tests programs.
  • the safe camera supervision is able to detect both the operator and the robot arm with redundant measurements and calculations and is therefore able to tell the controller the position of the operator in relation to the robot arm. If the robot controller has a high safety implementation the robot program coming from memory 98 is fed to the program executer 99, which generates position targets to the motion control 100 including the servo that controls the robot.
  • the camera supervision system 108 sends the position of the operator to the controller and in 100 the positions in the program to be executed are compared with the position of the op- erator and if there is a risk of collision, the dangerous program position will not be executed.
  • the camera supervision system software in memory 108 could be run in the same computer as the redundant robot control software 101 - 103.
  • Figure 19 exemplifies that the pattern on the floor does not need to be circular, a good design is to adapt the shape of the lines to the shape of the work envelope projected on the floor.
  • 1 1 1 are cameras that detect the link system of the parallel robot 1 10 as will as operators coming into the workspace.
  • the present invention is not limited to the embodiments disclosed but may be varied and modified within the scope of the following claims.
  • the shape and colour of the making of the floor can be varied in many different ways.
  • the number of markings can vary from one to a large number of markings.
  • the computer unit of supervision system can be a separate unit or the computer unit of the control system of the robot.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un système et un procédé permettant la supervision d'une zone de travail comportant un robot industriel comprenant au moins deux niveaux définis de sécurité. Le système comporte au moins un repère visible (11, 12) indiquant une zone potentiellement dangereuse dans le voisinage du robot, le repère comprenant au moins une caractéristique unique par rapport à d'autres objets dans la zone dangereuse, au moins une caméra (9a-d) apte à la capture répétée lors du fonctionnement du robot d'images de la zone potentiellement dangereuse comprenant ledit repère, une unité informatique (22) apte à la réception desdites images et à la détection de changements dans le repère en fonction desdites images et la détermination de la modification ou non du niveau de sécurité en fonction des changements détectés dans la ligne de repère.
PCT/EP2006/069682 2006-01-30 2006-12-13 Procédé et système permettant la supervision d'une zone de travail comportant un robot industriel WO2007085330A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US76289506P 2006-01-30 2006-01-30
US60/762,895 2006-01-30

Publications (1)

Publication Number Publication Date
WO2007085330A1 true WO2007085330A1 (fr) 2007-08-02

Family

ID=37771018

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2006/069682 WO2007085330A1 (fr) 2006-01-30 2006-12-13 Procédé et système permettant la supervision d'une zone de travail comportant un robot industriel

Country Status (1)

Country Link
WO (1) WO2007085330A1 (fr)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2447772A (en) * 2007-03-23 2008-09-24 Truetzschler Gmbh & Co Kg Apparatus for monitoring and securing danger zones on power-driven textile machines
WO2009043369A1 (fr) * 2007-10-01 2009-04-09 Abb Technology Ab Procédé de commande d'une pluralité d'axes dans un système robotisé industriel et système robotisé industriel
EP2053538A1 (fr) * 2007-10-25 2009-04-29 Sick Ag Protection d'une zone de surveillance et support visuel d'un usinage automatisé
EP2113344A1 (fr) * 2008-04-30 2009-11-04 KUKA Roboter GmbH Procédé et dispositif de surveillance d'un manipulateur
EP2136223A1 (fr) * 2008-06-17 2009-12-23 Sick Ag Procédé et dispositif destinés à la saisie d'un objet
US7832059B2 (en) 2007-04-27 2010-11-16 Truetzschler Gmbh & Co. Kg Device on a spinning preparation machine, for example a draw frame, carding machine, combing machine or the like, having a drafting system
DE102009035755A1 (de) * 2009-07-24 2011-01-27 Pilz Gmbh & Co. Kg Verfahren und Vorrichtung zum Überwachen eines Raumbereichs
WO2011104199A1 (fr) * 2010-02-23 2011-09-01 Ifm Electronic Gmbh Système de surveillance
WO2011128117A2 (fr) 2010-04-16 2011-10-20 Fraunhofer-Gesellschaft Zur Förderung Der Angewandten Forschung E.V. . Dispositif de surveillance d'au moins une zone de sécurité tridimensionnelle
DE102012007940A1 (de) * 2012-04-24 2013-10-24 Thyssenkrupp Millservices & Systems Gmbh Verfahren und Vorrichtung zum Schützen von gefährdeten Objekten im Bewegungsbereich einer Krananlage
DE202013104264U1 (de) 2013-09-18 2015-01-09 Daimler Ag Arbeitsstation
DE102013110905A1 (de) * 2013-10-01 2015-04-02 Daimler Ag MRK Planungs- und Überwachungstechnologie
JP2015526309A (ja) * 2012-08-31 2015-09-10 リシンク ロボティクス インコーポレイテッド 安全ロボット動作のためのシステムおよび方法
JP2016087785A (ja) * 2014-11-07 2016-05-23 コマウ・ソシエタ・ペル・アチオニComau Societa Per Azioni 産業用ロボット及び産業用ロボットの制御方法
DE102014226691A1 (de) * 2014-12-19 2016-06-23 Carl Zeiss Industrielle Messtechnik Gmbh Verfahren zur Überwachung eines Koordinatenmessgeräts
JP2016120529A (ja) * 2014-12-24 2016-07-07 セイコーエプソン株式会社 ロボット、ロボットシステム、制御装置、及び制御方法
JP2016159367A (ja) * 2015-02-26 2016-09-05 ファナック株式会社 ロボットの動作モードを自動的に切替えるロボット制御装置
JP2017013205A (ja) * 2015-07-03 2017-01-19 株式会社デンソーウェーブ ロボットシステム
JP2017013204A (ja) * 2015-07-03 2017-01-19 株式会社デンソーウェーブ ロボットシステム
JP2017013203A (ja) * 2015-07-03 2017-01-19 株式会社デンソーウェーブ ロボットシステム
WO2017025551A1 (fr) * 2015-08-10 2017-02-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Dispositif permettant de protéger une zone de sécurité autour d'au moins une machine fonctionnant de manière autonome
DE102016004902A1 (de) * 2016-04-22 2017-10-26 Kuka Roboter Gmbh Überwachung eines Roboters
DE102016007519A1 (de) * 2016-06-20 2017-12-21 Kuka Roboter Gmbh Überwachung einer Anlage mit wenigstens einem Roboter
DE102016007520A1 (de) * 2016-06-20 2017-12-21 Kuka Roboter Gmbh Überwachung einer Roboteranordnung
EP3186046A4 (fr) * 2014-08-26 2018-05-30 Kando Innovation Limited Amélioration de la productivité d'une scie à ruban
JP2019010704A (ja) * 2017-06-30 2019-01-24 Idec株式会社 照光表示装置
JP2019209407A (ja) * 2018-06-01 2019-12-12 セイコーエプソン株式会社 ロボット、制御装置およびロボットの制御方法
CN110561432A (zh) * 2019-08-30 2019-12-13 广东省智能制造研究所 一种基于人机共融的安全协作方法及装置
CN111113374A (zh) * 2018-10-31 2020-05-08 发那科株式会社 机器人系统
CN111226178A (zh) * 2017-10-24 2020-06-02 罗伯特·博世有限公司 监视设备、工业系统、用于监视的方法及计算机程序
CN111464967A (zh) * 2019-01-22 2020-07-28 恩格尔奥地利有限公司 用于调整安全区域的方法
CN111613022A (zh) * 2019-06-05 2020-09-01 北新集团建材股份有限公司 一种生产监测系统
WO2020216569A1 (fr) * 2019-04-26 2020-10-29 Kuka Deutschland Gmbh Procédé et système pour faire fonctionner un robot
DE102018121388B4 (de) * 2017-09-07 2020-12-10 Fanuc Corporation Robotersystem
JP2021003764A (ja) * 2019-06-26 2021-01-14 株式会社ケー・デー・イー 作業ロボット安全システム
WO2021038282A1 (fr) * 2019-08-30 2021-03-04 Telefonaktiebolaget Lm Ericsson (Publ) Système de sécurité adaptatif pour environnement dangereux
JP2021053801A (ja) * 2020-12-11 2021-04-08 Idec株式会社 照光表示装置
EP3812863A1 (fr) * 2019-10-24 2021-04-28 Sick Ag Machine mobile
EP3865257A1 (fr) 2020-02-11 2021-08-18 Ingenieurbüro Hannweber GmbH Équipement et procédés de surveillance et de commande d'un système de travail technique
DE102016000565B4 (de) 2015-01-27 2021-08-19 Fanuc Corporation Robotersystem, bei welchem die Helligkeit des Installationstisches für Roboter verändert wird

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05297944A (ja) * 1992-04-24 1993-11-12 Fujitsu Ltd 移動ロボットの障害物回避方式
DE19938639A1 (de) * 1999-08-14 2001-02-22 Pilz Gmbh & Co Vorrichtung zur Absicherung eines Gefahrenbereichs, insbesondere des Gefahrenbereichs einer automatisiert arbeitenden Maschine
DE10000287A1 (de) * 2000-01-07 2001-07-19 Leuze Lumiflex Gmbh & Co Vorrichtung und Verfahren zur Überwachung eines Erfassungsbereichs an einem Arbeitsmittel
WO2002041272A2 (fr) * 2000-11-17 2002-05-23 Honeywell International Inc. Detection d'objets
WO2002073086A1 (fr) * 2001-03-14 2002-09-19 Honeywell International Inc. Detection d'objets
EP1457730A2 (fr) * 2003-03-13 2004-09-15 Omron Corporation Dispositif de détection d'intrusion d'objets
DE10320343A1 (de) * 2003-05-07 2004-12-02 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren zur überwachten Kooperation zwischen einer Robotereinheit und einem Menschen

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05297944A (ja) * 1992-04-24 1993-11-12 Fujitsu Ltd 移動ロボットの障害物回避方式
DE19938639A1 (de) * 1999-08-14 2001-02-22 Pilz Gmbh & Co Vorrichtung zur Absicherung eines Gefahrenbereichs, insbesondere des Gefahrenbereichs einer automatisiert arbeitenden Maschine
DE10000287A1 (de) * 2000-01-07 2001-07-19 Leuze Lumiflex Gmbh & Co Vorrichtung und Verfahren zur Überwachung eines Erfassungsbereichs an einem Arbeitsmittel
WO2002041272A2 (fr) * 2000-11-17 2002-05-23 Honeywell International Inc. Detection d'objets
WO2002073086A1 (fr) * 2001-03-14 2002-09-19 Honeywell International Inc. Detection d'objets
EP1457730A2 (fr) * 2003-03-13 2004-09-15 Omron Corporation Dispositif de détection d'intrusion d'objets
DE10320343A1 (de) * 2003-05-07 2004-12-02 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren zur überwachten Kooperation zwischen einer Robotereinheit und einem Menschen

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2447772A (en) * 2007-03-23 2008-09-24 Truetzschler Gmbh & Co Kg Apparatus for monitoring and securing danger zones on power-driven textile machines
US8214072B2 (en) 2007-03-23 2012-07-03 Truetzschler Gmbh & Co. Kg Apparatus for monitoring and securing danger zones on power-driven textile machines
GB2447772B (en) * 2007-03-23 2011-09-28 Truetzschler Gmbh & Co Kg Apparatus for monitoring and securing danger zones on power-driven textile machines
US7832059B2 (en) 2007-04-27 2010-11-16 Truetzschler Gmbh & Co. Kg Device on a spinning preparation machine, for example a draw frame, carding machine, combing machine or the like, having a drafting system
US8452443B2 (en) 2007-10-01 2013-05-28 Abb Research Ltd Method for controlling a plurality of axes in an industrial robot system and an industrial robot system
WO2009043369A1 (fr) * 2007-10-01 2009-04-09 Abb Technology Ab Procédé de commande d'une pluralité d'axes dans un système robotisé industriel et système robotisé industriel
EP2053538A1 (fr) * 2007-10-25 2009-04-29 Sick Ag Protection d'une zone de surveillance et support visuel d'un usinage automatisé
DE102008021671B4 (de) * 2008-04-30 2013-04-11 Kuka Laboratories Gmbh Verfahren und Vorrichtung zur Überwachung eines Manipulators
EP2113344A1 (fr) * 2008-04-30 2009-11-04 KUKA Roboter GmbH Procédé et dispositif de surveillance d'un manipulateur
EP2136223A1 (fr) * 2008-06-17 2009-12-23 Sick Ag Procédé et dispositif destinés à la saisie d'un objet
CN102483846A (zh) * 2009-07-24 2012-05-30 皮尔茨公司 用于监控空间区域的方法和设备
US20120182419A1 (en) * 2009-07-24 2012-07-19 Wietfeld Martin Method and device for monitoring a spatial region
WO2011009933A1 (fr) * 2009-07-24 2011-01-27 Pilz Gmbh & Co. Kg Procédé et dispositif de surveillance d’un volume
US9292924B2 (en) 2009-07-24 2016-03-22 Pilz Gmbh & Co. Kg Method and device for monitoring a spatial region
DE102009035755A1 (de) * 2009-07-24 2011-01-27 Pilz Gmbh & Co. Kg Verfahren und Vorrichtung zum Überwachen eines Raumbereichs
WO2011104199A1 (fr) * 2010-02-23 2011-09-01 Ifm Electronic Gmbh Système de surveillance
US10245728B2 (en) 2010-02-23 2019-04-02 pmdtechnologies ag Monitoring system
DE102010002250B4 (de) 2010-02-23 2022-01-20 pmdtechnologies ag Überwachungssystem
WO2011128117A2 (fr) 2010-04-16 2011-10-20 Fraunhofer-Gesellschaft Zur Förderung Der Angewandten Forschung E.V. . Dispositif de surveillance d'au moins une zone de sécurité tridimensionnelle
DE102012007940A1 (de) * 2012-04-24 2013-10-24 Thyssenkrupp Millservices & Systems Gmbh Verfahren und Vorrichtung zum Schützen von gefährdeten Objekten im Bewegungsbereich einer Krananlage
JP2015526309A (ja) * 2012-08-31 2015-09-10 リシンク ロボティクス インコーポレイテッド 安全ロボット動作のためのシステムおよび方法
US9981394B2 (en) 2013-09-18 2018-05-29 Kuka Systems Gmbh Workstation
CN105555490A (zh) * 2013-09-18 2016-05-04 库卡系统有限责任公司 工作站
DE202013104264U1 (de) 2013-09-18 2015-01-09 Daimler Ag Arbeitsstation
WO2015040071A1 (fr) * 2013-09-18 2015-03-26 Kuka Systems Gmbh Station de travail
CN105555490B (zh) * 2013-09-18 2019-01-08 库卡系统有限责任公司 工作站
DE102013110905A1 (de) * 2013-10-01 2015-04-02 Daimler Ag MRK Planungs- und Überwachungstechnologie
EP3186046A4 (fr) * 2014-08-26 2018-05-30 Kando Innovation Limited Amélioration de la productivité d'une scie à ruban
JP2016087785A (ja) * 2014-11-07 2016-05-23 コマウ・ソシエタ・ペル・アチオニComau Societa Per Azioni 産業用ロボット及び産業用ロボットの制御方法
DE102014226691A1 (de) * 2014-12-19 2016-06-23 Carl Zeiss Industrielle Messtechnik Gmbh Verfahren zur Überwachung eines Koordinatenmessgeräts
US10451400B2 (en) 2014-12-19 2019-10-22 Carl Zeiss Industrielle Messtechnik Gmbh Machine and method for monitoring a coordinate measuring device
JP2016120529A (ja) * 2014-12-24 2016-07-07 セイコーエプソン株式会社 ロボット、ロボットシステム、制御装置、及び制御方法
DE102016000565B4 (de) 2015-01-27 2021-08-19 Fanuc Corporation Robotersystem, bei welchem die Helligkeit des Installationstisches für Roboter verändert wird
JP2016159367A (ja) * 2015-02-26 2016-09-05 ファナック株式会社 ロボットの動作モードを自動的に切替えるロボット制御装置
JP2017013203A (ja) * 2015-07-03 2017-01-19 株式会社デンソーウェーブ ロボットシステム
JP2017013204A (ja) * 2015-07-03 2017-01-19 株式会社デンソーウェーブ ロボットシステム
US10434666B2 (en) 2015-07-03 2019-10-08 Denso Wave Incorporated Industrial robot system optically indicating motion area of robot
JP2017013205A (ja) * 2015-07-03 2017-01-19 株式会社デンソーウェーブ ロボットシステム
US10569431B2 (en) 2015-07-03 2020-02-25 Denso Wave Incorporated Industrial robot system optically indicating motion area of robot
US11310887B2 (en) 2015-08-10 2022-04-19 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e. V. Device for securing a safety area around at least one automatically operating machine
WO2017025551A1 (fr) * 2015-08-10 2017-02-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Dispositif permettant de protéger une zone de sécurité autour d'au moins une machine fonctionnant de manière autonome
DE102016004902A1 (de) * 2016-04-22 2017-10-26 Kuka Roboter Gmbh Überwachung eines Roboters
DE102016007519A1 (de) * 2016-06-20 2017-12-21 Kuka Roboter Gmbh Überwachung einer Anlage mit wenigstens einem Roboter
DE102016007520A1 (de) * 2016-06-20 2017-12-21 Kuka Roboter Gmbh Überwachung einer Roboteranordnung
JP2019010704A (ja) * 2017-06-30 2019-01-24 Idec株式会社 照光表示装置
DE102018121388B4 (de) * 2017-09-07 2020-12-10 Fanuc Corporation Robotersystem
US11440188B2 (en) 2017-09-07 2022-09-13 Fanuc Corporation Robot system
CN111226178A (zh) * 2017-10-24 2020-06-02 罗伯特·博世有限公司 监视设备、工业系统、用于监视的方法及计算机程序
CN111226178B (zh) * 2017-10-24 2023-12-08 罗伯特·博世有限公司 监视设备、工业系统、用于监视的方法及计算机程序
JP2019209407A (ja) * 2018-06-01 2019-12-12 セイコーエプソン株式会社 ロボット、制御装置およびロボットの制御方法
JP7206638B2 (ja) 2018-06-01 2023-01-18 セイコーエプソン株式会社 ロボット、制御装置およびロボットの制御方法
CN111113374A (zh) * 2018-10-31 2020-05-08 发那科株式会社 机器人系统
CN111464967A (zh) * 2019-01-22 2020-07-28 恩格尔奥地利有限公司 用于调整安全区域的方法
CN111464967B (zh) * 2019-01-22 2022-09-13 恩格尔奥地利有限公司 用于调整安全区域的方法
WO2020216569A1 (fr) * 2019-04-26 2020-10-29 Kuka Deutschland Gmbh Procédé et système pour faire fonctionner un robot
CN113966265A (zh) * 2019-04-26 2022-01-21 库卡德国有限公司 用于运行机器人的方法和系统
CN111613022A (zh) * 2019-06-05 2020-09-01 北新集团建材股份有限公司 一种生产监测系统
JP2021003764A (ja) * 2019-06-26 2021-01-14 株式会社ケー・デー・イー 作業ロボット安全システム
JP7162345B2 (ja) 2019-06-26 2022-10-28 株式会社ケー・デー・イー 作業ロボット安全システム
WO2021038282A1 (fr) * 2019-08-30 2021-03-04 Telefonaktiebolaget Lm Ericsson (Publ) Système de sécurité adaptatif pour environnement dangereux
CN110561432A (zh) * 2019-08-30 2019-12-13 广东省智能制造研究所 一种基于人机共融的安全协作方法及装置
EP3812863A1 (fr) * 2019-10-24 2021-04-28 Sick Ag Machine mobile
EP3865257A1 (fr) 2020-02-11 2021-08-18 Ingenieurbüro Hannweber GmbH Équipement et procédés de surveillance et de commande d'un système de travail technique
JP2021053801A (ja) * 2020-12-11 2021-04-08 Idec株式会社 照光表示装置
JP7137609B2 (ja) 2020-12-11 2022-09-14 Idec株式会社 照光表示装置

Similar Documents

Publication Publication Date Title
WO2007085330A1 (fr) Procédé et système permettant la supervision d'une zone de travail comportant un robot industriel
JP6822719B2 (ja) 自動パッケージスキャンおよび登録メカニズムを備えたロボットシステム、ならびにその動作方法
JP7284953B2 (ja) 高度化したスキャンメカニズムを有するロボットシステム
JP6971223B2 (ja) 自律移動ロボットと自律移動ロボットの基地局とを有するシステム、自律移動ロボットの基地局、自律移動ロボットのための方法、自律移動ロボットの基地局への自動ドッキング方法
US11046530B2 (en) Article transfer apparatus, robot system, and article transfer method
CN1922473B (zh) 用于设计检测路径及用于确定待检测区域的方法
KR102056664B1 (ko) 센서를 이용한 작업 방법 및 이를 수행하는 작업 시스템
US20130238124A1 (en) Information processing apparatus and information processing method
CN108942916B (zh) 工件取出系统
Khazetdinov et al. Embedded ArUco: a novel approach for high precision UAV landing
CN104842347A (zh) 防止搬送对象物的落下事故的机器人系统
JP6424560B2 (ja) 異常原因推定装置、ピッキング装置及びピッキング装置における異常原因推定方法
CN106643661A (zh) 基于机器视觉的轨道式起重机吊具位姿检测系统及方法
Bostelman et al. Dynamic metrology performance measurement of a six degrees-of-freedom tracking system used in smart manufacturing
Frese et al. Multi-sensor obstacle tracking for safe human-robot interaction
CN205472298U (zh) 一种集装箱起重机自动化检测标定系统
Vogel et al. A projection-based sensor system for ensuring safety while grasping and transporting objects by an industrial robot
Janković et al. System for indoor localization of mobile robots by using machine vision
Grabowski et al. Vision safety system based on cellular neural networks
KR100312641B1 (ko) 자동화 용접장치의 센싱 시스템
Sansoni et al. Combination of 2D and 3D vision systems into robotic cells for improved flexibility and performance
JP2562047B2 (ja) 対象物体の位置姿勢認識方法
Qiao et al. Auto-calibration for vision-based 6-D sensing system to support monitoring and health management for industrial robots
Ramer et al. Work space surveillance of a robot assistance system using a ToF camera
Halmheu et al. Laser scanner detection and localization of successively arranged mobile robots

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06819955

Country of ref document: EP

Kind code of ref document: A1