WO2008047872A1 - Manipulateur - Google Patents

Manipulateur Download PDF

Info

Publication number
WO2008047872A1
WO2008047872A1 PCT/JP2007/070360 JP2007070360W WO2008047872A1 WO 2008047872 A1 WO2008047872 A1 WO 2008047872A1 JP 2007070360 W JP2007070360 W JP 2007070360W WO 2008047872 A1 WO2008047872 A1 WO 2008047872A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
manipulator
gripping
image input
arm
Prior art date
Application number
PCT/JP2007/070360
Other languages
English (en)
Japanese (ja)
Inventor
Saku Egawa
Original Assignee
Hitachi, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi, Ltd. filed Critical Hitachi, Ltd.
Priority to CN2007800378723A priority Critical patent/CN101522377B/zh
Priority to JP2008539872A priority patent/JPWO2008047872A1/ja
Publication of WO2008047872A1 publication Critical patent/WO2008047872A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/20Drives; Control devices
    • E02F9/2025Particular purposes of control systems not otherwise provided for
    • E02F9/2033Limiting the movement of frames or implements, e.g. to avoid collision between implements and the cabin

Definitions

  • the present invention relates to a manipulator that holds an object with an arm and performs positioning and transportation.
  • a manipulator is a device in which a joint and an arm (arm) are combined like a human arm, and is a generic name for a device that grips an object and performs positioning and transportation.
  • manipulators include a gripping mechanism that grips an object and an arm mechanism that moves the gripping mechanism.
  • the manipulator includes a mechanism that automatically controls movement of the arm and a mechanism that is operated by a person.
  • Examples of manipulators whose arms are automatically controlled include industrial robot arms used for parts transportation and assembly in factories, and service robots that perform tasks such as housework and nursing care in public space 'office' homes. There are arms.
  • Examples of manipulators that operate the arm by humans include construction machines that handle large 'heavy objects', master-slave manipulators that are used in space environments and nuclear power facilities, and medical operation support manipulators. .
  • the object to be grasped is applied to a simple shape by image recognition, its size and orientation are calculated, and the grasping method is obtained based on it.
  • a method of reliably grasping an object of arbitrary shape has been proposed.
  • an ultrasonic sensor provided in an arm type robot main body detects a surrounding moving object and moves the robot arm when the distance from the main body falls within a certain range.
  • a robot apparatus that performs speed reduction control is disclosed.
  • an object region image indicating a learning target object is cut out by moving a learning target object by bringing a movable part such as an arm unit into contact with the learning target object,
  • a robot apparatus that extracts features from the object region image and registers them in the object model database and a learning method thereof are disclosed.
  • the means for extracting the image of the target object uses a method of extracting an area that has changed before and after the target object is moved from the captured image.
  • surrounding objects other than the object to be gripped may move.
  • a method for extracting an object from an image when the object is moving a method for extracting a changed portion from a plurality of images at different times is known. If the surrounding object moves, the moving background part will be mistaken for the object, and the object cannot be extracted correctly.
  • the object to be grasped is surely recognized in a general environment. It was difficult.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2000-202790
  • Patent Document 2 Japanese Patent Laid-Open No. 2005-128959
  • the present invention has been made in view of the force and the point, and in the situation where the object being grasped and the surrounding object can move arbitrarily, the shape of the unknown grasped object can be reliably recognized and recognized.
  • an object is to perform a work based on the shape and to provide a manipulator capable of preventing the gripped object from coming into contact with surrounding objects and people and thereby improving safety.
  • the manipulator of the present invention includes an arm, arm driving means for driving the arm, a gripping portion provided on the arm, an image input means for acquiring a peripheral image of the gripping portion, and the image input Gripper relative position detection means for detecting the relative position of the gripper with respect to the means, a plurality of images acquired by the image input means, and the image input means detected by the gripper relative position detection means
  • Storage means for storing the relative position of the gripping part, and based on the plurality of images stored in the storage means and the relative position of the gripping part to the image input means! It detects the position and shape of an object.
  • FIG. 1 is a schematic diagram showing an example of the overall configuration according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing a system configuration example of a manipulator device according to an embodiment of the present invention.
  • FIG. 3 is a flowchart showing an overall processing example of contact possibility determination processing according to an embodiment of the present invention.
  • FIG. 4 is a flowchart showing a processing example of a gripping object position ′ shape determination process according to an embodiment of the present invention.
  • FIG. 5 is an explanatory diagram showing an example of a grayscale image according to an embodiment of the present invention.
  • FIG. 6 is an explanatory view showing an example of image processing in the gripping object position ′ shape determination processing according to the embodiment of the present invention.
  • FIG. 7 is a flowchart showing an example of a contact possibility determination process according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram showing an example of the overall configuration according to another embodiment of the present invention.
  • FIG. 1 shows an example of the overall configuration of the manipulator in the present embodiment
  • FIG. 2 shows an example of the system configuration of the manipulator in the present embodiment.
  • the manipulator 101 in this embodiment includes a base 105, a plurality of arms (arms) 103 and 104 that are directly connected to the base, and a grip portion 102 attached to the tip of the arm.
  • a force with two arms may be provided.
  • Each joint has actuators 132, 133, and 134 (shown in FIG. 1 overlaid on joints 112, 113, and 114) and joints to measure the amount of change in the movable part.
  • Angle sensors 122, 123, 124 that measure the angle of 112, 113, 114 (in Fig. 1, joint 112,
  • the angle sensor acquires rotation angles in a plurality of directions.
  • a control device 106 that inputs angle information acquired by the angle sensor and controls the actuator to move the manipulator 101 is provided.
  • the control device 106 has various installation positions depending on the use and shape of the manipulator 101.
  • FIG. 1 shows the control device 106 in the vicinity of the manipulator 101.
  • the manipulator mainly controls the position of the gripping part by using an arm connected in series from the base, and rotates the joint between the arm at the tip and the gripping part, so that the posture of the gripping part is increased.
  • the arm 103 in front of the grip portion 102 is referred to as the forearm 103 for comparison with the human structure.
  • the joint 112 between the forearm 103 and the grip portion 102 is called a wrist joint 112. Even when the wrist joint 112 is rotated, the position of the tip of the grip portion 102 moves, but since the length of the grip portion is shorter than that of the arms 103 and 104, the amount of change in the position is relatively small.
  • the part corresponding to the wrist joint is composed of multiple joints, short arms, and arms! /, Which may be a force S. Here, these are collectively regarded as one wrist joint.
  • the manipulator 101 of the present embodiment mounted on the manipulator 101 includes an image input device 2 that is an image input unit that acquires a three-dimensional image of an object near the grip portion 102, and a surrounding monitoring unit that can detect the position of a surrounding object 108, Ambient monitoring device 3 is provided.
  • the image input device 2 is attached to the forearm 103, and the surroundings monitoring device 3 is provided on the base 105 of the manipulator 101 so as to acquire a wider range of surrounding images than the image input device 2.
  • the image input device 2 may be provided on the arm 104 as long as it can acquire an image in the vicinity of the grip portion 102, but is preferably provided on the forearm (arm) 103 if the joint 113 can be inserted therebetween. Further, an angle sensor 122 that measures the angle of the wrist joint 112 is used as a gripper relative position detection unit that detects a change in the position of the gripper 102 relative to the image input device 2.
  • the image storage unit 301 is a storage unit that stores the stereoscopic image data acquired by the image input device 2 and the angle information of the angle sensor 122 of the wrist joint 112 acquired from the control device 106!
  • the image of the gripping object 109 is extracted using the plurality of images stored in the image storage unit 301 and the relative position / posture information of the gripping unit, and the position and shape of the gripping object are detected.
  • An image extraction unit 302 and based on the detected position and shape of the gripping object 109 and the position information of the surrounding object 108 detected by the surrounding monitoring device 3, the gripping object 109 and the surrounding object 108
  • a contact possibility determination device 4 which is a contact possibility determination means for determining the possibility of contact with the user
  • an alarm device 5 which is an alarm means for notifying the possibility of contact by sound / image.
  • the contact possibility determination device 4 notifies the control device 106 of the possibility of contact, thereby restricting the operation of the manipulator and preventing contact.
  • the contact possibility determination device 4 and the alarm device 5 are illustrated separately from the manipulator 101 and the control device 6, but may be configured to be incorporated in the manipulator 101 and the control device 6.
  • the image input apparatus 2 has a function of acquiring a distance image including a grayscale image and distance information to each point of the image.
  • the image input device 2 can be a stereo camera that measures the distance by image shift using two or more cameras, or a round trip until the laser beam hits an object and returns after scanning in one or two dimensions.
  • a laser radar that measures distance according to time
  • a combination of a plurality of the above image input devices may be used.
  • the surroundings monitoring device 3 has a function of acquiring a distance image.
  • the warning device 5 a buzzer, a lamp, a display, etc. are used.
  • the distance image is a term used in the field of image processing technology, and means data in which the value of the distance to the object is stored for each pixel of the image.
  • a normal image (grayscale image) is a two-dimensional representation of the brightness of light in a number of directions divided into a grid pattern, that is, the intensity of light reflected by an object in that direction.
  • the distance image stores the distance to the object in that direction instead of the brightness of the light for each point in the image.
  • a distance image is two-dimensional array data that stores the distance to an object, and is generally used to store the output of a sensor that can obtain depth information such as a stereo camera.
  • the manipulator 101 has a function of recognizing the position and shape of the gripping object 109 when the gripping unit 102 is gripping the target gripping object 109.
  • the following is the grip The description will be made on the assumption that the body 109 is held.
  • the target gripping object is recognized by recognizing the image using the shape feature based on the image information acquired by the image input device 2.
  • the gripping object 109 is gripped by measuring the position of the easy-to-grip point in 109 and positioning the gripping part at that position.
  • the operator visually determines the position and shape of the gripping object 109 and moves the manipulator by manual operation to grip the gripping object 109.
  • the manipulator 101 in this embodiment moves the manipulator 101 in a state where the gripping object 109 is gripped, and the movement of the gripping object in the image near the gripping part 102 is compared with the gripping part 102 and the image input device 2
  • the gripping object 109 is predicted from the image by predicting the gripping object 109 based on the change in the relative position of the image, and extracting the part that moves the same as the prediction from the image of the image input device 2. Extract high.
  • FIG. 3 is a flowchart showing an overall outline of a processing example of the image storage unit 301, the image extraction unit 302, and the contact possibility determination device 4.
  • This process starts when the gripping object 109 is gripped by the gripping part 102 of the manipulator 101.
  • stereoscopic image information around the grip portion 102 that is, a grayscale image 12 and a distance image 13 are acquired from the image input device 2, and at the same time, the wrist joint 112 measured by the angle sensor 122 through the control device 106 is obtained.
  • the angle 14 is acquired and stored in the image storage unit 301 (step Sl).
  • Information acquired before the movement of the manipulator 101 is referred to by the grayscale image 12a, the distance image 13a, and the wrist joint angle 14a in the subsequent processing.
  • the manipulator 101 starts moving and waits until the gripping part 102 moves a little (step S2).
  • Step S3 The information acquired after the movement is referred to by the grayscale image 12b, the distance image 13b, and the wrist joint angle 14b in the subsequent processing.
  • Step S4 the position / shape of the grasped object 109 is determined by the image extraction unit 302 based on the information acquired in Step 1 and Step 3 (Step S4).
  • the contact possibility determination device 4 acquires the position information of the surrounding object from the surrounding monitoring device 3, and compares it with the position of the gripping object 109 determined in step S4. (Step S5).
  • step S6 The result of the contact possibility determination process in step S5 is determined (step S6), and if there is a possibility of contact, an alarm is output by the alarm device 5 (step S7).
  • an alarm output method for example, in the case of an automatic manipulator, the control device 106 is notified and the manipulator is stopped, and in the case of an operation type manipulator, the manipulator operation is performed by voice output or lighting of a warning lamp. It is possible to tell the person.
  • the contact possibility determination process if there is no possibility of contact, the process proceeds to the next process. Finally, it is determined whether or not the grip portion 102 of the manipulator 101 is gripping the gripped object 109 (step S8).
  • step S9 If the object is being gripped, the grayscale image 12b, distance image 13b, Replace the information of angle 14b of wrist joint 112 with the information before the movement (step S9), return to the process of step S2, repeat the process, and continue to monitor the possibility of contact between the gripped object and surrounding objects . If it is determined in step S8 that the gripper 102 has not already gripped the object, the process ends.
  • step S4 a processing example of the gripping object position / shape determination process (step S4) in the image extraction unit 302 will be described with reference to the flowchart of FIG.
  • the position 'shape of the grasped object 109 is determined based on the grayscale image, the distance image, and the wrist contact angle information acquired in step S1 and step S3.
  • the moved gray image 12b and distance image 13b are divided into grid-like blocks (step S31).
  • the size of the block is determined in advance. The smaller the block, the higher the position resolution, and the larger the block, the better the image matching accuracy described later. Usually, the block is set to about 5 X 5 pixels (pixels) to 25 X 25 pixels.
  • the forearm 103 appears in the image as shown in the grayscale image example in FIG.
  • step S32 one piece of block information of the divided gray image 12b after movement is extracted, and the following processing is performed as block B (i) (step S32).
  • the subsequent image processing will be described with reference to the image processing example shown in FIG. First, the spatial position Pb (i) of the point Q (i) in the block B (i) is obtained (step S33). That is, the point of the object in the center of block B (i) is Q (i), and the position (two-dimensional coordinates) of the point Q (i) on the image is obtained and Rb (i).
  • step S34 the spatial position Pa (i) of Q (i) before movement when it is assumed that the point Q (i) is fixed to the grip portion 102 is obtained (step S34). That is, assuming that the point Q (i) is a point on the grasped object 109 fixed to the grasping part 102, the angle 14 of the wrist joint 112 is changed from the wrist angle 14b after the current movement to the wrist angle 14a before the movement 14a.
  • the three-dimensional spatial position (relative to image input device 2) when rotated to the position is obtained by coordinate transformation, and this is defined as Pa (i) (J2 in Fig. 6).
  • the position (two-dimensional coordinate) Ra (i) where the spatial position Pa (i) appears on the image of the image input device 2 is obtained by projection transformation (step S35) (J3 in FIG. 6). Note that the spatial position Pa (i) is also a relative position with respect to the image input device 2 in the same manner as P b (i).
  • a comparison is made to determine whether the images match (step S36). If the images match as a result of the determination (step S37), a gripping object mark indicating that the object is on the gripping object 109 is attached to block B (i) (step S38).
  • the position Ra (i) is assumed that the point Q (i) shown in the block B (i) on the grayscale image 12b after movement is a part of the grasped object 109. It is presumed that the point Q (i) appears in the gray image 12 a before moving. Therefore, if the point Q (i) is actually a part of the grasped object 109, the image of the block B (i) and the image of the partial image 21 should match. On the other hand, if Q (i) is different from the assumption, if the object is a background object rather than a part of the grasped object 109, the images do not match. This is because the manipulator 101 moves when Q (i) is a background object.
  • the point Q (i) appears at a different location from the position Ra (i) on the gray image 12a before movement. It is. Therefore, when the images match, it can be determined that the point Q (i) is a part of the grasped object 109.
  • an image matching technique such as normalized correlation is used for determining image matching.
  • the normalized correlation value between the image of the block B (i) on the gray image 12b after movement and the partial image 21 at the position Ra (i) on the gray image 12a before movement is obtained, and the value is determined in advance.
  • it is larger than the threshold! /, It is determined that the images match.
  • step S39 it is determined whether or not all blocks have been processed. If there is a block that has not been processed yet, the above steps S32 to S38 are repeated to process all blocks. When it is completed, this process ends. With the above processing, the 3D spatial position of the point on the grasped object can be detected, and the position 'shape of the grasped object can be known.
  • the manipulator is moved while the gripping object is gripped, and the gripping object motion in the image at that time is determined using the relative position of the gripping unit and the image input device measured by the sensor.
  • the relative position between the grip unit 102 and the image input device 2 is measured, and the image is captured so that the grip unit 102 shown in the image does not move before and after the movement of the manipulator 101.
  • step S5 of FIG. 3 a processing example of the contact possibility determination process (step S5) of FIG. 3 will be described with reference to the flowchart of FIG.
  • This process is performed by the contact possibility determination device 4. Compare the position of the point on the gripped object 109 extracted by the image extraction process described above with the position of the surrounding object obtained from the surrounding monitoring device 4, and if there is a close point, the possibility of contact It is determined that there is.
  • step S41 position information of surrounding objects is obtained from the surrounding monitoring device 3 (step S41).
  • step S42 block with the gripping object mark is extracted (step S42).
  • Block B with the extracted gripping object mark The spatial position Pb (i) corresponding to (i) is compared with the positions of surrounding objects acquired in step S41 (step S43). As a result, if the distance force is smaller than a predetermined threshold value, it is determined that the position is close (step S44), and if there is even one block that is close in position, the gripping object 109 and surrounding objects are detected. If there is a possibility of contact (step S45), the process is terminated.
  • step S44 if it is determined that the position is not close, it is determined whether or not all blocks that have been gripped are marked and processed (step S46). When the block processing is completed, it is determined that there is no possibility of contact (step S47), and the processing is terminated. If there is a block that has not been processed yet, the above steps S42 to S44 are repeated.
  • the gripping in the image is performed by image matching using the difference in the movement of the gripping object 109 and the surrounding object 108 on the image when the manipulator 101 is moved. Since the portion of the object 109 is detected, the shape 'position of the grasped object 109 can be reliably detected even when the background is complicated or there is a moving portion in the background. For this reason, even in a general usage environment, the possibility of approach between the gripped object 109 and surrounding objects is reliably determined, and if there is an object approaching the gripped object 109 using the determined result, the manipulator 101 is operated. Can be warned.
  • the grasped object 109 can always be grasped at the center of the visual field. For this reason, compared to the case where the image input device 2 is attached to the base 105 or the like, the grasped object 109 does not move much in the field of view (in the image), so that the grasped object 109 can be detected reliably and the viewing angle is limited Because it can do S, high resolution information can be obtained. Also, the distance force S between the image input device 2 and the gripping object 109 is moderately separated compared to the case where the image input device 2 is attached to the gripping part 102, so the overall shape of the gripping object 109 can be easily monitored. it can.
  • the force manipulator is used to determine the approaching state with the surrounding object 108 using the current position of the gripping object 109 detected based on the image information acquired from the image input device 2.
  • the surroundings monitoring device 3 is provided separately from the image input device 2.
  • the surrounding object 108 is connected to the image input device 2.
  • map information prepared in advance may be stored in the contact possibility determination device 4, and position information of surrounding objects may be acquired by referring to the map information.
  • the image input device 2 has a function of acquiring an image of the surrounding object 108
  • the image is extracted from the stereoscopic image input by the image input device 2 by the above method.
  • the surrounding object 108 is extracted from the background excluding the gripped object 109, and the three-dimensional coordinates of the object are stored.
  • the background force excluding the extracted grasped object 109 is also extracted as the surrounding object 108
  • the position of the surrounding object 108 is not basically changed by the operation of the manipulator 101. Therefore, the image input device 2 and the surrounding monitoring device 3 It is only necessary to obtain a surrounding distance image.
  • a stereoscopic image sensor such as a stereo camera is used as the image input device 2, but instead, a monocular camera such as a CCD may be used as a simple configuration.
  • a monocular camera such as a CCD
  • the distance may be estimated from the image of the monocular camera by assuming that the gripping object 109 is on one virtual plane fixed to the gripping portion 102.
  • the virtual surface is preferably a place where the probability that the gripping object 109 is high is preferably set to a surface that passes through the tip of the gripping portion 102 and is orthogonal to the gripping portion.
  • the image input device 2 can be attached to other places with the force attached to the forearm 103 of the manipulator.
  • the image input device 2 was attached to the base 105.
  • step S34 in the flowchart of the shape determination process shown in Fig. 4 the spatial position Pa (i) before the movement of the point Q (i) can be obtained considering the angle change of joints other than the wrist. That's fine.
  • the grasped object 109 moves greatly in the field of view.
  • the position of the image input device 2 is fixed, so that the structure becomes simple and the image input device 2 also serves as the surrounding monitoring device 3. There is an advantage that you can.
  • step S34 the spatial position Pa (i ) Should be equal to the position Pb (i) after movement.
  • the problem that the grasped object 109 is too close to the image input device 2 has an advantage that a certain force calculation process is simplified.
  • an angle sensor that measures the angle of the wrist joint is used as the gripper relative position detection unit that detects the relative positional relationship between the image input device and the gripper.
  • the relative position relationship may be obtained by detecting the position of the image input device and the gripping portion by other methods.
  • the method of measuring the position / posture includes the method of obtaining the position / posture by photographing with a camera with the object to be measured outside.
  • FIG. 8 shows a configuration example when the manipulator of this embodiment is mounted on a work machine used for forestry or demolition work.
  • the work machine 201 includes a grapple 2002, which is a gripping part, an arm 203, and an arm 204 as manipulators.
  • the work machine 201 is used for applications such as grabbing an object by a grappet nozzle 202 for disassembly and transportation.
  • the image input device 2 constituting the manipulator of the present embodiment is attached to the bottom surface of the arm 203 corresponding to the forearm. This location is suitable for capturing the object gripped by the grapple 202 because the positional relationship with the grapple 202 does not change greatly and is appropriately separated from the grapple 202.
  • the surrounding monitoring device 3 is attached to the upper part of the cabin 209 to cover a wide field of view.
  • An angle sensor 222 is attached to the joint 212 corresponding to the wrist as a wrist angle sensor.
  • the contact possibility determination device 4 and the alarm device 5 are illustrated to be installed inside the cabin 209. It may be installed in other places as long as it can communicate with each device that does not interfere with the operation of force S and manipulator.
  • the contact possibility determination device 4 uses the stereoscopic image around the grip nore 202 acquired by the image input device 2 and the angle of the joint 212 acquired from the angle sensor 222, so that the grapple 202 The gripping object is detected and compared with the position of the surrounding object detected by the surroundings monitoring device 3 to determine whether there is a possibility of contact. If there is a possibility of contact, an alarm device 5 informs the operator. There are methods for the alarm device 5 to notify the operator of the possibility of contact, such as giving vibration to the operation lever in addition to the notification method by voice or image. In addition, a series of image information related to processing in the contact possibility determination device 4 may be displayed by a display means, not shown.
  • the manipulator of this embodiment by applying the manipulator of this embodiment, the burden on the operator can be reduced. Also, as in this example, in a complex work environment with many obstacles such as forestry, demolition work, construction work, etc., by warning in advance the possibility of contact between the grasped object and the surrounding object 108, Work safely and quickly.
  • the shape of the gripped object of the manipulator can be reliably recognized, and the gripping object is a surrounding object. This can increase the safety of the manipulator.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Manipulator (AREA)

Abstract

La position et la forme d'un objet (109) maintenu par une section de maintien (102) sont détectées en fonction d'une image de la périphérie de la section de maintien (102) obtenue par un dispositif d'entrée d'image (2) prévu sur un bras (103) d'un manipulateur (101) et en fonction d'un changement de position du manipulateur (101) détecté par un capteur angulaire (122) prévu à une articulation (112) entre la section de maintien (102) et le bras (103). En outre, un dispositif de détermination de possibilité de contact (4) détecte la forme et la position de l'objet (109) et les compare avec la position d'un objet périphérique (108), détecté par un dispositif de contrôle de la périphérie (3), pour déterminer la possibilité de contact entre les objets. En outre, lorsqu'il est possible que l'objet (109) entre en contact avec l'objet périphérique (108), le déplacement du manipulateur (101) est arrêté ou l'approche entre l'objet maintenu (109) et l'objet périphérique (108) est signalée par des moyens d'avertissement (5).
PCT/JP2007/070360 2006-10-20 2007-10-18 Manipulateur WO2008047872A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN2007800378723A CN101522377B (zh) 2006-10-20 2007-10-18 机械手
JP2008539872A JPWO2008047872A1 (ja) 2006-10-20 2007-10-18 マニピュレータ

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-286419 2006-10-20
JP2006286419 2006-10-20

Publications (1)

Publication Number Publication Date
WO2008047872A1 true WO2008047872A1 (fr) 2008-04-24

Family

ID=39314090

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/070360 WO2008047872A1 (fr) 2006-10-20 2007-10-18 Manipulateur

Country Status (3)

Country Link
JP (1) JPWO2008047872A1 (fr)
CN (1) CN101522377B (fr)
WO (1) WO2008047872A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010127719A (ja) * 2008-11-26 2010-06-10 Canon Inc 作業システム及び情報処理方法
JP2010271978A (ja) * 2009-05-22 2010-12-02 Nippon Telegr & Teleph Corp <Ntt> 行動推定装置
WO2011077693A1 (fr) * 2009-12-21 2011-06-30 Canon Kabushiki Kaisha Système robotisé pour réorienter une pièce retenue
CN102189548A (zh) * 2010-03-05 2011-09-21 发那科株式会社 具有视觉传感器的机器人系统
JP2011200331A (ja) * 2010-03-24 2011-10-13 Fuji Xerox Co Ltd 位置計測システム、位置計測装置及び位置計測プログラム
JP2013036988A (ja) * 2011-07-08 2013-02-21 Canon Inc 情報処理装置及び情報処理方法
JP2013036987A (ja) * 2011-07-08 2013-02-21 Canon Inc 情報処理装置及び情報処理方法
US20130306543A1 (en) * 2010-11-08 2013-11-21 Fresenius Medical Care Deutschland Gmbh Manually openable clamping holder with sensor
US9217636B2 (en) 2012-06-11 2015-12-22 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and a computer-readable storage medium
US20190024348A1 (en) * 2016-03-02 2019-01-24 Kabushiki Kaisha Kobe Seiko Sho (Kobe Steel, Ltd.) Interference prevention device for construction machinery
WO2021070454A1 (fr) * 2019-10-10 2021-04-15 清水建設株式会社 Robot pour travail de construction
JP2021175592A (ja) * 2016-05-20 2021-11-04 グーグル エルエルシーGoogle LLC 物体を取り込む画像に基づき、環境内の将来のロボット運動に関するパラメータに基づいて、ロボットの環境内の物体の動きを予測することに関する機械学習の方法および装置
EP4088888A1 (fr) * 2021-05-14 2022-11-16 Intelligrated Headquarters, LLC Détection de la hauteur des objets pour les opérations de palettisation et de dépalettisation
JP2023029576A (ja) * 2018-04-27 2023-03-03 新明和工業株式会社 作業車両
US11618120B2 (en) 2018-07-12 2023-04-04 Novatron Oy Control system for controlling a tool of a machine

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101870110B (zh) 2010-07-01 2012-01-04 三一重工股份有限公司 一种机械铰接臂的控制方法及控制装置
JP5505138B2 (ja) * 2010-07-05 2014-05-28 株式会社安川電機 ロボット装置およびロボット装置による把持方法
DE102010063214A1 (de) * 2010-12-16 2012-06-21 Robert Bosch Gmbh Sicherungseinrichtung für eine Handhabungsvorrichtung, insbesondere einen Industrieroboter, sowie Verfahren zum Betreiben der Sicherungseinrichtung
US9469035B2 (en) * 2011-06-29 2016-10-18 Mitsubishi Electric Corporation Component supply apparatus
FR2982941B1 (fr) * 2011-11-18 2020-06-12 Hexagon Metrology Sas Appareil de mesure comportant un bras a verrouillage indexe
CN103192414B (zh) * 2012-01-06 2015-06-03 沈阳新松机器人自动化股份有限公司 一种基于机器视觉的机器人防撞保护装置及方法
CN103101760A (zh) * 2012-12-28 2013-05-15 长春大正博凯汽车设备有限公司 一种用于工件搬运的视觉搬运系统及其搬运方法
CN104416581A (zh) * 2013-08-27 2015-03-18 富泰华工业(深圳)有限公司 具有报警功能的机械手
CN104802174B (zh) * 2013-10-10 2016-09-07 精工爱普生株式会社 机器人控制系统、机器人、程序以及机器人控制方法
CN108602187A (zh) * 2015-09-09 2018-09-28 碳机器人公司 机械臂系统和物体躲避方法
CN105870814A (zh) * 2016-03-31 2016-08-17 广东电网有限责任公司中山供电局 一种适用于10kV开关紧急分闸的操作装置
JP6548816B2 (ja) * 2016-04-22 2019-07-24 三菱電機株式会社 物体操作装置及び物体操作方法
CN111757796B (zh) * 2018-02-23 2023-09-29 仓敷纺绩株式会社 线状物的前端移动方法、控制装置以及三维照相机
JP7000992B2 (ja) * 2018-05-25 2022-01-19 トヨタ自動車株式会社 マニピュレータおよび移動ロボット
CN108527374A (zh) * 2018-06-29 2018-09-14 德淮半导体有限公司 应用于机械臂的防撞系统和方法
CN113386135A (zh) * 2021-06-16 2021-09-14 深圳谦腾科技有限公司 一种具有2d相机的机械手及其抓取方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63245387A (ja) * 1987-03-30 1988-10-12 豊田工機株式会社 ロボツトの視覚認識装置
JP2004243454A (ja) * 2003-02-13 2004-09-02 Yaskawa Electric Corp ロボットの工具形状指定装置および工具干渉チェック装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05261692A (ja) * 1992-03-17 1993-10-12 Fujitsu Ltd ロボットの作業環境監視装置
DE10319253B4 (de) * 2003-04-28 2005-05-19 Tropf, Hermann Dreidimensional lagegerechtes Zuführen mit Roboter
JP2005001022A (ja) * 2003-06-10 2005-01-06 Yaskawa Electric Corp 物体モデル作成装置及びロボット制御装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63245387A (ja) * 1987-03-30 1988-10-12 豊田工機株式会社 ロボツトの視覚認識装置
JP2004243454A (ja) * 2003-02-13 2004-09-02 Yaskawa Electric Corp ロボットの工具形状指定装置および工具干渉チェック装置

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010127719A (ja) * 2008-11-26 2010-06-10 Canon Inc 作業システム及び情報処理方法
JP2010271978A (ja) * 2009-05-22 2010-12-02 Nippon Telegr & Teleph Corp <Ntt> 行動推定装置
WO2011077693A1 (fr) * 2009-12-21 2011-06-30 Canon Kabushiki Kaisha Système robotisé pour réorienter une pièce retenue
US9418291B2 (en) 2009-12-21 2016-08-16 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and computer-readable storage medium
US8326460B2 (en) 2010-03-05 2012-12-04 Fanuc Corporation Robot system comprising visual sensor
JP2011201007A (ja) * 2010-03-05 2011-10-13 Fanuc Ltd 視覚センサを備えたロボットシステム
CN102189548B (zh) * 2010-03-05 2014-06-18 发那科株式会社 具有视觉传感器的机器人系统
CN102189548A (zh) * 2010-03-05 2011-09-21 发那科株式会社 具有视觉传感器的机器人系统
JP2011200331A (ja) * 2010-03-24 2011-10-13 Fuji Xerox Co Ltd 位置計測システム、位置計測装置及び位置計測プログラム
US20130306543A1 (en) * 2010-11-08 2013-11-21 Fresenius Medical Care Deutschland Gmbh Manually openable clamping holder with sensor
US10835665B2 (en) * 2010-11-08 2020-11-17 Fresenius Medical Care Deutschland Gmbh Manually openable clamping holder with sensor
JP2013036988A (ja) * 2011-07-08 2013-02-21 Canon Inc 情報処理装置及び情報処理方法
JP2013036987A (ja) * 2011-07-08 2013-02-21 Canon Inc 情報処理装置及び情報処理方法
US9437005B2 (en) 2011-07-08 2016-09-06 Canon Kabushiki Kaisha Information processing apparatus and information processing method
US9217636B2 (en) 2012-06-11 2015-12-22 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and a computer-readable storage medium
EP3409841A4 (fr) * 2016-03-02 2019-03-20 Kabushiki Kaisha Kobe Seiko Sho (Kobe Steel, Ltd.) Dispositif de prévention d'interférence pour engin de chantier
US20190024348A1 (en) * 2016-03-02 2019-01-24 Kabushiki Kaisha Kobe Seiko Sho (Kobe Steel, Ltd.) Interference prevention device for construction machinery
US11111654B2 (en) 2016-03-02 2021-09-07 Kabushiki Kaisha Kobe Seiko Sho (Kobe Steel, Ltd.) Interference prevention device for construction machinery
JP2021175592A (ja) * 2016-05-20 2021-11-04 グーグル エルエルシーGoogle LLC 物体を取り込む画像に基づき、環境内の将来のロボット運動に関するパラメータに基づいて、ロボットの環境内の物体の動きを予測することに関する機械学習の方法および装置
JP7399912B2 (ja) 2016-05-20 2023-12-18 グーグル エルエルシー 物体を取り込む画像に基づき、環境内の将来のロボット運動に関するパラメータに基づいて、ロボットの環境内の物体の動きを予測することに関する機械学習の方法および装置
JP2023029576A (ja) * 2018-04-27 2023-03-03 新明和工業株式会社 作業車両
JP7427350B2 (ja) 2018-04-27 2024-02-05 新明和工業株式会社 作業車両
US11618120B2 (en) 2018-07-12 2023-04-04 Novatron Oy Control system for controlling a tool of a machine
WO2021070454A1 (fr) * 2019-10-10 2021-04-15 清水建設株式会社 Robot pour travail de construction
JP2021062413A (ja) * 2019-10-10 2021-04-22 清水建設株式会社 建設作業用ロボット
JP7341837B2 (ja) 2019-10-10 2023-09-11 清水建設株式会社 建設作業用ロボット
EP4088888A1 (fr) * 2021-05-14 2022-11-16 Intelligrated Headquarters, LLC Détection de la hauteur des objets pour les opérations de palettisation et de dépalettisation

Also Published As

Publication number Publication date
CN101522377B (zh) 2011-09-14
CN101522377A (zh) 2009-09-02
JPWO2008047872A1 (ja) 2010-02-25

Similar Documents

Publication Publication Date Title
WO2008047872A1 (fr) Manipulateur
JP6436604B2 (ja) 演算システムによって実施される方法及びシステム
CN114728417B (zh) 由远程操作员触发的机器人自主对象学习的方法及设备
JP5216690B2 (ja) ロボット管理システム、ロボット管理端末、ロボット管理方法およびプログラム
JP6567563B2 (ja) 衝突回避および軌道復帰能力を有する人型ロボット
JP4850984B2 (ja) 動作空間提示装置、動作空間提示方法およびプログラム
KR101751405B1 (ko) 작업 기계의 주변 감시 장치
US20120182155A1 (en) Danger presentation device, danger presentation system, danger presentation method and program
JP5276931B2 (ja) 移動体および移動体の位置推定誤り状態からの復帰方法
JP2010120139A (ja) 産業用ロボットの安全制御装置
CN101479082A (zh) 机器人装置和机器人装置的控制方法
JP2019188580A (ja) 情報処理装置、制御方法、ロボットシステム、コンピュータプログラム、及び記憶媒体
KR101615687B1 (ko) 충돌 예측 로봇 원격 제어 시스템 및 그 방법
WO2019146201A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations, et système de traitement d&#39;informations
JP5326794B2 (ja) 遠隔操作システム及び遠隔操作方法
CN110856932A (zh) 干涉回避装置以及机器人系统
US20200254610A1 (en) Industrial robot system and method for controlling an industrial robot
US11097414B1 (en) Monitoring of surface touch points for precision cleaning
JP6927937B2 (ja) 三次元骨格表現を生成するためのシステム及び方法
KR20210055650A (ko) 역각 시각화 장치, 로봇 및 역각 시각화 프로그램
JP2019198907A (ja) ロボットシステム
JP2008168372A (ja) ロボット装置及び形状認識方法
JP3565763B2 (ja) マスタアームのリンク位置検出方法
US11926064B2 (en) Remote control manipulator system and remote control assistance system
JP3376029B2 (ja) ロボットの遠隔操作装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780037872.3

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07830094

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008539872

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07830094

Country of ref document: EP

Kind code of ref document: A1