WO2018131237A1 - Système de robot collaboratif et son procédé de commande - Google Patents

Système de robot collaboratif et son procédé de commande Download PDF

Info

Publication number
WO2018131237A1
WO2018131237A1 PCT/JP2017/037221 JP2017037221W WO2018131237A1 WO 2018131237 A1 WO2018131237 A1 WO 2018131237A1 JP 2017037221 W JP2017037221 W JP 2017037221W WO 2018131237 A1 WO2018131237 A1 WO 2018131237A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot arm
worker
arm
robot
sensor
Prior art date
Application number
PCT/JP2017/037221
Other languages
English (en)
Japanese (ja)
Inventor
荒木 宏
良次 澤
浩司 白土
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2018517660A priority Critical patent/JP6479264B2/ja
Publication of WO2018131237A1 publication Critical patent/WO2018131237A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices

Definitions

  • the present invention relates to a collaborative robot system in which a robot arm and a worker perform work in the same work space, and also relates to a control method for the collaborative robot system.
  • This system has one or more proximity sensors that cover the robot arm over a wide area, and detects obstacles such as workers in the operation path of the robot arm. When the danger of interference between the robot arm and the worker is detected, the operation of the robot arm is immediately stopped or the operation path of the robot arm is changed so as to avoid the worker or the like (Patent Document 1). reference).
  • Patent Document 1 for each link of a manipulator in a robot and surrounding objects, a sphere whose outer shape is enveloped is set, and interference determination is performed only by calculating the center-to-center distance of each sphere accompanying the operation of the manipulator. .
  • the plurality of proximity sensors need to be constantly operated to detect the worker. Therefore, when the number of proximity sensors increases, there is a problem that the processing waiting time for worker detection increases and the movement speed or work speed of the robot arm decreases.
  • the present invention has been made to solve the above-described problems, and in a collaborative robot system in which a robot arm and an operator work close to each other, a work operation or a movement operation can be performed at a higher speed than before.
  • An object of the present invention is to provide a collaborative robot system and a control method for the collaborative robot system.
  • a collaborative robot system includes a robot arm, and the collaborative robot system in which an operator can access the robot arm is attached to the robot arm, and the worker approaches the robot arm. And a plurality of proximity sensors for selecting a proximity sensor used for worker detection from the plurality of proximity sensors in accordance with the work procedure of the robot arm.
  • the arm control unit that selects the proximity sensor is provided, so that it is possible to specify the proximity sensor necessary for the detection of the worker. That is, it is not necessary to obtain the worker detection information from all the proximity sensors, and the proximity sensor detection information necessary for the worker detection can be reduced. Therefore, the processing waiting time by the proximity sensor can be shortened. As a result, the approach of the worker to the robot arm can be detected more quickly and reflected in the next work operation, so that the robot arm can be operated or operated at a higher speed.
  • FIG. 2 is a system block diagram assuming hardware and software implementation in the collaborative robot system shown in FIG. 1.
  • FIG. 2 is a diagram showing a coordinate system for calculating a moving speed vector from a change in position coordinates of a robot arm in the collaborative robot system shown in FIG. 1.
  • the robot control module, the human detection module, and the sensor ECU are sequence diagrams for sharing information using a shared memory. It is an operation
  • FIG. 8 is a characteristic diagram showing the relationship between the reception electric field strength of the LF charging wave of the wireless tag and the reception distance in the cooperative robot system shown in FIG. 7.
  • FIG. 8 is a diagram illustrating the received electric field strength of the wireless tag obtained from the first antenna and the received distance of the wireless tag at the received electric field strength in the cooperative robot system shown in FIG. 7.
  • FIG. 8 is a diagram showing the received electric field strength of the wireless tag obtained from the second antenna and the received distance of the wireless tag at the received electric field strength in the cooperative robot system shown in FIG. 7.
  • FIG. 8 is a diagram illustrating the received electric field strength of the wireless tag obtained from the third antenna and the received distance of the wireless tag at the received electric field strength in the cooperative robot system shown in FIG. 7.
  • FIG. 8 is a schematic diagram for explaining position estimation of a wireless tag from reception distances of wireless tags obtained from three antennas in the collaborative robot system shown in FIG. 7.
  • Embodiment 1 FIG.
  • a human collaborative robot (hereinafter referred to as a “collaborative robot”) is a work system in which an operator can work closer together with a robot arm so that the worker can work with the robot arm.
  • proximity sensors of various detection methods are installed in the robot arm itself in order to improve the work efficiency when the worker approaches and moves away from the robot arm and to ensure the safety of the worker. Consideration is being made.
  • FIG. 1 shows a schematic configuration of a collaborative robot system 101 according to the embodiment.
  • the collaborative robot system 101 includes, as basic components, a robot arm 1, a proximity sensor (non-contact sensor) 2 that is attached to the robot arm 1 and detects the approaching and leaving of an operator, and an arm control unit 5.
  • the arm control unit 5 is a part having a function of selecting a proximity sensor 2 to be used for worker detection from a plurality of proximity sensors 2 in accordance with the work procedure of the robot arm 1.
  • the collaborative robot system 101 is further based on the environmental sensor 3 that detects the entry / exit of the worker to / from the work space, the detection information of the proximity sensor 2 selected by the arm control unit 5, and the detection information of the environmental sensor 3.
  • a sensor processing unit 4 for obtaining the position of the worker can be provided.
  • each of these components will be sequentially described.
  • the robot arm 1 in this embodiment refers to the entire link 1a-1f including joints, and corresponds to, for example, a 6-axis articulated robot capable of complicated work. Drive device).
  • the robot arm 1 is not limited to an articulated robot, and corresponds to a robot capable of cooperating with an operator.
  • the links 1a to 1f are constituent elements of the robot arm 1 including joints in the robot arm 1.
  • the proximity sensor 2 is installed in plural on the surface (including the surface of the joint part) of a part or all of the links 1a to 1f constituting the robot arm 1, and detects an operator approaching the robot arm 1.
  • the proximity distance of the worker is assumed to be 5 m to 0 m, but is variable depending on the performance of the proximity sensor 2 to be used.
  • an ultrasonic sensor, an optical sensor, a capacitance sensor, a radio wave sensor, or the like can be used, and a distance sensor that can measure a distance from an operator is used.
  • a pyroelectric sensor that detects human far-infrared radiation is used as an optical sensor, a sensor capable of measuring the distance to the person may be used in combination.
  • the proximity sensor 2 is installed at the tip of one or a plurality of links that are particularly fast in moving speed, but is installed at a position defined by the specifications of the robot arm 1 on the link surface or joint. Further, the proximity sensor 2 is used after being calibrated so that the arm controller 5 can be operated, such as the detection distance, detection sensitivity, detection time, and detection direction, depending on the installation position of the proximity sensor 2.
  • FIG. 2 shows a case where the proximity sensor 2 is installed at the tip of the link 1a or the like.
  • the ultrasonic sensors 21a to 21e as an example of the proximity sensor 2 are installed in the vertical direction, the horizontal direction, and the front-rear direction on the link tip having a rectangular housing, respectively.
  • an operator from 2 to 3 m away from the link to 20 cm in the vicinity is detected. This detection distance is variable depending on the performance of the proximity sensor 2 installed as described above.
  • the electrostatic capacitance sensor 22 is installed on the entire surface or a part of the link housing, and the electrostatic capacitance sensor 22 detects an operator who contacts the link from, for example, the vicinity 20 cm.
  • the detection distance is variable depending on the performance of the proximity sensor 2 to be installed.
  • the term “proximity sensor 2” is a generic term for the above-described ultrasonic sensors 21a to 21e (may be collectively referred to as “ultrasonic sensor 21”), the capacitance sensor 22, and the like. It is used as.
  • Each proximity sensor 2 is given a sensor ID which is a number for identifying the plurality of proximity sensors 2.
  • each link 1a-1f may be a columnar shape or a curved shape, and the above-described installation position and installation direction in the proximity sensor 2 are relative and are defined in the specifications of the robot arm 1. As long as it can be set in the arm control unit 5 at any position.
  • the detection distance of the proximity sensor 2 installed on the robot arm 1 is variable depending on the moving speed of the robot arm 1, but may be a fixed length.
  • the detection distance of the proximity sensor 2 By setting the detection distance of the proximity sensor 2 to the maximum detection distance based on the link length in the robot arm 1, for example, the detection distance greater than or equal to the link length, an operator existing within the reach of the link is detected.
  • the number of proximity sensors 2 installed on the robot arm 1 can be reduced. For example, even if a link has a large link length and a wide operating range, the proximity sensor 2 having a detection distance equal to or longer than the link length can be used to reduce the wide operating range to at least one proximity sensor 2. And the number of installed proximity sensors 2 can be reduced.
  • the proximity sensor 2 that performs different distance detection is used to detect and complement the proximity distance, which is a dead zone of one proximity sensor, with the other proximity sensor.
  • the dead zone of the operator can be eliminated.
  • the environment sensor 3 monitors the entire work space of the robot arm 1 and detects the entry / exit of the worker to / from the work space of the robot arm 1.
  • the environment sensor 3 detects an operator's identification, a moving direction, a moving speed, etc. from an acquired image as a change in work space using area sensors, such as a surveillance camera and a laser scanner.
  • area sensors such as a surveillance camera and a laser scanner.
  • the robot arm 1 and the worker's work space can be generated as a stereoscopic image. Therefore, the three-dimensional positions of the worker and the robot arm 1 in the work space can be specified, and the avoidance operation of the worker of the robot arm 1 can be accurately controlled.
  • the sensor processing unit 4 determines the position of the worker from the detection information of the distance between the robot arm 1 and the worker obtained from the proximity sensor 2 and the entry / exit information of the worker in the work space obtained from the environment sensor 3. The position information of the operator is identified and the arm control unit 5 is notified. The distance detection information by the proximity sensor 2 is selected by the arm control unit 5 as necessary, as will be described later. Further, the output signal of the environment sensor 3 may be directly supplied to the arm controller 5.
  • the sensor processing unit 4 may be integrated with the arm control unit 5 as a function of the arm control unit 5.
  • the arm control unit 5 controls work operations of the links 1a to 1f including the joints, which are work arms of the robot arm 1. That is, when the operator approaches the robot arm 1, the arm control unit 5 moves based on the operation procedure of the robot arm 1 at that time, and the movement speed that would interfere with the operator and cause danger.
  • One or a plurality of the proximity sensors 2 arranged in the direction are selected and notified to the sensor processing unit 4.
  • the sensor processing unit 4 notified of the selection information of the proximity sensor 2 controls the proximity sensor 2 corresponding to the selection information, acquires distance information with respect to the worker from the controlled proximity sensor 2 and notifies the arm control unit 5 of the information. To do.
  • the arm control unit 5 analyzes the interference between the robot arm 1 and the worker from the obtained distance information to the worker, and changes the working state or operating state of the robot arm 1.
  • the “work procedure of the robot arm 1” information used by the arm control unit 5 to select the proximity sensor 2 uses the record information input to the arm control unit 5 in accordance with a predetermined work procedure.
  • FIG. 3 shows a system block configuration of hardware and software to which the present embodiment is applied, and shows a robot arm 1, an arm control unit 5 corresponding to a robot controller, and a sensor processing unit 4.
  • the arm control unit 5 includes a robot control module 51 that performs operation control and work procedure control of the robot arm 1, a human detection module 43, and a human recognition module 44.
  • the sensor processing unit 4 includes a sensor ECU (Electric Control Unit) 41 that performs signal processing on the proximity sensor 2 and the environment sensor 3, and a communication I / F (interface) 42.
  • An ultrasonic sensor 21 and a capacitance sensor 22 as the proximity sensor 2 are connected to the sensor ECU 41.
  • the proximity sensor 2 may be a sensor of another detection method.
  • the human detection module 43 and the human recognition module 44 may be included in the sensor processing unit 4.
  • the ultrasonic sensor 21 of the proximity sensor 2 performs time measurement using ultrasonic echoes from a distant place to a nearby worker, and the sensor ECU 41 converts the distance information into an arm control unit via the communication I / F 42. 5 supplies the distance information. Further, the capacitance sensor 22 of the proximity sensors 2 outputs distance information of about 20 cm or less as described above.
  • the electrostatic capacity sensor 22 intentionally contacts the robot arm 1 again after the stop so that the robot arm 1 A man-machine interface can be used as a work start switch.
  • the capacitance sensor 22 is configured as a man-machine interface
  • the sensor processing unit 4 or the arm control unit 5 has a function of restarting the robot arm 1.
  • the stopped robot arm 1 can be reactivated, and the labor for restarting the arm control unit 5 can be omitted. Can be improved.
  • the arm control unit 5 corresponding to the robot controller incorporates software and hardware constituting the robot control module 51, the human recognition module 44, the human detection module 43, and the like.
  • the conventional robot controller performs only the robot control.
  • the robot arm 1 and the human being in the collaborative robot system 101 are added. Enables collaborative work with.
  • the added human recognition module 44 and human detection module 43 may be integrated with their functions with the robot control module 51, or may be a controller configured with hardware separate from the robot controller.
  • the human recognition module 44 is a functional part having software and hardware that processes an image acquired by a monitoring camera that is an example of the environmental sensor 3 and detects the presence / absence, position, movement, etc. of the worker in a predetermined work space. It is.
  • the human recognition module 44 may be integrated with the environment sensor 3 (monitoring camera).
  • the human detection module 43 selects the proximity sensor 2 installed at a location calculated based on predetermined position coordinates at which the robot arm 1 operates, that is, the work procedure of the robot arm 1, and the selected proximity sensor 2
  • the position information of the worker obtained from is output.
  • the function of the human detection module 43 is as follows: ⁇ Enter the world coordinates (X, Y, Z) of the tip of the robot arm 1 ⁇ Calculate the speed vector of the tip of the robot arm 1 from the world coordinates (X, Y, Z) ⁇ Select and instruct the sensor ID to be measured from the speed vector ⁇ Reading the distance information of the proximity object (worker) from the proximity sensor with the selected sensor ID ⁇ Calculating the speed of the proximity object (worker) from the sensor distance information ⁇ Recording the time stamp determined by the distance information and the speed information -Output the measured distance information and time stamp for the input world coordinates (X, Y, Z)-Logging of proximity sensor data-Self-diagnosis, etc.
  • the world coordinates are absolute position coordinates assigned to the work space of the robot arm 1 and have a correlation with the position coordinates of the actual work space. However, relative coordinates obtained by aligning the origin of the work space of the robot arm 1 and the actual work space as the origin may be used.
  • the “calculation of the velocity vector at the tip of the robot arm 1” function will be described in detail below.
  • the “tip of the robot arm 1” corresponds to, for example, the tip of the “link 1f” shown in FIG. 1 or the tip of each of the links 1a to 1f.
  • the velocity vector calculation of the tip of the robot arm 1 is as follows. The time difference is obtained from the time change of the world coordinates, and the velocity vector at the tip of the robot arm 1 is calculated. It is assumed that the time difference is an update cycle ⁇ t (ms unit) of the proximity sensor 2 and can be set and adjusted.
  • the sensor ECU 41 obtains a time difference from the time change of the distance information obtained by the sensor ECU 41, and calculates the speed of the proximity object (worker).
  • the time difference is assumed to be a sensor update cycle ⁇ t (in ms units) and can be set and adjusted.
  • the ultrasonic sensor 21 starts distance measurement by output of a specified time (adjustable) from the sensor ECU 41.
  • the sensor ECU 41 measures the time from the start of reading to the end of reading after the elapse of a certain mask time (adjustable) from the time of the input trigger, and sets it as the ultrasonic propagation time T [us].
  • the electrostatic capacitance sensor 22 starts proximity detection output by output from the sensor ECU 41 for a specified time (adjustable).
  • the sensor ECU 41 samples the proximity detection output several times between the start of reading and the end of reading after the elapse of a certain mask time (adjustable) from the time of the input trigger, and sets the logic of proximity: H, proximity: L . In this logic, H and L may be reversed.
  • the sensor ECU 41 When the sensor ECU 41 is configured separately from the arm control unit 5, the sensor ECU 41 shares sensor information with the robot control module 51 and the human detection module 43 through wired communication or wireless communication. Information exchange among the robot control module 51, the human detection module 43, and the sensor ECU 41 is performed in the shared memory 53.
  • the shared memory 53 has two memories, each of which performs an operation of writing and reading the other.
  • the person recognition module 44 may use the shared memory 53. Even when the sensor ECU 41 is configured integrally with the arm control unit 5, information may be shared in the same manner.
  • FIG. 5 shows an information exchange sequence among the robot control module 51, the human detection module 43, and the sensor ECU 41.
  • the current position of the tip of the robot arm 1 is written from the robot control module 51 to the shared memory 53 with the human detection module 43, and the human detection module 43 designates the sensor ID of the proximity sensor 2 to be operated.
  • the sensor ID For the designation of the sensor ID, all of the plurality of proximity sensors 2 may be designated, or only a part (one or a plurality) may be designated.
  • the human detection module 43 records the current position of the tip of the robot arm 1 written in the shared memory 53.
  • the human detection module 43 has an operation counter and records a count value in which the current position of the tip of the robot arm 1 is written or a count value in which the current position of the tip of the robot arm 1 is recorded. If this count value is incremented or decremented, it indicates that the human detection module 43 is operating, and can be used for detecting an abnormal operation of the human detection module 43.
  • the human detection module 43 outputs an operation instruction to the proximity sensor 2 having the specified sensor ID with respect to the sensor ECU 41.
  • the sensor ECU 41 notifies the distance information with respect to the worker sent by the proximity sensor 2 corresponding to the sensor ID designated by the human detection module 43.
  • the sensor ECU 41 also has an operation counter, and records the count value informing the distance information. If this count value is incremented or decremented, it indicates that the sensor ECU 41 is operating, and can be used for detecting an abnormal operation of the sensor ECU 41.
  • the human detection module 43 calculates the speed from the difference between the current distance information and the distance information output last time.
  • the human detection module 43 writes each information such as distance and speed notified from the sensor ECU 41 in the shared memory 53.
  • the robot control module 51 determines the approach or withdrawal of the worker from the information in the shared memory 53, corrects the position of the robot arm 1, or stops the operation of the robot arm 1.
  • the collaborative robot system 101 it is possible to quickly detect the approaching and leaving of the worker with respect to the robot arm 1. Therefore, it is possible to prevent the worker from approaching the robot arm 1 and coming into contact with the robot arm 1 that is working, and to ensure both the safety of the worker and the high-speed operation of the robot arm 1.
  • FIG. 6 shows a series of operation flows when the collaborative robot system 101 detects an operator.
  • the arm control unit 5 analyzes the position, operation procedure, operation speed, and operation acceleration of the entire robot arm 1 (step 10b).
  • the operation procedure of the robot arm 1 and the proximity sensor 2 that is optimal for worker detection are sequentially associated (10c).
  • the arm controller 5 selects an optimum proximity sensor from the plurality of proximity sensors 2 in order to quickly detect the worker (10e), and from the selected proximity sensor 2
  • the distance information which is the proximity information of the worker is collected (10f).
  • the arm controller 5 (specifically, the human detection module 43) calculates the position, speed, acceleration, etc. of the worker from the collected distance information (10g), and the worker is linked to the link where the selected proximity sensor 2 is installed. (10h).
  • the robot control module 51 analyzes a specified series of work procedures and calculates a motion vector from the current world coordinates of the tip of the robot arm 1 and the world coordinates after t seconds. From the obtained motion vector of the tip of the robot arm 1, the robot control module 51 extracts the proximity sensor 2 in the link in the moving direction, and gives a detection instruction for the proximity sensor 2 to the human detection module 43. The human detection module 43 acquires the instructed distance and speed detected by the proximity sensor 2. The robot control module 51 detects the approach of the worker to the link in the moving direction using the acquired distance and speed to the worker as the target object, and whether or not the robot arm 1 and the worker interfere with each other. That is, it is determined whether or not the worker is safe.
  • Whether the robot arm 1 and the worker interfere with each other, that is, whether or not the worker is safe is determined according to the following time condition and distance condition. In the determination, either a time condition or a distance condition may be used.
  • Worker speed V1
  • Robot arm 1 speed V2
  • the judgment condition for whether or not it is safe is as follows.
  • the safety of the worker is enhanced, the safety distance between the robot arm 1 and the worker is secured, and the speed reduction of the robot arm 1 caused by the movement or position of the worker is achieved. It is possible to avoid operation stop and the like. Thereby, the working efficiency of the robot arm 1 and the operator can be improved.
  • FIG. FIG. 7 shows a schematic configuration of the cooperative robot system 102 according to the second embodiment.
  • a wireless position detection unit 9 is provided instead of the environment sensor 3 according to the first embodiment.
  • the wireless position detection unit 9 includes an antenna 7 that transmits and receives radio waves, and one or a plurality of antennas 7 are arranged to cover the robot arm 1 and the worker's work space wirelessly.
  • the worker wears a wireless tag 8 that can be detected by the wireless position detector 9.
  • Other configurations are the same as the configuration of the cooperative robot system 101 in the first embodiment. Therefore, in the following, this difference will be described in particular, and description of the same component will be omitted.
  • the wireless position detection unit 9 detects two-dimensional or three-dimensional position coordinates of the wireless tag 8 carried by the worker.
  • the radio frequency band used for the radio position detection unit 9 is an LF band (125 kHz, 21 kHz, etc.), and transmits the radio frequency of the LF band.
  • the radio tag 8 receives the radio frequency of the LF band transmitted from the radio position detector 9, and uses a radio frequency band different from the radio frequency of the LF band such as VHF or UHF, and uses the radio frequency of the LF band. Send the received strength.
  • the wireless position detection unit 9 receives the reception strength of the radio frequency in the LF band transmitted from the wireless tag 8. These operations are repeated for the number of installed antennas 7.
  • the electric field strength of the radio frequency in the LF band from the wireless tag 8 has a distance attenuation characteristic as shown in FIG. That is, the electric field strength is such that the distance attenuation characteristic is linear when the wireless tag 8 and the wireless position detection unit 9 are at a short distance, and the distance attenuation decreases as the distance increases, and becomes the cube of the distance from the antenna 7. Shows a tendency to decay proportionally. Further, the radio frequency of the LF band is less deteriorated in radio propagation due to shielding of the robot arm 1 or its peripheral devices and equipment.
  • each antenna 7 and operator Identifies the distance R1 from the received electric field strength V1 of the first antenna 7-1 and similarly identifies the distance R1 from the received electric field strength V1 of the second antenna 7-2. Then, the distance R1 is specified from the received electric field strength V1 of the third antenna 7-3.
  • the position of the first antenna 7-1 is x1, y1, and the position of the wireless tag 8 is on a circle of distance R1.
  • the position of the wireless tag 8 exists on a circle with a distance R2 where the position of the second antenna 7-2 is x2 and y2, and the position of the wireless tag 8 is the position of the third antenna 7-3 as x3 and y3.
  • the positions x0 and y0 of the wireless tag 8 exist at the intersections of the above-described three circles, and the wireless position detection unit 9 can estimate the position of the wireless tag 8.
  • the arm control unit 5 calculates the presence / absence of the worker in the work space of the robot arm 1 and the position of the worker from the robot arm 1 based on the worker's position obtained from the wireless position detection unit 9. .
  • the proximity sensor 2 is selected from the worker position and the work procedure of the robot arm 1, the approaching or leaving of the worker is detected, the safety of the worker is ensured, and the worker The work efficiency of the robot arm 1 is prevented from being lowered due to the approach of.
  • the wireless tag 8 may be assigned a type ID that identifies the worker.
  • the type ID is, for example, the skill level of the worker, the years of service, gender, maintenance person, general person, etc., and the combination of the safety distance between the robot arm 1 and the worker and the control operation speed according to the type ID. Can be changed. Thereby, at the same time as detecting the position of the wireless tag 8 worn by the worker, the operator can be specified by the type of the wireless tag 8 worn by the worker and the control operation speed of the robot arm 1 can be changed. Therefore, it is possible to efficiently improve the work efficiency of the robot arm 1 by reflecting the safety ensuring according to the type of the worker and the approaching degree of the worker engaged in the work.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un système de robot collaboratif capable d'effectuer des opérations de travail ou des mouvements plus rapides que par le passé, et son procédé de commande associé. Le système de robot collaboratif, qui est pourvu d'un bras de robot (1) et dans lequel un travailleur peut se rapprocher dudit bras de robot, est pourvu de multiples capteurs de proximité (2), qui sont installés sur le bras de robot et qui sont destinés à détecter quand le travailleur se rapproche du bras de robot, et une unité de commande de bras (5) pour sélectionner le capteur de proximité à utiliser pour détecter le travailleur parmi les multiples capteurs de proximité selon une procédure de travail du bras de robot.
PCT/JP2017/037221 2017-01-13 2017-10-13 Système de robot collaboratif et son procédé de commande WO2018131237A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018517660A JP6479264B2 (ja) 2017-01-13 2017-10-13 協働ロボットシステム、及びその制御方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-003887 2017-01-13
JP2017003887 2017-01-13

Publications (1)

Publication Number Publication Date
WO2018131237A1 true WO2018131237A1 (fr) 2018-07-19

Family

ID=62840306

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/037221 WO2018131237A1 (fr) 2017-01-13 2017-10-13 Système de robot collaboratif et son procédé de commande

Country Status (2)

Country Link
JP (1) JP6479264B2 (fr)
WO (1) WO2018131237A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020055045A (ja) * 2018-09-28 2020-04-09 学校法人福岡大学 ロボットセンサ
JP2020082318A (ja) * 2018-11-30 2020-06-04 ファナック株式会社 ロボットの監視システムおよびロボットシステム
JP2020093373A (ja) * 2018-12-14 2020-06-18 オムロン株式会社 ロボット干渉判定装置、ロボット干渉判定方法、ロボット制御装置、およびロボット制御システム
WO2020144852A1 (fr) * 2019-01-11 2020-07-16 株式会社Fuji Dispositif de commande, appareil de travail de pièce, système de travail de pièce et procédé de commande
JP2020146826A (ja) * 2019-03-15 2020-09-17 株式会社デンソーウェーブ ロボット及び人の位置検出システム
JP2020192635A (ja) * 2019-05-28 2020-12-03 オムロン株式会社 安全監視システム、安全監視制御装置、および安全監視方法
CN112297061A (zh) * 2019-07-30 2021-02-02 精工爱普生株式会社 检测方法及机器人
JP2021054660A (ja) * 2018-10-30 2021-04-08 株式会社Mujin 自動パッケージ登録メカニズムを備えたロボットシステム、および、その動作方法
JP2021053743A (ja) * 2019-09-30 2021-04-08 Johnan株式会社 制御装置、制御方法およびプログラム
WO2021065879A1 (fr) * 2019-09-30 2021-04-08 Johnan株式会社 Système de surveillance, procédé de surveillance et programme
JPWO2021193914A1 (fr) * 2020-03-27 2021-09-30
WO2021215390A1 (fr) * 2020-04-24 2021-10-28 住友理工株式会社 Dispositif capacitif de détection de proximité
EP3909727A1 (fr) * 2020-05-12 2021-11-17 Soremartec S.A. Dispositif de commande
US11281906B2 (en) 2019-06-21 2022-03-22 Fanuc Corporation Monitor device provided with camera for capturing motion image of motion of robot device
US11585082B2 (en) * 2017-07-31 2023-02-21 Germán Becerril Hernández Automated system for robotised construction and construction method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63109996A (ja) * 1986-05-15 1988-05-14 三菱電機株式会社 ロボット作業域監視装置
JP2006043792A (ja) * 2004-08-02 2006-02-16 Yaskawa Electric Corp 衝突防止機能付ロボット
JP2007203519A (ja) * 2006-01-31 2007-08-16 Yushin Precision Equipment Co Ltd 成形品取出機
JP2010055250A (ja) * 2008-08-27 2010-03-11 Toyota Motor Corp 移動体の位置情報取得システム及び取得方法
JP2010120139A (ja) * 2008-11-21 2010-06-03 New Industry Research Organization 産業用ロボットの安全制御装置
JP2011073079A (ja) * 2009-09-29 2011-04-14 Daihen Corp 動物体の監視装置
JP2011125975A (ja) * 2009-12-18 2011-06-30 Denso Wave Inc ロボットの干渉回避装置
US20120022689A1 (en) * 2010-07-23 2012-01-26 Agile Planet, Inc. System and Method for Robot Safety and Collision Avoidance
JP2013082071A (ja) * 2013-02-12 2013-05-09 Toyota Motor East Japan Inc 作業支援システム

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010063214A1 (de) * 2010-12-16 2012-06-21 Robert Bosch Gmbh Sicherungseinrichtung für eine Handhabungsvorrichtung, insbesondere einen Industrieroboter, sowie Verfahren zum Betreiben der Sicherungseinrichtung
WO2014207299A1 (fr) * 2013-06-25 2014-12-31 Tekno-Ants Oy Procédé et système de guidage servant à l'utilisation d'un robot
JP6771888B2 (ja) * 2015-01-29 2020-10-21 キヤノン株式会社 ロボット装置、制御方法、物品の製造方法、プログラム及び記録媒体
JP6455310B2 (ja) * 2015-05-18 2019-01-23 本田技研工業株式会社 動作推定装置、ロボット、及び動作推定方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63109996A (ja) * 1986-05-15 1988-05-14 三菱電機株式会社 ロボット作業域監視装置
JP2006043792A (ja) * 2004-08-02 2006-02-16 Yaskawa Electric Corp 衝突防止機能付ロボット
JP2007203519A (ja) * 2006-01-31 2007-08-16 Yushin Precision Equipment Co Ltd 成形品取出機
JP2010055250A (ja) * 2008-08-27 2010-03-11 Toyota Motor Corp 移動体の位置情報取得システム及び取得方法
JP2010120139A (ja) * 2008-11-21 2010-06-03 New Industry Research Organization 産業用ロボットの安全制御装置
JP2011073079A (ja) * 2009-09-29 2011-04-14 Daihen Corp 動物体の監視装置
JP2011125975A (ja) * 2009-12-18 2011-06-30 Denso Wave Inc ロボットの干渉回避装置
US20120022689A1 (en) * 2010-07-23 2012-01-26 Agile Planet, Inc. System and Method for Robot Safety and Collision Avoidance
JP2013082071A (ja) * 2013-02-12 2013-05-09 Toyota Motor East Japan Inc 作業支援システム

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11585082B2 (en) * 2017-07-31 2023-02-21 Germán Becerril Hernández Automated system for robotised construction and construction method
JP2020055045A (ja) * 2018-09-28 2020-04-09 学校法人福岡大学 ロボットセンサ
JP7153322B2 (ja) 2018-09-28 2022-10-14 学校法人福岡大学 ロボットセンサ
JP2021054660A (ja) * 2018-10-30 2021-04-08 株式会社Mujin 自動パッケージ登録メカニズムを備えたロボットシステム、および、その動作方法
JP7466150B2 (ja) 2018-10-30 2024-04-12 株式会社Mujin 自動パッケージ登録メカニズムを備えたロボットシステム、および、その動作方法
CN111251291A (zh) * 2018-11-30 2020-06-09 发那科株式会社 机器人的监控系统以及机器人系统
JP2020082318A (ja) * 2018-11-30 2020-06-04 ファナック株式会社 ロボットの監視システムおよびロボットシステム
US11325259B2 (en) 2018-11-30 2022-05-10 Fanuc Corporation Monitor system for robot and robot system
JP2020093373A (ja) * 2018-12-14 2020-06-18 オムロン株式会社 ロボット干渉判定装置、ロボット干渉判定方法、ロボット制御装置、およびロボット制御システム
WO2020144852A1 (fr) * 2019-01-11 2020-07-16 株式会社Fuji Dispositif de commande, appareil de travail de pièce, système de travail de pièce et procédé de commande
JPWO2020144852A1 (ja) * 2019-01-11 2021-09-27 株式会社Fuji 制御装置、ワーク作業装置、ワーク作業システム及び制御方法
JP7145237B2 (ja) 2019-01-11 2022-09-30 株式会社Fuji 制御装置、ワーク作業装置、ワーク作業システム及び制御方法
JP2020146826A (ja) * 2019-03-15 2020-09-17 株式会社デンソーウェーブ ロボット及び人の位置検出システム
JP2020192635A (ja) * 2019-05-28 2020-12-03 オムロン株式会社 安全監視システム、安全監視制御装置、および安全監視方法
JP7226101B2 (ja) 2019-05-28 2023-02-21 オムロン株式会社 安全監視システム、安全監視制御装置、および安全監視方法
US11281906B2 (en) 2019-06-21 2022-03-22 Fanuc Corporation Monitor device provided with camera for capturing motion image of motion of robot device
CN112297061A (zh) * 2019-07-30 2021-02-02 精工爱普生株式会社 检测方法及机器人
CN112297061B (zh) * 2019-07-30 2023-12-19 精工爱普生株式会社 检测方法及机器人
JP7398780B2 (ja) 2019-09-30 2023-12-15 Johnan株式会社 監視システム、監視方法およびプログラム
JP2021053741A (ja) * 2019-09-30 2021-04-08 Johnan株式会社 監視システム、監視方法およびプログラム
WO2021065879A1 (fr) * 2019-09-30 2021-04-08 Johnan株式会社 Système de surveillance, procédé de surveillance et programme
WO2021065881A1 (fr) * 2019-09-30 2021-04-08 Johnan株式会社 Dispositif de commande, procédé de commande et programme
JP7362107B2 (ja) 2019-09-30 2023-10-17 Johnan株式会社 制御装置、制御方法およびプログラム
JP2021053743A (ja) * 2019-09-30 2021-04-08 Johnan株式会社 制御装置、制御方法およびプログラム
WO2021193914A1 (fr) * 2020-03-27 2021-09-30 三井化学株式会社 Robot, système de robot et programme de commande de robot
JPWO2021193914A1 (fr) * 2020-03-27 2021-09-30
JP7371878B2 (ja) 2020-03-27 2023-10-31 三井化学株式会社 ロボット、ロボットシステム、およびロボット制御プログラム
WO2021215390A1 (fr) * 2020-04-24 2021-10-28 住友理工株式会社 Dispositif capacitif de détection de proximité
JP7460435B2 (ja) 2020-04-24 2024-04-02 住友理工株式会社 静電容量型近接検出装置
EP3909727A1 (fr) * 2020-05-12 2021-11-17 Soremartec S.A. Dispositif de commande

Also Published As

Publication number Publication date
JPWO2018131237A1 (ja) 2019-01-17
JP6479264B2 (ja) 2019-03-06

Similar Documents

Publication Publication Date Title
JP6479264B2 (ja) 協働ロボットシステム、及びその制御方法
US10546167B2 (en) System and method of operating a manufacturing cell
US10452939B2 (en) Monitoring system, monitoring device, and monitoring method
EP3718699B1 (fr) Détection de proximité dans des environnements d'assemblage comportant des machines
US10864637B2 (en) Protective-field adjustment of a manipulator system
JP4822926B2 (ja) 無線発信機の三次元位置を推定する方法、プログラムおよびシステム
US20230099779A1 (en) Metrology system
US9030674B2 (en) Method and apparatus for secure control of a robot
CN113156364A (zh) 安全系统和方法
CN110856932A (zh) 干涉回避装置以及机器人系统
KR101921113B1 (ko) 탐지용 하이브리드 가시광 rfid 태그 및 이에 사용되는 로봇시스템
Fetzner et al. A 3D representation of obstacles in the robots reachable area considering occlusions
JP2020093373A (ja) ロボット干渉判定装置、ロボット干渉判定方法、ロボット制御装置、およびロボット制御システム
BR102020005643A2 (pt) Detecção de proximidade em ambientes de montagem tendo maquinário
JP2021011000A (ja) 協調制御装置およびロボットシステム
Araiza-lllan et al. Dynamic regions to enhance safety in human-robot interactions
JP7272221B2 (ja) 情報処理装置、ロボット、および情報処理システム
JP2020192635A (ja) 安全監視システム、安全監視制御装置、および安全監視方法
EP3718700B1 (fr) Détection de proximité dans des environnements d'assemblage dotés de machines
US20240012383A1 (en) Clustering and detection system and method for safety monitoring in a collaborative workspace
US20230196495A1 (en) System and method for verifying positional and spatial information using depth sensors
US20230256606A1 (en) Robot System with Object Detecting Sensors
CN116673949A (zh) 一种机器人安全控制方法、装置、电子设备和介质
WO2023158598A1 (fr) Système de robot avec capteurs de détection d'objet

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018517660

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17891518

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17891518

Country of ref document: EP

Kind code of ref document: A1