WO2017141577A1 - Impact prediction device, impact prediction system, control device, impact prediction method, and impact prediction program - Google Patents

Impact prediction device, impact prediction system, control device, impact prediction method, and impact prediction program Download PDF

Info

Publication number
WO2017141577A1
WO2017141577A1 PCT/JP2017/000575 JP2017000575W WO2017141577A1 WO 2017141577 A1 WO2017141577 A1 WO 2017141577A1 JP 2017000575 W JP2017000575 W JP 2017000575W WO 2017141577 A1 WO2017141577 A1 WO 2017141577A1
Authority
WO
WIPO (PCT)
Prior art keywords
impact
control
subject
information
unit
Prior art date
Application number
PCT/JP2017/000575
Other languages
French (fr)
Japanese (ja)
Inventor
小也香 内藤
嘉一 森
一希 笠井
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2017141577A1 publication Critical patent/WO2017141577A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices

Definitions

  • the present invention relates to an impact prediction device that predicts an impact given to a subject by contact of a control object, a control device that controls the control object based on a prediction result of such an impact prediction device, and an impact equipped with such an impact prediction device
  • the present invention relates to a prediction system, an impact prediction method using such an impact prediction device, and an impact prediction program for realizing such an impact prediction device.
  • Patent Document 1 A method for measuring an impact when a machine collides is disclosed in Patent Document 1, for example.
  • Patent Document 2 A method for controlling the robot after the collision is disclosed in, for example, Patent Document 2.
  • Patent Document 1 and Patent Document 2 do not disclose a technique regarding a collision between a machine such as an industrial robot and a person.
  • a machine such as an industrial robot
  • Patent Document 2 do not disclose a technique regarding a collision between a machine such as an industrial robot and a person.
  • sufficient safety must be ensured when applied to an area where there is a risk of collision between a machine and a person.
  • the present invention has been made in view of such circumstances, and by calculating the degree of impact when a machine comes into contact with a person based on the operation of the person, the operation of the machine, and the specifications of the machine, the machine and the person It is a main object to provide an impact prediction device capable of predicting an impact before it comes into contact.
  • Another object of the present invention is to provide a control device that controls a controlled object based on the impact prediction result of the impact prediction device according to the present invention.
  • Another object of the present invention is to provide an impact prediction system including the impact prediction device according to the present invention.
  • the present invention has another object to provide an impact prediction method using the impact prediction apparatus according to the present invention.
  • the present invention has another object to provide an impact prediction program for realizing the impact prediction apparatus according to the present invention.
  • the impact prediction device described in the present application is an impact prediction device that predicts an impact that a control target gives to a subject by contact, and a specification acquisition unit that acquires specification information about the specification of the control target
  • a target person action acquisition unit for acquiring target person action information indicating the action of the target person based on a measurement result of a measurement target part of the subject person's body, and control target action information indicating the action of the control object
  • an impact degree calculation unit that calculates an impact degree indicating the degree of impact that the control object gives to the subject when the control subject comes into contact with the subject.
  • the subject motion acquisition unit acquires subject motion information of a plurality of the subjects, and the impact degree computation unit specifies the subject to be contacted. .
  • the specification acquisition unit acquires at least one of information on a shape, information on a material, and information on an operation regarding a part constituting the control target as specification information.
  • the subject motion acquisition unit acquires the calculated subject motion information based on at least one of position information and posture information of a measurement target part of the subject's body.
  • At least one of the position information and posture information related to the acquisition of the subject motion acquisition unit is the velocity, acceleration, angular velocity, angular acceleration, pressure, and magnetism of the measurement target part of the subject's body. It is the information calculated based on at least one of these.
  • the calculated subject motion information acquired by the subject motion acquisition unit is a speed or acceleration of a calculation target part of the subject.
  • the impact prediction apparatus further includes a control target control unit that outputs a control command for controlling the control target based on the impact level calculated by the impact level calculation unit.
  • the control target control unit outputs a control command for continuing the operation, stopping the operation, reducing the operation speed, performing the contact avoiding operation, or performing the operation for reducing the impact level. It is characterized by.
  • control device described in the present application is a control device that controls a control target, and an input unit that receives an input of an impact level calculated by the impact level calculation unit from the impact prediction device, and the input unit receives And a control target control unit that outputs a control command for controlling the control target based on the degree of impact.
  • control target control unit outputs a control command for causing the operation to continue, the operation to be stopped, the operation speed to be reduced, the contact avoidance operation to be performed, or the impact degree reduction operation to be performed.
  • the impact prediction system described in the present application includes a mounting device that can be worn by a subject, a controlled object that operates based on control, and the impact prediction device, and the wearing device measures the body of the subject.
  • a measurement unit that measures a target region the control target includes an output unit that outputs information about an operation, and the subject motion acquisition unit included in the impact prediction device is based on a measurement result by the measurement unit of the mounting device.
  • the subject motion information is obtained, and the control subject motion acquisition unit provided in the impact prediction apparatus obtains control subject motion information based on information related to the motion output from the control subject.
  • the impact prediction method described in the present application is an impact prediction method for predicting an impact given to a subject by contact of a control target
  • the specification acquisition unit acquires specification information related to the specification of the control target
  • a person action obtaining unit obtaining object person action information indicating the action of the subject person based on a measurement result of a measurement target part of the subject person's body
  • a control object action obtaining unit comprising an action of the control object
  • the control object operation information indicating the control object operation information acquired by the object person operation acquisition unit, the control object operation information acquired by the control object operation acquisition unit, and the specification acquisition Calculating a degree of impact indicating a degree of impact that the control target gives to the target person when the control target contacts the target person based on the specification information acquired by the unit; And wherein the Mukoto.
  • the impact prediction program described in the present application is an impact prediction program for causing a computer to predict an impact given to a subject by contact of a control target, and acquiring, in the computer, specification information relating to the specification of the control target; Acquiring subject motion information indicating the motion of the subject based on measurement results of a measurement target region of the subject's body, obtaining control target motion information indicating the motion of the control target, and obtaining The degree of impact indicating the degree of impact that the control target gives to the target person when the control target comes into contact with the target person based on the acquired target person action information, the acquired control target action information, and the acquired specification information
  • the step of calculating is performed.
  • the impact prediction device, the control device, the impact prediction system, the impact prediction method, and the impact prediction program described in this application are based on the target person's motion, the control target's motion, and the control target's specification information. In this case, it is possible to calculate the degree of impact indicating the degree of impact that the control target gives to the subject.
  • the present invention provides a control object when a control object comes into contact with a target person based on specification information regarding the operation of the object person and the operation of the control object based on the measurement result of the measurement part of the body of the object person and the control object.
  • the degree of impact indicating the degree of impact given to the subject is calculated.
  • the degree of impact can be predicted before the worker contacts the controlled object, so that control of the controlled object according to the degree of impact is possible, so the worker and the controlled object can collaborate. As a result, it is possible to improve the safety in performing the process.
  • FIG. 1 is an explanatory diagram conceptually illustrating an example of an impact prediction system described in the present application.
  • the impact prediction system described in the present application includes a work robot (hereinafter referred to as a robot) 2 that works according to a predetermined control command as a control target, and an FA (Factory Automation) that a worker (target person) works in cooperation with the robot 2. ) Applies to systems such as systems.
  • the worker uses the mounting device 3 including various inertial sensors such as an acceleration sensor and a gyro sensor on various measurement target parts of the body, for example, measurement target parts such as the head, upper arm, forearm, chest, abdomen, thigh, and lower leg. Installing.
  • the mounting apparatus 3 including various inertial sensors measures a measurement target portion of the subject's body and outputs various measurement information indicating the measurement results.
  • a sensor to be attached to the mounting device 3 in addition to an inertial sensor such as an acceleration sensor or a gyro sensor, a sensor such as a magnetic sensor or a pressure sensor can be used.
  • each worker who is a target of impact prediction mounts these mounting devices 3.
  • a part where the inertial sensor is located is indicated by a white circle in an operator who wears the mounting device 3.
  • the impact prediction system includes an impact prediction device 1 that acquires various types of information output from the mounting device 3 and the robot 2 and predicts an impact when the operator and the robot 2 come into contact with each other.
  • the impact prediction apparatus 1 is configured using a computer such as a control computer that controls the robot 2, for example.
  • the impact prediction device 1 is communicably connected to the mounting device 3 and the robot 2 indirectly or directly via a communication network such as a local LAN (Local Area Network) by a wireless or wired communication method. It communicates various information and signals.
  • a communication network such as a local LAN (Local Area Network) by a wireless or wired communication method. It communicates various information and signals.
  • the robot 2 and the impact prediction device 1 are described as different devices, but the impact prediction device 1 may be a device incorporated in the robot 2. In that case, a circuit or a program that functions as the impact prediction device 1 may be incorporated into a part of the control circuit that controls the operation of the robot 2. Furthermore, the robot 2 may be plural, and in that case, the impact prediction device 1 may be incorporated in each robot 2.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of the impact prediction device 1 and other various devices included in the impact prediction system described in the present application.
  • the impact prediction apparatus 1 includes a control unit 10 and a recording unit 11, and further includes a measurement information acquisition unit 12, a robot input / output unit 13, and an information input unit 14 as interfaces with other devices.
  • the control unit 10 is configured by using a processor such as a CPU (Central Processing Unit) and a memory such as a register, and controls the entire apparatus by executing various commands. Is output.
  • a processor such as a CPU (Central Processing Unit) and a memory such as a register
  • the recording unit 11 includes a nonvolatile memory such as a ROM (Read Only Memory) and an EPROM (Erasable Programmable Read Only Memory), a volatile memory such as a RAM (Random Access Memory), and a recording medium such as a hard disk drive and a semiconductor memory. It records data such as various programs and information. Further, in the recording area of the recording unit 11, an impact prediction program PG that causes a computer such as a control computer to function as the impact prediction apparatus 1 according to the present invention is recorded.
  • a nonvolatile memory such as a ROM (Read Only Memory) and an EPROM (Erasable Programmable Read Only Memory)
  • a volatile memory such as a RAM (Random Access Memory)
  • a recording medium such as a hard disk drive and a semiconductor memory.
  • a part of the recording area of the recording unit 11 includes a human body model recording unit 11a that records a human body model that schematically represents the body shape of each worker, and a robot specification recording unit that records specification information indicating the specifications of the robot 2. It is used as a database such as 11b.
  • the human body model is a numerical model schematically representing the body of each worker using numerical values such as lengths of various parts such as the upper arm, forearm, thigh, and lower leg.
  • a human body model relating to each worker is recorded in association with worker specifying information (worker ID) for specifying the worker.
  • a recording device such as a server computer for recording various information is connected to the impact prediction device 1 to A part of the recording area of the recording device connected to the prediction device 1 may be used as a database for the human body model recording unit 11a, the robot specification recording unit 11b, and the like. That is, the database such as the human body model recording unit 11a and the robot specification recording unit 11b is accessed by the control unit 10 included in the impact prediction device 1 and can be designed in various forms as long as it can be recorded and read. Is possible.
  • FIG. 3 is an explanatory diagram conceptually showing an example of recorded contents of the robot specification recording unit 11b provided in the impact prediction apparatus 1 described in the present application.
  • the robot specification recording unit 11b information related to the specifications of the robot 2 is recorded for each part of the robot 2 indicated as "A", "B", and "C” in the figure.
  • As the specification information on the specifications of the robot 2 various information such as information on the shape of the part such as the link length, information on the material of the part such as the weight and the coefficient of restitution, information on the operation such as the movable range and the maximum speed are recorded. ing.
  • the measurement information acquisition unit 12 is an interface that acquires various types of information such as measurement information indicating the results of measurement from each mounting device 3.
  • the robot input / output unit 13 is an interface that acquires various information from the robot 2 and outputs various commands.
  • the information input unit 14 is an interface for communicating with various devices used for inputting various information such as information indicating a human body model and specification information regarding the specifications of the robot 2. Note that these interfaces do not necessarily have to exist as individual devices, and can be shared as appropriate, and include a plurality of devices for inputting and outputting different information to one apparatus. It is also possible to make it.
  • the information input unit 14 may accept input of various information such as a human body model and specification information from an information input device using a device such as a tablet computer, and various information from portable recording media such as various semiconductor memories. It is possible to design as appropriate, such as accepting the input.
  • a computer such as a control computer reads various programs recorded in the recording unit 11, for example, the impact prediction program PG, acquires specification information included in the read impact prediction program PG, acquires target person operation information, and controls. It functions as the impact prediction device 1 by executing various steps such as acquisition of target motion information and calculation of impact level under the control of the control unit 10.
  • the robot 2 includes various components such as a control unit that controls the entire apparatus, an operation unit such as an arm unit that performs work, a sensor unit that measures operations such as position and posture, and an input / output unit that communicates with the impact prediction apparatus 1. ing.
  • the mounting device 3 includes various components such as a measurement unit using sensors that detect information related to the operation of an acceleration sensor, a gyro sensor, and the like, and other output units.
  • various sensors such as a magnetic sensor and a pressure sensor can be used in addition to an acceleration sensor and a gyro sensor.
  • the mounting apparatus 3 measures physical quantities, such as speed, angular velocity, acceleration, angular acceleration, pressure, magnetism, in a measurement part, and uses the measured result as measurement information, such as raw data which shows the measurement result regarding an operator's operation
  • the impact prediction apparatus 1 acquires measurement information indicating physical quantities such as velocity, angular velocity, acceleration, angular acceleration, pressure, magnetism, and the like by the measurement information acquisition unit 12, and position information and posture calculated based on the acquired measurement information Get status information such as information.
  • the position information can be calculated by integrating the acceleration measured by the acceleration sensor twice.
  • calculation of state information such as position information and posture information based on measurement information indicating physical quantities such as velocity, angular velocity, acceleration, angular acceleration, pressure, and magnetism is performed by either the mounting device 3 or the impact prediction device 1. You may do it.
  • FIG. 4 is a functional block diagram showing an example of the functional configuration of the impact prediction device 1 and other various devices provided in the impact prediction system described in the present application.
  • the impact prediction device 1 executes various programs such as the impact prediction program PG, and based on the control of the control unit 10, the human body state acquisition unit 10a, the human body model calculation unit 10b, the human body model acquisition unit 10c, the human body motion calculation unit 10d, human body motion acquisition unit (target person motion acquisition unit) 10e, robot motion calculation unit 10f, robot motion acquisition unit (control target motion acquisition unit) 10g, approach part determination unit 10h, robot specification acquisition unit 10i, impact degree calculation unit 10j, the function as a calculation part which performs various calculations, such as a robot control part (control object control part) 10k, is realized.
  • various arithmetic units that realize these various functions can be implemented as dedicated circuits using semiconductor chips such as LSI (Large Scale Integration) and VLSI (Very Large Scale Integration).
  • the human body state acquisition unit 10a uses the measurement information indicating physical quantities such as velocity, angular velocity, acceleration, angular acceleration, pressure, magnetism, and the like acquired by the measurement information acquisition unit 12 from the mounting device 3 to determine the position information and posture of the measurement target part. Status information such as information is obtained by calculation.
  • state information such as position information and posture information is output from the mounting device 3 and the measurement information acquisition unit 12 acquires the state information
  • the human body state acquisition unit 10a of the control unit 10 It is acquired as it is as information used for calculation.
  • the human body model calculation unit 10b performs a calculation for calculating a human body model that schematically represents the shape of the worker's body based on various state information such as position information and posture information acquired by the human body state acquisition unit 10a.
  • the calculation of the human body model based on the various state information based on the measurement of the mounting device 3 worn by the worker can be performed using various preceding techniques. For example, it is described in “Kanesugi Hiroshi, Shibasaki Ryosuke,“ Measurement and Analysis of Human Motion Using Wearable Sensors ”, Japan Photogrammetric Society, 2005 Annual Scientific Papers, pp.199-202, 2005.06. It is possible to apply the method.
  • the human body model acquisition unit 10c executes a process of acquiring the human body model recorded in the human body model recording unit 11a.
  • the human body motion calculation unit 10d is based on the state information acquired by the human body model acquisition unit 10c and the human body model acquired by the human body model acquisition unit 10c from the human body model recording unit 11a, and the calculation target of the body such as the hands and feet of the worker
  • part is performed.
  • the operation of the calculation target part is calculated as, for example, human body motion information (subject action information) indicating operations such as the position of the calculation target part, the motion direction, the predicted position after a predetermined time, and the predicted motion direction.
  • the human body motion acquisition unit 10e executes a process of acquiring the human body motion information calculated by the human body motion calculation unit 10d.
  • the robot motion calculation unit 10f communicates with the robot 2 via the robot input / output unit 13 that functions as the robot state acquisition unit 13a. Then, the robot operation calculation unit 10f receives input of robot state information indicating the position, posture, and the like of each part from the robot 2, and executes calculation for calculating the operation of the robot 2 based on the received input.
  • the robot motion acquisition unit 10g executes a process of acquiring robot motion information (control target motion information) indicating the motion of the robot 2 calculated by the robot motion calculation unit 10f.
  • the approaching part determination unit 10h is configured so that each part of the worker and each part of the robot 2 are determined based on the operation of the worker that is the calculation result of the human body motion calculation unit 10d and the operation of the robot 2 that is the calculation result of the robot motion calculation unit 10f.
  • the approach situation is determined for the part.
  • the determination of the approach situation is the identification of the worker and the part of the worker who is approaching or predicted to approach and the part of the robot 2, the relative speed between the approaching parts, the prediction of the contact, It is the derivation of the situation such as the relative speed between the parts.
  • the robot specification acquisition unit 10i executes a process of acquiring specification information recorded in the robot specification recording unit 11b.
  • the impact calculation unit 10j is configured so that the robot 2 touches the target person by the contact based on the approach status as the calculation result of the approach site determination unit 10h and the specification information of each part of the robot 2 recorded in the robot specification recording unit 11b.
  • the calculation which calculates the applied impact as an impact degree is performed.
  • the degree of impact is calculated based on items such as pressure, force, maximum transmission energy, and kinetic energy applied to the worker when the robot 2 contacts the worker. Therefore, for example, the higher the relative speed at the time of contact between the robot 2 and the worker, and the greater the weight of the part of the robot 2, the greater the impact degree is calculated.
  • the robot control unit 10k executes a calculation for deriving a method for controlling the robot 2 based on the calculation result of the impact level by the impact level calculation unit 10j, and outputs a control command indicating the control method for the robot 2 as the calculation result. To do.
  • the output of the control command is executed as an output to the robot 2 via the robot input / output unit 13 that functions as the robot control command output unit 13b.
  • the method for controlling the robot 2 is derived, for example, by comparing the calculated degree of impact with a predetermined threshold value set in advance in a plurality of stages. For example, if the degree of impact is less than a predetermined first threshold, it is determined that there is no problem even if the contact is made, and control is performed so that the operation of the robot 2 is continued.
  • the robot 2 is controlled to perform an operation for reducing the degree of impact. Further, if the degree of impact is equal to or greater than the second impact value, control is performed so that the motion trajectory is corrected so that the motion of the robot 2 is not stopped or touched.
  • the derived control method is not limited to these exemplified controls, and is appropriately set according to the calculation result of the impact level. That is, the robot control unit 10k derives a control method such as continuation of operation, stop of operation, deceleration of operation speed, implementation of contact avoidance operation, reduction operation of impact degree, and the like, and a control command for executing the derived control method Is output.
  • the information input unit 14 functions as a human body model input unit 14a that receives input of the human body model of each worker from the human body model input device 4 that is an external device, and records the received human body model in the human body model recording unit 11a.
  • the information input unit 14 functions as a robot specification input unit 14b that receives input of specification information of the robot 2 from the robot specification input device 5 that is an external device, and records the received specification information in the robot specification recording unit 11b. To do.
  • FIG. 5 is a flowchart showing an example of the robot specification recording process of the impact prediction apparatus 1 provided in the impact prediction system described in the present application.
  • the robot specification recording process is a process of accepting input of specification information from the robot specification input device 5 that records specification information and recording it in the robot specification recording unit 11b.
  • the control unit 10 included in the impact prediction apparatus 1 executes a robot specification recording process by executing various programs such as the impact prediction program PG.
  • the control unit 10 of the impact prediction apparatus 1 accepts input of specification information from the robot specification input device 5 through the robot specification input unit 14b (S101), and records the received specification information in the robot specification recording unit 11b (S102).
  • specification information such as information on the shape of the part of the robot 2, information on the material, and information on the operation is recorded for each part.
  • specification information is recorded for each robot 2 and each part.
  • FIG. 6 is a flowchart showing an example of the first human body model recording process of the impact prediction device 1 provided in the impact prediction system described in the present application.
  • the first human body model recording process is a process in which an operator wearing the mounting apparatus 3 performs an operation, acquires information based on the operation, and records a human body model calculated from the acquired information in the human body model recording unit 11a. is there.
  • the control unit 10 included in the impact prediction apparatus 1 executes the first human body model recording process by executing various programs such as the impact prediction program PG.
  • the control unit 10 of the impact prediction apparatus 1 acquires measurement information as a result of measuring the operator's operation by the measurement information acquisition unit 12 from the mounting apparatus 3 worn by the worker (S201).
  • the acquisition of the measurement information in step S201 means that the worker wearing various inertial sensors as the attachment device 3 performs a predetermined reference motion, and the speed and angular velocity of the measurement target part measured by the worker performing the reference motion.
  • the measurement information acquisition unit 12 acquires measurement information indicating at least one of physical quantities such as acceleration, angular acceleration, pressure, and magnetism.
  • the control unit 10 uses the human body state acquisition unit 10a to obtain and acquire state information such as position information and posture information of the measurement target part based on the measurement information acquired by the measurement information acquisition unit 12 (S202).
  • Step S202 is a process of calculating at least one of the position information and posture information of the subject as state information based on the measurement information obtained by measuring the motion of the subject acquired by the measurement information acquisition unit 12.
  • the control unit 10 calculates (calculates) a human body model that schematically represents the shape of the worker's body based on the acquired state information by the calculation of the human body model calculation unit 10b (S203).
  • control unit 10 records the calculated human body model in the human body model recording unit 11a in association with worker specifying information for specifying the worker who performed the operation (S204).
  • FIG. 7 is a flowchart showing an example of the second human body model recording process of the impact prediction device 1 provided in the impact prediction system described in the present application.
  • the second human body model recording process is a process of receiving an input of a human body model from the human body model input device 4 that records a human body model calculated in advance and recording it in the human body model recording unit 11a.
  • the control unit 10 provided in the impact prediction apparatus 1 executes the second human body model recording process by executing various programs such as the impact prediction program PG.
  • the control unit 10 of the impact prediction apparatus 1 receives an input of the human body model and corresponding worker identification information from the human body model input device 4 through the human body model input unit 14a (S301). And the control part 10 matches the received human body model with worker specific information, and records it on the human body model recording part 11a (S302).
  • any of a 1st human body model recording process and a 2nd human body model recording process may be used, and you may make it use another method. .
  • FIG. 8 is a flowchart showing an example of the impact prediction process of the impact prediction device 1 provided in the impact prediction system described in the present application.
  • the impact prediction process is a process for predicting the impact that the control target gives to the subject by contact.
  • the control unit 10 included in the impact prediction apparatus 1 executes impact prediction processing by executing various programs such as the impact prediction program PG.
  • the control unit 10 of the impact prediction apparatus 1 executes a process for calculating the operation state of the worker and a process for calculating the operation state of the robot 2 as parallel processes. That is, as one process, the control unit 10 acquires measurement information indicating a measurement result obtained by the mounting apparatus 3 measuring the movement of the worker by the measurement information acquisition unit 12 (S401), and the human body state acquisition unit 10a.
  • human body state information indicating the state of the human body such as position information and posture information is obtained by calculation and acquired (S402). Further, the control unit 10 acquires a human body model from the human body model recording unit 11a by the human body model acquisition unit 10c (S403). Further, the control unit 10 calculates the motion of the calculation target part of the worker (human body) based on the acquired human body state information and the human body model by the human body motion calculation unit 10d (S404), and the human body motion acquisition unit 10e Then, human body motion information indicating the motion of the part to be calculated is acquired (S405). The calculation of the operation state of the calculation target part based on the operation information is performed for each worker.
  • the control unit 10 acquires robot state information indicating the robot state such as the position and posture of each part from the robot 2 by the robot state acquisition unit 13a (S406). Further, the control unit 10 calculates the operation of the robot 2 based on the robot state information acquired by the robot operation calculation unit 10f (S407), and the robot operation information indicating the operation of the robot 2 by the robot operation acquisition unit 10g. Is acquired (S408).
  • control unit 10 determines the approach status for each part for each worker and each part of the robot 2 based on the human body motion information and the robot motion information by the approach part determination unit 10h. It is determined whether or not the robot 2 comes into contact (S409).
  • step S409 If it is determined in step S409 that the worker and the robot 2 are in contact (S409: YES), the control unit 10 acquires the specification information of each part of the robot 2 from the robot specification recording unit 11b by the robot specification acquisition unit 10i. (S410), the impact level calculation unit 10j calculates the impact level between the worker and the robot 2 based on the approach situation and the specification information (S411). In step S411, the degree of impact is calculated based on the approach status of each part for each worker and each part of the robot 2, and the specification information of each part of the robot 2 recorded in the robot specification recording unit 11b.
  • the calculation of the degree of impact is performed as identification of the worker who is predicted to be contacted, derivation of the degree of impact obtained by quantifying the contact portion of the worker and the robot 2 and the impact of the robot 2 on the subject due to the contact. That is, in step S411, as the calculation of the degree of impact, the impact that the robot 2 gives to the operator through contact is predicted. If it is determined in step S409 that the worker and the robot 2 do not come into contact with each other (S409: NO), the control unit 10 parallels the process for calculating the operation state of the worker and the process for calculating the operation state of the robot 2. Run the process again.
  • the control unit 10 After calculating the impact level, the control unit 10 performs arbitrary control of the robot 2 based on the impact level by the robot control unit 10k (S412).
  • the arbitrary control in step S412 means that the robot 2 continues to operate, stops operation, decelerates the operation speed, performs a contact avoidance operation, reduces the impact, etc., based on the impact calculated in step S411. This indicates that predetermined control is executed in advance.
  • the robot control unit 10k derives a control method for arbitrary control based on the degree of impact, and outputs a control command for executing the derived control method to the robot 2 via the robot control command output unit 13b.
  • control part 10 determines whether an impact prediction process is complete
  • step S413 When it is determined in step S413 that the impact prediction process is to be ended (S413: YES), the control unit 10 ends the impact prediction process. If it is determined in step S413 that the impact prediction process is not to be ended (S413: NO), the control unit 10 executes again the parallel process of the process of calculating the operation state of the worker and the process of calculating the operation state of the robot 2. To do.
  • the impact prediction device 1 predicts the impact that the control target gives to the subject by contact based on the motion of the subject, the motion of the robot 2, and the specifications of the robot 2. As a result, the impact can be predicted before the operator contacts the robot 2, so that arbitrary control such as continuation of the operation of the robot 2, impact mitigation operation, stop, contact avoidance operation, information collection, etc. can be performed according to the impact. Is possible. Therefore, the impact prediction apparatus 1 can improve both safety and productivity because it is not necessary to stop the operation of the robot 2 more than necessary while maintaining safety.
  • the robot motion information is acquired when calculating the robot motion.
  • the present invention is not limited to this, and the motion of the robot 2 is calculated based on a preprogrammed control command. It can be developed in various forms.
  • the scalar quantity is used as the degree of impact obtained by quantifying the impact.
  • the present invention is not limited to this, and the present invention can be developed in various forms such as using a vector quantity as the degree of impact. It is possible.
  • the impact prediction device 1 performs various calculations using various calculation units such as the human body model calculation unit 10b, the human body movement calculation unit 10d, and the robot movement calculation unit 10f.
  • the invention is not limited to this. That is, some of these calculation functions are provided in another device, the impact prediction device 1 acquires various calculation results from the other device, and the impact degree calculation unit 10j performs the calculation of the impact degree. It can be developed in various forms.
  • the impact prediction apparatus 1 showed the form which controls the robot 2 in the said embodiment, this invention is not limited to this, The apparatus which controls the robot 2 separately from the impact prediction apparatus 1 which estimates an impact is shown. It is also possible to develop in various forms such as providing.
  • FIG. 9 is a functional block diagram showing an example of the functional configuration of various devices in another system configuration example of the impact prediction system described in the present application.
  • the control device 6 is formed using, for example, a device such as a control computer that controls the robot 2, and is connected to the impact prediction device 1 and the robot 2.
  • the impact prediction device 1 includes a control device output unit 15 as a communication interface with the control device 6.
  • the control device 6 includes various components such as an input unit 60 serving as a communication interface with the impact prediction device 1 and a robot control unit (control target control unit) 61 that controls the robot 2.
  • the impact prediction device 1 outputs the impact level predicted by the impact level calculation unit 10j to the control device 6.
  • the control device 6 receives an input of the impact level from the input unit 60, and the robot control unit 61 determines control of the robot 2 based on the impact level. Then, the control device 6 outputs a control command for arbitrarily controlling the robot 2 based on the degree of impact from the robot control unit 61 to the robot 2.
  • the control command to be output to the robot 2 performs predetermined control such as continuation of the operation, stop of the operation, deceleration of the operation speed, execution of the contact avoidance operation, reduction operation of the impact degree, etc. It is an instruction to make. That is, the robot control unit 61 included in the control device 6 realizes the same function as the robot control unit 10 k included in the impact prediction device 1.
  • the impact prediction system described in the present application can be deployed in various system configurations.

Abstract

Provided are an impact prediction device, control device, impact prediction system, impact determination method, and impact prediction program with which it is possible to ensure safety in an area in which there is a risk of contact between an object to be controlled and a worker when an object to be controlled such as an industrial robot and a human subject such as a worker perform work in cooperation. This impact prediction device (1) is provided with a specifications recording unit that records specification information relating to the specifications of the object to be controlled. The impact prediction device (1) calculates a movement of the human subject on the basis of a measurement result for a part to be measured in the body of the human subject (S401-S405), and calculates an operation of the object to be controlled (S406-S408). Then, on the basis of the movement of the human subject, the operation of the object to be controlled, and the specification information recorded in the specifications recording unit, the impact prediction device (1) calculates an impact degree, which represents the degree of the impact of the object to be controlled on the human subject when the object to be controlled makes contact with the human subject (S411), and controls the object to be controlled on the basis of the calculation result (S412).

Description

衝撃予測装置、衝撃予測システム、制御装置、衝撃予測方法及び衝撃予測プログラムImpact prediction device, impact prediction system, control device, impact prediction method, and impact prediction program
 本発明は、制御対象が接触により対象者に与える衝撃を予測する衝撃予測装置、そのような衝撃予測装置の予測結果に基づいて制御対象を制御する制御装置、そのような衝撃予測装置を備える衝撃予測システム、そのような衝撃予測装置を用いた衝撃予測方法、及びそのような衝撃予測装置を実現するための衝撃予測プログラムに関する。 The present invention relates to an impact prediction device that predicts an impact given to a subject by contact of a control object, a control device that controls the control object based on a prediction result of such an impact prediction device, and an impact equipped with such an impact prediction device The present invention relates to a prediction system, an impact prediction method using such an impact prediction device, and an impact prediction program for realizing such an impact prediction device.
 工場等の様々な作業現場においては、産業用ロボット等の機械と人とが協調して作業を行っている。機械と人との協調作業において、人が機械に衝突する際の衝撃度を検知することは、安全上重要な問題である。従来の機械においては、機械の変位、過トルク、過電流等の状態を衝突後に検知することにより、実際の衝突を検知している。このような手法では、人体に負荷がかかった状態になって初めて衝突を検知することになるため、人体に痛みが生じる等、危険が伴う恐れがある。 In various work sites, such as factories, machines such as industrial robots and people work together. In a cooperative work between a machine and a person, it is an important safety issue to detect the degree of impact when a person collides with the machine. In a conventional machine, an actual collision is detected by detecting the state of the machine such as displacement, overtorque, and overcurrent after the collision. In such a method, since a collision is detected only when a load is applied to the human body, there is a risk of danger such as pain in the human body.
 機械が衝突した場合の衝撃を測定する方法は、例えば特許文献1に開示されている。また、衝突後にロボットを制御する方法は、例えば特許文献2に開示されている。 A method for measuring an impact when a machine collides is disclosed in Patent Document 1, for example. A method for controlling the robot after the collision is disclosed in, for example, Patent Document 2.
 ところで、従来は、定格出力が80Wを超える産業用ロボットは、物理的な安全柵で囲い、作業者と隔離する必要があった。このような80W規制が2013年12月に緩和され、一定の条件を満たせば80Wを超える産業用ロボットでも安全柵無しで作業者と協調することが可能となった。 By the way, conventionally, industrial robots with a rated output exceeding 80 W had to be enclosed with a physical safety fence and isolated from the workers. Such 80W regulations were relaxed in December 2013, and if certain conditions were met, industrial robots exceeding 80W could cooperate with workers without safety fences.
特開2009-101457号公報JP 2009-101457 A 特開2007-26528号公報JP 2007-26528 A
 80W規制の緩和に伴い、産業用ロボットと人とが協調する上で、安全性の確保が、より重要な課題となっている。なお、安全性の確保が重要な課題であることは、80W以下の産業用ロボットについても同様である。 With the relaxation of the 80W regulation, ensuring safety is a more important issue when industrial robots and people collaborate. Note that ensuring safety is also an important issue for industrial robots of 80 W or less.
 しかしながら、特許文献1及び特許文献2は、産業用ロボット等の機械と人との衝突についての技術を開示するものではない。また、従来のように、衝突後に衝撃度を検知する方法では、機械と人との衝突の恐れがある区域に適用する際には十分な安全性を確保しなければならない。 However, Patent Document 1 and Patent Document 2 do not disclose a technique regarding a collision between a machine such as an industrial robot and a person. In addition, in the conventional method of detecting the degree of impact after a collision, sufficient safety must be ensured when applied to an area where there is a risk of collision between a machine and a person.
 本発明は斯かる事情に鑑みてなされたものであり、人の動作及び機械の動作、並びに機械の仕様に基づいて、機械が人に接触した際の衝撃度を演算することにより、機械と人とが接触する前に衝撃を予測することが可能な衝撃予測装置の提供を主たる目的とする。 The present invention has been made in view of such circumstances, and by calculating the degree of impact when a machine comes into contact with a person based on the operation of the person, the operation of the machine, and the specifications of the machine, the machine and the person It is a main object to provide an impact prediction device capable of predicting an impact before it comes into contact.
 また、本発明は、本発明に係る衝撃予測装置の衝撃の予測結果に基づいて制御対象を制御する制御装置の提供を他の目的とする。 Another object of the present invention is to provide a control device that controls a controlled object based on the impact prediction result of the impact prediction device according to the present invention.
 また、本発明は、本発明に係る衝撃予測装置を備える衝撃予測システムの提供を他の目的とする。 Another object of the present invention is to provide an impact prediction system including the impact prediction device according to the present invention.
 また、本発明は、本発明に係る衝撃予測装置を用いた衝撃予測方法の提供を更に他の目的とする。 Further, the present invention has another object to provide an impact prediction method using the impact prediction apparatus according to the present invention.
 また、本発明は、本発明に係る衝撃予測装置を実現する衝撃予測プログラムの提供を更に他の目的とする。 Further, the present invention has another object to provide an impact prediction program for realizing the impact prediction apparatus according to the present invention.
 上記課題を解決するために、本願記載の衝撃予測装置は、制御対象が接触により対象者に与える衝撃を予測する衝撃予測装置であって、前記制御対象の仕様に関する仕様情報を取得する仕様取得部と、前記対象者の身体の測定対象部位の測定結果に基づいて前記対象者の動作を示す対象者動作情報を取得する対象者動作取得部と、前記制御対象の動作を示す制御対象動作情報を取得する制御対象動作取得部と、前記対象者動作取得部が取得した対象者動作情報及び前記制御対象動作取得部が取得した制御対象動作情報、並びに前記仕様取得部が取得した仕様情報に基づいて、前記制御対象が前記対象者に接触した場合に前記制御対象が前記対象者に与える衝撃の程度を示す衝撃度を演算する衝撃度演算部とを備えることを特徴とする。 In order to solve the above-described problem, the impact prediction device described in the present application is an impact prediction device that predicts an impact that a control target gives to a subject by contact, and a specification acquisition unit that acquires specification information about the specification of the control target A target person action acquisition unit for acquiring target person action information indicating the action of the target person based on a measurement result of a measurement target part of the subject person's body, and control target action information indicating the action of the control object Based on the control target action acquisition unit to be acquired, the target person action information acquired by the target person action acquisition unit, the control target action information acquired by the control target action acquisition unit, and the specification information acquired by the specification acquisition unit And an impact degree calculation unit that calculates an impact degree indicating the degree of impact that the control object gives to the subject when the control subject comes into contact with the subject.
 また、前記衝撃予測装置において、前記対象者動作取得部は、複数の前記対象者の対象者動作情報を取得し、前記衝撃度演算部は、接触する前記対象者を特定することを特徴とする。 Further, in the impact prediction apparatus, the subject motion acquisition unit acquires subject motion information of a plurality of the subjects, and the impact degree computation unit specifies the subject to be contacted. .
 また、前記衝撃予測装置において、前記仕様取得部は、仕様情報として、前記制御対象を構成する部位についての形状に関する情報、材質に関する情報、及び動作に関する情報のうちの少なくとも一を取得することを特徴とする。 Further, in the impact prediction device, the specification acquisition unit acquires at least one of information on a shape, information on a material, and information on an operation regarding a part constituting the control target as specification information. And
 また、前記衝撃予測装置において、前記対象者動作取得部は、前記対象者の身体の測定対象部位の位置情報及び姿勢情報の少なくとも一に基づいて、演算された対象者動作情報を取得することを特徴とする。 Further, in the impact prediction device, the subject motion acquisition unit acquires the calculated subject motion information based on at least one of position information and posture information of a measurement target part of the subject's body. Features.
 また、前記衝撃予測装置において、前記対象者動作取得部の取得に係る位置情報及び姿勢情報の少なくとも一は、前記対象者の身体の測定対象部位の速度、加速度、角速度、角加速度、圧力及び磁気のうちの少なくとも一に基づいて演算された情報であることを特徴とする。 Further, in the impact prediction apparatus, at least one of the position information and posture information related to the acquisition of the subject motion acquisition unit is the velocity, acceleration, angular velocity, angular acceleration, pressure, and magnetism of the measurement target part of the subject's body. It is the information calculated based on at least one of these.
 また、前記衝撃予測装置において、前記対象者動作取得部が取得する演算された対象者動作情報は、前記対象者の演算対象部位の速度又は加速度であることを特徴とする。 Further, in the impact prediction device, the calculated subject motion information acquired by the subject motion acquisition unit is a speed or acceleration of a calculation target part of the subject.
 また、前記衝撃予測装置において、前記衝撃度演算部が演算した衝撃度に基づいて、前記制御対象を制御する制御命令を出力する制御対象制御部を備えることを特徴とする。 The impact prediction apparatus further includes a control target control unit that outputs a control command for controlling the control target based on the impact level calculated by the impact level calculation unit.
 また、前記衝撃予測装置において、前記制御対象制御部は、動作の継続、動作の停止、動作速度の減速、接触回避動作の実施、又は衝撃度の低減動作の実施をさせる制御命令を出力することを特徴とする。 Further, in the impact prediction device, the control target control unit outputs a control command for continuing the operation, stopping the operation, reducing the operation speed, performing the contact avoiding operation, or performing the operation for reducing the impact level. It is characterized by.
 更に、本願記載の制御装置は、制御対象を制御する制御装置であって、前記衝撃予測装置から、前記衝撃度演算部が演算した衝撃度の入力を受け付ける入力部と、前記入力部が受け付けた衝撃度に基づいて、前記制御対象を制御する制御命令を出力する制御対象制御部とを備えることを特徴とする。 Furthermore, the control device described in the present application is a control device that controls a control target, and an input unit that receives an input of an impact level calculated by the impact level calculation unit from the impact prediction device, and the input unit receives And a control target control unit that outputs a control command for controlling the control target based on the degree of impact.
 また、前記制御装置において、前記制御対象制御部は、動作の継続、動作の停止、動作速度の減速、接触回避動作の実施、又は衝撃度の低減動作の実施をさせる制御命令を出力することを特徴とする。 Further, in the control device, the control target control unit outputs a control command for causing the operation to continue, the operation to be stopped, the operation speed to be reduced, the contact avoidance operation to be performed, or the impact degree reduction operation to be performed. Features.
 更に、本願記載の衝撃予測システムは、対象者が装着可能な装着装置と、制御に基づいて動作する制御対象と、前記衝撃予測装置とを備え、前記装着装置は、前記対象者の身体の測定対象部位を測定する測定部を備え、前記制御対象は、動作に関する情報を出力する出力部を備え、前記衝撃予測装置が備える対象者動作取得部は、前記装着装置の測定部による測定結果に基づく対象者動作情報を取得し、前記衝撃予測装置が備える制御対象動作取得部は、前記制御対象から出力された動作に関する情報に基づく制御対象動作情報を取得することを特徴とする。 Furthermore, the impact prediction system described in the present application includes a mounting device that can be worn by a subject, a controlled object that operates based on control, and the impact prediction device, and the wearing device measures the body of the subject. A measurement unit that measures a target region, the control target includes an output unit that outputs information about an operation, and the subject motion acquisition unit included in the impact prediction device is based on a measurement result by the measurement unit of the mounting device The subject motion information is obtained, and the control subject motion acquisition unit provided in the impact prediction apparatus obtains control subject motion information based on information related to the motion output from the control subject.
 更に、本願記載の衝撃予測方法は、制御対象が接触により対象者に与える衝撃を予測する衝撃予測方法であって、仕様取得部が、前記制御対象の仕様に関する仕様情報を取得するステップと、対象者動作取得部が、前記対象者の身体の測定対象部位の測定結果に基づいて前記対象者の動作を示す対象者動作情報を取得するステップと、制御対象動作取得部が、前記制御対象の動作を示す制御対象動作情報を取得するステップと、衝撃度演算部が、前記対象者動作取得部が取得した対象者動作情報及び前記制御対象動作取得部が取得した制御対象動作情報、並びに前記仕様取得部が取得した仕様情報に基づいて、前記制御対象が前記対象者に接触した場合に前記制御対象が前記対象者に与える衝撃の程度を示す衝撃度を演算するステップとを含むことを特徴とする。 Further, the impact prediction method described in the present application is an impact prediction method for predicting an impact given to a subject by contact of a control target, wherein the specification acquisition unit acquires specification information related to the specification of the control target; A person action obtaining unit obtaining object person action information indicating the action of the subject person based on a measurement result of a measurement target part of the subject person's body; and a control object action obtaining unit comprising an action of the control object The control object operation information indicating the control object operation information acquired by the object person operation acquisition unit, the control object operation information acquired by the control object operation acquisition unit, and the specification acquisition Calculating a degree of impact indicating a degree of impact that the control target gives to the target person when the control target contacts the target person based on the specification information acquired by the unit; And wherein the Mukoto.
 更に、本願記載の衝撃予測プログラムは、コンピュータに、制御対象が接触により対象者に与える衝撃を予測させる衝撃予測プログラムであって、コンピュータに、前記制御対象の仕様に関する仕様情報を取得するステップと、前記対象者の身体の測定対象部位の測定結果に基づいて前記対象者の動作を示す対象者動作情報を取得するステップと、前記制御対象の動作を示す制御対象動作情報を取得するステップと、取得した対象者動作情報及び取得した制御対象動作情報、並びに取得した仕様情報に基づいて、前記制御対象が前記対象者に接触した場合に前記制御対象が前記対象者に与える衝撃の程度を示す衝撃度を演算するステップとを実行させることを特徴とする。 Furthermore, the impact prediction program described in the present application is an impact prediction program for causing a computer to predict an impact given to a subject by contact of a control target, and acquiring, in the computer, specification information relating to the specification of the control target; Acquiring subject motion information indicating the motion of the subject based on measurement results of a measurement target region of the subject's body, obtaining control target motion information indicating the motion of the control target, and obtaining The degree of impact indicating the degree of impact that the control target gives to the target person when the control target comes into contact with the target person based on the acquired target person action information, the acquired control target action information, and the acquired specification information The step of calculating is performed.
 本願記載の衝撃予測装置、制御装置、衝撃予測システム、衝撃予測方法及び衝撃予測プログラムは、対象者の動作及び制御対象の動作、並びに制御対象の仕様情報に基づいて、制御対象が対象者に接触した場合に制御対象が対象者に与える衝撃の程度を示す衝撃度を演算することができる。 The impact prediction device, the control device, the impact prediction system, the impact prediction method, and the impact prediction program described in this application are based on the target person's motion, the control target's motion, and the control target's specification information. In this case, it is possible to calculate the degree of impact indicating the degree of impact that the control target gives to the subject.
 本発明は、対象者の身体の測定部位の測定結果に基づく対象者の動作及び制御対象の動作、並びに制御対象の仕様に関する仕様情報に基づいて、制御対象が対象者に接触した場合に制御対象が対象者に与える衝撃の程度を示す衝撃度を演算する。これにより、作業者が制御対象と接触する前に衝撃度を予測することができるので、衝撃度に応じた制御対象の制御等の対応が可能となるため、作業者と制御対象とが協調作業を行う上での安全性を向上させることが可能である等、優れた効果を奏する。 The present invention provides a control object when a control object comes into contact with a target person based on specification information regarding the operation of the object person and the operation of the control object based on the measurement result of the measurement part of the body of the object person and the control object. The degree of impact indicating the degree of impact given to the subject is calculated. As a result, the degree of impact can be predicted before the worker contacts the controlled object, so that control of the controlled object according to the degree of impact is possible, so the worker and the controlled object can collaborate. As a result, it is possible to improve the safety in performing the process.
本願記載の衝撃予測システムの一例を概念的に示す説明図である。It is explanatory drawing which shows notionally an example of the impact prediction system of this application description. 本願記載の衝撃予測システムが備える衝撃予測装置及びその他の各種装置のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the impact prediction apparatus with which the impact prediction system of this application description is provided, and other various apparatuses. 本願記載の衝撃予測装置が備えるロボット仕様記録部の記録内容の一例を概念的に示す説明図である。It is explanatory drawing which shows notionally an example of the recording content of the robot specification recording part with which the impact prediction apparatus of this application description is provided. 本願記載の衝撃予測システムが備える衝撃予測装置及びその他の各種装置の機能構成の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of a function structure of the impact prediction apparatus with which the impact prediction system of this application description is equipped, and other various apparatuses. 本願記載の衝撃予測システムが備える衝撃予測装置のロボット仕様記録処理の一例を示すフローチャートである。It is a flowchart which shows an example of the robot specification recording process of the impact prediction apparatus with which the impact prediction system of this application is provided. 本願記載の衝撃予測システムが備える衝撃予測装置の第1人体モデル記録処理の一例を示すフローチャートである。It is a flowchart which shows an example of the 1st human body model recording process of the impact prediction apparatus with which the impact prediction system of this application is provided. 本願記載の衝撃予測システムが備える衝撃予測装置の第2人体モデル記録処理の一例を示すフローチャートである。It is a flowchart which shows an example of the 2nd human body model recording process of the impact prediction apparatus with which the impact prediction system of this application description is provided. 本願記載の衝撃予測システムが備える衝撃予測装置の衝撃予測処理の一例を示すフローチャートである。It is a flowchart which shows an example of the impact prediction process of the impact prediction apparatus with which the impact prediction system of this application is provided. 本願記載の衝撃予測システムの他のシステム構成例における各種装置の機能構成の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of a function structure of the various apparatuses in the other system structure example of the impact prediction system described in this application.
 以下、本発明の実施形態について図面を参照しながら説明する。なお、以下の実施形態は、本発明を具現化した一例であって、本発明の技術的範囲を限定する性格のものではない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. The following embodiment is an example embodying the present invention, and is not intended to limit the technical scope of the present invention.
 <システム構成>
 先ず、本願記載の衝撃予測システムの概要について説明する。図1は、本願記載の衝撃予測システムの一例を概念的に示す説明図である。本願記載の衝撃予測システムは、制御対象として、所定の制御命令に従って作業する作業ロボット(以下、ロボットという)2を備え、ロボット2と協調して作業者(対象者)が作業するFA(Factory Automation)システム等のシステムに適用される。作業者は、身体の様々な測定対象部位、例えば、頭部、上腕、前腕、胸部、腹部、大腿、下腿等の測定対象部位に加速度センサ、ジャイロセンサ等の各種慣性センサを備える装着装置3を装着する。各種慣性センサを備える装着装置3は、対象者の身体の測定対象部位を測定し、測定した結果を示す各種測定情報を出力する。装着装置3に取り付けるセンサとしては、加速度センサ、ジャイロセンサ等の慣性センサの他、磁気センサ、圧力センサ等のセンサを用いることができる。複数の作業者が作業する場合、衝撃予測の対象となるそれぞれの作業者が、これらの装着装置3を装着する。なお、図1では、理解を容易にするため、装着装置3を装着した作業者において、慣性センサが位置する部位を白丸で示している。
<System configuration>
First, an outline of the impact prediction system described in the present application will be described. FIG. 1 is an explanatory diagram conceptually illustrating an example of an impact prediction system described in the present application. The impact prediction system described in the present application includes a work robot (hereinafter referred to as a robot) 2 that works according to a predetermined control command as a control target, and an FA (Factory Automation) that a worker (target person) works in cooperation with the robot 2. ) Applies to systems such as systems. The worker uses the mounting device 3 including various inertial sensors such as an acceleration sensor and a gyro sensor on various measurement target parts of the body, for example, measurement target parts such as the head, upper arm, forearm, chest, abdomen, thigh, and lower leg. Installing. The mounting apparatus 3 including various inertial sensors measures a measurement target portion of the subject's body and outputs various measurement information indicating the measurement results. As a sensor to be attached to the mounting device 3, in addition to an inertial sensor such as an acceleration sensor or a gyro sensor, a sensor such as a magnetic sensor or a pressure sensor can be used. When a plurality of workers work, each worker who is a target of impact prediction mounts these mounting devices 3. In FIG. 1, for easy understanding, a part where the inertial sensor is located is indicated by a white circle in an operator who wears the mounting device 3.
 更に、衝撃予測システムは、装着装置3及びロボット2からそれぞれ出力される各種情報を取得し、作業者とロボット2とが接触した際の衝撃を予測する衝撃予測装置1を備えている。衝撃予測装置1は、例えば、ロボット2を制御する制御用コンピュータ等のコンピュータを用いて構成されている。そして、衝撃予測装置1は、無線又は有線の通信方法により、構内LAN(Local Area Network)等の通信網を介して間接的に、又は直接に、装着装置3及びロボット2と通信可能に接続されており、各種情報及び信号の通信を行う。 Furthermore, the impact prediction system includes an impact prediction device 1 that acquires various types of information output from the mounting device 3 and the robot 2 and predicts an impact when the operator and the robot 2 come into contact with each other. The impact prediction apparatus 1 is configured using a computer such as a control computer that controls the robot 2, for example. The impact prediction device 1 is communicably connected to the mounting device 3 and the robot 2 indirectly or directly via a communication network such as a local LAN (Local Area Network) by a wireless or wired communication method. It communicates various information and signals.
 なお、ここでは説明の便宜上、ロボット2と衝撃予測装置1とを別体の異なる装置として記載しているが、衝撃予測装置1は、ロボット2に組み込まれた装置であっても良い。また、その場合、ロボット2の動作を制御する制御回路の一部に衝撃予測装置1として機能する回路又はプログラムを組み込むようにしても良い。更に、ロボット2は複数であっても良く、その場合、それぞれのロボット2に衝撃予測装置1が組み込まれていても良い。 Here, for convenience of explanation, the robot 2 and the impact prediction device 1 are described as different devices, but the impact prediction device 1 may be a device incorporated in the robot 2. In that case, a circuit or a program that functions as the impact prediction device 1 may be incorporated into a part of the control circuit that controls the operation of the robot 2. Furthermore, the robot 2 may be plural, and in that case, the impact prediction device 1 may be incorporated in each robot 2.
 <装置構成>
 次に、本願記載の衝撃予測システムが備える各種装置の構成例について説明する。図2は、本願記載の衝撃予測システムが備える衝撃予測装置1及びその他の各種装置のハードウェア構成の一例を示すブロック図である。衝撃予測装置1は、制御部10及び記録部11を備え、更に他の装置とのインターフェースとして、測定情報取得部12、ロボット用入出力部13及び情報入力部14を備えている。
<Device configuration>
Next, configuration examples of various devices included in the impact prediction system described in the present application will be described. FIG. 2 is a block diagram illustrating an example of a hardware configuration of the impact prediction device 1 and other various devices included in the impact prediction system described in the present application. The impact prediction apparatus 1 includes a control unit 10 and a recording unit 11, and further includes a measurement information acquisition unit 12, a robot input / output unit 13, and an information input unit 14 as interfaces with other devices.
 制御部10は、CPU(Central Processing Unit)等のプロセッサ及びレジスタ等のメモリを用いて構成されており、各種命令を実行することにより、装置全体を制御し、また、ロボット2に対して制御命令を出力する。 The control unit 10 is configured by using a processor such as a CPU (Central Processing Unit) and a memory such as a register, and controls the entire apparatus by executing various commands. Is output.
 記録部11は、ROM(Read Only Memory)、EPROM(Erasable Programmable Read Only Memory)等の不揮発メモリと、RAM(Random Access Memory)等の揮発メモリと、ハードディスクドライブ、半導体メモリ等の記録媒体とを備えており、各種プログラム及び情報等のデータを記録している。また、記録部11の記録領域には、制御用コンピュータ等のコンピュータを本発明に係る衝撃予測装置1として機能させる衝撃予測プログラムPGが記録されている。 The recording unit 11 includes a nonvolatile memory such as a ROM (Read Only Memory) and an EPROM (Erasable Programmable Read Only Memory), a volatile memory such as a RAM (Random Access Memory), and a recording medium such as a hard disk drive and a semiconductor memory. It records data such as various programs and information. Further, in the recording area of the recording unit 11, an impact prediction program PG that causes a computer such as a control computer to function as the impact prediction apparatus 1 according to the present invention is recorded.
 更に、記録部11の記録領域の一部は、各作業者の身体の形状を模式化した人体モデルを記録する人体モデル記録部11a、ロボット2の仕様を示す仕様情報を記録するロボット仕様記録部11b等のデータベースとして用いられている。人体モデルとは、各作業者の身体について、上腕、前腕、大腿、下腿等の各種部位の長さ等の数値を用いて模式化した数値モデルである。人体モデル記録部11aには、作業者を特定する作業者特定情報(作業者ID)に対応付けてそれぞれの作業者に関する人体モデルが記録される。なお、衝撃予測装置1が備える記録部11の記録領域の一部を人体モデル記録部11aとして用いるのではなく、各種情報を記録するサーバコンピュータ等の記録装置を衝撃予測装置1に接続し、衝撃予測装置1に接続した記録装置の記録領域の一部を人体モデル記録部11a、ロボット仕様記録部11b等のデータベースとして用いても良い。即ち、人体モデル記録部11a、ロボット仕様記録部11b等のデータベースは、衝撃予測装置1が備える制御部10にてアクセスし、記録及び読取可能な状態であれば、様々な形態に設計することが可能である。 Furthermore, a part of the recording area of the recording unit 11 includes a human body model recording unit 11a that records a human body model that schematically represents the body shape of each worker, and a robot specification recording unit that records specification information indicating the specifications of the robot 2. It is used as a database such as 11b. The human body model is a numerical model schematically representing the body of each worker using numerical values such as lengths of various parts such as the upper arm, forearm, thigh, and lower leg. In the human body model recording unit 11a, a human body model relating to each worker is recorded in association with worker specifying information (worker ID) for specifying the worker. Instead of using a part of the recording area of the recording unit 11 included in the impact prediction device 1 as the human body model recording unit 11a, a recording device such as a server computer for recording various information is connected to the impact prediction device 1 to A part of the recording area of the recording device connected to the prediction device 1 may be used as a database for the human body model recording unit 11a, the robot specification recording unit 11b, and the like. That is, the database such as the human body model recording unit 11a and the robot specification recording unit 11b is accessed by the control unit 10 included in the impact prediction device 1 and can be designed in various forms as long as it can be recorded and read. Is possible.
 図3は、本願記載の衝撃予測装置1が備えるロボット仕様記録部11bの記録内容の一例を概念的に示す説明図である。ロボット仕様記録部11bには、ロボット2の仕様に関する情報が、図中「A」、「B」、「C」として示したロボット2の部位毎に記録されている。ロボット2の仕様に関する仕様情報としては、リンク長さ等の部位の形状に関する情報、重量、反発係数等の部位の材質に関する情報、可動域、最高速度等の動作に関する情報等の各種情報が記録されている。 FIG. 3 is an explanatory diagram conceptually showing an example of recorded contents of the robot specification recording unit 11b provided in the impact prediction apparatus 1 described in the present application. In the robot specification recording unit 11b, information related to the specifications of the robot 2 is recorded for each part of the robot 2 indicated as "A", "B", and "C" in the figure. As the specification information on the specifications of the robot 2, various information such as information on the shape of the part such as the link length, information on the material of the part such as the weight and the coefficient of restitution, information on the operation such as the movable range and the maximum speed are recorded. ing.
 図2のブロック図に戻り、測定情報取得部12は、各装着装置3から測定した結果を示す測定情報等の各種情報を取得するインターフェースである。ロボット用入出力部13は、ロボット2から各種情報を取得し、各種命令を出力するインターフェースである。情報入力部14は、例えば人体モデルを示す情報、ロボット2の仕様に関する仕様情報等の各種情報の入力に用いられる各種装置と通信するためのインターフェースである。なお、これらのインターフェースは、必ずしもそれぞれ個別のデバイスとして存在する必要は無く、適宜共通化することも可能であり、また、一つの装置に対して異なる情報を入出力するための複数のデバイスを備えさせることも可能である。例えば通信網を介して他の装置と情報の送受信を行う場合、通信網に接続する一つのデバイスを備えていれば良く、また、ロボット2から位置情報、姿勢情報、その他の情報を取得するデバイスと、ロボット2に命令を出力するデバイスとの双方を備えさせることも可能である。更に、情報入力部14は、タブレット型コンピュータ等の装置を用いた情報入力装置から人体モデル、仕様情報等の各種情報の入力を受け付けても良く、各種半導体メモリ等の携帯型記録媒体から各種情報の入力を受け付けても良い等、適宜設計することが可能である。 Returning to the block diagram of FIG. 2, the measurement information acquisition unit 12 is an interface that acquires various types of information such as measurement information indicating the results of measurement from each mounting device 3. The robot input / output unit 13 is an interface that acquires various information from the robot 2 and outputs various commands. The information input unit 14 is an interface for communicating with various devices used for inputting various information such as information indicating a human body model and specification information regarding the specifications of the robot 2. Note that these interfaces do not necessarily have to exist as individual devices, and can be shared as appropriate, and include a plurality of devices for inputting and outputting different information to one apparatus. It is also possible to make it. For example, when transmitting / receiving information to / from other apparatuses via a communication network, it is sufficient to have one device connected to the communication network, and a device that acquires position information, posture information, and other information from the robot 2 It is also possible to provide both the robot 2 and a device that outputs a command. Furthermore, the information input unit 14 may accept input of various information such as a human body model and specification information from an information input device using a device such as a tablet computer, and various information from portable recording media such as various semiconductor memories. It is possible to design as appropriate, such as accepting the input.
 そして、制御用コンピュータ等のコンピュータは、記録部11に記録された各種プログラム、例えば衝撃予測プログラムPGを読み取り、読み取った衝撃予測プログラムPGに含まれる仕様情報の取得、対象者動作情報の取得、制御対象動作情報の取得、衝撃度の演算等の各種ステップを制御部10の制御にて実行することにより、衝撃予測装置1として機能する。 A computer such as a control computer reads various programs recorded in the recording unit 11, for example, the impact prediction program PG, acquires specification information included in the read impact prediction program PG, acquires target person operation information, and controls. It functions as the impact prediction device 1 by executing various steps such as acquisition of target motion information and calculation of impact level under the control of the control unit 10.
 ロボット2は、装置全体を制御する制御部、作業を行うアーム部等の動作部、位置、姿勢等の動作を測定するセンサ部、衝撃予測装置1と通信する入出力部等の各種構成を備えている。 The robot 2 includes various components such as a control unit that controls the entire apparatus, an operation unit such as an arm unit that performs work, a sensor unit that measures operations such as position and posture, and an input / output unit that communicates with the impact prediction apparatus 1. ing.
 装着装置3は、加速度センサ、ジャイロセンサ等の動作に関する情報を検出するセンサ類を用いた測定部、その他出力部等の各種構成を備えている。装着装置3に取り付けるセンサとしては、加速度センサ、ジャイロセンサの他、磁気センサ、圧力センサ等の各種センサを用いることができる。そして、装着装置3は、速度、角速度、加速度、角加速度、圧力、磁気等の物理量を測定部にて測定し、測定した結果を作業者の動作に関する測定結果を示す生データ等の測定情報として、出力部から衝撃予測装置1へ出力する。衝撃予測装置1は、速度、角速度、加速度、角加速度、圧力、磁気等の物理量を示す測定情報を測定情報取得部12にて取得し、取得した測定情報に基づいて算出される位置情報、姿勢情報等の状態情報を取得する。例えば、加速度センサが測定した加速度を2回積分することにより位置情報を算出することができる。なお、速度、角速度、加速度、角加速度、圧力、磁気等の物理量を示す測定情報に基づく、位置情報、姿勢情報等の状態情報の算出は、装着装置3及び衝撃予測装置1のどちらで演算するようにしても良い。 The mounting device 3 includes various components such as a measurement unit using sensors that detect information related to the operation of an acceleration sensor, a gyro sensor, and the like, and other output units. As a sensor attached to the mounting device 3, various sensors such as a magnetic sensor and a pressure sensor can be used in addition to an acceleration sensor and a gyro sensor. And the mounting apparatus 3 measures physical quantities, such as speed, angular velocity, acceleration, angular acceleration, pressure, magnetism, in a measurement part, and uses the measured result as measurement information, such as raw data which shows the measurement result regarding an operator's operation | movement. , Output from the output unit to the impact prediction device 1. The impact prediction apparatus 1 acquires measurement information indicating physical quantities such as velocity, angular velocity, acceleration, angular acceleration, pressure, magnetism, and the like by the measurement information acquisition unit 12, and position information and posture calculated based on the acquired measurement information Get status information such as information. For example, the position information can be calculated by integrating the acceleration measured by the acceleration sensor twice. Note that calculation of state information such as position information and posture information based on measurement information indicating physical quantities such as velocity, angular velocity, acceleration, angular acceleration, pressure, and magnetism is performed by either the mounting device 3 or the impact prediction device 1. You may do it.
 図4は、本願記載の衝撃予測システムが備える衝撃予測装置1及びその他の各種装置の機能構成の一例を示す機能ブロック図である。衝撃予測装置1は、衝撃予測プログラムPG等の各種プログラムを実行することにより、制御部10の制御に基づき、人体状態取得部10a、人体モデル演算部10b、人体モデル取得部10c、人体動作演算部10d、人体動作取得部(対象者動作取得部)10e、ロボット動作演算部10f、ロボット動作取得部(制御対象動作取得部)10g、接近部位判定部10h、ロボット仕様取得部10i、衝撃度演算部10j、ロボット制御部(制御対象制御部)10k等の各種演算を実行する演算部としての機能を実現する。なお、これら各種機能を実現する様々な演算部は、LSI(Large Scale Integration )、VLSI(Very Large Scale Integration)等の半導体チップを用いたそれぞれ専用の回路として実装することも可能である。 FIG. 4 is a functional block diagram showing an example of the functional configuration of the impact prediction device 1 and other various devices provided in the impact prediction system described in the present application. The impact prediction device 1 executes various programs such as the impact prediction program PG, and based on the control of the control unit 10, the human body state acquisition unit 10a, the human body model calculation unit 10b, the human body model acquisition unit 10c, the human body motion calculation unit 10d, human body motion acquisition unit (target person motion acquisition unit) 10e, robot motion calculation unit 10f, robot motion acquisition unit (control target motion acquisition unit) 10g, approach part determination unit 10h, robot specification acquisition unit 10i, impact degree calculation unit 10j, the function as a calculation part which performs various calculations, such as a robot control part (control object control part) 10k, is realized. Note that various arithmetic units that realize these various functions can be implemented as dedicated circuits using semiconductor chips such as LSI (Large Scale Integration) and VLSI (Very Large Scale Integration).
 人体状態取得部10aは、測定情報取得部12が装着装置3から取得した速度、角速度、加速度、角加速度、圧力、磁気等の物理量を示す測定情報に基づいて、測定対象部位の位置情報、姿勢情報等の状態情報を演算により求めて取得する。なお、装着装置3から位置情報、姿勢情報等の状態情報が出力され、測定情報取得部12が状態情報を取得する場合、制御部10の人体状態取得部10aは、取得した状態情報を以降の演算に用いる情報としてそのまま取得する。 The human body state acquisition unit 10a uses the measurement information indicating physical quantities such as velocity, angular velocity, acceleration, angular acceleration, pressure, magnetism, and the like acquired by the measurement information acquisition unit 12 from the mounting device 3 to determine the position information and posture of the measurement target part. Status information such as information is obtained by calculation. When state information such as position information and posture information is output from the mounting device 3 and the measurement information acquisition unit 12 acquires the state information, the human body state acquisition unit 10a of the control unit 10 It is acquired as it is as information used for calculation.
 人体モデル演算部10bは、人体状態取得部10aが取得した位置情報、姿勢情報等の各種状態情報に基づいて、作業者の身体の形状を模式化した人体モデルを算出する演算を実行する。作業者が装着した装着装置3の測定に基づく各種状態情報に基づく人体モデルの算出は、先行する様々な技術を用いて行うことができる。例えば、「金杉洋、柴崎亮介、“ウェアラブルセンサによる人体動作の計測と解析”、日本写真測量学会、平成17年度年次学術講演会論文集、pp.199-202、2005.06 」に記載されている方法を適用することが可能である。 The human body model calculation unit 10b performs a calculation for calculating a human body model that schematically represents the shape of the worker's body based on various state information such as position information and posture information acquired by the human body state acquisition unit 10a. The calculation of the human body model based on the various state information based on the measurement of the mounting device 3 worn by the worker can be performed using various preceding techniques. For example, it is described in “Kanesugi Hiroshi, Shibasaki Ryosuke,“ Measurement and Analysis of Human Motion Using Wearable Sensors ”, Japan Photogrammetric Society, 2005 Annual Scientific Papers, pp.199-202, 2005.06. It is possible to apply the method.
 人体モデル取得部10cは、人体モデル記録部11aに記録されている人体モデルを取得する処理を実行する。 The human body model acquisition unit 10c executes a process of acquiring the human body model recorded in the human body model recording unit 11a.
 人体動作演算部10dは、人体モデル取得部10cが取得した状態情報及び人体モデル記録部11aから人体モデル取得部10cが取得した人体モデルに基づいて、作業者の手、足等の身体の演算対象部位の動作を算出する演算を実行する。演算対象部位の動作は、例えば、演算対象部位の位置、動作方向、所定時間後の予測位置、予測動作方向等の動作を示す人体動作情報(対象者動作情報)として算出される。 The human body motion calculation unit 10d is based on the state information acquired by the human body model acquisition unit 10c and the human body model acquired by the human body model acquisition unit 10c from the human body model recording unit 11a, and the calculation target of the body such as the hands and feet of the worker The calculation which calculates the operation | movement of a site | part is performed. The operation of the calculation target part is calculated as, for example, human body motion information (subject action information) indicating operations such as the position of the calculation target part, the motion direction, the predicted position after a predetermined time, and the predicted motion direction.
 人体動作取得部10eは、人体動作演算部10dが演算した人体動作情報を取得する処理を実行する。 The human body motion acquisition unit 10e executes a process of acquiring the human body motion information calculated by the human body motion calculation unit 10d.
 ロボット動作演算部10fは、ロボット状態取得部13aとして機能するロボット用入出力部13を介してロボット2と通信を行う。そして、ロボット動作演算部10fは、ロボット2から各部位の位置、姿勢等の状態を示すロボット状態情報の入力を受け付け、受け付けた入力に基づきロボット2の動作を算出する演算を実行する。 The robot motion calculation unit 10f communicates with the robot 2 via the robot input / output unit 13 that functions as the robot state acquisition unit 13a. Then, the robot operation calculation unit 10f receives input of robot state information indicating the position, posture, and the like of each part from the robot 2, and executes calculation for calculating the operation of the robot 2 based on the received input.
 ロボット動作取得部10gは、ロボット動作演算部10fが演算したロボット2の動作を示すロボット動作情報(制御対象動作情報)を取得する処理を実行する。 The robot motion acquisition unit 10g executes a process of acquiring robot motion information (control target motion information) indicating the motion of the robot 2 calculated by the robot motion calculation unit 10f.
 接近部位判定部10hは、人体動作演算部10dの演算結果である作業者の動作及びロボット動作演算部10fの演算結果であるロボット2の動作に基づいて、作業者の各部位及びロボット2の各部位について、接近状況を判定する。接近状況の判定とは、接近している又は接近すると予測される作業者及び作業者の部位並びにロボット2の部位の特定、接近している部位間の相対速度、接触の予測、接触の際の部位間の相対速度等の状況の導出である。 The approaching part determination unit 10h is configured so that each part of the worker and each part of the robot 2 are determined based on the operation of the worker that is the calculation result of the human body motion calculation unit 10d and the operation of the robot 2 that is the calculation result of the robot motion calculation unit 10f. The approach situation is determined for the part. The determination of the approach situation is the identification of the worker and the part of the worker who is approaching or predicted to approach and the part of the robot 2, the relative speed between the approaching parts, the prediction of the contact, It is the derivation of the situation such as the relative speed between the parts.
 ロボット仕様取得部10iは、ロボット仕様記録部11bに記録されている仕様情報を取得する処理を実行する。 The robot specification acquisition unit 10i executes a process of acquiring specification information recorded in the robot specification recording unit 11b.
 衝撃度演算部10jは、接近部位判定部10hの演算結果である接近状況及びロボット仕様記録部11bに記録されているロボット2の各部位の仕様情報に基づいて、ロボット2が接触により対象者に与える衝撃を衝撃度として算出する演算を実行する。衝撃度は、ロボット2が作業者に接触した際に作業者に与える圧力、力、最大伝達エネルギー、運動エネルギー等の項目に基づいて算出される。従って、例えば、ロボット2と作業者との接触の際の相対速度が速いほど、またロボット2の部位の重量が大きいほど、衝撃度は大きな値として算出される。 The impact calculation unit 10j is configured so that the robot 2 touches the target person by the contact based on the approach status as the calculation result of the approach site determination unit 10h and the specification information of each part of the robot 2 recorded in the robot specification recording unit 11b. The calculation which calculates the applied impact as an impact degree is performed. The degree of impact is calculated based on items such as pressure, force, maximum transmission energy, and kinetic energy applied to the worker when the robot 2 contacts the worker. Therefore, for example, the higher the relative speed at the time of contact between the robot 2 and the worker, and the greater the weight of the part of the robot 2, the greater the impact degree is calculated.
 ロボット制御部10kは、衝撃度演算部10jによる衝撃度の算出結果に基づいて、ロボット2を制御する方法を導出する演算を実行し、演算結果となるロボット2の制御方法を示す制御命令を出力する。制御命令の出力は、ロボット制御命令出力部13bとして機能するロボット用入出力部13を介したロボット2への出力として実行される。ロボット2を制御する方法は、例えば、算出した衝撃度を、予め複数段階に設定されている所定の閾値と比較することにより導出される。例えば、衝撃度が所定の第1閾値未満であれば、接触しても問題なしと判断し、ロボット2の動作を継続させるように制御する。衝撃度が第1閾値以上で、かつ所定の第2閾値未満であれば、衝撃度を緩和する動作を行うようにロボット2を制御する。更に、衝撃度が第2衝撃値以上であれば、ロボット2の動作を停止又は接触しないように動作の軌道を修正するように制御する。なお導出される制御方法は例示したこれらの制御に限らず衝撃度の算出結果に応じて適宜設定される。即ち、ロボット制御部10kは、動作の継続、動作の停止、動作速度の減速、接触回避動作の実施、衝撃度の低減動作等の制御方法を導出し、導出した制御方法の実施をさせる制御命令を出力する。 The robot control unit 10k executes a calculation for deriving a method for controlling the robot 2 based on the calculation result of the impact level by the impact level calculation unit 10j, and outputs a control command indicating the control method for the robot 2 as the calculation result. To do. The output of the control command is executed as an output to the robot 2 via the robot input / output unit 13 that functions as the robot control command output unit 13b. The method for controlling the robot 2 is derived, for example, by comparing the calculated degree of impact with a predetermined threshold value set in advance in a plurality of stages. For example, if the degree of impact is less than a predetermined first threshold, it is determined that there is no problem even if the contact is made, and control is performed so that the operation of the robot 2 is continued. If the degree of impact is equal to or greater than the first threshold value and less than the predetermined second threshold value, the robot 2 is controlled to perform an operation for reducing the degree of impact. Further, if the degree of impact is equal to or greater than the second impact value, control is performed so that the motion trajectory is corrected so that the motion of the robot 2 is not stopped or touched. The derived control method is not limited to these exemplified controls, and is appropriately set according to the calculation result of the impact level. That is, the robot control unit 10k derives a control method such as continuation of operation, stop of operation, deceleration of operation speed, implementation of contact avoidance operation, reduction operation of impact degree, and the like, and a control command for executing the derived control method Is output.
 このようにロボット2と作業者とが接触するか否かを判定するだけで無く、接触する可能性がある場合であっても、衝撃度が低ければロボット2の動作を停止しないような制御を行う。これにより、本願記載の衝撃予測装置1は、安全性を維持しながらも、必要以上にロボット2の動作を停止させる必要が無いため、安全性及び生産性の双方を向上させることが可能である。 In this way, not only is it determined whether or not the robot 2 and the worker are in contact with each other, but even if there is a possibility of contact, control is performed so as not to stop the operation of the robot 2 if the degree of impact is low. Do. Thereby, since the impact prediction apparatus 1 described in the present application does not need to stop the operation of the robot 2 more than necessary while maintaining safety, it is possible to improve both safety and productivity. .
 情報入力部14は、外部の装置である人体モデル入力装置4から、各作業者の人体モデルの入力を受け付ける人体モデル入力部14aとして機能し、受け付けた人体モデルを人体モデル記録部11aに記録する。また、情報入力部14は、外部の装置であるロボット仕様入力装置5から、ロボット2の仕様情報の入力を受け付けるロボット仕様入力部14bとして機能し、受け付けた仕様情報をロボット仕様記録部11bに記録する。 The information input unit 14 functions as a human body model input unit 14a that receives input of the human body model of each worker from the human body model input device 4 that is an external device, and records the received human body model in the human body model recording unit 11a. . The information input unit 14 functions as a robot specification input unit 14b that receives input of specification information of the robot 2 from the robot specification input device 5 that is an external device, and records the received specification information in the robot specification recording unit 11b. To do.
 <処理構成>
 以上の様に構成された本願記載の衝撃予測システムが備える衝撃予測装置1の様々な処理について説明する。図5は、本願記載の衝撃予測システムが備える衝撃予測装置1のロボット仕様記録処理の一例を示すフローチャートである。ロボット仕様記録処理は、仕様情報を記録しているロボット仕様入力装置5から仕様情報の入力を受け付けてロボット仕様記録部11bに記録する処理である。
<Processing configuration>
Various processes of the impact prediction apparatus 1 provided in the impact prediction system according to the present application configured as described above will be described. FIG. 5 is a flowchart showing an example of the robot specification recording process of the impact prediction apparatus 1 provided in the impact prediction system described in the present application. The robot specification recording process is a process of accepting input of specification information from the robot specification input device 5 that records specification information and recording it in the robot specification recording unit 11b.
 衝撃予測装置1が備える制御部10は、衝撃予測プログラムPG等の各種プログラムを実行することにより、ロボット仕様記録処理を実行する。衝撃予測装置1の制御部10は、ロボット仕様入力装置5からロボット仕様入力部14bにより仕様情報の入力を受け付け(S101)、受け付けた仕様情報をロボット仕様記録部11bに記録する(S102)。ステップS102では、ロボット2の部位の形状に関する情報、材質に関する情報、動作に関する情報等の仕様情報を、部位毎に記録する。なお、使用するロボット2が複数である場合、ロボット2及び部位毎に仕様情報を記録する。 The control unit 10 included in the impact prediction apparatus 1 executes a robot specification recording process by executing various programs such as the impact prediction program PG. The control unit 10 of the impact prediction apparatus 1 accepts input of specification information from the robot specification input device 5 through the robot specification input unit 14b (S101), and records the received specification information in the robot specification recording unit 11b (S102). In step S102, specification information such as information on the shape of the part of the robot 2, information on the material, and information on the operation is recorded for each part. When there are a plurality of robots 2 to be used, specification information is recorded for each robot 2 and each part.
 このようにして、ロボット仕様記録処理が実行される。 In this way, the robot specification recording process is executed.
 図6は、本願記載の衝撃予測システムが備える衝撃予測装置1の第1人体モデル記録処理の一例を示すフローチャートである。第1人体モデル記録処理は、装着装置3を装着した作業者が動作を行い、動作に基づく情報を取得し、取得した情報から演算される人体モデルを、人体モデル記録部11aに記録する処理である。 FIG. 6 is a flowchart showing an example of the first human body model recording process of the impact prediction device 1 provided in the impact prediction system described in the present application. The first human body model recording process is a process in which an operator wearing the mounting apparatus 3 performs an operation, acquires information based on the operation, and records a human body model calculated from the acquired information in the human body model recording unit 11a. is there.
 衝撃予測装置1が備える制御部10は、衝撃予測プログラムPG等の各種プログラムを実行することにより、第1人体モデル記録処理を実行する。衝撃予測装置1の制御部10は、作業者が装着している装着装置3から測定情報取得部12により作業者の動作を測定した結果として測定情報を取得する(S201)。ステップS201の測定情報の取得とは、装着装置3として各種慣性センサを装着した作業者が所定の基準動作を行い、作業者が基準動作を行ったことにより測定される測定対象部位の速度、角速度、加速度、角加速度、圧力、磁気等の物理量の少なくとも一つを示す測定情報を測定情報取得部12が取得する処理である。 The control unit 10 included in the impact prediction apparatus 1 executes the first human body model recording process by executing various programs such as the impact prediction program PG. The control unit 10 of the impact prediction apparatus 1 acquires measurement information as a result of measuring the operator's operation by the measurement information acquisition unit 12 from the mounting apparatus 3 worn by the worker (S201). The acquisition of the measurement information in step S201 means that the worker wearing various inertial sensors as the attachment device 3 performs a predetermined reference motion, and the speed and angular velocity of the measurement target part measured by the worker performing the reference motion. The measurement information acquisition unit 12 acquires measurement information indicating at least one of physical quantities such as acceleration, angular acceleration, pressure, and magnetism.
 制御部10は、人体状態取得部10aにより、測定情報取得部12が取得した測定情報に基づいて、測定対象部位の位置情報、姿勢情報等の状態情報を演算により求めて取得する(S202)。ステップS202は、測定情報取得部12が取得した対象者の動作を測定した測定情報に基づいて、対象者の位置情報及び姿勢情報のうち少なくとも一方を状態情報として演算する処理である。 The control unit 10 uses the human body state acquisition unit 10a to obtain and acquire state information such as position information and posture information of the measurement target part based on the measurement information acquired by the measurement information acquisition unit 12 (S202). Step S202 is a process of calculating at least one of the position information and posture information of the subject as state information based on the measurement information obtained by measuring the motion of the subject acquired by the measurement information acquisition unit 12.
 制御部10は、人体モデル演算部10bの演算により、取得した状態情報に基づいて、作業者の身体の形状を模式化した人体モデルを算出(演算)する(S203)。 The control unit 10 calculates (calculates) a human body model that schematically represents the shape of the worker's body based on the acquired state information by the calculation of the human body model calculation unit 10b (S203).
 そして、制御部10は、算出した人体モデルを、動作を行った作業者を特定する作業者特定情報に対応付けて人体モデル記録部11aに記録する(S204)。 Then, the control unit 10 records the calculated human body model in the human body model recording unit 11a in association with worker specifying information for specifying the worker who performed the operation (S204).
 このようにして、第1人体モデル記録処理が実行される。 In this way, the first human body model recording process is executed.
 図7は、本願記載の衝撃予測システムが備える衝撃予測装置1の第2人体モデル記録処理の一例を示すフローチャートである。第2人体モデル記録処理は、予め演算された人体モデルを記録している人体モデル入力装置4から、人体モデルの入力を受け付けて人体モデル記録部11aに記録する処理である。 FIG. 7 is a flowchart showing an example of the second human body model recording process of the impact prediction device 1 provided in the impact prediction system described in the present application. The second human body model recording process is a process of receiving an input of a human body model from the human body model input device 4 that records a human body model calculated in advance and recording it in the human body model recording unit 11a.
 衝撃予測装置1が備える制御部10は、衝撃予測プログラムPG等の各種プログラムを実行することにより、第2人体モデル記録処理を実行する。衝撃予測装置1の制御部10は、人体モデル入力装置4から人体モデル入力部14aにより人体モデル及び対応する作業者特定情報の入力を受け付ける(S301)。そして、制御部10は、受け付けた人体モデルを作業者特定情報に対応付けて人体モデル記録部11aに記録する(S302)。 The control unit 10 provided in the impact prediction apparatus 1 executes the second human body model recording process by executing various programs such as the impact prediction program PG. The control unit 10 of the impact prediction apparatus 1 receives an input of the human body model and corresponding worker identification information from the human body model input device 4 through the human body model input unit 14a (S301). And the control part 10 matches the received human body model with worker specific information, and records it on the human body model recording part 11a (S302).
 このようにして、第2人体モデル記録処理が実行される。 In this way, the second human body model recording process is executed.
 なお、モデル記録部11aにモデル情報を記録する処理としては、第1人体モデル記録処理及び第2人体モデル記録処理のうちのいずれを用いても良く、更に他の方法を用いるようにしても良い。 In addition, as a process which records model information in the model recording part 11a, any of a 1st human body model recording process and a 2nd human body model recording process may be used, and you may make it use another method. .
 図8は、本願記載の衝撃予測システムが備える衝撃予測装置1の衝撃予測処理の一例を示すフローチャートである。衝撃予測処理は、制御対象が接触により対象者に与える衝撃を予測する処理である。衝撃予測装置1が備える制御部10は、衝撃予測プログラムPG等の各種プログラムを実行することにより、衝撃予測処理を実行する。衝撃予測装置1の制御部10は、作業者の動作状態を算出する処理と、ロボット2の動作状態を算出する処理とを並列処理として実行する。即ち、制御部10は、一方の処理として、装着装置3が作業者の動作を測定した測定結果を示す測定情報を測定情報取得部12にて取得し(S401)、人体状態取得部10aにて、取得した測定情報に基づいて、位置情報、姿勢情報等の人体の状態を示す人体状態情報を演算により求めて取得する(S402)。更に、制御部10は、人体モデル取得部10cにより、人体モデル記録部11aから人体モデルを取得する(S403)。更に、制御部10は、人体動作演算部10dにより、取得した人体状態情報及び人体モデルに基づいて、作業者(人体)の演算対象部位の動作を演算し(S404)、人体動作取得部10eにより、演算対象部位の動作を示す人体動作情報を取得する(S405)。動作情報に基づく演算対象部位の動作状態の算出は、作業者毎に行われる。 FIG. 8 is a flowchart showing an example of the impact prediction process of the impact prediction device 1 provided in the impact prediction system described in the present application. The impact prediction process is a process for predicting the impact that the control target gives to the subject by contact. The control unit 10 included in the impact prediction apparatus 1 executes impact prediction processing by executing various programs such as the impact prediction program PG. The control unit 10 of the impact prediction apparatus 1 executes a process for calculating the operation state of the worker and a process for calculating the operation state of the robot 2 as parallel processes. That is, as one process, the control unit 10 acquires measurement information indicating a measurement result obtained by the mounting apparatus 3 measuring the movement of the worker by the measurement information acquisition unit 12 (S401), and the human body state acquisition unit 10a. Based on the acquired measurement information, human body state information indicating the state of the human body such as position information and posture information is obtained by calculation and acquired (S402). Further, the control unit 10 acquires a human body model from the human body model recording unit 11a by the human body model acquisition unit 10c (S403). Further, the control unit 10 calculates the motion of the calculation target part of the worker (human body) based on the acquired human body state information and the human body model by the human body motion calculation unit 10d (S404), and the human body motion acquisition unit 10e Then, human body motion information indicating the motion of the part to be calculated is acquired (S405). The calculation of the operation state of the calculation target part based on the operation information is performed for each worker.
 また、制御部10は、他方の処理として、ロボット状態取得部13aにてロボット2から各部位の位置、姿勢等のロボットの状態を示すロボット状態情報を取得する(S406)。更に、制御部10は、ロボット動作演算部10fにより、取得したロボット状態情報に基づいて、ロボット2の動作を演算し(S407)、ロボット動作取得部10gにより、ロボット2の動作を示すロボット動作情報を取得する(S408)。 Further, as the other process, the control unit 10 acquires robot state information indicating the robot state such as the position and posture of each part from the robot 2 by the robot state acquisition unit 13a (S406). Further, the control unit 10 calculates the operation of the robot 2 based on the robot state information acquired by the robot operation calculation unit 10f (S407), and the robot operation information indicating the operation of the robot 2 by the robot operation acquisition unit 10g. Is acquired (S408).
 並列処理後、制御部10は、接近部位判定部10hにより、人体動作情報及びロボット動作情報に基づいて、作業者毎の各部位及びロボット2の各部位について、接近状況を判定し、作業者及びロボット2が接触するか否かを判定する(S409)。 After the parallel processing, the control unit 10 determines the approach status for each part for each worker and each part of the robot 2 based on the human body motion information and the robot motion information by the approach part determination unit 10h. It is determined whether or not the robot 2 comes into contact (S409).
 ステップS409において、作業者及びロボット2が接触すると判定した場合(S409:YES)、制御部10は、ロボット仕様取得部10iにより、ロボット仕様記録部11bからロボット2の各部位の仕様情報を取得し(S410)、衝撃度演算部10jにより、接近状況及び仕様情報に基づいて作業者とロボット2との衝撃度を演算する(S411)。ステップS411では、作業者毎の各部位及びロボット2の各部位の接近状況、並びにロボット仕様記録部11bに記録されているロボット2の各部位の仕様情報に基づいて衝撃度を演算する。衝撃度の演算は、接触が予測される作業者の特定、作業者及びロボット2の接触部位並びにロボット2が接触により対象者に与える衝撃を数値化した衝撃度の導出として実行される。即ち、ステップS411では、衝撃度の演算として、ロボット2が接触により作業者に与える衝撃を予測する。なお、ステップS409において、作業者及びロボット2が接触しないと判定した場合(S409:NO)、制御部10は、作業者の動作状態を算出する処理及びロボット2の動作状態を算出する処理の並列処理を再度実行する。 If it is determined in step S409 that the worker and the robot 2 are in contact (S409: YES), the control unit 10 acquires the specification information of each part of the robot 2 from the robot specification recording unit 11b by the robot specification acquisition unit 10i. (S410), the impact level calculation unit 10j calculates the impact level between the worker and the robot 2 based on the approach situation and the specification information (S411). In step S411, the degree of impact is calculated based on the approach status of each part for each worker and each part of the robot 2, and the specification information of each part of the robot 2 recorded in the robot specification recording unit 11b. The calculation of the degree of impact is performed as identification of the worker who is predicted to be contacted, derivation of the degree of impact obtained by quantifying the contact portion of the worker and the robot 2 and the impact of the robot 2 on the subject due to the contact. That is, in step S411, as the calculation of the degree of impact, the impact that the robot 2 gives to the operator through contact is predicted. If it is determined in step S409 that the worker and the robot 2 do not come into contact with each other (S409: NO), the control unit 10 parallels the process for calculating the operation state of the worker and the process for calculating the operation state of the robot 2. Run the process again.
 衝撃度を演算後、制御部10は、ロボット制御部10kにより、衝撃度に基づいてロボット2の任意制御を実行する(S412)。ステップS412の任意制御とは、ステップS411にて演算した衝撃度に基づいて、ロボット2に対し、動作の継続、動作の停止、動作速度の減速、接触回避動作の実施、衝撃度の低減動作等の予め設定されている所定の制御を実施することを示す。ロボット制御部10kは、衝撃度に基づいて任意制御となる制御方法を導出し、導出した制御方法を実施をさせる制御命令を、ロボット制御命令出力部13bを介してロボット2へ出力する。 After calculating the impact level, the control unit 10 performs arbitrary control of the robot 2 based on the impact level by the robot control unit 10k (S412). The arbitrary control in step S412 means that the robot 2 continues to operate, stops operation, decelerates the operation speed, performs a contact avoidance operation, reduces the impact, etc., based on the impact calculated in step S411. This indicates that predetermined control is executed in advance. The robot control unit 10k derives a control method for arbitrary control based on the degree of impact, and outputs a control command for executing the derived control method to the robot 2 via the robot control command output unit 13b.
 そして、制御部10は、衝撃予測処理を終了するか否かを判定する(S413)。例えば、作業者が作業を終了し、衝撃予測処理を終了させる所定の操作を行った場合、操作を受け付けた制御部10は、衝撃予測処理を終了すると判定する。 And the control part 10 determines whether an impact prediction process is complete | finished (S413). For example, when the operator finishes the work and performs a predetermined operation for ending the impact prediction process, the control unit 10 that has received the operation determines to end the impact prediction process.
 ステップS413において、衝撃予測処理を終了すると判定した場合(S413:YES)、制御部10は、衝撃予測処理を終了する。ステップS413において、衝撃予測処理を終了しないと判定した場合(S413:NO)、制御部10は、作業者の動作状態を算出する処理及びロボット2の動作状態を算出する処理の並列処理を再度実行する。 When it is determined in step S413 that the impact prediction process is to be ended (S413: YES), the control unit 10 ends the impact prediction process. If it is determined in step S413 that the impact prediction process is not to be ended (S413: NO), the control unit 10 executes again the parallel process of the process of calculating the operation state of the worker and the process of calculating the operation state of the robot 2. To do.
 このようにして、衝撃予測処理が実行される。衝撃予測装置1は、対象者の動作及びロボット2の動作、並びにロボット2の仕様に基づいて、制御対象が接触により対象者に与える衝撃を予測する。これにより、作業者がロボット2と接触する前に衝撃を予測することができるので、衝撃に応じて、ロボット2の動作の継続、衝撃緩和動作、停止、接触回避動作、情報収集等の任意制御が可能となる。従って、衝撃予測装置1は、安全性を維持しながらも、必要以上にロボット2の動作を停止させる必要が無いため、安全性及び生産性の双方を向上させることが可能である。 In this way, the impact prediction process is executed. The impact prediction device 1 predicts the impact that the control target gives to the subject by contact based on the motion of the subject, the motion of the robot 2, and the specifications of the robot 2. As a result, the impact can be predicted before the operator contacts the robot 2, so that arbitrary control such as continuation of the operation of the robot 2, impact mitigation operation, stop, contact avoidance operation, information collection, etc. can be performed according to the impact. Is possible. Therefore, the impact prediction apparatus 1 can improve both safety and productivity because it is not necessary to stop the operation of the robot 2 more than necessary while maintaining safety.
 本発明は、以上説明した実施形態に限定されるものではなく、他のいろいろな形態で実施することが可能である。そのため、上述した実施形態はあらゆる点で単なる例示にすぎず、限定的に解釈してはならない。本発明の範囲は請求の範囲によって示すものであって、明細書本文には、なんら拘束されない。更に、請求の範囲の均等範囲に属する変形や変更は、全て本発明の範囲内のものである。 The present invention is not limited to the embodiment described above, and can be implemented in various other forms. For this reason, the above-described embodiment is merely an example in all respects and should not be interpreted in a limited manner. The scope of the present invention is shown by the scope of claims, and is not restricted by the text of the specification. Further, all modifications and changes belonging to the equivalent scope of the claims are within the scope of the present invention.
 例えば、前記実施形態では、ロボットの動作を演算する際にロボットの動作情報を取得したが、本発明はこれに限らず、予めプログラミングされている制御命令に基づいてロボット2の動作を演算する等、様々な形態に展開することが可能である。 For example, in the embodiment, the robot motion information is acquired when calculating the robot motion. However, the present invention is not limited to this, and the motion of the robot 2 is calculated based on a preprogrammed control command. It can be developed in various forms.
 また、例えば、前記実施形態では、衝撃を数値化した衝撃度としてスカラー量を用いる形態を示したが、本発明はこれに限らず、衝撃度としてベクトル量を用いる等、様々な形態に展開することが可能である。 Further, for example, in the above-described embodiment, the scalar quantity is used as the degree of impact obtained by quantifying the impact. However, the present invention is not limited to this, and the present invention can be developed in various forms such as using a vector quantity as the degree of impact. It is possible.
 また、前記実施形態では、衝撃予測装置1が、人体モデル演算部10b、人体動作演算部10d、ロボット動作演算部10f等の様々な演算部により、各種演算を実施する形態を示したが、本発明はこれに限るものではない。即ち、これらの演算機能の一部を他の装置に備えさせ、衝撃予測装置1は、他の装置から各種演算結果を取得し、衝撃度演算部10jにより、衝撃度の演算を実行する等、様々な形態に展開することが可能である。 In the above-described embodiment, the impact prediction device 1 performs various calculations using various calculation units such as the human body model calculation unit 10b, the human body movement calculation unit 10d, and the robot movement calculation unit 10f. The invention is not limited to this. That is, some of these calculation functions are provided in another device, the impact prediction device 1 acquires various calculation results from the other device, and the impact degree calculation unit 10j performs the calculation of the impact degree. It can be developed in various forms.
 また、前記実施形態では、衝撃予測装置1がロボット2を制御する形態を示したが、本発明はこれに限らず、衝撃を予測する衝撃予測装置1とは別に、ロボット2を制御する装置を設けるようにする等、様々な形態に展開することも可能である。 Moreover, although the impact prediction apparatus 1 showed the form which controls the robot 2 in the said embodiment, this invention is not limited to this, The apparatus which controls the robot 2 separately from the impact prediction apparatus 1 which estimates an impact is shown. It is also possible to develop in various forms such as providing.
 <他のシステム構成例>
 衝撃予測装置1とは別に、ロボット2を制御する制御装置6を設けたシステム構成例について説明する。なお、制御装置6を備えていない前述のシステム構成例と同様の構成については、前述のシステムに関する説明を参照するものとし、その説明を省略する。
<Other system configuration examples>
A system configuration example in which a control device 6 for controlling the robot 2 is provided separately from the impact prediction device 1 will be described. In addition, about the structure similar to the above-mentioned system configuration example which is not provided with the control apparatus 6, the description regarding the above-mentioned system shall be referred, and the description is abbreviate | omitted.
 図9は、本願記載の衝撃予測システムの他のシステム構成例における各種装置の機能構成の一例を示す機能ブロック図である。制御装置6は、例えば、ロボット2を制御する制御コンピュータ等の装置を用いて形成されており、衝撃予測装置1及びロボット2と接続されている。 FIG. 9 is a functional block diagram showing an example of the functional configuration of various devices in another system configuration example of the impact prediction system described in the present application. The control device 6 is formed using, for example, a device such as a control computer that controls the robot 2, and is connected to the impact prediction device 1 and the robot 2.
 衝撃予測装置1は、制御装置6との通信インターフェースとして、制御装置用出力部15を備えている。また、制御装置6は、衝撃予測装置1との通信インターフェースとなる入力部60、ロボット2を制御するロボット制御部(制御対象制御部)61等の各種構成を備えている。 The impact prediction device 1 includes a control device output unit 15 as a communication interface with the control device 6. The control device 6 includes various components such as an input unit 60 serving as a communication interface with the impact prediction device 1 and a robot control unit (control target control unit) 61 that controls the robot 2.
 そして、衝撃予測装置1は、衝撃度演算部10jにより予測した衝撃度を制御装置6へ出力する。制御装置6は、入力部60から衝撃度の入力を受け付け、ロボット制御部61により、衝撃度に基づきロボット2の制御を判断する。そして、制御装置6は、衝撃度に基づきロボット2を任意制御する制御命令を、ロボット制御部61からロボット2へ出力する。ロボット2へ出力する制御命令は、ロボット2に対し、動作の継続、動作の停止、動作速度の減速、接触回避動作の実施、衝撃度の低減動作等の予め設定されている所定の制御を実施させる命令である。即ち、制御装置6が備えるロボット制御部61が、衝撃予測装置1が備えるロボット制御部10kと同様の機能を実現する。 Then, the impact prediction device 1 outputs the impact level predicted by the impact level calculation unit 10j to the control device 6. The control device 6 receives an input of the impact level from the input unit 60, and the robot control unit 61 determines control of the robot 2 based on the impact level. Then, the control device 6 outputs a control command for arbitrarily controlling the robot 2 based on the degree of impact from the robot control unit 61 to the robot 2. The control command to be output to the robot 2 performs predetermined control such as continuation of the operation, stop of the operation, deceleration of the operation speed, execution of the contact avoidance operation, reduction operation of the impact degree, etc. It is an instruction to make. That is, the robot control unit 61 included in the control device 6 realizes the same function as the robot control unit 10 k included in the impact prediction device 1.
 このように、本願記載の衝撃予測システムは、様々なシステム構成に展開することが可能である。 Thus, the impact prediction system described in the present application can be deployed in various system configurations.
1    衝撃予測装置
10   制御部
10a   人体状態取得部
10b   人体モデル演算部
10c   人体モデル取得部
10d   人体動作演算部
10e   人体動作取得部(対象者動作取得部)
10f   ロボット動作演算部
10g   ロボット動作取得部(制御対象動作取得部)
10h   接近部位判定部
10i   ロボット仕様取得部(仕様取得部)
10j   衝撃度演算部
10k   ロボット制御部(制御対象制御部)
11   記録部
11a   人体モデル記録部
11b   ロボット仕様記録部
12   測定情報取得部
13   ロボット用入出力部
13a   ロボット状態取得部
13b   ロボット制御命令出力部
14   情報入力部
14a   人体モデル入力部
14b   ロボット仕様入力部
15   制御装置用出力部
2    ロボット(制御対象)
3    装着装置
4    人体モデル入力装置
5    ロボット仕様入力装置
6    制御装置
60   入力部
61   ロボット制御部(制御対象制御部)
PG   衝撃予測プログラム
DESCRIPTION OF SYMBOLS 1 Impact prediction apparatus 10 Control part 10a Human body state acquisition part 10b Human body model calculation part 10c Human body model acquisition part 10d Human body movement calculation part 10e Human body movement acquisition part (target person movement acquisition part)
10f Robot motion calculation unit 10g Robot motion acquisition unit (control target motion acquisition unit)
10h Approach site determination unit 10i Robot specification acquisition unit (specification acquisition unit)
10j Impact degree calculation unit 10k Robot control unit (control target control unit)
DESCRIPTION OF SYMBOLS 11 Recording part 11a Human body model recording part 11b Robot specification recording part 12 Measurement information acquisition part 13 Robot input / output part 13a Robot state acquisition part 13b Robot control command output part 14 Information input part 14a Human body model input part 14b Robot specification input part 15 Control unit output unit 2 Robot (control target)
3 Mounting device 4 Human body model input device 5 Robot specification input device 6 Control device 60 Input unit 61 Robot control unit (control target control unit)
PG impact prediction program

Claims (13)

  1.  制御対象が接触により対象者に与える衝撃を予測する衝撃予測装置であって、
     前記制御対象の仕様に関する仕様情報を取得する仕様取得部と、
     前記対象者の身体の測定対象部位の測定結果に基づいて前記対象者の動作を示す対象者動作情報を取得する対象者動作取得部と、
     前記制御対象の動作を示す制御対象動作情報を取得する制御対象動作取得部と、
     前記対象者動作取得部が取得した対象者動作情報及び前記制御対象動作取得部が取得した制御対象動作情報、並びに前記仕様取得部が取得した仕様情報に基づいて、前記制御対象が前記対象者に接触した場合に前記制御対象が前記対象者に与える衝撃の程度を示す衝撃度を演算する衝撃度演算部と
     を備えることを特徴とする衝撃予測装置。
    An impact prediction device that predicts an impact that a control target gives to a subject by contact,
    A specification acquisition unit for acquiring specification information on the specification of the control target;
    A subject motion acquisition unit for acquiring subject motion information indicating the motion of the subject based on a measurement result of a measurement target portion of the subject's body;
    A control target action acquisition unit for acquiring control target action information indicating the control target action;
    Based on the target person action information acquired by the target person action acquisition unit, the control target action information acquired by the control target action acquisition unit, and the specification information acquired by the specification acquisition part, the control target is transferred to the target person. An impact prediction device, comprising: an impact degree calculation unit that calculates an impact degree indicating the degree of impact that the control object gives to the subject when contacted.
  2.  請求項1に記載の衝撃予測装置であって、
     前記対象者動作取得部は、複数の前記対象者の対象者動作情報を取得し、
     前記衝撃度演算部は、接触する前記対象者を特定する
     ことを特徴とする衝撃予測装置。
    The impact prediction device according to claim 1,
    The target person action acquisition unit acquires target person action information of a plurality of the target persons,
    The impact prediction device is characterized in that the impact degree calculation unit identifies the subject in contact.
  3.  請求項1又は請求項2に記載の衝撃予測装置であって、
     前記仕様取得部は、
     仕様情報として、前記制御対象を構成する部位についての形状に関する情報、材質に関する情報、及び動作に関する情報のうちの少なくとも一を取得する
     ことを特徴とする衝撃予測装置。
    The impact prediction apparatus according to claim 1 or 2,
    The specification acquisition unit
    As the specification information, at least one of information regarding a shape, information regarding a material, and information regarding an operation of a part constituting the control target is acquired.
  4.  請求項1乃至請求項3のいずれか1項に記載の衝撃予測装置であって、
     前記対象者動作取得部は、
     前記対象者の身体の測定対象部位の位置情報及び姿勢情報の少なくとも一に基づいて、演算された対象者動作情報を取得する
     ことを特徴とする衝撃予測装置。
    The impact prediction apparatus according to any one of claims 1 to 3,
    The subject action acquisition unit
    An impact prediction device, wherein the calculated subject motion information is acquired based on at least one of position information and posture information of a measurement target part of the subject's body.
  5.  請求項4に記載の衝撃予測装置であって、
     前記対象者動作取得部の取得に係る位置情報及び姿勢情報の少なくとも一は、前記対象者の身体の測定対象部位の速度、加速度、角速度、角加速度、圧力及び磁気のうちの少なくとも一に基づいて演算された情報である
     ことを特徴とする衝撃予測装置。
    The impact prediction device according to claim 4,
    At least one of the position information and the posture information related to the acquisition of the subject motion acquisition unit is based on at least one of speed, acceleration, angular velocity, angular acceleration, pressure, and magnetism of a measurement target part of the subject's body. An impact prediction device characterized by being calculated information.
  6.  請求項4又は請求項5に記載の衝撃予測装置であって、
     前記対象者動作取得部が取得する演算された対象者動作情報は、前記対象者の演算対象部位の速度又は加速度である
     ことを特徴とする衝撃予測装置。
    The impact prediction apparatus according to claim 4 or 5, wherein
    The calculated subject motion information acquired by the subject motion acquisition unit is the speed or acceleration of the calculation target portion of the subject.
  7.  請求項1乃至請求項6のいずれか1項に記載の衝撃予測装置であって、
     前記衝撃度演算部が演算した衝撃度に基づいて、前記制御対象を制御する制御命令を出力する制御対象制御部を備える
     ことを特徴とする衝撃予測装置。
    The impact prediction device according to any one of claims 1 to 6,
    An impact prediction apparatus comprising: a control target control unit that outputs a control command for controlling the control target based on the impact level calculated by the impact level calculation unit.
  8.  請求項7に記載の衝撃予測装置であって、
     前記制御対象制御部は、
     動作の継続、動作の停止、動作速度の減速、接触回避動作の実施、又は衝撃度の低減動作の実施をさせる制御命令を出力する
     ことを特徴とする衝撃予測装置。
    The impact prediction device according to claim 7,
    The control target control unit
    An impact prediction apparatus, characterized by outputting a control command for continuing an operation, stopping an operation, reducing an operation speed, performing a contact avoiding operation, or performing an operation for reducing an impact level.
  9.  制御対象を制御する制御装置であって、
     請求項1乃至請求項6のいずれか1項に記載の衝撃予測装置から、前記衝撃度演算部が演算した衝撃度の入力を受け付ける入力部と、
     前記入力部が受け付けた衝撃度に基づいて、前記制御対象を制御する制御命令を出力する制御対象制御部と
     を備えることを特徴とする制御装置。
    A control device for controlling a control object,
    An input unit that receives an input of an impact level calculated by the impact level calculation unit from the impact prediction device according to any one of claims 1 to 6,
    And a control target control unit that outputs a control command for controlling the control target based on the degree of impact received by the input unit.
  10.  請求項9に記載の制御装置であって、
     前記制御対象制御部は、
     動作の継続、動作の停止、動作速度の減速、接触回避動作の実施、又は衝撃度の低減動作の実施をさせる制御命令を出力する
     ことを特徴とする制御装置。
    The control device according to claim 9,
    The control target control unit
    A control device that outputs a control command for causing an operation to continue, an operation to be stopped, an operation speed to be reduced, a contact avoidance operation to be performed, or an impact reduction operation to be performed.
  11.  対象者が装着可能な装着装置と、
     制御に基づいて動作する制御対象と、
     請求項1乃至請求項8のいずれか1項に記載の衝撃予測装置と
     を備え、
     前記装着装置は、
     前記対象者の身体の測定対象部位を測定する測定部を備え、
     前記制御対象は、
     動作に関する情報を出力する出力部を備え、
     前記衝撃予測装置が備える対象者動作取得部は、前記装着装置の測定部による測定結果に基づく対象者動作情報を取得し、
     前記衝撃予測装置が備える制御対象動作取得部は、前記制御対象から出力された動作に関する情報に基づく制御対象動作情報を取得する
     ことを特徴とする衝撃予測システム。
    A wearing device that the subject can wear;
    A controlled object that operates based on the control;
    An impact prediction device according to any one of claims 1 to 8, and
    The mounting device is
    A measurement unit for measuring a measurement target part of the subject's body;
    The controlled object is
    It has an output unit that outputs information about operations,
    The subject motion acquisition unit provided in the impact prediction device acquires subject motion information based on the measurement result by the measurement unit of the mounting device,
    The control object operation acquisition part with which the above-mentioned shock prediction device is provided acquires control object operation information based on information about the operation outputted from the control object. An impact prediction system characterized by things.
  12.  制御対象が接触により対象者に与える衝撃を予測する衝撃予測方法であって、
     仕様取得部が、前記制御対象の仕様に関する仕様情報を取得するステップと、
     対象者動作取得部が、前記対象者の身体の測定対象部位の測定結果に基づいて前記対象者の動作を示す対象者動作情報を取得するステップと、
     制御対象動作取得部が、前記制御対象の動作を示す制御対象動作情報を取得するステップと、
     衝撃度演算部が、前記対象者動作取得部が取得した対象者動作情報及び前記制御対象動作取得部が取得した制御対象動作情報、並びに前記仕様取得部が取得した仕様情報に基づいて、前記制御対象が前記対象者に接触した場合に前記制御対象が前記対象者に与える衝撃の程度を示す衝撃度を演算するステップと
     を含むことを特徴とする衝撃予測方法。
    An impact prediction method for predicting an impact given to a subject by contact of a controlled object,
    A specification acquiring unit acquiring specification information related to the specification of the control target;
    A subject motion acquisition unit acquiring subject motion information indicating the motion of the subject based on a measurement result of a measurement target region of the subject's body;
    A control target operation acquisition unit acquires control target operation information indicating the operation of the control target; and
    The degree-of-impact calculator calculates the control based on the target person action information acquired by the target person action acquisition part, the control target action information acquired by the control target action acquisition part, and the specification information acquired by the specification acquisition part. Calculating a degree of impact indicating a degree of impact that the control target gives to the subject when the subject comes into contact with the subject.
  13.  コンピュータに、制御対象が接触により対象者に与える衝撃を予測させる衝撃予測プログラムであって、
     コンピュータに、
     前記制御対象の仕様に関する仕様情報を取得するステップと、
     前記対象者の身体の測定対象部位の測定結果に基づいて前記対象者の動作を示す対象者動作情報を取得するステップと、
     前記制御対象の動作を示す制御対象動作情報を取得するステップと、
     取得した対象者動作情報及び取得した制御対象動作情報、並びに取得した仕様情報に基づいて、前記制御対象が前記対象者に接触した場合に前記制御対象が前記対象者に与える衝撃の程度を示す衝撃度を演算するステップと
     を実行させることを特徴とする衝撃予測プログラム。
    An impact prediction program for causing a computer to predict an impact given to a subject by contact of a controlled object,
    On the computer,
    Obtaining specification information on the specification of the controlled object;
    Obtaining subject action information indicating the action of the subject based on the measurement result of the measurement subject part of the subject's body;
    Obtaining control target operation information indicating the operation of the control target;
    Based on the acquired subject action information, the obtained control object action information, and the obtained specification information, an impact indicating the degree of impact that the control object gives to the subject when the control object comes into contact with the subject person And a step of calculating a degree.
PCT/JP2017/000575 2016-02-15 2017-01-11 Impact prediction device, impact prediction system, control device, impact prediction method, and impact prediction program WO2017141577A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016025964A JP2017144492A (en) 2016-02-15 2016-02-15 Impact prediction device, impact prediction system, control device, impact prediction method and impact prediction program
JP2016-025964 2016-02-15

Publications (1)

Publication Number Publication Date
WO2017141577A1 true WO2017141577A1 (en) 2017-08-24

Family

ID=59625775

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/000575 WO2017141577A1 (en) 2016-02-15 2017-01-11 Impact prediction device, impact prediction system, control device, impact prediction method, and impact prediction program

Country Status (2)

Country Link
JP (1) JP2017144492A (en)
WO (1) WO2017141577A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017221305A1 (en) * 2017-11-23 2019-05-23 Robert Bosch Gmbh Method for operating a collaborative robot
CN115389077A (en) * 2022-08-26 2022-11-25 法奥意威(苏州)机器人系统有限公司 Collision detection method and device, control equipment and readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022141177A (en) * 2021-03-15 2022-09-29 オムロン株式会社 wearable equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08164807A (en) * 1994-12-13 1996-06-25 Hitachi Ltd Collision avoiding device
JP2010188504A (en) * 2009-02-20 2010-09-02 Yaskawa Electric Corp Robot control device, and robot
JP2011073079A (en) * 2009-09-29 2011-04-14 Daihen Corp Monitoring device of moving body
US20150015818A1 (en) * 2013-07-12 2015-01-15 Samsung Electronics Co., Ltd. Method of switching guest-host dual frequency liquid crystal by using back flow
US20150131896A1 (en) * 2013-11-11 2015-05-14 Industrial Technology Research Institute Safety monitoring system for human-machine symbiosis and method using the same
US20150239124A1 (en) * 2012-10-08 2015-08-27 Deutsches Zentrum Für Luftund Raumfahrt E.V. Method for controlling a robot device, robot device and computer program product
JP2015227813A (en) * 2014-05-30 2015-12-17 アニマ株式会社 Sensor module position acquiring method and device, and operation measurement method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08164807A (en) * 1994-12-13 1996-06-25 Hitachi Ltd Collision avoiding device
JP2010188504A (en) * 2009-02-20 2010-09-02 Yaskawa Electric Corp Robot control device, and robot
JP2011073079A (en) * 2009-09-29 2011-04-14 Daihen Corp Monitoring device of moving body
US20150239124A1 (en) * 2012-10-08 2015-08-27 Deutsches Zentrum Für Luftund Raumfahrt E.V. Method for controlling a robot device, robot device and computer program product
US20150015818A1 (en) * 2013-07-12 2015-01-15 Samsung Electronics Co., Ltd. Method of switching guest-host dual frequency liquid crystal by using back flow
US20150131896A1 (en) * 2013-11-11 2015-05-14 Industrial Technology Research Institute Safety monitoring system for human-machine symbiosis and method using the same
JP2015227813A (en) * 2014-05-30 2015-12-17 アニマ株式会社 Sensor module position acquiring method and device, and operation measurement method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017221305A1 (en) * 2017-11-23 2019-05-23 Robert Bosch Gmbh Method for operating a collaborative robot
CN115389077A (en) * 2022-08-26 2022-11-25 法奥意威(苏州)机器人系统有限公司 Collision detection method and device, control equipment and readable storage medium
CN115389077B (en) * 2022-08-26 2024-04-12 法奥意威(苏州)机器人系统有限公司 Collision detection method, collision detection device, control apparatus, and readable storage medium

Also Published As

Publication number Publication date
JP2017144492A (en) 2017-08-24

Similar Documents

Publication Publication Date Title
JP6481635B2 (en) Contact determination device, control device, contact determination system, contact determination method, and contact determination program
JP6555149B2 (en) Arithmetic apparatus, arithmetic method and arithmetic program
US11548153B2 (en) Robot comprising safety system ensuring stopping time and distance
US20200276680A1 (en) High-precision kickback detection for power tools
WO2017141577A1 (en) Impact prediction device, impact prediction system, control device, impact prediction method, and impact prediction program
JP6748145B2 (en) Robot system
US20210362338A1 (en) Method of improving safety of robot and method of evaluating safety of robot
KR20180048620A (en) CONTROL DEVICE, SYSTEM, CONTROL METHOD, AND PROGRAM
EP3418835B1 (en) Work region estimation device, control device, control system, work region estimation method, and program
US11669084B2 (en) Controller and control system
WO2017141573A1 (en) Calculation device, calculation method, and calculation program
US10635080B2 (en) Work region estimation device, control device, control system, work region estimation method, and non-transitory computer-readable recording medium
WO2023037443A1 (en) Robot control device, learning device, and inference device
CN115174663A (en) Personnel safety monitoring method and device based on block chain

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17752841

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17752841

Country of ref document: EP

Kind code of ref document: A1