WO2020241797A1 - Dispositif de commande, système de commande, système de dispositif machine, et procédé de commande - Google Patents

Dispositif de commande, système de commande, système de dispositif machine, et procédé de commande Download PDF

Info

Publication number
WO2020241797A1
WO2020241797A1 PCT/JP2020/021236 JP2020021236W WO2020241797A1 WO 2020241797 A1 WO2020241797 A1 WO 2020241797A1 JP 2020021236 W JP2020021236 W JP 2020021236W WO 2020241797 A1 WO2020241797 A1 WO 2020241797A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
command
data
information
mechanical device
Prior art date
Application number
PCT/JP2020/021236
Other languages
English (en)
Japanese (ja)
Inventor
省吾 長谷川
吉田 哲也
Original Assignee
川崎重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 川崎重工業株式会社 filed Critical 川崎重工業株式会社
Publication of WO2020241797A1 publication Critical patent/WO2020241797A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J3/00Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements

Definitions

  • the present disclosure relates to a control device, a control system, a mechanical device system, and a control method.
  • Patent Document 1 discloses a robot camera control device that controls a robot camera using a neural network.
  • the robot camera control device includes a robot camera, a subject detection device that detects the position of the subject, an operator of the robot camera, and a learning control device that has a neural network and controls the imaging operation of the robot camera.
  • the robot camera captures an image of the subject according to the operation on the actuator, and outputs state data indicating the state of the imaging operation to the learning control device.
  • the learning control device causes the neural network to learn the state data using the position data of the subject detected by the subject detection device.
  • the learning control device uses the output of the neural network obtained by inputting the position data of the subject to control the robot camera.
  • An object of the present disclosure is to provide a control device, a control system, a machine device system, and a control method that enable skill inheritance using a machine learning model.
  • the control device is a control device of a mechanical device, and the control device of the mechanical device is in accordance with operation information output from the operation device for operating the mechanical device.
  • the operation in the operation device is performed.
  • the operation device includes an auxiliary unit that outputs an auxiliary command to assist, and the operation device outputs the operation information based on the second operation information indicating the operation of the operation device.
  • control system includes a control device according to one aspect of the present disclosure and the operation device for operating the mechanical device.
  • the mechanical device system includes a control device according to one aspect of the present disclosure, the mechanical device, and the operating device for operating the mechanical device.
  • the machine device is operated according to the operation information output from the operation device for operating the machine device, and the first operation information indicating the operation of the machine device is input.
  • the data is input to the machine learning model to output a command for the operation of the machine device corresponding to the first operation information, and the operation of the machine device and the command corresponding to the command output by the machine learning model.
  • the auxiliary command for assisting the operation in the operation device is output, and the operation device outputs the operation information based on the second operation information indicating the operation of the operation device. ..
  • FIG. 1 is a functional block diagram showing an example of the configuration of the mechanical device system according to the embodiment.
  • FIG. 2 is a diagram showing an example of a machine learning model.
  • FIG. 3 is a diagram showing another example of the machine learning model.
  • FIG. 4 is a functional block diagram showing an example of the configuration of the calculation unit according to the embodiment.
  • FIG. 5 is a flowchart showing an example of the operation of the mechanical device system according to the embodiment in the automatic operation mode.
  • FIG. 6 is a flowchart showing an example of the operation of the mechanical device system according to the embodiment in the training mode of the manual operation mode.
  • FIG. 7 is a functional block diagram showing an example of the configuration of the mechanical device system according to the first modification.
  • FIG. 1 is a functional block diagram showing an example of the configuration of the mechanical device system according to the embodiment.
  • FIG. 2 is a diagram showing an example of a machine learning model.
  • FIG. 3 is a diagram showing another example of the machine learning model.
  • FIG. 4 is
  • FIG. 8 is a side view showing an example of the configuration of the robot according to the first modification.
  • FIG. 9 is a diagram showing an example of the appearance of the operating device according to the first modification.
  • FIG. 10 is a functional block diagram showing an example of the configuration of the operation device according to the first modification.
  • FIG. 11 is a functional block diagram showing an example of the configuration of the calculation unit according to the first modification.
  • FIG. 12 is a functional block diagram showing an example of the configuration of the mechanical device system according to the second modification.
  • FIG. 13 is a functional block diagram showing an example of the configuration of the calculation unit according to the second modification.
  • FIG. 14 is a diagram showing an auxiliary example of the operation based on the auxiliary command.
  • FIG. 1 is a functional block diagram showing an example of the configuration of the mechanical device system 1 according to the embodiment.
  • the solid arrow indicates the flow of commands, data, information, and the like for operating the mechanical device 10 in the automatic operation mode.
  • the broken line arrow indicates the flow of commands, data, information, etc. for operating the mechanical device 10 in the training mode of the manual operation mode.
  • the arrow of the alternate long and short dash line indicates the flow of commands, data, information, etc. for the arithmetic unit 36 to learn.
  • the mechanical device system 1 includes a mechanical device 10, an operating device 20, a control device 30, an operation information detection device 50, and an output device 60.
  • the mechanical device 10 includes an action unit 11 that applies an action to an object to be processed, and an operation unit 12 that moves the action unit 11 so as to perform the action.
  • the operation device 20 and the control device 30 constitute a control system 100 for controlling the mechanical device 10.
  • the operation device 20 is a device for operating the mechanical device 10, and outputs operation information, which is information input to the operation device 20, to the control device 30.
  • the control device 30 controls the overall operation of the mechanical device 10.
  • the operation information detection device 50 detects the operation information indicating the operation of the operation unit 11 and the operation unit 12 of the mechanical device 10 and outputs the operation information to the control device 30.
  • the motion information detection device 50 uses motion information such as the position and orientation of the action unit 11, the force applied by the action unit 11 to the object, an image of the object, vibration, impact, light, sound, and temperature at the action unit 11. , Humidity, barometric pressure, and other information may be provided.
  • the control device 30 outputs operation information to the operation device 20 and the output device 60 for feedback and presentation of the operation state.
  • the output device 60 converts the operation information into information such as visual and auditory information and presents it to the operator of the operation device 20.
  • an image pickup device such as a camera may be arranged at a position away from the mechanical device 10, and the control device 30 may output the image captured by the image pickup device to the output device 60.
  • Such an output device 60 can present the state of the mechanical device 10 to the operator.
  • Examples of the output device 60 are a liquid crystal display (Liquid Crystal Display) and an organic or inorganic EL display (Electro-Luminescence Display), but are not limited thereto.
  • the output device 60 may include a speaker that emits sound.
  • the mechanical device system 1 can cause the mechanical device 10 to execute the operation in the manual operation mode and the operation in the automatic operation mode.
  • the manual operation mode and the automatic operation mode in the present embodiment do not include a teaching (also referred to as "teaching") operation for teaching the mechanical device 10 an operation such as work.
  • the mechanical device 10 executes an operation according to an operation input to the operation device 20 by the operator, that is, an operation that traces the operation.
  • the mechanical device 10 is manually operated by an operator.
  • the manual operation mode includes a training mode and a non-training mode. In the training mode, the mechanical device system 1 assists the operator in operating the operating device 20.
  • the operator receives assistance from the operating device 20 or the like to operate the operating device 20 for causing the mechanical device 10 to perform a predetermined predetermined operation.
  • the mechanical device system 1 does not assist the operator in operating the operating device 20.
  • the operator can cause the mechanical device 10 to perform an arbitrary operation, and operates the operating device 20 without receiving assistance from the operation.
  • the mechanical device 10 executes an operation according to a predetermined operation set in advance.
  • the mechanical device 10 automatically performs an automatic operation to automatically execute a predetermined operation according to the control program.
  • the predetermined operation may be an individual operation such as horizontal movement, vertical movement, and rotation, or may be a composite operation in which a series of a plurality of individual operations are combined according to an execution order.
  • the individual operation may include one operation or may include two or more operations. Examples of complex operations include holding and moving an object at the action unit 11, cutting the object at the action unit 11, joining two or more objects at the action unit 11, and excavating at the action unit 11. It is work such as doing.
  • the mechanical device system 1 can accept the modification of the operation of the action unit 11 and the operation unit 12 using the operation device 20 during the automatic operation.
  • the mechanical device system 1 modifies the operations of the action unit 11 and the operation unit 12 by adding a correction operation corresponding to the operation input to the operation device 20.
  • the automatic operation mode may include a combination of automatic operation and manual operation so that a part of the complex operation is manually operated.
  • the mechanical device 10 may be a device operated by power.
  • the mechanical device 10 include construction machinery, tunnel excavators, cranes, cargo handling vehicles, robots for various purposes such as industrial use, and the like.
  • the mechanical device 10 is a backhoe of a construction machine
  • the excavator of the backhoe corresponds to the working portion 11
  • the arm corresponds to the moving portion 12.
  • the control device 30 controls a hydraulic device or the like that operates the arm.
  • the mechanical device 10 is a tunnel excavator
  • the excavating blade of the tunnel excavator corresponds to the working unit 11
  • the operating device for operating the excavating blade corresponds to the operating unit 12.
  • the control device 30 controls the operation of the operating device and the like.
  • the mechanical device 10 is a cargo handling vehicle
  • the loading portion or grip portion of the cargo handling device such as a fork of the cargo handling vehicle corresponds to the working unit 11
  • the cargo handling device and the driving device of the transport trolley correspond to the operating unit 12.
  • the control device 30 controls the operation of the cargo handling device, the drive device of the transport carriage, and the like.
  • the mechanical device 10 is an industrial robot
  • the robot arm of the robot corresponds to the moving unit 12, and the end effector at the tip of the robot arm corresponds to the working unit 11.
  • the control device 30 controls the operation of the robot arm, the drive device for the end effector, and the like.
  • the type of power may be any type. Examples of types of power are electric motors, internal combustion engines, water vapor, hydraulic pressure, pneumatic pressure and the like.
  • the type of control may be any type. Examples of control types are electrical control, hydraulic control, hydraulic control, pneumatic control and the like.
  • the operation device 20 converts the input by the operator into information corresponding to the input and outputs the operation information to the control device 30.
  • the operating device 20 converts the input by the operator into a signal corresponding to the input and outputs it to the control device 30.
  • the operating device 20 is not fixed to another object such as the mechanical device 10, and is configured to be movable in an arbitrary direction in the three-dimensional space.
  • the operating device 20 may be configured to be movable in any direction on a two-dimensional plane or a one-dimensional straight line.
  • the operating device 20 is configured so that it can be gripped by the operator's hand.
  • the operation device 20 is configured to communicate with the control device 30 via wired communication or wireless communication. Any type of communication may be used regardless of the type of wired communication or wireless communication.
  • the operating device 20 may be, for example, a device having the same configuration as a general-purpose device such as a game controller, a remote controller, or a smartphone of a home-use game machine, or a dedicated device. ..
  • the dedicated device may be a device corresponding to the function of the end effector.
  • the operating device 20 may be a gun-like device.
  • the operating device 20 includes an inertial measurement unit (IMU: Inertial Measurement Unit) (not shown).
  • the inertial measurement unit includes a 3-axis acceleration sensor and a 3-axis angular velocity sensor, and the operating device 20 outputs operation information based on measurement data of acceleration and angular velocity in the 3-axis direction measured by the inertial measuring unit to the control device 30. ..
  • the operation device 20 may output the measurement data itself to the control device 30. From the measurement data of acceleration and angular velocity in the three axial directions, it is possible to detect various information indicating the operation and acting force of the operating device 20 such as position, posture, movement, moving speed, acceleration and force. Such an operation device 20 outputs operation information based on the operation operation information which is information indicating the operation of the operation device 20.
  • the operation device 20 includes a haptics device (not shown) that gives feedback of the operating state of the mechanical device 10 that operates according to the operation information to the operator as a tactile sensation.
  • the haptics device receives the operation information of the mechanical device 10 from the operation information detection device 50 via the control device 30, and gives feedback of the operation state of the mechanical device 10 based on the operation information to the operator as a tactile sense.
  • the haptics device is an example of a sensory device.
  • the operation information includes the operation data.
  • the operation data is at least one of force data representing the force applied to the object by the acting unit 11 of the mechanical device 10, that is, the force acting on the working environment, and position data representing the position of the acting unit 11 during operation. including.
  • the operation data includes both.
  • the force data may be time series data including the magnitude of the force and the time when the force is generated in association with each other.
  • the position data may be time series data including the position information and the time of the position in association with each other.
  • the motion data including the force data and the position data may be time series data including the magnitude of the force, the time when the force is generated, the position information, and the time at the position.
  • the position of the acting unit 11 may include not only the position of the acting unit 11 in the three-dimensional space but also the posture of the acting unit 11 in the three-dimensional space.
  • position means including a position in three-dimensional space and a position in at least three-dimensional space among postures in three-dimensional space.
  • the reason why the operation information includes the operation data as essential information is that the control device 30 controls at least one of the "force” that the action unit 11 acts on the work environment and the "position" of the action unit 11 during operation. This is because the operation of the mechanical device 10 is controlled by.
  • the "operation command” in the present embodiment is a force command that indicates a target value or correction value (correction value) of this "force", and a target value or correction value (correction value) of this "position”. It includes at least one of a position command which is a command to instruct.
  • the operation information includes image data of the object to which the action unit 11 acts, vibration data, impact data, optical data, sound data, temperature data, and humidity data generated in the action unit 11 as information other than the operation data.
  • Pressure data such as pressure may be included. At least the operation data of the operation information is sent to the operation device 20.
  • haptics devices include actuators, controllers, drivers, and the like.
  • the actuator is exemplified by an eccentric motor, a linear resonance actuator, a piezo, or the like, and gives an operator a sense of tactile force.
  • the controller controls the actuator via a driver, and may have a configuration similar to that of the control device 30 illustrated later.
  • the driver constitutes the interface between the actuator and the controller.
  • the detailed configuration of the haptics device is disclosed in Japanese Patent No. 4111278, Japanese Patent Application Laid-Open No. 2019-60835, and the like, and is known, and therefore detailed description thereof will be omitted.
  • the haptics device can give the operator a tactile sensation while the operator is holding the operating device 20 in the air, and an example of such a tactile sensation is pushed by the operator himself.
  • the control device 30 shown in FIG. 1 is composed of, for example, an arithmetic unit having a processor, a memory, and the like.
  • the memory is composed of a semiconductor memory such as a volatile memory and a non-volatile memory, and a storage device such as a hard disk (HDD: Hard Disc Drive) and an SSD (Solid State Drive).
  • the function of the arithmetic unit is a computer system consisting of a processor such as a CPU (Central Processing Unit), a volatile memory such as a RAM (Random Access Memory), and a non-volatile memory such as a ROM (Read-Only Memory) (shown in the figure). It may be realized by.
  • Some or all of the functions of the arithmetic unit may be realized by the CPU using the RAM as a work area to execute a program recorded in the ROM.
  • a part or all of the functions of the arithmetic unit may be realized by the above computer system, or may be realized by a dedicated hardware circuit such as an electronic circuit or an integrated circuit, and may be realized by the above computer system and the hardware circuit. It may be realized by a combination.
  • the control device 30 may execute each process by centralized control by a single arithmetic unit, or may execute each process by distributed control by the cooperation of a plurality of arithmetic units.
  • the control device 30 may be composed of a computer device such as a computer and a personal computer, for example.
  • the control device 30 may be composed of, for example, a microcontroller, an MPU (Micro Processing Unit), an LSI (Large Scale Integration: large-scale integrated circuit), a system LSI, a PLC (Programmable Logic Controller), a logic circuit, or the like. ..
  • the plurality of functions of the control device 30 may be realized by being individually integrated into one chip, or may be realized by being integrated into one chip so as to include a part or all of them. Further, each circuit may be a general-purpose circuit or a dedicated circuit.
  • an FPGA Field Programmable Gate Array
  • a reconfigurable processor that can reconfigure the connection and / or setting of circuit cells inside the LSI, or multiple functions for a specific application.
  • An ASIC Application Specific Integrated Circuit
  • ASIC Application Specific Integrated Circuit
  • the control device 30 includes an operation determination unit 31, an operation command unit 32, a correction command unit 33, a drive command unit 34, a correction information detection unit 35, a calculation unit 36, an operation information processing unit 37, and a difference detection.
  • a unit 38, an auxiliary information processing unit 39, a first storage unit 40, a second storage unit 41, and a third storage unit 42 are included as functional components.
  • the operation determination unit 31, the operation command unit 32, the correction command unit 33, the drive command unit 34, the correction information detection unit 35, the calculation unit 36, the operation information processing unit 37, the difference detection unit 38, and the auxiliary information processing unit 39 perform the above calculation.
  • the first storage unit 40, the second storage unit 41, and the third storage unit 42 are functional blocks realized by the storage device of the arithmetic unit.
  • the difference detection unit 38 and the auxiliary information processing unit 39 constitute an auxiliary unit.
  • the operation determination unit 31 determines a predetermined operation to be executed by the mechanical device 10, and acquires operation information (hereinafter, also referred to as “determined operation information”) of the predetermined operation.
  • the operation determination unit 31 receives a command of a predetermined operation to be executed by the mechanical device 10 via the operation device 20 or another input device of the mechanical device system 1.
  • the operation determination unit 31 extracts the operation information corresponding to the received predetermined operation from the third storage unit 42 as the determination operation information.
  • the predetermined operation to be executed by the mechanical device 10 may be an individual operation or a combined operation.
  • the operation determination unit 31 outputs the determination operation information to the operation command unit 32, but may further output the content of the predetermined operation to the calculation unit 36.
  • the operation determination unit 31 In the training mode of the manual operation mode, the operation determination unit 31 outputs the content of a predetermined operation to the calculation unit 36, but it is not necessary to acquire the determination operation information.
  • the motion determination unit 31 does not function in the non-training mode.
  • the third storage unit 42 stores a predetermined operation that can be executed by the mechanical device 10 in association with the operation information of the predetermined operation.
  • the operation information of the predetermined operation is preset and stored in the third storage unit 42.
  • operation information for each individual operation may be set.
  • the motion information of each individual motion may be set by presetting the target values of the force and the position of the action unit 11.
  • the operation information of each individual operation may be set by using the operation information obtained as a result of operating the mechanical device 10 via the operation device 20 in the manual operation mode.
  • the operation information of each individual operation may be set by using the operation information obtained as a result of actually operating the mechanical device 10 in the automatic operation mode.
  • the operation command unit 32 generates an operation command (hereinafter, also referred to as an “execution operation command”) for causing the mechanical device 10 to execute an operation.
  • execution operation command an operation command
  • the operation command unit 32 receives the operation information from the operation device 20 and generates an execution operation command corresponding to the operation information.
  • the operation command unit 32 generates an operation command (hereinafter, also referred to as “operation operation command”) for causing the action unit 11 to perform an operation corresponding to the operation information, and drives the operation operation command as an execution operation command.
  • the operation command unit 32 uses the determined operation information determined by the operation determination unit 31 to generate an execution operation command for causing the mechanical device 10 to execute an operation corresponding to the determined operation information. Output to the correction command unit 33. Further, the operation command unit 32 is configured to receive output data from the calculation unit 36. This output data is a command (hereinafter, also referred to as "execution operation correction command") to be output by the calculation unit 36 by inputting the operation information of the mechanical device 10 as input data.
  • execution operation correction command is an operation command.
  • the operation command unit 32 When the operation command unit 32 receives the execution operation correction command from the calculation unit 36, the operation command unit 32 corrects the operation command for executing the determination operation information (hereinafter, also referred to as “determination operation command”) by using the execution operation correction command. By doing so, an execution operation command is generated. At this time, the operation command unit 32 adds an execution operation correction command corresponding to the determination operation command, or replaces the determination operation command with the corresponding execution operation correction command. If not received, the operation command unit 32 uses the determined operation command as the execution operation command.
  • the execution operation correction command corresponding to the determination operation command is output data of the calculation unit 36 when the operation information of the mechanical device 10 immediately before executing the operation of the determination operation command is used as input data.
  • the correction command unit 33 functions in the automatic operation mode.
  • the correction command unit 33 generates a correction operation command, which is a corrected operation command, by modifying the execution operation command received from the operation command unit 32 according to the operation information output from the operation device 20 during the automatic operation mode. And output to the drive command unit 34.
  • the correction command unit 33 when an input is made to the operation device 20, the correction command unit 33 generates an operation operation command according to the operation information corresponding to the input.
  • the correction command unit 33 generates a correction operation command by adding the execution operation command and the operation operation command.
  • the modified operation command is an operation command that reflects the operation information. Further, when there is no input to the operation device 20, the correction command unit 33 determines the execution operation command as the correction operation command.
  • the drive command unit 34 controls the operation of the mechanical device 10 according to the execution operation command received from the operation command unit 32 or the correction operation command received from the correction command unit 33.
  • the drive command unit 34 controls the operation of each drive device of the mechanical device 10 so that the action unit 11 performs the operation corresponding to the operation command.
  • the drive command unit 34 generates drive data including a command value for driving the drive device in order to execute the above operation, and outputs the drive data to each drive device.
  • the operation command unit 32 and the drive command unit 34 form an operation control unit
  • the correction command unit 33 and the drive command unit 34 form a correction control unit.
  • commands can be added or subtracted
  • operation commands and “operation data” can be added or subtracted from each other.
  • the operation information processing unit 37 receives the operation information of the mechanical device 10 from the operation information detection device 50, and outputs the operation information to the calculation unit 36, the operation device 20, and the output device 60. In the automatic operation mode and the training mode, the operation information processing unit 37 outputs the operation information to the calculation unit 36, the operation device 20 and the output device 60, and in the non-training mode, the operation information is output to the operation device 20 and the output device 60. Output to, but not limited to this.
  • the motion information processing unit 37 is an example of a processing unit.
  • the correction information detection unit 35 functions in the automatic operation mode.
  • the correction information detection unit 35 detects the correction information indicating the correction made by the correction command unit 33 and stores it in the second storage unit 41. Specifically, when the correction command unit 33 corrects the execution operation command, the correction information detection unit 35 detects the correction operation command generated by the correction command unit 33 as correction information. Further, when the correction command unit 33 does not correct the execution operation command, the correction information detection unit 35 detects the uncorrected execution operation command as the correction information.
  • the correction information detection unit 35 may associate the correction operation command or the execution operation command with the issuance time which is the time when the operation command is issued, and generate time-series data of the operation command. In this case, the correction information detection unit 35 may associate the target value of the "force” and the target value of the "position” included in the operation command with the announcement time to generate time-series data similar to the operation data. ..
  • the correction information detection unit 35 may detect an operation operation command as correction information. For example, the correction information detection unit 35 detects the operation operation information used for the correction as the correction information when the execution operation command is corrected, and if the execution operation command is not corrected, the correction information You may generate a detection result assuming that there is no.
  • the first storage unit 40 stores operation information indicating the operation of the mechanical device 10. Specifically, the first storage unit 40 stores the operation information of the mechanical device 10 received from the operation information detection device 50. In the first storage unit 40, the operation information and the time when the operation information is detected by the operation information detection device 50 are stored in association with each other.
  • the second storage unit 41 stores correction information indicating the correction made by the correction command unit 33. Specifically, the second storage unit 41 stores the correction information received from the correction information detection unit 35. In the second storage unit 41, the correction information and the issuance time of the operation command corresponding to the correction information are stored in association with each other.
  • the calculation unit 36 includes a machine learning model 36a for machine learning.
  • the machine learning model 36a improves the accuracy of the output data with respect to the input data by learning using the training data.
  • Such machine learning models include neural networks (Neural Network) such as Deep Learning, Random Forest, Genetic Programming, regression models, tree models, Bayesian models, time series models, clustering models, ensemble learning models, etc. Illustrated.
  • the machine learning model 36a of the present embodiment is a neural network.
  • the calculation unit 36 causes the machine learning model 36a to perform machine learning.
  • the machine learning model 36a performs machine learning using the operation information of the machine device 10 and the correction information corresponding to the operation information.
  • the machine learning model 36a after machine learning uses the operation information of the machine device 10 as input data and the command corresponding to the operation information as output data.
  • the output data is an execution operation correction command.
  • the operation information of the machine device 10 may be used as input data, and the correction information executed in the state of the operation information may be used as teacher data.
  • the weighting of the connection between the nodes in the neural network which will be described later, is adjusted so that the output data with respect to the input data matches the teacher data.
  • the machine learning model 36a after such weighting adjustment can output an execution operation correction command to be executed in the state of the operation information.
  • a neural network is an information processing model modeled on the cranial nerve system.
  • the neural network is composed of a plurality of node layers including an input layer and an output layer.
  • the node layer contains one or more nodes.
  • the machine learning model 36a may be configured by a neural network as shown in FIG.
  • FIG. 2 is a diagram showing an example of the machine learning model 36a.
  • the neural network when the neural network is composed of an input layer, an intermediate layer, and an output layer, the neural network outputs the information input to the nodes of the input layer from the input layer to the intermediate layer. Output processing from the intermediate layer to the output layer is performed in sequence, and output results that match the input information are output.
  • Each node of one layer is connected to each node of the next layer, and the connection between the nodes is weighted.
  • the information of the nodes of one layer is weighted for the connection between the nodes and output to the nodes of the next layer.
  • the machine learning model 36a may be composed of a recurrent neural network (also referred to as a "recurrent neural network”) as shown in FIG.
  • FIG. 3 is a diagram showing another example of the machine learning model 36a.
  • the recurrent neural network handles time series information.
  • the input data of the recurrent neural network includes the data at the current time t and the output data of the intermediate layer in the recurrent neural network at the time t-1 before the time t.
  • the recurrent neural network has a network structure in consideration of time series information. Since such a recurrent neural network outputs the motion information in consideration of the behavior over time, the accuracy of the output data can be improved.
  • FIG. 4 is a functional block diagram showing an example of the configuration of the calculation unit 36 according to the embodiment.
  • the calculation unit 36 includes a machine learning model 36a, a data generation unit 36b, a data input unit 36c, and a learning evaluation unit 36d.
  • the configuration of the machine learning model 36a is as described above. Since the machine learning model 36a handles time series data as follows, it is preferably a recurrent neural network.
  • each command and each data is acquired at a predetermined sampling interval while the mechanical device 10 executes a predetermined operation once.
  • the correction information detector 35 time-series data Pm 0, Pm 1, Pm 2 modified operation command Pm as correction information, ⁇ ⁇ ⁇ , Pm u (hereinafter, the Pm 0 ⁇ Pm u (Abbreviated) is acquired at the sampling interval.
  • Operation information detecting unit 50 time-series data Pd 0, Pd 1, Pd 2 operation data Pd of the machine 10, ⁇ ⁇ ⁇ , Pd u (hereinafter, abbreviated as Pd 0 ⁇ Pd u) at the sampling interval get.
  • time series data in which the numbers in the subscripts are the same means that the data is acquired at the sampling time which can be regarded as the same or substantially the same.
  • time-series data of the operation data Pd of mechanical apparatus 10 in accordance with time-series data Pm i corrective action command Pm is executed when a series data Pd i.
  • Time-series data having the same subscript numbers are time-series data corresponding to each other.
  • the data generation unit 36b generates the time series data pd 0 to pd u of the learning data pd from the time series data Pd 0 to Pdu u of the operation data Pd stored in the first storage unit 40. Also, the data generating unit 36b generates the time-series data pn 0 ⁇ pn u teacher data pn from the time-series data Pm 0 ⁇ Pm u corrective action command Pm stored in the second storage unit 41. The data generation unit 36b outputs the generated time series data to the data input unit 36c.
  • the data input unit 36c sequentially inputs the time series data pd 0 to pd u of the training data pd to each neuron in the input layer of the neural network of the machine learning model 36a.
  • the operation correction command Pni + 1 is predicted and output.
  • the learning evaluation unit 36d extracts the time-series data pn i + 1 at the sampling time ti + 1 by searching the time-series data pn 0 to pn u of the teacher data pn based on the execution operation correction command Pni + 1 . Further, the learning evaluation unit 36d adjusts the weights between the neurons of the neural network by a backward calculation so as to match the execution operation correction command Pni + 1 and the time series data pn i + 1 or to minimize the error. To do. Further, the data input unit 36c and the learning evaluation unit 36d optimize the weights between neurons by performing the above processing for all of the time series data pd 0 to pd u of the learning data pd.
  • the calculation unit 36 functions in the automatic operation mode and the training mode.
  • the movement information detecting unit 50 detects the operation data Pd i at the current sampling time t i, and outputs to the calculation unit 36 through the operation information processing unit 37.
  • Data input unit 36c inputs the operation data Pd i to the neural network machine learning model 36a. Neural networks will be able to input data to motion data Pd i, and outputs an execution operation correction command Pn i + 1 at the next sampling time t i + 1 as the output data.
  • the calculation unit 36 outputs the execution operation correction command Pni + 1 output by the neural network to the operation command unit 32.
  • the operation command unit 32 generates an execution operation command that reflects the execution operation correction command Pni + 1 .
  • the machine learning model 36a by using a neural network, the sampling time t sampling time operation data Pd i as input data in the i t i + execution operation modified in one command Pn By outputting i + 1 , the execution operation command reflecting the execution operation correction command Pni + 1 is output.
  • the calculation unit 36 outputs the execution operation correction command Pni + 1 output by the neural network to the difference detection unit 38.
  • the neural network machine learning model 36a as the input data, the operation data Pd i at the sampling time t i, the sampling time t i earlier sampling time t i-1 ⁇ t i- n (n is a predetermined natural number) it may be configured to operate with data Pd i-1 ⁇ Pd i- n of the input.
  • the machine learning data input unit 36c inputs regarding training data pd at the sampling time t i, the time-series data Pd i, the Pd i-1 ⁇ Pd i- n to the neural network, the neural network, following Outputs the execution operation correction command Pni + 1 at the sampling time ti + 1 .
  • the learning evaluation unit 36d adjusts the weights between the neurons of the neural network with respect to the execution motion correction command Pni + 1 and the time series data pn i + 1 of the teacher data pn.
  • the neural network machine learning models 36a when the data input and output, the neural network machine learning models 36a, with respect to the sampling time t i, the sampling time t i, t i-1 operation in ⁇ t i-n data Pd i, Pd i-1 ⁇ Pd i and it outputs an execution operation correction command Pn i + 1 at the sampling time t i + 1 to -n as input data.
  • Such a neural network can improve its learning efficiency and learning accuracy.
  • Such a neural network predicts the next movement of the working part 11 of the mechanical device 10 based on not only the movement data at the present moment but also a series of movement data from before that, so that accurate prediction is possible. To.
  • the neural network as described above is constructed for each type of complex operation that can be executed by the mechanical device 10, and one neural network is configured to correspond to one type of complex operation. In some cases, one neural network may be configured to support a plurality of types of complex operations.
  • the difference detection unit 38 functions in the training mode.
  • the difference detection unit 38 detects the difference between the operation of the mechanical device 10 controlled by the operation command unit 32 and the operation of the mechanical device 10 corresponding to the command output by the calculation unit 36.
  • the difference detection unit 38 detects the difference between the execution operation command received from the operation command unit 32 and the execution operation correction command received from the calculation unit 36.
  • the execution operation command and the execution operation correction command for which the difference is detected are operation commands corresponding to each other.
  • the execution operation correction command corresponding to the execution operation command is output data of the calculation unit 36 when the operation information of the mechanical device 10 immediately before executing the operation of the execution operation command is used as input data. For example, executing the operation command at the sampling time t i corresponds with the operation data Pd i-1 the next sampling time t execution operation correction command in i Pn i is output as the input data at the sampling time t i-1.
  • the auxiliary information processing unit 39 functions in the training mode.
  • the auxiliary information processing unit 39 generates an auxiliary command for assisting the operation in the operation device 20 based on the difference detected by the difference detection unit 38, and outputs the auxiliary command to the operation device 20.
  • the auxiliary information processing unit 39 generates an auxiliary command when the difference is equal to or more than the threshold value, and does not generate an auxiliary command when the difference is less than the threshold value.
  • the auxiliary command is a command for generating a stimulus that makes the operator perceive that the difference is equal to or more than the threshold value. Examples of stimuli are light, sound, vibration, electrical stimuli, temperature, and tactile sensation using haptics devices.
  • the auxiliary command may be a command to generate a stimulus to the operating device 20, or may be a command to generate a stimulus to a device around the operator. In this embodiment, the auxiliary command causes the operating device 20 to generate a stimulus.
  • the auxiliary command may be a command to generate a tactile sensation in the haptics device so that the degree of tactile sensation corresponds to the magnitude of the difference. ..
  • the auxiliary command may be a command that generates a tactile force sense that guides the operation of the operating device 20 in a direction that reduces the difference.
  • the auxiliary command when the amount of movement of the operating device 20 in a certain direction is excessive and the difference exceeds the threshold value, the auxiliary command generates a tactile sensation of pushing the operator's hand in the direction opposite to the above direction. It may be a command to make it.
  • the operator can receive guidance in the operation direction, that is, receive operation assistance by receiving the above-mentioned tactile force sense in the hand holding the operation device 20.
  • FIG. 5 is a flowchart showing an example of the operation of the mechanical device system 1 according to the embodiment in the automatic operation mode. Further, FIG. 5 shows an example in which the mechanical device system 1 causes the mechanical device 10 to perform a predetermined operation for one cycle. In this example, the mechanical device system 1 will be described as having the mechanical device 10 automatically perform all predetermined operations.
  • the operator inputs a command to execute a predetermined operation in the automatic operation mode to the mechanical device system 1, and the control device 30 receives the command (step S101).
  • the operator may input via the operating device 20 or may input via another input device included in the mechanical device system 1.
  • the predetermined operation is a complex operation.
  • the operation determination unit 31 of the control device 30 acquires the operation information corresponding to the predetermined operation (step S102).
  • the operation determination unit 31 extracts operation information corresponding to each individual operation included in a predetermined operation from the third storage unit 42, and sequentially outputs the operation information to the operation command unit 32. Further, the operation determination unit 31 outputs the content of the predetermined operation to the calculation unit 36.
  • the operation command unit 32 determines whether or not there is incomplete operation information among the operation information corresponding to the individual operations included in the predetermined operation, that is, there is an incomplete individual operation. Whether or not it is determined (step S103).
  • the operation command unit 32 proceeds to step S104 when there is incomplete operation information (Yes in step S103), and ends a series of processes when there is no incomplete operation information (No in step S103).
  • step S104 the calculation unit 36, operation information such as the action part 11 of the machine 10, specifically, to acquire the operation data Pd i included in the operation information.
  • Operation data Pd i is located operation data at time t i, a time t initial value Pd 0 in 0 a at the beginning of the process.
  • the calculation unit 36 may request the operation information from the operation information processing unit 37 of the control device 30.
  • the motion information processing unit 37 may request the motion information detection device 50 to detect the motion information and acquire the detection result of the motion information detection device 50.
  • the calculation unit 36 may receive operation information from the operation information processing unit 37 in step S112, which will be described later, and acquire operation data from the operation information.
  • step S112 the calculation unit 36 stores the operation data in the first storage unit 40 of the control device 30. The operation information that has been processed may be acquired.
  • the calculation unit 36 a neural network machine learning models 36a corresponding to a predetermined operation, to generate an execution operation correction command Pm i + 1 by inputting the operation data Pd i, operation command execution operation correction command Pm i + 1 Output to unit 32 (step S105).
  • the operation command unit 32 generates an execution operation command for causing the mechanical device 10 to execute the operation by using the operation information corresponding to the predetermined operation, and outputs the execution operation command to the correction command unit 33 (step S106).
  • the operation command unit 32 is for executing the operation information corresponding to the individual operation for the individual operation to be executed at the earliest among the unfinished individual operations included in the predetermined operation.
  • the decision operation command Ps i + 1 which is an operation command, is generated.
  • the operation command unit 32 generates the execution operation command Pe i + 1 based on the decision operation command Ps i + 1 and the execution operation correction command Pmi + 1 .
  • the decision operation command Ps i + 1 and the execution operation correction command Pmi + 1 are commands corresponding to the time ti + 1 .
  • the correction command unit 33 determines whether or not there is a correction input that is an input for correcting the operation of the mechanical device 10 from the operation device 20 (step S107).
  • the correction command unit 33 proceeds to step S108 when there is a correction input (Yes in step S107), and proceeds to step S109 when there is no correction input (N réelle in step S107).
  • step S108 the correction command unit 33 corrects the execution operation command Pe i + 1 of the operation command unit 32 according to the operation information output from the operation device 20, and outputs the correction command unit 33 to the drive command unit 34.
  • the correction command unit 33 adds the operation operation command P réelle i + 1 for causing the action unit 11 to perform the operation corresponding to the operation information and the execution operation command Pe i + 1 of the operation command unit 32, thereby issuing the correction operation command Pf i + 1 .
  • step S109 the correction command unit 33 outputs the execution operation command Pe i + 1 of the operation command unit 32 to the drive command unit 34.
  • step S110 the correction information detection unit 35 detects the correction information and stores it in the second storage unit 41.
  • the modification information detection unit 35 detects the modification operation command Pf i + 1 as the modification information. If the execution operation command Pe i is not modified, the correction information detection unit 35 detects the uncorrected execution operation command Pe i + 1 as the correction information.
  • the drive command unit 34 generates drive data which is a command for driving each drive device of the mechanical device 10 so that the action unit 11 performs an operation corresponding to the correction operation command Pfi + 1 or the execution operation command Pe i + 1. , Output to each drive. That is, the drive command unit 34 drives the mechanical device 10 so as to perform an operation corresponding to the command (step S111).
  • the operation information detection device 50 detects the operation data Pdi + 1 as the operation information of the operating mechanical device 10, and stores it in the first storage unit 40 (step S112).
  • the operation information detection device 50 outputs the detected operation information, which is the detected operation data Pdi + 1 , to the first storage unit 40 and the operation information processing unit 37.
  • the operation information processing unit 37 outputs the detected operation information to the calculation unit 36, the operation device 20, and the output device 60. Further, the motion information processing unit 37 returns to the process of step S103.
  • the operation device 20 provides the operator with a tactile force sense corresponding to the force data and the position data of the operation data included in the detection operation information.
  • the tactile force sense can indicate the operating state of the acting unit 11.
  • the action unit 11 presses the object by giving the operator's hand holding the operation device 20 a tactile force sensation as if the operator pushes the object by himself / herself.
  • the operating device 20 gives the operator a tactile sensation of pulling by himself / herself, so that he / she can experience a state in which the acting unit 11 is pulling or lifting an object.
  • the operating device 20 gives a tactile sense of the texture of the surface so that the operating portion 11 can experience the roughness state of the surface of the object in contact with the object.
  • the operating device 20 gives a tactile sense of pressure to experience the hardness and softness of the surface of the object with which the acting portion 11 is in contact.
  • the output device 60 visually and / or audibly indicates to the operator the position and posture of the action unit 11 with respect to the object based on the position data of the operation data included in the detection operation information.
  • steps S103 to S112 described above processing related to the operation to be executed at the sampling time ti + 1 is performed, but in the next steps S103 to S112, processing related to the operation to be executed at the next sampling time ti + 2 is performed.
  • the operation information to be performed may be updated.
  • control device 30 at the sampling time t 0 ⁇ t u respective timings, to modify the determined operation command with an execution operation correction instruction of the arithmetic unit 36, but generates an execution operation command, to Not limited.
  • the control device 30 may generate the execution operation command as described above at the timing when the individual operations included in the predetermined operation change.
  • control device 30 may perform machine learning of the calculation unit 36 at any time.
  • the control device 30 may cause the calculation unit 36 to perform machine learning using the data accumulated in each predetermined work each time the predetermined work by the mechanical device 10 is completed.
  • the control device 30 may cause the calculation unit 36 to perform machine learning using the data accumulated in the predetermined work of the predetermined number of times each time the predetermined work of the mechanical device 10 is completed.
  • the control device 30 causes the calculation unit 36 to use the data accumulated in the predetermined work in the predetermined period for each predetermined period such as a predetermined number of days, a predetermined number of weeks, and a predetermined number of months. Machine learning may be performed.
  • FIG. 6 is a flowchart showing an example of the operation of the mechanical device system 1 according to the embodiment in the training mode of the manual operation mode. Further, FIG. 6 shows an example in which the operator causes the mechanical device 10 to perform a predetermined operation for one cycle by using the operating device 20. In this example, the mechanical device system 1 will be described as having the mechanical device 10 perform all predetermined operations according to the operation of the operator.
  • the operator inputs a command to execute a predetermined operation in the training mode of the manual operation mode to the mechanical device system 1, and the control device 30 receives the command (step S201).
  • the operation determination unit 31 of the control device 30 outputs the content of the predetermined operation to the calculation unit 36.
  • Calculation unit 36 operation information such as the action part 11 of the machine 10, specifically, to acquire the operation data Pd i included in the operation information (step S202).
  • the calculation unit 36 may request the operation information from the operation information detection device 50 via the operation information processing unit 37 of the control device 30 and acquire the detection result of the operation information detection device 50.
  • step S203 determines whether a predetermined operation has not been completed. If it is not completed (Yes in step S203), the calculation unit 36 proceeds to step S204, and if it is completed (No in step S203), the calculation unit 36 ends a series of processes.
  • step S204 the calculation unit 36, the neural network of the corresponding machine learning models 36a to a predetermined operation, to generate an execution operation correction command Pm i + 1 by inputting the operation data Pd i, the execution operation correction command Pm i + 1 Output to the difference detection unit 38.
  • the operation command unit 32 receives the operation information input to the operation device 20 by the operator (step S205). Further, the operation command unit 32 generates an operation operation command P Cincinnati i + 1 from the operation information as an execution operation command based on the operation information (step S206). Operation Operation command P Cincinnati i + 1 is the operation command generated from the operation information input to the operation unit 20 at time t i + 1, which is a command to be operated machine 10 at time t i + 1. Further, the operation command unit 32 outputs the operation operation command P réelle i + 1 to the difference detection unit 38 and the drive command unit 34.
  • the difference detecting unit 38 detects a difference D i + 1 of the execution operation correction instruction Pm i + 1 and the operating motion command (execution operation command) P Cincinnati i + 1 (step S207).
  • the auxiliary information processing unit 39 detects whether or not the difference Di + 1 is equal to or greater than the threshold value Tr, that is, Di + 1 ⁇ Tr (step S208).
  • the auxiliary information processing unit 39 proceeds to step S209, and when Di + 1 ⁇ Tr (N réelle in step S208), proceeds to step S211.
  • step S209 the auxiliary information processing unit 39 generates an auxiliary command for generating a stimulus in the operation device 20 and outputs the auxiliary command to the operation device 20.
  • an auxiliary command is a command for indicating that there is a large difference between the output result of the arithmetic unit 36 using machine learning and the operation result of the operator, and the operation result of the operator is an ideal operation. It is a directive to indicate that it deviates from.
  • the auxiliary information processing unit 39 may generate an auxiliary command that generates a stimulus corresponding to the magnitude of the difference Di + 1 .
  • the operating device 20 receives the auxiliary command from the auxiliary information processing unit 39 and gives the operator a stimulus corresponding to the auxiliary command (step S210).
  • the operator who perceives the stimulus can recognize that his / her operation is inappropriate. Further, the operator can easily bring his / her own operation closer to the ideal operation by receiving the stimulus corresponding to the magnitude of the difference Di + 1 .
  • the drive command unit 34 drives each drive device of the mechanical device 10 so that the action unit 11 performs an operation corresponding to the operation operation command (execution operation command) P réelle i + 1 received from the operation command unit 32 in step S206. Generate data and output it to each drive unit. That is, the drive command unit 34 drives the mechanical device 10 so as to perform an operation corresponding to the command (step S211). Further, the drive command unit 34 returns to the process of step S202.
  • steps S202 to S211 described above processing related to the operation executed at the sampling time ti + 1 is performed, but in the next steps S202 to S211, processing related to the operation executed at the next sampling time ti + 2 is performed.
  • the controller 30 at the timing of each sampling time t i, based on the difference in the execution behavior modification command of the arithmetic unit 36 and the operation operation command operation command unit 32 (the execution operation command), stimulation to the operation apparatus 20 Is generated, but is not limited to this.
  • the control device 30 may generate the above-mentioned command at the timing when the individual operations included in the predetermined operations change.
  • the drive command unit 34 generates drive data for each drive device of the mechanical device 10 and outputs the drive data to each drive device so that the action unit 11 performs an operation corresponding to the operation operation command received from the operation command unit 32.
  • the mechanical device 10 operates according to the operation information input to the operation device 20.
  • the control device 30 receives an input to the operation device 20, the control device 30 operates the mechanical device 10 according to the input operation information.
  • the control device 30 of the mechanical device 10 operates the mechanical device 10 according to the operation information output from the operating device 20 for operating the mechanical device 10.
  • the operation command unit 32 and the drive command unit 34 as the operation control unit to be controlled, and the first operation information indicating the operation of the mechanical device 10 are used as input data, and the operation command of the mechanical device 10 corresponding to the first operation information is output.
  • the operation of the arithmetic unit 36 including the machine learning model 36a as data the operation of the mechanical device 10 controlled by the operation command unit 32, and the operation of the mechanical device 10 corresponding to the command output by the arithmetic unit 36.
  • It includes a difference detection unit 38 and an auxiliary information processing unit 39 as auxiliary units that output an auxiliary command that assists the operation of the operation device 20 based on the difference. Further, the operation device 20 outputs operation information based on the second operation information indicating the operation of the operation device 20.
  • the machine learning model 36a of the calculation unit 36 outputs a command for the operation of the machine device 10 corresponding to the first operation information indicating the operation of the machine device 10.
  • the auxiliary information processing unit 39 outputs an auxiliary command for assisting the operation of the operation device 20 based on the difference between the operation command and the operation of the mechanical device 10.
  • the operating device 20 includes a perceptual device that gives a perceptual stimulus to the operator holding the operating device 20, and the auxiliary information processing unit 39 gives the perceptual device a perceptual stimulus. May be output to the operating device 20 to generate an auxiliary command.
  • the sensory device may be a haptics device that gives a tactile stimulus to the operator.
  • the control device 30 gives the operator a perceptual stimulus to assist the operation in the operating device 20. The operator can recognize the assistance for the operation through the experience. Therefore, the operator can easily and surely recognize the assistance. Furthermore, the stimulus using the haptics device can make the operator recognize the contents of various assists.
  • the control device 30 includes an operation information processing unit 37 as a processing unit that outputs first operation information indicating the operation of the mechanical device 10 to the operation device 20, and is a perception device.
  • the haptics device as may give the operator a perceptual stimulus corresponding to the operating state of the mechanical device 10 based on the first motion information. According to the above configuration, the operator can operate the operating device 20 while experiencing the operation of the mechanical device 10. Therefore, the operator can appropriately operate the mechanical device 10 using the operating device 20.
  • the operation device 20 may include an inertial measurement unit and output operation information based on the measurement data of the inertial measurement unit as the second operation information. According to the above configuration, the operation device 20 outputs operation information based on the second operation information indicating the operation of the operation device 20. Since the operation information is information based on the measurement data of the inertial measurement unit, the operation of the operation device 20 can be accurately shown. Therefore, the accuracy of the operation information is improved, and the operation content of the operation device 20 is reflected in the operation of the mechanical device 10 with high accuracy.
  • the operating device 20 may be configured to be movable in an arbitrary direction in a three-dimensional space. According to the above configuration, the operating device 20 can cause the mechanical device 10 to perform various operations.
  • the first operation information indicating the operation of the mechanical device 10 may include force data representing the force applied by the mechanical device 10 to the object.
  • the calculation unit 36 outputs a command for the operation of the mechanical device 10 in consideration of the force applied by the mechanical device 10 to the object. Therefore, the auxiliary information processing unit 39 can output an auxiliary command in consideration of the state of force of the acting unit 11 of the mechanical device 10. Such an auxiliary command can assist the operation in the operating device 20 with high accuracy.
  • the first operation information indicating the operation of the mechanical device 10 may include position data indicating the position of the mechanical device 10.
  • the calculation unit 36 outputs a command for the operation of the mechanical device 10 in consideration of the position of the mechanical device 10. Therefore, the auxiliary information processing unit 39 can output an auxiliary command in consideration of the position of the working unit 11 or the like of the mechanical device 10. Such an auxiliary command can assist the operation in the operating device 20 with high accuracy.
  • the machine learning model 36a may be configured by a neural network. According to the above configuration, the neural network enables flexible and highly accurate processing. Therefore, the machine learning model 36a can output highly accurate output data for various input data.
  • the first operation information indicating the operation of the mechanical device 10 may include the current operation and the past operation of the mechanical device 10.
  • the first operation information indicates time series information of the operation of the mechanical device 10.
  • the machine learning model 36a uses such time series information as input data. Therefore, the machine learning model 36a outputs a command for the operation of the machine device 10 in consideration of the behavior of the machine device 10 over time. Therefore, the output accuracy of the machine learning model 36a is improved.
  • the control device 30 serves as a correction control unit that corrects the operation of the mechanical device 10 according to the operation information output from the operating device 20 in the automatic operation of the mechanical device 10.
  • the correction command unit 33 and the drive command unit 34, the first storage unit 40 that stores the first operation information of the mechanical device 10, and the second storage unit that stores the correction information indicating the correction made by the correction command unit 33. 41 may be provided.
  • the calculation unit 36 may cause the machine learning model 36a to perform machine learning using the first operation information and the correction information corresponding to the first operation information.
  • the operation command unit 32 controls the operation of the mechanical device 10 based on the command of the calculation unit 36 in the automatic operation, and controls the operation of the mechanical device 10 according to the operation information of the operation device 20 in the manual operation of the mechanical device 10.
  • the auxiliary information processing unit 39 may output an auxiliary command in the manual operation.
  • the calculation unit 36 uses as learning data the first operation information indicating the operation of the mechanical device 10 and the correction information indicating the correction of the operation of the mechanical device 10 performed by using the operation device 20.
  • the machine learning model 36a perform machine learning. That is, the arithmetic unit 36 executes machine learning by itself. Since the calculation unit 36 causes the machine learning model 36a to perform machine learning using the control result of the machine device 10, the learning accuracy of the machine learning model 36a can be improved as the machine device 10 is operated. Further, since the learning data is generated by the operator modifying the operation of the mechanical device 10 via the operating device 20, the generation is simple. Further, the modification of the operation of the mechanical device 10 is appropriate because it is performed by the operator who has confirmed the operation of the mechanical device 10.
  • the machine learning model 36a that performs machine learning using such learning data can achieve an output accuracy in a short period of time that outputs a command corresponding to the operation of the machine device 10 that is ideal for an operator or the like. Therefore, the time required for machine learning can be shortened.
  • control system 100 includes the control device 30 according to the embodiment and the operation device 20. According to the above configuration, the same effect as that of the control device 30 according to the embodiment can be obtained.
  • the mechanical device system 1 includes a control device 30, a mechanical device 10, and an operating device 20 according to the embodiment. According to the above configuration, the same effect as that of the control device 30 according to the embodiment can be obtained.
  • the mechanical device system 1A according to the modified example of the embodiment will be described.
  • the mechanical device system 1A includes the robot 10A as a mechanical device, and controls the operation of the robot 10A by using not only the motion data but also the image of the processing target of the robot 10A.
  • the present modification will be described focusing on the points different from the embodiment, and the description of the same points as the embodiment will be omitted as appropriate.
  • FIG. 7 is a functional block diagram showing an example of the configuration of the mechanical device system 1A according to the modified example 1.
  • the mechanical device system 1A according to the present modification further includes an imaging device 70 as compared with the mechanical device system 1 according to the embodiment.
  • the mechanical device system 1A includes a robot 10A as a mechanical device 10 and a control device 30A as a control device 30.
  • the robot 10A includes an end effector 11A and a robot arm 12A, the end effector 11A corresponds to an action unit 11, and the robot arm 12A corresponds to an operation unit 12. Details of the robot 10A will be described later.
  • the image pickup device 70 images an object to be processed by the robot 10A.
  • the image pickup apparatus 70 are a digital camera, a digital video camera, and the like.
  • the image pickup device 70 is arranged on the end effector 11A or the robot arm 12A or the like, but may be arranged at a position away from the end effector 11A or the robot arm 12A.
  • the image pickup device 70 outputs a signal of the captured image to the control device 30A.
  • the image pickup apparatus 70 may output an image signal to the output apparatus 60.
  • the operator can confirm the processing state of the object by the end effector 11A via the output device 60. Then, the operator can control the operation of the end effector 11A using the operating device 20 while checking the processing state of the object.
  • FIG. 8 is a side view showing an example of the configuration of the robot 10A according to the first modification.
  • the base portion of the robot arm 12A of the robot 10A is attached and fixed to the base 13, and the end effector 11A is detachably attached to the tip portion of the robot arm 12A.
  • the end effector 11A is configured so that various actions corresponding to the object such as gripping, suction, lifting, or scooping can be applied to the object.
  • the end effector 11A is configured to grip the object W, and the robot 10A performs an operation of assembling the object W gripped by the end effector 11A to the assembly object T.
  • the work of the robot 10A is not limited to assembly, and may be any work. Examples of work of the robot 10A are medical practices such as sorting, assembling, painting, welding, joining, chipping, polishing, sealing, semiconductor manufacturing, drug preparation and surgery.
  • the robot arm 12A includes joints JT1 to JT6 that sequentially connect links 12Aa to 12Af and links 12Aa to 12Af arranged in order from the base to the tip, and arm drive devices M1 to rotationally drive each of the joints JT1 to JT6. It is equipped with M6.
  • the operation of the arm drive devices M1 to M6 is controlled by the control device 30A.
  • each of the arm driving devices M1 to M6 uses electric power as a power source and has a servomotor as an electric motor for driving them.
  • the number of joints of the robot arm 12A is not limited to 6, but may be 7 or more, or 1 or more and 5 or less.
  • the link 12Aa is attached to the mounting surface 13a of the base 13, and the end effector 11A is attached to the tip of the link 12Af.
  • a mechanical interface is provided at the tip of the link 12Af.
  • the end effector 11A is attached to the mechanical interface via the force sensor 14.
  • An example of the force sensor 14 is a force sensor or the like, and the configuration of the force sensor is not particularly limited, but may be composed of, for example, a three-axis acceleration sensor.
  • the force sensor 14 detects the force exerted by the end effector 11A on the object as a reaction force received from the object.
  • the force detected by the force sensor 14 is converted into force data by an appropriate signal processing means (not shown).
  • This signal processing means is provided in, for example, the force sensor 14 or the control device 30A. In the present specification, for convenience, it is expressed that the force sensor 14 detects force data.
  • the joint JT1 rotatably connects the base 13 and the base end portion of the link 12Aa around an axis in the vertical direction perpendicular to the mounting surface 13a.
  • the joint JT2 rotatably connects the tip end of the link 12Aa and the base end of the link 12Ab around a horizontal axis parallel to the mounting surface 13a.
  • the joint JT3 rotatably connects the tip end of the link 12Ab and the base end of the link 12Ac around an axis in a direction parallel to the mounting surface 13a.
  • the joint JT4 rotatably connects the tip end of the link 12Ac and the proximal end of the link 12Ac about the longitudinal axis of the link 12Ac.
  • the joint JT5 rotatably connects the tip end of the link 12Ad and the base end of the link 12Ae about an axis in a direction orthogonal to the longitudinal direction of the link 12Ad.
  • the joint JT6 connects the tip end of the link 12Ae and the base end of the link 12Af in a twistable and rotatable manner with respect to the link 12Ae.
  • the arm drive devices M1 to M6 are a servomotor (not shown), a rotation sensor such as an encoder that detects the amount of rotation of the rotor of the servomotor (not shown), and a current that detects the drive current of the servomotor, respectively.
  • a sensor (not shown) may be provided.
  • Each of the arm drive devices M1 to M6 operates the servomotor according to a command or the like output from the control device 30A, and outputs the detection values of the rotation sensor and the current sensor to the control device 30A.
  • the control device 30A detects the rotation amount, rotation speed, current value, etc.
  • control device 30A can stop each servomotor at an arbitrary rotation position, rotate it at an arbitrary rotation speed, and operate it at an arbitrary rotation torque. Therefore, the control device 30A can operate the robot arm 12A in various and precise manners.
  • the operation information calculation unit 43 of the control device 30A calculates the three-dimensional position and orientation of the end effector 11A as position data by integrating the rotation amounts of all the servomotors of the arm drive devices M1 to M6. Further, the data detected by the force sensor 14 is the force data. The position data and the force data are operation data of the robot 10A.
  • the rotation sensor and the force sensor 14 of the arm drive devices M1 to M6 constitute an operation information detection device 50. Further, the detection signal of the current sensor of the arm drive devices M1 to M6 is feedback-controlled by the control device 30A so that the current of each of the servomotors of the arm drive devices M1 to M6 becomes a current value according to the current command. Used for.
  • the robot 10A is configured as a vertical articulated robot, but is not limited thereto.
  • FIG. 9 is a diagram showing an example of the appearance of the operating device 20 according to the modified example 1.
  • FIG. 10 is a functional block diagram showing an example of the configuration of the operation device 20 according to the first modification.
  • the operating device 20 includes a housing 20a that can be grasped by a human hand. Further, the operation device 20 includes an input device 21 in the housing 20a. In FIG. 9, the input device 21 is a button switch, but the present invention is not limited to this. Further, the operation device 20 includes an inertial measurement unit 22, a haptics device 23, an operation control device 24, and a communication device 25 (not shown) inside the housing 20a.
  • the control device 30A performs bilateral control on the robot 10A by using the operation device 20.
  • the components of the operating device 20 will be described with reference to FIG.
  • the haptics device 23 is as described in the embodiment.
  • the communication device 25 connects the operation device 20 and the control device 30A via wired communication or wireless communication.
  • the communication device 25 may include a communication circuit.
  • the format of wired communication and wireless communication may be any format.
  • the input device 21 receives input of commands and information by the operator, and transmits the input commands and information to the control device 30A via the operation control device 24 and the communication device 25.
  • Such an input device 21 may accept physical input, voice input, image input, and the like.
  • the input device 21 may include devices such as a slide switch, a button switch, a key, a lever, a touch panel, a microphone, and a camera.
  • the commands and information input to the input device 21 may indicate the operation mode selection and execution command of the robot 10A, the operation selection and execution command of the end effector 11A, and the like.
  • the inertial measurement unit 22 includes a 3-axis acceleration sensor and a 3-axis angular velocity sensor, and detects acceleration and angular velocity in the 3-axis direction of the operating device 20.
  • the measurement data of the acceleration and the angular velocity in the three axial directions detected by the inertial measurement unit 22 indicate the operation and acting force of the operation device 20 such as position, attitude, movement, movement speed, acceleration and force by the operation control device 24. It is converted into various information, and the information is transmitted to the control device 30A via the communication device 25 as the operation information of the operation device 20.
  • the measurement data of the acceleration and the angular velocity in the three axial directions may be transmitted to the control device 30A, and the control device 30A may perform an operation to convert the data.
  • the information converted from the measurement data of the inertial measurement unit 22 may indicate the position, posture, movement, moving speed, acceleration, acting force, and the like of the end effector 11A.
  • the inertial measurement unit 22 may include a geomagnetic sensor, a temperature sensor, and the like.
  • the measurement data of the acceleration and the angular velocity in the three-axis direction may be corrected by using the measurement data of the geomagnetic sensor, the temperature sensor, and the like.
  • the operation control device 24 controls the overall operation of the operation device 20.
  • the operation control device 24 may have a configuration similar to the configuration exemplified in the embodiment for the control device 30.
  • the operation control device 24 receives a signal from the input device 21, converts the signal into information indicating the corresponding operation, and transmits the signal to the control device 30A.
  • the operation control device 24 converts the measurement data of the inertial measurement unit 22 and transmits the converted data to the control device 30A.
  • the operation control device 24 transmits the measurement data of the inertial measurement unit 22 to the control device 30A.
  • the operation control device 24 receives the operation information of the robot 10A from the control device 30A, converts the operation data and the like included in the operation information into data suitable for input to the haptics device 23, and outputs the data to the haptics device 23. ..
  • control device 30A The configuration of the control device 30A will be described. As shown in FIG. 7, the control device 30A according to the present modification has the operation information calculation unit 43, the image processing unit 44, and the fourth storage unit 45 as compared with the control device 30 according to the embodiment. Further included, the calculation unit 36A is included instead of the calculation unit 36.
  • the operation information calculation unit 43 converts the data received from the operation information detection device 50 into operation data and outputs the data to the operation information processing unit 37 and the first storage unit 40.
  • the motion information detection device 50 includes data on the rotation amount and current value of the servomotor detected by using the rotation sensors of the arm drive devices M1 to M6 of the robot 10A, and the force detected by using the force sensor 14.
  • the force data of is output to the operation information calculation unit 43.
  • the operation information calculation unit 43 calculates the position data indicating the three-dimensional position and orientation of the end effector 11A by integrating the rotation amounts of all the servomotors of the arm drive devices M1 to M6.
  • the operation information calculation unit 43 generates and outputs operation data including force data and position data at the same detection time in association with the detection time.
  • the image processing unit 44 receives image data indicating an image captured by the image pickup device 70, and performs image processing on the image data.
  • the image processing unit 44 extracts the object and the end effector 11A included in the image by image processing, and generates processed image data which is the image data of the image including only the object and the end effector 11A.
  • the image processing unit 44 stores the processed image data in the fourth storage unit 45 in association with the imaging time.
  • the image processing unit 44 may also store the image data before processing in the fourth storage unit 45.
  • the method for extracting the image of the object and the end effector 11A from the image may be any known method.
  • the image processing unit 44 may extract an image of the object and the end effector 11A by using an image matching method such as feature-based or region-based.
  • the image processing unit 44 may extract feature points such as edges and corners from the image before processing and calculate the feature amount of the feature points. Further, the image processing unit 44 performs matching between the image before processing and the template of the image of the object and the end effector 11A based on the feature amount of the feature point, so that the image before processing is used as the object and the end. The image of the effector 11A may be extracted. Further, in the case of region-based, the image processing unit 44 may specify each region in the image before processing based on the edge, texture, and the like.
  • the image processing unit 44 performs matching between the image before processing and the template of the image of the object and the end effector 11A based on the specified area, so that the object and the end effector 11A from the image before processing are matched.
  • the image of may be extracted.
  • the template of the image of the object and the end effector 11A may be stored in advance in the fourth storage unit 45.
  • the fourth storage unit 45 is realized by a storage device in the same manner as the first storage unit 40 and the like.
  • the fourth storage unit 45 stores the image data captured by the image pickup apparatus 70, the processed image data processed by the image processing unit 44, the template of the image of the object and the end effector 11A, and the like.
  • the calculation unit 36A performs machine learning using the operation information of the robot 10A and the correction information corresponding to the operation information. Further, the calculation unit 36A uses the operation information of the robot 10A as input data, and uses the command corresponding to the operation information as output data.
  • the motion information of the robot 10A includes the motion data of the end effector 11A of the robot 10A and the image data of the object imaged by the imaging device 70.
  • the image data shows the operation information of the end effector 11A such as the positional relationship between the end effector 11A and the object and the processing status of the object by the end effector 11A.
  • the image data is the processed image data processed by the image processing unit 44, but may be the image data before processing. By using the processed image data, it is possible to improve the output accuracy of the neural network.
  • the operation data of the end effector 11A and the image data of the object are used as input data, and the correction information executed at the time of detecting these data is used as the teacher data.
  • the neural network accepts the input of the operation data of the end effector 11A and the image data of the object, and outputs an execution operation correction command for causing the robot 10A to execute next.
  • FIG. 11 is a diagram showing an example of the configuration of the calculation unit 36A according to the first modification.
  • the calculation unit 36A includes a machine learning model 36Aa, a data generation unit 36b, a data input unit 36c, and a learning evaluation unit 36d.
  • each command and each data are acquired at a predetermined sampling interval while the robot 10A executes a predetermined operation once.
  • the correction information detector 35, the time-series data Pm 0 ⁇ Pm u corrective action command Pm is obtained at the sampling interval.
  • the operation information detection device 50 acquires the detection data of the end effector 11A at the sampling interval, and the operation information calculation unit 43 calculates the detection data, so that the time series data Pd 0 of the operation data Pd of the end effector 11A to get the ⁇ Pd u.
  • the image pickup apparatus 70 acquires the image data in which the object is captured at the sampling interval, and the image processing unit 44 performs image processing on the image data to time-series data of the processed image data Ip that has been image-processed. to get the Ip 0 ⁇ Ip u.
  • Data generating unit 36b by using the time series data Pd 0 ⁇ Pd u operation data Pd of the first storage unit 40, and a time series data Ip 0 ⁇ Ip u processing image data Ip of the fourth memory unit 45, to generate a time-series data Ld 0 ⁇ Ld u of learning data Ld.
  • Time-series data Ld i is generated using the time-series data Pd i and Ip i.
  • the data generating unit 36b generates the time-series data pn 0 ⁇ pn u teacher data pn from the time-series data Pm 0 ⁇ Pm u modification operation command Pm of the second storage unit 41.
  • Data input unit 36c sequentially inputs the time-series data Ld 0 ⁇ Ld u training data Ld to each neuron of the input layer of the neural network machine learning models 36Aa.
  • Neural network receives the input of the time series data Ld i training data Ld at sampling time t i, which predicts the execution behavior correction command Pn i + 1 at the next sampling time t i + 1 output.
  • the learning evaluation unit 36d adjusts the weights between the neurons of the neural network based on the time series data pn i + 1 of the teacher data pn at the sampling time ti + 1 and the execution operation correction command Pn i + 1 .
  • the data input unit 36c and the learning evaluation unit 36d perform the above processing for all of the time series data Ld 0 to Ld u .
  • the operation information calculation unit 43 uses the detected data of the operation information detecting unit 50 at the current sampling time t i, and detects the operation data Pd i, and outputs to the operation unit 36A.
  • the image processing unit 44 uses the image data captured by the imaging device 70 to the sampling time t i, and generates the processed image data Ip i, and outputs to the operation unit 36A.
  • Data input unit 36c inputs the operation data Pd i and the processed image data Ip i to the neural network machine learning models 36Aa. Neural networks will be able to input data to motion data Pd i and the processed image data Ip i, as output data to perform operation correction command Pn i + 1 at the next sampling time t i + 1, and outputs the operation command unit 32.
  • the operation command unit 32 generates an execution operation command that reflects the execution operation correction command Pni + 1 .
  • the machine learning model 36Aa performs the above processing. In such processing by the machine learning model 36Aa, the processing state of the object is taken into consideration.
  • the neural network machine learning models 36Aa as input data, the operation data Pd i and the processed image data Ip i at the sampling time t i, a past sampling time t i-1 ⁇ t i may be configured such that the operational data Pd i-1 ⁇ Pd i- n and processed image data I i-1 ⁇ I i- n of -n is input. Further, since the other configurations and operations of the mechanical device system 1A according to the first modification are the same as those of the embodiment, detailed description thereof will be omitted.
  • the machine learning model 36Aa performs machine learning using the motion information of the robot 10A and the correction information corresponding to the motion information, and uses the motion information of the robot 10A as input data to obtain the motion information.
  • the corresponding command is used as output data.
  • the operation information of the robot 10A includes the operation data of the end effector 11A of the robot 10A and the image data of the object imaged by the imaging device 70.
  • the machine learning model 36Aa can output not only the operating state of the end effector 11A but also the state of the object to be processed recognized from the image, that is, the processing state.
  • the robot 10A performs operations such as painting, welding, chipping, polishing, and sealing
  • the work performance changes depending on the state of the portion to be processed in the object.
  • the machine learning model 36Aa can output an image suitable for the state of the part by using an image including the part as input data. Therefore, the output accuracy of the machine learning model 36Aa is improved.
  • the machine learning model 36Aa that handles motion information including image data may be used in any mechanical device other than a robot.
  • Modification 2 The mechanical device system 1B according to the second modification of the embodiment will be described.
  • the machine device system 1B is different from the embodiment and the modification 1 in that the machine learning model is not trained by itself and processing is performed using the given machine learning model.
  • the present modification will be described focusing on the points different from the embodiment and the modification 1, and the description of the same points as the embodiment and the modification 1 will be omitted as appropriate.
  • FIG. 12 is a functional block diagram showing an example of the configuration of the mechanical device system 1B according to the modified example 2.
  • the mechanical device system 1B according to the present modification includes the control device 30B as the control device 30 as compared with the mechanical device system 1 according to the embodiment. Further, the control device 30B does not include the correction command unit 33, the correction information detection unit 35, the first storage unit 40, and the second storage unit 41 as compared with the control device 30 according to the embodiment, and the calculation unit 36 Instead, a calculation unit 36B is provided.
  • the configuration of the operation determination unit 31, the operation command unit 32, the drive command unit 34, the operation information processing unit 37, the difference detection unit 38, the auxiliary information processing unit 39, and the third storage unit 42 of the control device 30B is the same as that of the embodiment. Is.
  • the operation command unit 32 outputs an execution operation command to the drive command unit 34 in all modes, and outputs an execution operation command to the difference detection unit 38 in the training mode.
  • FIG. 13 is a diagram showing an example of the configuration of the calculation unit 36B according to the modification 2.
  • the calculation unit 36B includes a machine learning model 36Ba and a data input unit 36c.
  • the calculation unit 36B executes data input / output functions in the automatic operation mode and the training mode.
  • the data input unit 36c inputs the operation data of the machine device 10 detected by the operation information detection device 50 and the operation information processing unit 37 into the neural network of the machine learning model 36Ba.
  • the machine learning model 36Ba has the same configuration as the machine learning model 36a of the embodiment, but is a machine learning model generated through machine learning outside the machine device system 1B and given to the arithmetic unit 36B.
  • the machine learning model 36Ba may be generated through machine learning using not only the learning data collected in one machine learning system 1B but also the learning data collected in another machine learning system. Since such a machine learning model 36Ba uses a lot of learning data for machine learning, it is possible to output output data corresponding to various input data.
  • the machine learning model 36Ba generated in advance is held in a storage unit (not shown) of the calculation unit 36B, and when a new machine learning model is generated, it may be updated by the machine learning model.
  • the machine learning model 36Ba may be a machine learning model given from the outside of the machine learning system 1B.
  • Such a machine learning model 36Ba can be generated by machine learning using learning data collected not only by the mechanical device system 1B but also by other systems and the like. Since the machine learning model 36Ba can output output data corresponding to various input data, its flexibility can be improved.
  • the configuration of the mechanical device system 1B according to the present modification may be applied to the modification 1.
  • the auxiliary information processing unit 39 of the control device gives the operator a perceptual stimulus via the operation device 20 in order to assist the operation in the operation device 20.
  • the auxiliary information processing unit 39 may output an auxiliary command to a device other than the operation device 20 to generate a perceptual stimulus to the operator.
  • the auxiliary information processing unit 39 may be configured to output an auxiliary command for displaying an image assisting the operation in the operation device 20 to an image output device such as an output device 60.
  • the auxiliary command sets the operating device 20 and the indicators Ad1 and Ad2 on the image of the robot 10A captured by the camera on the screen S for the operation of the operator in the output device 60. It may be a command to superimpose and display the represented image.
  • the indexes Ad1 and Ad2 may include an arrow indicating the moving direction of the operating device 20 and a numerical value indicating the moving order. Therefore, the operator can receive assistance through vision.
  • auxiliary information processing unit 39 may be configured to output an auxiliary command for outputting a voice assisting the operation in the operation device 20 to a voice output device such as the output device 60. Therefore, the operator can receive assistance through hearing.
  • the operation device 20 includes the haptics device 23 for giving a perceptual stimulus to the operator, but the present invention is not limited to this.
  • the operating device 20 may include any device that stimulates the operator's perception.
  • the operating device 20 may be configured to provide the operator with at least one of tactile, warm, visual and auditory stimuli.
  • the operating device 20 is a device that gives a tactile stimulus by deformation such as expansion / contraction or expansion / contraction of the operating device 20 and vibration, and expands / contracts by air pressure or hydraulic pressure, a device that generates vibration such as a piezoelectric element, and the like. May be provided.
  • the operating device 20 may be provided with, for example, a heater or the like, which stimulates the sense of warmth by generating heat or the like.
  • the operating device 20 may be provided with a light source such as an LED (Light Emitting Diode), for example, by giving a visual stimulus by emitting light or blinking light.
  • the operating device 20 may provide an auditory stimulus by pronunciation or the like, and may include, for example, a speaker or the like.
  • control device is configured to execute the training mode in the manual operation mode, but may be configured to execute the training mode also in the automatic operation mode. For example, when a predetermined operation executed in the automatic operation mode includes an operation by automatic operation and an operation by manual operation, the control device selects either a training mode or a non-training mode for the operation by manual operation. May be configured to run.
  • the control device modifies the operations of the mechanical device 10 and the robot 10A according to the operation information output from one operation device 20 in the automatic operation mode, but the present invention is not limited to this.
  • the control device may modify the operations of the mechanical device 10 and the robot 10A according to the operation information output from the two or more operation devices 20.
  • a priority is set for two or more operation devices 20, and the control device determines the operation information to be adopted for correction from the operation information output from the two or more operation devices 20 according to the priority. May be good.
  • the control device executes processing such as addition, subtraction, averaging, or other statistical processing on the operation information output from the two or more operation devices 20, and performs the operation information after the processing. It may be adopted for correction.
  • the information used by the machine learning model for machine learning is the information acquired in the automatic operation mode, specifically, the operation data as the operation information of the machine device 10 and the robot 10A. And at least the operation data of the image data of the object and the correction operation command as the correction information, but the present invention is not limited to this.
  • the machine learning model may use the information acquired in the manual operation mode for machine learning.
  • Such information includes, for example, an execution operation command based on the operation information of the operation device 20, operation data as operation information of the mechanical device 10 and the robot 10A operated according to the execution operation command, and image data of an object. At least it may be operation data.
  • the machine learning model also machine-learns the operation results of the mechanical device 10 and the robot 10A by the operator, so that the output can be close to that of a human operation.
  • the robot 10A was an industrial robot, but any robot may be used.
  • the robot 10A may be a service robot, a humanoid robot, or the like.
  • Service robots are robots used in various service industries such as nursing care, medical care, cleaning, security, guidance, rescue, cooking, and product provision.
  • the robot 10A was a vertical articulated robot, but the robot 10A is not limited to this, for example, a horizontal articulated robot, a polar coordinate robot, a cylindrical coordinate robot, a right angle coordinate robot, and a vertical articulated robot. It may be configured as a type robot or other robot.
  • the technique of the present disclosure may be a control method.
  • the machine device is operated according to the operation information output from the operation device for operating the machine device, and the machine uses the first operation information indicating the operation of the machine device as input data.
  • Input to the learning model to output a command for the operation of the machine device corresponding to the first operation information, and the operation of the machine device and the machine device corresponding to the command output by the machine learning model.
  • the auxiliary command for assisting the operation in the operation device is output, and the operation device outputs the operation information based on the second operation information indicating the operation of the operation device.
  • Such a control method may be realized by a circuit such as a CPU and an LSI, an IC card, a single module, or the like.
  • the technique of the present disclosure may be a program for executing the above control method, or may be a non-temporary computer-readable recording medium in which the above program is recorded. Needless to say, the above program can be distributed via a transmission medium such as the Internet.
  • the numbers such as the ordinal number and the quantity used above are all examples for concretely explaining the technology of the present disclosure, and the present disclosure is not limited to the illustrated numbers.
  • the connection relationship between the components is illustrated for the purpose of specifically explaining the technique of the present disclosure, and the connection relationship for realizing the function of the present disclosure is not limited thereto.
  • the division of blocks in the functional block diagram is an example, and even if a plurality of blocks are realized as one block, one block is divided into a plurality of blocks, and / or some functions are transferred to another block. Good.
  • a single piece of hardware or software may process the functions of a plurality of blocks having similar functions in parallel or in a time division manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un dispositif de commande (30) d'un dispositif machine (10) comprenant : une unité de commande de fonctionnement (32) qui commande le fonctionnement du dispositif machine en fonction d'informations de manipulation délivrées en sortie par un dispositif de manipulation (20) destiné à manipuler le dispositif machine ; une unité de calcul (36) comprenant un modèle d'apprentissage automatique (36a) qui met en œuvre les premières informations de fonctionnement indiquant le fonctionnement du dispositif machine en tant que données d'entrée et met en œuvre une instruction de fonctionnement destinée au fonctionnement, du dispositif machine, correspondant aux premières informations de fonctionnement en tant que données de sortie ; et des unités d'assistance (38, 39) qui, sur la base d'une différence entre le fonctionnement du dispositif machine commandé par l'unité de commande de fonctionnement et le fonctionnement, du dispositif machine, correspondant à l'instruction délivrée en sortie par l'unité de calcul, délivre en sortie une instruction d'assistance destiné à aider à la manipulation dans le dispositif de manipulation. Le dispositif de manipulation délivre en sortie les informations de manipulation sur la base de secondes informations de fonctionnement indiquant le fonctionnement du dispositif de manipulation.
PCT/JP2020/021236 2019-05-28 2020-05-28 Dispositif de commande, système de commande, système de dispositif machine, et procédé de commande WO2020241797A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019099549A JP7333197B2 (ja) 2019-05-28 2019-05-28 制御システム、機械装置システム及び制御方法
JP2019-099549 2019-05-28

Publications (1)

Publication Number Publication Date
WO2020241797A1 true WO2020241797A1 (fr) 2020-12-03

Family

ID=73547743

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/021236 WO2020241797A1 (fr) 2019-05-28 2020-05-28 Dispositif de commande, système de commande, système de dispositif machine, et procédé de commande

Country Status (2)

Country Link
JP (1) JP7333197B2 (fr)
WO (1) WO2020241797A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023038776A (ja) * 2021-09-07 2023-03-17 オムロン株式会社 指令値生成装置、方法、及びプログラム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014050950A (ja) * 2012-09-06 2014-03-20 Fanuc Robotics America Inc 触覚教示ペンダント
WO2017033356A1 (fr) * 2015-08-25 2017-03-02 川崎重工業株式会社 Système robotique
WO2017220605A1 (fr) * 2016-06-23 2017-12-28 Kuka Roboter Gmbh Ensemble appareil de commande de robot portatif muni d'un capteur de position de commande de base
WO2018049447A1 (fr) * 2016-09-14 2018-03-22 Keba Ag Dispositif de commande et procédé de commande pour machines industrielles à entraînement de déplacement commandé
JP2018062016A (ja) * 2016-10-11 2018-04-19 ファナック株式会社 人の行動を学習してロボットを制御する制御装置、ロボットシステムおよび生産システム
WO2019044766A1 (fr) * 2017-08-31 2019-03-07 川崎重工業株式会社 Système de robot et son procédé de fonctionnement

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0631982U (ja) * 1992-09-28 1994-04-26 株式会社明電舎 ロボットの制御装置
JP2002192489A (ja) * 2000-12-26 2002-07-10 Idec Izumi Corp 手持ち操作装置及びリモートコントロール機械
DE102012010721A1 (de) * 2012-05-30 2013-12-05 Robert Bosch Gmbh Maschinensteuervorrichtung und -steuerverfahren zur Steuern einer Bewegung einer Maschine sowie Bedienteilsteuereinrichtung für tragbares Bedienteil
US9811066B1 (en) * 2015-12-14 2017-11-07 X Development Llc Throttle functionality of haptic controller
US10440240B2 (en) * 2016-09-30 2019-10-08 Sony Interactive Entertainment Inc. Systems and methods for reducing an effect of occlusion of a tracker by people

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014050950A (ja) * 2012-09-06 2014-03-20 Fanuc Robotics America Inc 触覚教示ペンダント
WO2017033356A1 (fr) * 2015-08-25 2017-03-02 川崎重工業株式会社 Système robotique
WO2017033364A1 (fr) * 2015-08-25 2017-03-02 川崎重工業株式会社 Système de robot
WO2017220605A1 (fr) * 2016-06-23 2017-12-28 Kuka Roboter Gmbh Ensemble appareil de commande de robot portatif muni d'un capteur de position de commande de base
WO2018049447A1 (fr) * 2016-09-14 2018-03-22 Keba Ag Dispositif de commande et procédé de commande pour machines industrielles à entraînement de déplacement commandé
JP2018062016A (ja) * 2016-10-11 2018-04-19 ファナック株式会社 人の行動を学習してロボットを制御する制御装置、ロボットシステムおよび生産システム
WO2019044766A1 (fr) * 2017-08-31 2019-03-07 川崎重工業株式会社 Système de robot et son procédé de fonctionnement

Also Published As

Publication number Publication date
JP7333197B2 (ja) 2023-08-24
JP2020192641A (ja) 2020-12-03

Similar Documents

Publication Publication Date Title
WO2020241796A1 (fr) Dispositif de commande, système de commande, système de dispositif machine et procédé de commande
WO2021024586A1 (fr) Dispositif de commande, système de commande, système de robot et procédé de commande
CN111432990B (zh) 技能传承机械装置
JP6826532B2 (ja) 遠隔操作ロボットシステム
JP6956081B2 (ja) ロボットシステム及びロボットシステムをバックドライブする方法
JP6940879B2 (ja) ロボット制御システム、機械制御システム、ロボット制御方法、機械制御方法、およびコンピュータプログラム
TWI805545B (zh) 用於藉由示範來程式化機器人之方法和電腦程式產品
US20170249561A1 (en) Robot learning via human-demonstration of tasks with force and position objectives
CN103128729B (zh) 机器人装置和控制该机器人装置的方法
US20220388160A1 (en) Control device, control system, robot system, and control method
WO2010136961A1 (fr) Dispositif et procédé de commande d'un robot
JP6811465B2 (ja) 学習装置、学習方法、学習プログラム、自動制御装置、自動制御方法および自動制御プログラム
CN110709211B (zh) 机器人系统和机器人系统的控制方法
CN113412178A (zh) 机器人控制装置、机器人系统以及机器人控制方法
WO2020241797A1 (fr) Dispositif de commande, système de commande, système de dispositif machine, et procédé de commande
Jadeja et al. Design and development of 5-DOF robotic arm manipulators
JP3749883B2 (ja) 遠隔操作方法及び装置
CN114585321B (zh) 手术系统以及控制方法
JP2020082313A (ja) ロボット制御装置、学習装置、及びロボット制御システム
Crammond et al. Commanding an anthropomorphic robotic hand with motion capture data
WO2022209924A1 (fr) Dispositif de commande d'opération à distance de robot, système de commande d'opération à distance de robot, procédé de commande d'opération à distance de robot, et programme
JP7247571B2 (ja) 制御装置、及び学習装置
JP2022155623A (ja) ロボット遠隔操作制御装置、ロボット遠隔操作制御システム、ロボット遠隔操作制御方法、およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20813138

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20813138

Country of ref document: EP

Kind code of ref document: A1