WO2020202427A1 - Unité de commande - Google Patents

Unité de commande Download PDF

Info

Publication number
WO2020202427A1
WO2020202427A1 PCT/JP2019/014442 JP2019014442W WO2020202427A1 WO 2020202427 A1 WO2020202427 A1 WO 2020202427A1 JP 2019014442 W JP2019014442 W JP 2019014442W WO 2020202427 A1 WO2020202427 A1 WO 2020202427A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
control unit
control
cooperative
external environment
Prior art date
Application number
PCT/JP2019/014442
Other languages
English (en)
Japanese (ja)
Inventor
啓二 西村
村松 啓且
Original Assignee
ヤマハ発動機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ヤマハ発動機株式会社 filed Critical ヤマハ発動機株式会社
Priority to PCT/JP2019/014442 priority Critical patent/WO2020202427A1/fr
Priority to PCT/JP2020/014546 priority patent/WO2020203968A1/fr
Publication of WO2020202427A1 publication Critical patent/WO2020202427A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • the present invention relates to a control unit used in an automatic operation device (Autonomous Operation Machine).
  • Patent Document 1 discloses an automatic guided vehicle with a robot arm.
  • the transport vehicle is a self-propelled vehicle.
  • the automatic guided vehicle includes an AGV (Automatic Guided Vehicle), an AGV control unit, a robot arm, and a robot controller.
  • the AGV control unit controls the movement of the AGV.
  • the robot controller controls the robot arm.
  • the robot controller notifies the AGV control unit of a signal indicating that the robot arm can be connected to the power source and can work. This regulates the movement of the AGV.
  • the detachment control unit of the robot controller notifies the AGV control unit of a signal indicating that the work is completed. As a result, the movement of the AGV starts. In this way, the robot controller controls the movement of the AGV.
  • control unit used in the automatic operation device is required to increase the degree of freedom of combination with the control device of other working machines and improve the versatility.
  • An object of the present invention is to provide a control unit capable of increasing the degree of freedom of combination with other devices and improving versatility.
  • the present inventor conducted a study in view of the above-mentioned problems and obtained the following findings.
  • the above-mentioned automatic operation device may want to perform work other than the work by the robot arm.
  • the types of work required for the work machine include, for example, relatively simple work that operates simply according to the position of the autonomous vehicle.
  • the work machines that perform such work for example, there is a work machine configured so as not to output a signal for traveling to an autonomous driving vehicle. In this case, it is not easy to link the work machine and the autonomous driving vehicle. As a result, the types of work machines that can be combined with autonomous vehicles are limited.
  • the automatic operation device operates based on the command, and if the work machine is a type of device that does not output a command, the automatic operation device becomes the work machine. It is conceivable that it is configured to output a command. If this can be achieved, the degree of freedom of combination can be increased and the versatility can be improved.
  • control unit has the following configuration.
  • the control unit is An external environment information connector for inputting external environment data indicating the detection result from the external environment sensing unit, and An operation control connector for outputting an operation control signal for controlling the operation of the actuator, and an operation control connector.
  • It is configured to be communicably connected to a cooperative control unit that controls a cooperative operating device that outputs physical or non-physical output by wire or wirelessly, and the cooperative control unit is either a master unit or a slave unit.
  • the master unit transmits an operation command for controlling the automatic operation device to the control unit, and the slave unit sends an operation command for controlling the cooperative operation device.
  • the operation control signal is generated at least based on an operation command received from the master unit, and the operation control signal is transmitted to the actuator by the operation control connector.
  • the operation control signal is generated based on the processing result of the external environment data, and the operation command for controlling the linked operation device is generated.
  • the operation control device for transmitting the operation command to the slave unit via the external communication connection unit is provided.
  • the operation control device controls the actuator of the automatic operation device at least based on the operation command from the master unit.
  • the motion control device generates an motion control signal at least based on the motion command, and outputs the motion control signal to the actuator.
  • the operation control device may generate an operation control signal based only on the operation command, or may generate an operation control signal based on the operation command and the external environment data. In this way, the actuator of the automatic operation device is controlled based on the operation command of the master unit. Therefore, the operation of the automatically operating device and the operation of the linked operating device can be precisely linked.
  • the operation control device transmits an operation command generated based on the processing result of the external environment data to the cooperative control unit. Therefore, the cooperative control device is controlled based on the external environment of the automatically operating device. Therefore, the operation of the automatically operating device and the operation of the linked operating device can be precisely linked. In this way, regardless of whether the linked control unit connected to the control unit used in the automatic operating device is a master unit or a slave unit, the operation of the automatic operating device and the operation of the linked operating device can be precisely linked. Therefore, the versatility of the control unit for controlling the automatically operating device can be improved.
  • control unit can adopt the following configuration.
  • the control unit of (1) The external environment sensing unit is a camera.
  • the external environment information connector inputs image data output from the camera as the external environment data.
  • control unit by using the image data from the camera, it is possible to perform the operation while recognizing the complicated external environment.
  • the control unit can cause the cooperative control unit to operate with high accuracy based on the image data from the camera.
  • the control unit can operate the actuator with high accuracy based on the operation command from the cooperative control unit and the image data from the camera. Therefore, while improving the versatility of the control unit, it is possible to make the connected linked control unit operate with high accuracy.
  • control unit can adopt the following configuration.
  • the control unit of (1) The actuator mounted on the automatic operation device is a traveling device for traveling the automatic operation device.
  • the motion control device is When the master unit is connected to the external communication connection unit, the travel route of the automatic operation device is instructed based on the operation command received from the master unit and the processing result of the external environment data. Generates motion control signals for When the slave unit is connected to the external communication connection unit, an operation control signal for instructing a traveling route of the automatic operation device is generated based on the processing result of the external environment data, and the operation control signal is generated. An operation command for controlling the linked operation device is generated, and the operation command is transmitted to the slave unit via the external communication connection unit.
  • the automatic operation device functions as an automatic traveling vehicle.
  • the cooperative control unit when the cooperative control unit is a slave unit, the autonomous vehicle can make the cooperative control unit operate with high accuracy while traveling based on external environment data such as image data from a camera.
  • the cooperative control unit when the cooperative control unit is the master unit, the autonomous vehicle can travel with high accuracy based on the operation command from the cooperative control unit and the external environment data such as the image data from the camera. Therefore, it is possible to make the cooperative control unit operate with high accuracy while improving the versatility of the autonomous driving vehicle.
  • control unit can adopt the following configuration.
  • the operation control device When the operation control device detects the connection with the cooperation control unit via the external communication connection unit, the operation control device receives the identification information for identifying the cooperation control unit via the external communication connection unit, and receives the identification information. Based on the identification information, it is determined whether the cooperation control unit is either a master unit or a slave unit.
  • the motion control device can control the operation of the actuator according to the type of the linked control unit immediately after being connected to the linked control unit. Therefore, it is possible to improve the operational stability of the operation control device while improving the versatility of the control unit.
  • control unit can adopt the following configuration.
  • the control unit of (1) determines that the cooperative control unit is the master unit at the time when an operation command is input from the cooperative control unit connected via the external communication connection unit, and the operation command is input. Until then, the cooperative control unit is determined to be the slave unit.
  • the operation control device controls the operation of the actuator in response to the input of the operation command from the cooperative control unit. Therefore, there is a high degree of flexibility in dealing with cooperative control units whose roles change. Therefore, the versatility of the control unit is further improved.
  • control unit can adopt the following configuration.
  • the control unit of (1) The external communication connection unit includes a plurality of connectors corresponding to a plurality of different types of transmission formats.
  • the linked control unit to be connected can be selected from the candidates of a plurality of units having various functions. Therefore, the versatility of the control unit can be further improved.
  • the terminology used herein is for the purpose of defining only specific embodiments and is not intended to limit the invention.
  • the term “and / or” includes any or all combinations of one or more related enumerated components.
  • the use of the terms “including, including,””comprising,” or “having,” and variations thereof, is a feature, process, operation, described. It identifies the presence of elements, components and / or their equivalents, but can include one or more of steps, actions, elements, components, and / or groups thereof.
  • the terms “attached”, “connected”, “combined” and / or their equivalents are widely used, direct and indirect attachment, connection and Includes both bonds.
  • connection and “coupled” are not limited to physical or mechanical connections or connections, but can include direct or indirect electrical connections or connections.
  • all terms used herein, including technical and scientific terms, have the same meaning as commonly understood by those skilled in the art to which the present invention belongs. Terms such as those defined in commonly used dictionaries should be construed to have meaning consistent with the relevant technology and in the context of the present disclosure and are expressly defined herein. Unless it is, it will not be interpreted in an ideal or overly formal sense. It is understood that a plurality of techniques and processes are disclosed in the description of the present invention. Each of these has its own benefit and each can be used in conjunction with one or more of the other disclosed techniques, or in some cases all.
  • the detection of the external environment is to detect the external environment of the automatically operating device.
  • the external environment is an external condition that determines the operation of an automatically operating device.
  • the external environment is the environment around the automatically operating device.
  • the external environment data as a result of being detected by the external environment sensing unit is, for example, an image taken of the outside of the automatically operating device.
  • the external environment sensing unit in this case is a camera.
  • the external environment data and the external environment sensing unit are not particularly limited.
  • the external environment data is, for example, the distance to an external object of the automatic operation device.
  • the external environment sensing unit in this case is, for example, a distance sensor.
  • the distance sensor is, for example, a sonar (SONAR: sound navigation ranking) that uses ultrasonic waves.
  • the distance sensor is, for example, a distance measuring device using a laser.
  • the external environment data is, for example, a laser scan image of an object outside the automatic operation device.
  • the external environment sensing unit in this case is, for example, LIDAR (Laser Imaging Detection and Ranking).
  • the environmental data is, for example, position information in the work area of the automatically operating device.
  • the external environment sensing unit in this case is, for example, GNSS (Global Navigation Satellite System).
  • the autonomous driving device is, for example, an autonomous driving vehicle.
  • the automatic operation device is not particularly limited, and may be, for example, an automatic work robot.
  • the cooperative operation device that cooperates with the automatic operation device is, for example, an automatic work robot mounted on an automatic driving vehicle.
  • the cooperative operation device is not limited to this, and for example, an autonomous driving vehicle equipped with an automatic work robot may be used.
  • the actuator is, for example, a motor.
  • the actuator causes the automatic operation device to perform physical output.
  • the actuator provided in the cooperative operation device causes the cooperative operation device to perform physical output.
  • the actuator may be an electromagnetic solenoid.
  • the actuator is controlled by the motion control device.
  • the actuator may be directly controlled by the motion control device.
  • the actuator and the motion control device may be indirectly controlled via a control means different from the motion control device. In this case, the motion control device controls the control means, and the control means controls the actuator.
  • the control unit is, for example, a navigation device for controlling the automatic driving of an autonomous driving vehicle.
  • the control unit is not particularly limited, and may be configured to control, for example, a robot that does not travel.
  • the control unit used in the automatic operation device is mounted on the automatic operation device, for example.
  • the control unit is not particularly limited, and may be installed at a position away from the automatically operating device and may be communicably connected to the peripheral device via each connector.
  • the external communication connection part is, for example, a connector to which an electric wire is connected.
  • the external communication connection unit may be a wireless communication device connected by wireless communication. That is, the communicable connection is, for example, directly electrically connected via a connector and an electric wire.
  • the communicable connection is not particularly limited, and may be, for example, a state in which communication is possible via a wireless communication device arranged at physically separated positions. Further, the communicable connection may be in a state in which communication is possible via a central communication relay device capable of communicating with a large number of devices.
  • the cooperative operation device is, for example, a device connected to an automatic operation device.
  • the cooperative operation device is, for example, an automatic work robot connected to an automatic driving vehicle as an automatic operation device.
  • the cooperative operation device is not particularly limited, and may be, for example, a second automatic driving vehicle that travels at a position away from the automatic driving vehicle as the automatic driving device.
  • the cooperative control unit is, for example, a navigation device for controlling the automatic driving of the autonomous driving vehicle as a cooperative operation device.
  • a plurality of autonomous driving vehicles can travel in cooperation with each other.
  • the physical output is an action that involves the physical movement of at least some of the members.
  • the physical output of the linked operating device is, for example, the movement of the entire linked operating device. However, the physical output is not particularly limited, and for example, a part of the linked operating device may be physically moved.
  • the physical output is, for example, the operation of the robot arm when the cooperative operation device includes the robot arm.
  • the physical output is, for example, the operation of a valve when the coordinated operating device includes a valve that opens and closes a flow path.
  • the physical output is, for example, the rotation of the fan when the cooperating device includes a blower fan.
  • Non-physical output is an action that does not involve physical movement.
  • the non-physical output is, for example, the output of information.
  • the non-physical output is, for example, the output of captured image data when the cooperative operating device is equipped with a camera.
  • the non-physical output is, for example, a display when the cooperative operating device is equipped with an image display device.
  • the specific method for determining whether either the master unit or the slave unit is connected as the cooperative control unit is not particularly limited.
  • the operation control device can read the identification information of the cooperative control unit and collate the identification table stored in advance to discriminate either the master unit or the slave unit.
  • the operation control device may determine that the cooperation control unit is the master unit when the operation command is received from the cooperation control unit without determining at the time of connection. If the cooperative control unit can switch its own mode between the master unit and the slave unit, the operation control device detects the mode switching of the cooperative control unit each time, and switches the operation mode of the operation control device according to the detection result. You may.
  • a plurality of transmission formats supported by a plurality of connectors are, for example, CAN (Control Area Network) Bus or Ethernet (registered trademark).
  • control unit capable of increasing the degree of freedom of combination with other devices and improving versatility.
  • FIG. 1 is a block diagram showing a configuration of an automatic operation system including a control unit according to the first embodiment of the present invention.
  • the automatic operation system S is an operator H, that is, a system capable of automatically operating regardless of the operation of a person.
  • the automatic operation system S detects the external environment of the automatic operation system S by itself. Then, the automatic operation system S recognizes the content of the detection result, and controls the operation of the automatic operation system S based on the recognition result.
  • the automatic operation system S also has a function of operating according to the operation of the operator H.
  • the automatic operation system S operates in response to the operation of the remote control device 3.
  • the remote control device 3 is a device for remotely controlling the automatic operation system S.
  • the remote control device 3 can communicate with the automatic operation system S by wireless communication.
  • the remote control device 3 transmits operation information to the automatic operation system S.
  • the remote control device 3 receives an image from the automatic operation system S and displays the image.
  • the automatic operation system S can start or stop the automatic operation in response to the operation of starting or stopping the automatic operation with respect to the remote control device 3.
  • the automatic operation system S can select one pattern from a plurality of automatic operation patterns according to the operation of the selection operation for the remote control device 3.
  • the automatic operation system S can perform sequential operations in response to sequential operations on the remote control device 3. Sequential operations are operations typified by, for example, forward, backward, and stop.
  • the automatic operation system S includes an automatic operation device 1 and a cooperative operation device 2.
  • the automatic operation device 1 shown in FIG. 1 is a system capable of performing automatic operation.
  • the automatic operation device 1 detects the external environment. Then, the automatic operation device 1 recognizes the detection result and controls the operation based on the recognition result.
  • the automatic operation device 1 also has a function of operating according to the operation of the operator H. The operation according to the operation is the same as the operation for the automatic operation system S described above.
  • the cooperative operation device 2 is also a system capable of performing automatic operation.
  • the cooperative operation device 2 detects the external environment. Then, the cooperative operation device 2 recognizes the detection result and controls the operation based on the recognition result.
  • the cooperative operation device 2 also has a function of operating according to the operation of the operator H.
  • the cooperative operation device 2 it is also possible to adopt a configuration in which the automatic operation device 1 does not detect the external environment by itself, but operates by receiving the detection result of the external environment in the automatic operation device 1 or the instruction from the automatic operation device 1.
  • a configuration for detecting the external environment as the cooperative operation device 2 will be described.
  • the cooperative operation device 2 shown in FIG. 1 operates in cooperation with the automatic operation device 1. That is, the automatic operation device 1 and the cooperative operation device 2 cooperate with each other to complete the work expected by the operator H.
  • the cooperative operation device 2 can cooperate by performing the same type of operation as the automatic operation device 1, for example. As an example of this, there is a case where the automatic operation device 1 and the cooperative operation device 2 are a pair of robot arms. Further, for example, the cooperative operation device 2 and the automatic operation device 1 can complete the desired operation by performing different types of operations from each other. That is, the cooperative operation device 2 and the automatic operation device 1 can cooperate with each other. As an example of this, there is a case where the automatic operation device 1 is an automatic traveling vehicle and the cooperative operation device 2 is a robot arm mounted on the automatic traveling vehicle.
  • the automatic operation device 1 and the cooperative operation device 2 shown in FIG. 1 are mechanically connected. However, the automatic operation device 1 and the cooperative operation device 2 may be separated from each other. As an example of this, there is a case where the automatic operation device 1 and the cooperative operation device 2 are two vehicles that travel in a common area or share a plurality of adjacent areas.
  • the linked operation device 2 shown in FIG. 1 is communicably connected to the automatic operation device 1.
  • the automatic operation device 1 and the cooperative operation device 2 shown in FIG. 1 are electrically connected to each other by electric wires.
  • a configuration for wireless communication can be adopted as the automatic operation device 1 and the cooperative operation device 2, for example, a configuration for wireless communication can be adopted.
  • the automatic operation device 1 includes a control unit 10, an external environment sensing unit 11, an operation unit 12, a remote communication device 13, and a power supply unit 14.
  • the external environment sensing unit 11 detects the external environment of the automatic operation device 1.
  • the external environment sensing unit 11 outputs external environment data indicating the detection result.
  • the external environment sensing unit 11 is, for example, a camera that photographs the external environment of the automatically operating device 1.
  • the camera as the external environment sensing unit 11 outputs external image data indicating the shooting result.
  • the control unit 10 can operate the actuator 121 while recognizing the complicated external environment.
  • the operation unit 12 is controlled based on the external environment.
  • the operation unit 12 includes an actuator 121.
  • the actuator 121 is mechanically operated by electric control to drive a device mounted on the automatic operation device 1 or the automatic operation device 1 itself.
  • the control unit 10 is connected to the external environment sensing unit 11, the operation unit 12, and the remote communication device 13.
  • the control unit 10 controls the actuator 121 of the operation unit 12 based on the external environment detected by the external environment sensing unit 11. More specifically, the control unit 10 recognizes the contents of the external environment by processing the external environment data output from the external environment sensing unit 11. The control unit 10 determines the control content based on the recognized content. Then, the control unit 10 controls the actuator 121 based on the determined control content.
  • the control unit 10 may receive a control command corresponding to the operation of the remote control device 3 from the remote communication device 13 and control the actuator 121 based on the control command.
  • the internal configuration of the control unit 10 will be described later.
  • the power supply unit 14 supplies electric power to the control unit 10, the external environment sensing unit 11, the operating unit 12, and the remote communication device 13.
  • the power supply unit 14 has a battery (not shown).
  • the power supply unit 14 supplies the electric power stored in the battery to each unit.
  • the power supply unit 14 supplies electric power based on the control of the control unit 10.
  • the power source used by the power supply unit 14 is not limited to the battery, and various power sources can be used.
  • An engine generator includes, for example, an engine that operates on liquid fuel and a generator that is driven by the engine to generate electricity.
  • the power supply unit 14 shown in FIG. 1 also supplies electric power to the cooperative operation device 2.
  • the linked operation device 2 may be provided with a power supply independent of, for example, the automatic operation device 1.
  • the remote communication device 13 is communicably connected to the remote control device 3.
  • the remote communication device 13 is communicably connected to the remote control device 3 by wireless communication.
  • the remote communication device 13 relays communication data between the remote control device 3 and the control unit 10.
  • the remote communication device 13 outputs a control command output from the remote control device 3 to the control unit 10 in response to the operation of the remote control device 3.
  • the control command from the remote control device 3 is supplied to the control unit 10.
  • the remote communication device 13 supplies data based on the data output from the external environment sensing unit 11 to the remote control device 3.
  • the remote communication device 13 transmits data based on the external image data output from the camera to the remote control device 3.
  • the remote control device 3 transmits an image command to the remote communication device 13.
  • the image command is a command for designating the content of the image and the amount of image data transmitted to the remote control device 3.
  • the remote communication device 13 sends an image command to the control unit 10.
  • the data exchange between the remote communication device 13 and the control unit 10 described above is the same as the data exchange between the remote communication device 13 and the cooperative control unit 20.
  • the cooperative operation device 2 includes a cooperative control unit 20, a cooperative sensing unit 21, and a cooperative operation unit 22.
  • the roles of the cooperation control unit 20, the cooperation sensing unit 21, and the cooperation operation unit 22 in the cooperation operation device 2 are the same as the roles of the control unit 10, the external environment sensing unit 11, and the operation unit 12 in the above-mentioned automatic operation device 1. are doing. However, the type of environment detected by the cooperative sensing unit 21, the detailed content of the determination of the cooperative control unit 20, and the output of the cooperative operation unit 22 differ depending on the function of the cooperative operation device 2.
  • the cooperation control unit 20 controls the cooperation operation unit 22 that performs physical output.
  • the physical output of the cooperative operation unit 22 is, for example, the operation of the actuator 221.
  • the cooperative control unit 20 corresponds to an example of the control unit referred to in the present invention, like the control unit 10.
  • an automatic traveling vehicle can be mentioned, for example.
  • the control unit 10 is, for example, an automatic navigation unit.
  • the actuator 121 of the operation unit 12 is a traveling device for traveling the automatic operation device 1.
  • the traveling device is, for example, a motor.
  • the control unit 10 recognizes the content of the image of the traveling region taken by the camera as the external environment sensing unit 11, determines the traveling route based on the recognition result, and instructs the operation unit 12 of the determined traveling route. That is, the control unit 10 controls the operation unit 12 based on the determined travel path.
  • the automatically operating device 1 automatically travels.
  • An example of the function of the cooperative operation device 2 shown in FIG. 1 is a work device mounted on an autonomous vehicle.
  • An example of working equipment is a fruit-picking device that picks fruits on a farm.
  • the cooperative control unit 20 recognizes the content of the fruit tree photographed by the camera as the cooperative sensing unit 21, determines the state and position of the fruit based on the recognition result, and based on the determined position of the fruit, A running command (running position command) is transmitted to the automatic operation device 1.
  • the travel command is an example of an operation command.
  • the actuator 221 of the fruit-picking device as the cooperative operation unit 22 is controlled based on the determined position of the fruit.
  • the cooperative operation device 2 mounted on the automatic operation device 1 can be replaced with a cooperative operation device having a function different from that of the cooperative operation device 2 shown in FIG.
  • FIG. 2 is a block diagram showing the configuration of the control unit 10 shown in FIG.
  • the control unit 10 is a unit covered with one housing, and is incorporated in the automatic operation device 1 (see FIG. 1).
  • the control unit 10 is electrically connected to each part of the automatic operation device 1.
  • the control unit 10 is a commercial unit.
  • the control unit 10 is a mass production type.
  • the control unit 10 includes an external environment information connector 110, an operation control connector 130, an external communication connection unit 140, and an operation control device 160. Further, the control unit 10 includes a remote data connector 150.
  • the external environment information connector 110 is electrically connected to the external environment sensing unit 11 shown in FIG. External environment data indicating the detection result is input to the control unit 10 from the external environment sensing unit 11 via the external environment information connector 110.
  • the external environment sensing unit 11 is, for example, a camera
  • the external environment information connector 110 functions as an external image connector.
  • the external environment information connector 110 will also be referred to as an external image connector 110.
  • the operation control connector 130 is electrically connected to the operation unit 12 shown in FIG.
  • An operation control signal for controlling the operation of the actuator 121 is output from the control unit 10 to the operation unit 12 via the operation control connector 130.
  • the external communication connection unit 140 is connected to the cooperation control unit 20 shown in FIG.
  • the external communication connection unit 140 in the example shown in FIG. 1 is an external communication connector that is electrically connected to the cooperation control unit 20.
  • the external communication connection unit 140 is also referred to as an external communication connector 140.
  • the external communication connector 140 physically includes a plurality of connectors corresponding to a plurality of types of transmission formats.
  • the types of transmission formats are, for example, Controller Area Network (CAN) (registered trademark) and Ethernet (registered trademark).
  • CAN Controller Area Network
  • Ethernet registered trademark
  • the external communication connector 140 is an example of an external communication connection unit that is communicably connected to the cooperation control unit 20.
  • As the external communication connection unit for example, a configuration in which a wireless communication device is used instead of the external communication connector 140 can be adopted.
  • the remote data connector 150 is electrically connected to the remote communication device 13 shown in FIG.
  • a control command signal is input to the control unit 10 from the remote control device 3 (see FIG. 1) and the remote communication device 13 via the remote data connector 150. That is, in response to the request for image transmission being input from the remote control device 3 to the remote communication device 13, the control command signal is output from the remote control device 13 and input via the remote data connector 150 for control. It is input to the unit 10. Further, a signal indicating the external environment and the state of the control unit 10 is output from the control unit 10 to the remote communication device 13 via the remote data connector 150. This signal is supplied from the remote communication device 13 to the remote control device 3.
  • the motion control device 160 controls the actuator 121 of the motion unit 12 based on the external environment detected by the external environment sensing unit 11 shown in FIG. More specifically, the motion control device 160 processes the external environment data output from the external environment sensing unit 11. The motion control device 160 controls the actuator 121 based on the processing result of the external environment data. Further, the motion control device 160 receives a control command corresponding to the operation of the remote control device 3 from the remote communication device 13, and controls the motion unit 12 based on the control command. Further, the operation control device 160 communicates with the cooperation control unit 20 connected via the external communication connector 140. As described above, various devices can be selected as the collaborative operation device 2 that can be combined with the automatic operation device 1.
  • the master unit is a cooperative control unit 20 configured to transmit an operation command to the operation control device 160.
  • the slave unit is a cooperative control unit 20 configured to receive an operation command from the operation control device 160. Either the master unit or the slave unit is connected to the external communication connector 140.
  • the operation control device 160 switches the operation mode according to the type of the cooperation control unit 20.
  • the operation control device 160 When the master unit is connected to the external communication connector 140, the operation control device 160 generates an operation control signal based on the operation command received from the master unit. The motion control device 160 outputs the generated motion control signal to the actuator 121 of the motion unit 12 via the motion control connector 130.
  • the operation control device 160 controls the cooperative operation device 2 based on the processing result of the external environment data input from the external environment sensing unit 11. Generate an operation command for. The operation control device 160 transmits the generated operation command to the slave unit via the external communication connector 140.
  • the versatility of the control unit 10 can be improved.
  • the motion control device 160 includes an automatic control circuit 170 and a monitoring circuit 180.
  • the automatic control circuit 170 and the monitoring circuit 180 are provided in the housing of the control unit 10.
  • the automatic control circuit 170 carries out basic control processing in the motion control device 160. More specifically, the automatic control circuit 170 controls the actuator 121 based on the external environment signal from the external environment sensing unit 11. More specifically, the automatic control circuit 170 outputs an operation control signal based on an external environment signal by executing a software process. The automatic control circuit 170 also outputs a status index signal by executing a software process.
  • the automatic control circuit 170 includes a Graphics Processing Unit (GPU) 171.
  • GPU 171 is a processor having a multi-core capable of parallel processing.
  • the GPU 171 includes 100 or more arithmetic cores that can operate in parallel.
  • the GPU 171 executes a SIMD (single-instruction multiple-data stream) operation by 100 or more arithmetic cores.
  • the automatic control circuit 170 includes a non-volatile memory 172, a RAM 173, a control input / output (control IO) 174, and a CPU 175.
  • the non-volatile memory 172 is, for example, a mask ROM flash memory or an EEPROM.
  • the CPU 175 is a Central Processing Unit. The CPU 175 controls the entire automatic control circuit 170.
  • the GPU 171 and the CPU 175 share and execute the control of the automatic control circuit 170. More specifically, the CPU 175 causes the GPU 171 to perform some of the functions of the automatic control circuit 170. The functions executed by the GPU 171 will be described later.
  • the non-volatile memory 172 stores a program executed by the CPU 175 and the GPU 171.
  • the CPU 175 sequentially reads and executes the programs stored in the non-volatile memory 172. As a result, control by the automatic control circuit 170 is executed. Further, the program of the GPU 171 stored in the non-volatile memory 172 is read by the CPU 175 and supplied to the GPU 171.
  • the RAM 173 holds the result of the processing by the CPU 175 and the result of the processing by the GPU 171.
  • the CPU 175 and the GPU 171 read / write data to / from the RAM 173.
  • the RAM 173 stores data input to the CPU 175 and the GPU 171, data indicating the processing status, and data indicating the operation control signal output from the automatic control circuit 170 as a result of the processing.
  • the data input to the GPU 171 is, for example, data representing an external environment signal.
  • the data indicating the processing status is, for example, one of the parameters indicating the operating status of the automatic control circuit 170.
  • the control IO 174 relays signals input / output to the CPU 175 and the GPU 171.
  • the CPU 175 and the GPU 171 output a status index signal indicating the operating status of the control model via the control IO 174.
  • the status indicator signal is, for example, a pulse indicating the period and time during which the process of processing the control model is executed.
  • the status index signal is one of the parameters indicating the operating status of the automatic control circuit 170.
  • the stored contents of the RAM 173 can be read out to the FPGA 181 of the monitoring circuit 180 via the control IO 174.
  • the CPU 175 supplies the program stored in the non-volatile memory 172 to the GPU 171. Further, the CPU 175 outputs a command to execute the program to the GPU 171.
  • the automatic control circuit 170 is configured with the control model 171a constructed by machine learning.
  • the control model 171a is a model showing the relationship between the external environment detected by the external environment sensing unit 11 and the control of the operation unit 12 to be controlled.
  • the control model 171a is a machine learning model using a neural network.
  • a machine learning model can be obtained, for example, by constructing a model that shows the relationship between external image data and an object that can exist on a traveling path, and optimizing the model.
  • control model 171a optimizes the weighting parameters of the model by referring to, for example, the actual external image data and the data in which the object on the traveling path is associated with the data. This is the result obtained by converting.
  • the function of optimizing the control model 171a with reference to the data is performed not inside the automatic control circuit 170 or the control unit 10, but in the external environment of the automatic operation system S.
  • a program that builds the control model as a result of the optimization is stored in the non-volatile memory 172.
  • the method of optimizing the control model 171a is not limited to this.
  • the automatic control circuit 170 itself can optimize the control model 171a that it configures based on the external image data actually obtained.
  • the GPU 171 can execute SIMD operations on 100 or more arithmetic cores, it is possible to execute the processing of the control model 171a accompanied by the iterative arithmetic of a large-scale matrix at high speed.
  • the CPU 175 determines the operation of the control unit 10 based on the information of the object obtained as a result of applying the data of the external environment to the machine learning model.
  • the CPU 175 controls the operation unit 12 based on the result of the determination. More specifically, the CPU 175 outputs a command to the operation unit 130 via, for example, the control IO 174 and the communication IF 183 of the monitoring circuit 180. Further, the CPU 175 transmits a command to the cooperative operation unit 20 based on the result of the determination.
  • the CPU 175 transmits data to the remote communication device 13.
  • the division of control between the CPU 175 and the GPU 171 and the input / output of the model executed by the GPU 171 are not limited to those described above.
  • the machine learning model may be a model that directly shows, for example, the relationship between the external image data and the optimum traveling path or the motion trajectory of the arm or the like.
  • the CPU 175 controls the operation unit 12 based on the travel path or operation locus output as the processing result of the GPU 171.
  • the monitoring circuit 180 constitutes the control unit 10 integrally with the automatic control circuit 170.
  • the monitoring circuit 180 monitors the operation of the automatic control circuit 170 constituting the control model 171a.
  • the monitoring circuit 180 includes a field programmable gate array (FPGA) 181 and a non-volatile memory 182. Further, the monitoring circuit 180 includes a communication interface (communication IF) 183, a relay 184, and a memory 185 for a program.
  • FPGA field programmable gate array
  • the FPGA 181 has a reprogrammable logic circuit.
  • the non-volatile memory 182 stores the connection information of the logic circuit for monitoring constructed by the FPGA 181.
  • the FPGA 181 reads the connection information from the non-volatile memory 182 in the initialization process after the power is turned on or after the reset.
  • the FPGA 181 constructs a logic circuit based on connection information. After constructing the logic circuit, the FPGA 181 starts the processing by the logic circuit.
  • the monitoring circuit 180 the monitoring conditions can be changed by changing the connection information, which is software stored in the non-volatile memory 182, and the hardware based on the connection information.
  • the GPU 171 described above, the CPU 175, or the processor 181p described later will sequentially read the stored programs by accessing the non-volatile memory 172 or the memory 185 for the program during execution after initialization.
  • the FPGA 181 reads out the non-volatile memory 182 only once at the time of initialization to form the logic circuit of the monitoring circuit 180.
  • the FPGA 181 completes reading the non-volatile memory 182 before starting execution of the process. Therefore, the logic circuit for monitoring can operate with high reliability.
  • the non-volatile memory 182 containing the information of the logic circuit for monitoring may be physically divided into a plurality of parts.
  • the first device of the non-volatile memory 182 composed of a plurality of memory devices stores information on a higher-level logic circuit that monitors the monitoring target and a reference information loader.
  • the numerical range of the monitoring target and the logic of the combination of signals determined to be abnormal are stored as reference information.
  • the reference information loader starts execution, reads (loads) the reference information from the second device, and complements the numerical range in the logic circuit and the logic of the combination. This completes the construction of the logic circuit in FPGA181. That is, the reading of information is executed in a plurality of steps.
  • a configuration in which the non-volatile memory 182 is used as three or more devices can also be adopted.
  • the FPGA 181 may have a fixed logic circuit other than the reprogrammable logic circuit.
  • the FPGA 181 has a processor 181p and a memory as logic circuits.
  • the processor 181p executes processing while sequentially reading the programs stored in the memory 185, for example. This allows for more advanced processing.
  • the memory 185 read by the processor 181p is non-volatile. However, unlike the non-volatile memory 182 for the FPGA 181, the memory 185 stores not the connection information but the programs that are sequentially read by the processor. By dividing the memory according to the application, the reliability of the logic circuit composed of the FPGA 181 is improved.
  • the communication IF183 is an interface for the FPGA 181 and the automatic control circuit 170 to communicate with the operation unit 12.
  • the communication IF 183 provides, for example, a physical interface for communicating with the operating unit 12.
  • the physical interface is, for example, CAN.
  • the automatic control circuit 170 outputs an operation control signal via the communication IF 183.
  • the relay 184 cuts off the power supply of the power supply unit 14 (see FIG. 1) to the operating unit 12. More specifically, the relay 184 transmits a supply signal for supplying electric power to the power supply unit 14 by energizing under the control of the FPGA 181. When the energization of the relay 184 is stopped by the control of the FPGA 181, the transmission of the supply signal is stopped. As a result, the power supply from the power supply unit 14 is cut off. By shutting off the power supply, the operation can be reliably stopped.
  • the supply signal from the relay 184 can pass through a relay (not shown) provided in each part of the automatic operation device 1 and the cooperative operation device 2 outside the monitoring circuit 180. As a result, the power supply is immediately cut off by some cutoff control. Therefore, the operation can be reliably stopped.
  • the logic circuit composed of FPGA181 of the monitoring circuit 180 detects an abnormality in the operation of the automatic control circuit 170 by rule-based logic.
  • the monitoring circuit 180 inspects at least one signal selected from the group consisting of the external environment signal, the operation control signal, and the status index signal of the automatic control circuit 170.
  • the monitoring circuit 180 monitors all of the external environment signal, the operation control signal, and the status index signal, for example.
  • the monitoring circuit 180 is configured to monitor whether or not the external environment signal, the operation control signal, and the status index signal satisfy the conditions set to be associated with each of them.
  • the automatic control circuit 170 stores the external environment data representing the external environment signal output from the external environment sensing unit 11 in the RAM 173.
  • the monitoring circuit 180 reads a part of the external environment data of the RAM 173 and monitors whether or not the external environment signal is within the corresponding normal condition range.
  • the monitoring circuit 180 also monitors the interval at which the external environment signal is input, the value of the data indicated by the external environment signal, and the amount of change in the data indicated by the external environment signal in the automatic control circuit 170.
  • the monitoring circuit 180 reads data indicating the processing status and data indicating the operation control signal as a result of the processing from the RAM 173. The monitoring circuit 180 monitors whether or not the read result is within the corresponding normal condition range. Further, the monitoring circuit 180 monitors whether or not the status index signal output from the control IO 174 of the automatic control circuit 170 is within the corresponding normal condition range. The monitoring circuit 180 monitors whether or not the operation control signal output from the automatic control circuit 170 via the communication IF 183 is within the corresponding normal condition range. Further, the monitoring circuit 180 monitors whether or not the cycle and time for executing the process for processing the control model are within the corresponding normal condition range. The monitoring circuit 180 also monitors the voltage of the power supply supplied to the FPGA 181. As a result, the monitoring circuit 180 can also monitor the abnormality of the power supply unit 14.
  • the monitoring circuit 180 monitors whether or not the parameter to be monitored is included in the range defined so as to be associated with the parameter.
  • the monitoring circuit 180 does not limit the output of the automatic control circuit 170. That is, when the parameter to be monitored is within the range defined by the rule, the automatic control circuit 170 is not restricted by the rule. Therefore, it is possible to control with a high degree of freedom by the model of the automatic control circuit 170 while maintaining reliability.
  • the monitoring circuit 180 detects an abnormality of the parameter, the portion of the controlled object that stops the operation is changed according to the type of the abnormality of the parameter.
  • the non-volatile memory 182 records conditions defined so as to be associated with parameters.
  • the FPGA 181 of the monitoring circuit 180 implements a processing function for inspecting whether or not the parameters satisfy the conditions.
  • the FPGA 181 of the monitoring circuit 180 is configured to determine the result of monitoring by the logical product, OR, or a combination thereof of signals representing the detected abnormalities. A logical product, a logical sum, or a combination thereof is recorded in the non-volatile memory 182.
  • the monitoring circuit 180 is configured with a logic circuit to select the type of stop described above according to the degree of abnormality of the parameter. Further, the monitoring circuit 180 generates a stop signal for stopping the operation of the actuator 121 at a predetermined deceleration depending on the type of abnormal parameter. For example, if the anomalous parameter relates to a range of command speeds, the monitoring circuit 180 will generate a stop signal to stop at a predetermined deceleration. In this case, the impact caused by the stop of the actuator 121 is suppressed. When the actuator 121 is a traveling device, the influence of the sudden stop on the mounted equipment is suppressed.
  • the monitoring circuit 180 generates a stop signal for stopping the operation of the actuator 121 instead of the operation control signal output by the automatic control circuit 170, depending on the type of the abnormal parameter. For example, if the combination of commands is different from the combination of the assumed range, a stop signal is generated. In this case, the actuator 121 is stopped in a short time, and the shock to the mounted device is suppressed to some extent. Further, the monitoring circuit 180 generates a signal for operating the brake included in the operation unit 12 instead of the operation control signal output by the automatic control circuit 170, depending on the type of abnormal parameter. For example, when an abnormality is detected in the process itself in the automatic control circuit 170, a signal for operating the brake is generated. In this case, since the actuator 121 can be stopped in the shortest time, the influence of the abnormal state can be minimized.
  • the monitoring circuit 180 when the monitoring circuit 180 detects an abnormality, the monitoring circuit 180 cuts off the power supply to the controlled object according to the type and number of the abnormal parameters.
  • the monitoring circuit 180 shuts off the power supply by operating the relay 184, for example, when an abnormality of a plurality of parameters is detected.
  • the monitoring circuit 180 controls the automatic control circuit 170 so that the image of the camera is forcibly displayed on the remote control device 3 when an abnormality is detected. As a result, the operator can immediately perform the corresponding maneuver.
  • the output by the monitoring circuit 180 is not limited to the above combination.
  • the monitoring circuit 180 detects an abnormality of at least one parameter related to at least one type of signal
  • the monitoring circuit 180 also adopts a configuration in which the output of the operation control signal by the automatic control circuit 170 is prohibited instead of outputting the operation command. It is possible. In this case, the monitoring circuit 180 stops the operation of the communication IF 183 that outputs the signal from the automatic control circuit 170. As a result, the situation in which an abnormal operation control signal is continuously output is suppressed.
  • the logical combination constructed by the monitoring circuit 180 implements an appropriate stop according to the degree of parameter abnormality.
  • the basic hardware structure of the control unit 10 described above is also applied to the cooperative control unit 20. However, when the output content based on the abnormality detection result of the cooperative sensing unit 21 is different from that of the control unit 10, a part of the hardware and the software are different from the control unit 10 according to the difference.
  • the control unit 10 having the above-described configuration operates in cooperation with the cooperative control unit 20.
  • the control unit 10 operates in cooperation with the cooperative control unit 20 regardless of whether the master unit or the slave unit is connected as the cooperative control unit 20. Subsequently, the details of the cooperation operation with the cooperation control unit 20 will be described.
  • FIG. 3 is a flowchart illustrating a cooperative operation among the operations of the control unit shown in FIG.
  • the operation control device 160 executes the determination of the type of the cooperation control unit 20 and the cooperation operation according to the type.
  • the type determination and the cooperative operation according to the type are mainly carried out by the automatic control circuit 170 shown in FIG.
  • a configuration in which the type of the cooperative control unit 20 is determined by the processor 181p provided in the monitoring circuit 180 can also be adopted.
  • the cooperative operation will be described as the operation of the operation control device 160.
  • the motion control device 160 first determines whether or not it is connected to the linked control unit 20 (S11). For example, the operation control device 160 determines whether or not it is possible to communicate with the cooperation control unit 20 via the external communication connection unit 140. When the cooperative control unit 20 is connected, the cooperative control unit 20 can communicate with the operation control device 160 of the control unit 10. In this case, the operation control device 160 determines that it is connected to the cooperation control unit 20.
  • the operation control device 160 When communication with the cooperation control unit 20 is possible (Yes in S11), the operation control device 160 identifies the type of the cooperation control unit 20 (S12). For example, the operation control device 160 reads the identification information that identifies the cooperation control unit 20 from the cooperation control unit 20 via the external communication connection unit 140.
  • the cooperation control unit 20 determines the type of the cooperation control unit 20 (S13).
  • the cooperation control unit 20 determines whether the cooperation control unit 20 is either a master unit or a slave unit by referring to a database in which identification information and the type of the cooperation control unit are associated with each other, for example.
  • the cooperation control unit 20 When the cooperation control unit 20 is not a master unit (No in S13), the cooperation control unit 20 is a slave unit. In this case, the control unit 10 operates as a master unit.
  • the operation control device 160 recognizes the content of the external environment data by processing the external environment data (S14). More specifically, for example, when the automatic operation device 1 is an automatic traveling vehicle, the operation control device 160 performs a process for recognizing the content of image data representing an external image based on the constructed control model.
  • the motion control device 160 determines the motion based on the recognition result of the content of the external environment data (S15). More specifically, for example, the motion control device 160 grasps the current position of the automatic motion device 1 and determines the optimum travel route based on the recognition of the content of the image data.
  • the operation control device 160 generates an operation control signal based on the processing result of the external environment data (S16).
  • the operation control device 160 outputs the generated operation control signal to the actuator 121 of the operation unit 12 via the operation control connector 130. More specifically, for example, the motion control device 160 generates an motion control signal including a travel and steering command based on the determined travel path, and outputs the motion control signal to the motion unit 12.
  • the operation unit 12 operates the actuator 121.
  • the automatic operation device 1 operates based on the external environment.
  • the operation control device 160 transmits an operation command for controlling the linked operation device 2 based on the processing result of the external environment data (S17). More specifically, for example, the operation control device 160 generates an operation command for the cooperative control unit 20 to perform an operation according to the position of the automatic operation device 1 on the traveling path. The operation control device 160 transmits an operation command to the cooperation control unit 20 via the external communication connection unit 140. As a result, the control unit 10 operates in cooperation with the cooperative control unit 20 that operates as a slave unit.
  • the operation control device 160 When the cooperative control unit 20 is a master unit (Yes in S13), the operation control device 160 operates as a slave unit. In this case, the motion control device 160 recognizes the content of the external environment data by processing the external environment data (S21). Further, the motion control device 160 determines the motion based on the recognition result of the content of the external environment data (S22). These operations are the same as in steps S14 and S15 described above.
  • the operation control device 160 receives an operation command from the cooperative control unit 20 which is a master unit (S23).
  • the cooperative control unit 20 transmits an operation command to the control unit 10 via the external communication connection unit 140. More specifically, for example, the cooperative control unit 20 transmits an operation command indicating whether the automatic operation device 1 moves forward or backward according to the position of the work target to the operation control device 160 of the control unit 10.
  • the operation control device 160 transmits an operation control signal for controlling the cooperative operation device 2 based on the operation command received via the external communication connection unit 140 (S24).
  • the motion control device 160 generates an motion control signal based on the external environment data recognized in step S22 and the motion command. More specifically, for example, the motion control device 160 generates an motion control signal so as to move forward or backward along the determined travel path, and outputs the motion control signal to the motion unit 12.
  • the operation unit 12 operates the actuator 121.
  • the automatic operation device 1 operates based on the operation command of the cooperation control unit 20.
  • the control unit 10 operates in cooperation with the cooperative control unit 20 that operates as a master unit.
  • FIG. 4 is a block diagram showing a first application example of the control unit shown in FIG.
  • An application example shown in FIG. 4 is a case where the automatic operation device 1 is an automatic traveling vehicle and the cooperation control unit 20 is a master unit. That is, the master unit is connected to the external communication connector 140.
  • the cooperative operation device 2 is an autonomous operation robot mounted on an autonomous vehicle.
  • the cooperation control unit 20 recognizes the position of the work target of the robot based on the image of the cooperation operation device camera (robot camera) as the cooperation sensing unit 21.
  • the cooperative control unit 20 transmits an operation command including forward / backward movement to the operation control device 160 of the control unit 10 according to the position of the work target.
  • the operation control device 160 generates an operation control signal from the cooperation control unit 20 based on the operation command received.
  • the motion control device 160 outputs the generated motion control signal to the actuator 121 of the motion unit 12 via the motion control connector 130.
  • the automatic operation device 1 and the cooperative operation device 2 can cooperate with each other to perform precise work.
  • FIG. 5 is a block diagram showing a second application example of the control unit shown in FIG.
  • the application example shown in FIG. 5 is a case where the automatic operation device 1 is an automatic traveling vehicle and the cooperation control unit 20 is a slave unit. That is, the slave unit is connected to the external communication connector 140.
  • the cooperative operation device 2' is a simple work device mounted on the automatic operation device 1.
  • the cooperative operation device 2' is, for example, a sprayer that sprays a chemical or the like toward a work target.
  • the operation control device 160 generates an operation command for controlling the cooperative operation device 2'based on the processing result of the image data of the external photographing camera as the external environment sensing unit 11.
  • the motion control device 160 generates, for example, an motion command for causing the cooperative motion device 2 to start or stop a work motion based on the position of the autonomous driving vehicle acquired as a result of processing the image data of the external camera. , Transmit to the cooperation control unit 20.
  • the cooperative operation device 2 can operate appropriately according to the traveling of the automatic traveling vehicle as the automatic operation device 1.
  • the cooperation control unit 20 is connected to the external communication connector 140 of the control unit 10 regardless of whether the master unit or the slave unit is connected as the cooperation control unit 20. Can perform precise work in cooperation with. Therefore, the versatility of the control unit 10 can be improved.
  • the automatic operation system S operates in response to the operation of the remote control device 3 by the operator H.
  • the remote control device 3 shows both or one of an image based on the image data of the external photographing camera 11 and an image based on the image data of the cooperative operation device camera 21.
  • FIG. 6 is a block diagram showing an image flow in the automatic operation system shown in FIG.
  • the solid arrow in FIG. 6 shows the flow of the image when the automatic operation system S is remotely controlled.
  • the broken line arrow in FIG. 6 indicates the flow of the image request signal from the remote control device 3.
  • an external photographing camera is shown as an example of the external environment sensing unit 11.
  • a linked operation device camera is shown as the linked sensing unit 21.
  • the external environment sensing unit 11 will also be referred to as an external photographing camera 11.
  • the linked sensing unit 21 is also referred to as a linked operating device camera 21.
  • the automatic operation device 1 is equipped with an external photographing camera 11 and an actuator 221 for photographing an external environment. Further, the cooperative operation device 2 is provided with a cooperative operation device camera 21 for photographing the external environment.
  • the control unit 10 of the automatic operation device 1 is connected to one remote communication device 13.
  • the control unit 10 of the automatic operation device 1 and the cooperation control unit 20 of the cooperation operation device 2 are connected to one remote communication device 13. That is, the control unit 10 and the cooperative control unit 20 jointly use one remote communication device 13.
  • the control unit 10 includes an external image connector 110, an operation control connector 130, a remote data connector 150, and an operation control device 160 (see FIG. 2).
  • the external image connector 110 (external environment information connector 110) is a connector for inputting external image data indicating a shooting result from the external shooting camera 11 to the control unit 10.
  • the operation control connector 130 is a connector for the control unit 10 to output an operation control signal for controlling the operation of the actuator 121.
  • the remote data connector 150 is a connector for inputting / outputting data to / from one remote communication device 13 communicably connected to the remote control device 3.
  • the operation control device 160 (see FIG. 2) of the control unit 10 remotely transfers monitor image data based on the external image data input via the external image connector 110 from the remote data connector 150 via one remote communication device 13. Output to the control device 3.
  • the operation control device 160 outputs monitor image data based on an image command signal from the outside of the control unit 10.
  • the cooperative control unit 20 also performs the same processing as the control unit 10 on the image data of the cooperative operation device camera 21.
  • the operation of the control unit 10 in the operation control device 160 will be described as a representative.
  • FIG. 7 is a flowchart illustrating control of an image in the operation control device 160 of the control unit 10 shown in FIG.
  • the operation control device 160 determines whether or not an image command signal has been received from the outside (S31).
  • the image command signal is transmitted from the remote communication device 13 via the remote data connector 150.
  • the image command signal may be transmitted from the cooperation control unit 20 via the external communication connector 140.
  • the operation control device 160 discriminates about the image command signal transmitted through both connectors.
  • the operation control device 160 determines whether the content of the image command signal indicates the start of image transmission (S32).
  • the motion control device 160 outputs the monitor image data to the remote control device 3 (S33).
  • the monitor image data is data obtained by processing the external image data input via the external image connector 110.
  • the operation control device 160 generates monitor image data having a smaller amount of data than the external image data by performing image compression processing on the external image data.
  • external image data that has not been substantially processed may be used as the monitor image data.
  • the operation control device 160 outputs monitor image data from the remote data connector 150 to one remote communication device 13.
  • the monitor image data is transmitted to the remote control device 3 via the remote communication device 13.
  • the operation control device 160 determines whether the content of the image command signal indicates that the image transmission is stopped (S34). When the image transmission is stopped (Yes in S34), the operation control device 160 stops the output of the monitor image data (S35). As a result, the transmission of the monitor image data to the remote control device 3 is stopped.
  • the operation control device 160 determines whether the content of the image command signal indicates that the image transmission is stopped (S34). When the image transmission is stopped (Yes in S34), the operation control device 160 stops the output of the monitor image data (S35). As a result, the transmission of the monitor image data to the remote control device 3 is stopped.
  • the image command signal indicating that the image transmission is stopped may be output from the cooperation control unit 20.
  • the cooperative control unit 20 that has received a command to transmit only the image of the cooperative operation device camera 21 from the remote control device 3 sends an image command signal indicating that the image transmission is stopped to the control unit 10. By stopping the transmission of data based on the image data of the external photographing camera 11, the remote control device 3 can display the image that the operator H wants to pay attention to during maneuvering while reducing the amount of data to be transmitted.
  • the operation control device 160 determines whether the content of the image command signal indicates frame thinning (S36). In the case of frame thinning (Yes in S36), the operation control device 160 performs frame thinning processing on the monitor image data (S37). As a result, the amount of monitor image data transmitted to the remote control device 3 is reduced.
  • the operation control device 160 determines whether the content of the image command signal indicates the image compression rate (S38). In the case of the image compression rate (Yes in S38), the operation control device 160 changes the compression rate of the image compression process (S39). As a result, the amount of monitor image data transmitted to the remote control device 3 is reduced.
  • the operation control device 160 determines whether the content of the image command signal indicates a cutout of a part of the image (S41). In the case of cutting out the area (Yes in S41), the operation control device 160 performs the area cutting process on the monitor image data (S42). That is, the motion control device 160 extracts an image of a part of the area designated by the image command from the images captured by the external photographing camera 11 to generate monitor image data. More specifically, for example, when the range captured by the external photographing camera 11 is wide, only an image of a part of the traveling direction necessary for maneuvering is transmitted / displayed. As a result, the amount of monitor image data transmitted to the remote control device 3 is reduced.
  • a plurality of external photographing cameras 11 are connected to the control unit 10, and the motion control device 160 processes the images of the plurality of areas photographed by the plurality of external photographing cameras 11. It also applies if you have one. In the case of cropping a region (Yes in S41), the motion control device 160 uses only the image of the designated region as monitor image data. Also in this case, the amount of monitor image data transmitted to the remote control device 3 is reduced.
  • step S31 When it is determined in step S31 that the image command signal has not been received (No in S31), the operation control device 160 stops the output of the monitor image data (S45).
  • the operation control device 160 determines whether or not an abnormal state has occurred in the control unit 10 (S46).
  • the abnormal state of the control unit 10 is detected by, for example, a logic circuit composed of FPGA 181 of the monitoring circuit 180.
  • the operation control device 160 determines that an abnormal state has occurred, for example, when any of the parameters to be monitored is not within the range defined by the rule.
  • the motion control device 160 outputs monitor image data to the remote control device 3 (S47).
  • the monitor image data is transmitted to the remote control device 3 via the remote communication device 13.
  • the operator H can recognize the occurrence of the abnormal state at an early stage by the display screen of the remote control device 3.
  • the operator H can recognize the cause of the abnormal state at an early stage from the display screen.
  • FIG. 8 is a block diagram showing a third application example of the control unit shown in FIG.
  • the cooperative operation device 4 of the automatic operation system S is not connected to the automatic operation device 1.
  • the cooperative operation device 4 operates away from the automatic operation device 1.
  • the cooperative operation device 4 is, for example, an automatic traveling vehicle having substantially the same configuration as the automatic operation device 1.
  • the cooperative control unit 40 of the cooperative operation device 4 operates as either a master unit or a slave unit.
  • the external communication connection unit 140 included in the control unit 10 of the automatic operation device 1 is a wireless communication device.
  • the cooperative operation device 4 includes an external communication connection unit 440 that performs wireless communication with the external communication connection unit 140.
  • the control unit 10 of the automatic operation device 1 communicates with the cooperation control unit 40 of the cooperation operation device 4 via wireless communication.
  • the cooperation control unit 40 controls the cooperation operation unit 42.
  • the control unit 10 of the automatic operation device 1 When the cooperative control unit 40 is the master unit, the control unit 10 of the automatic operation device 1 outputs an operation control signal based on the operation command from the cooperative control unit 40. For example, the control unit 10 travels in response to an operation command from the cooperation control unit 40. For example, the control unit 10 operates following the cooperative operation device 4. In this case, the control unit 10 can, for example, photograph the cooperative operation device 4 with the external photographing camera 11 and travel following the cooperative operation device 4 based on the image data of the external photographing camera 11.
  • the control unit 10 When the cooperation control unit 40 is a slave unit, the control unit 10 generates an operation command for controlling the cooperation operation device 4 based on the image data of the external photographing camera 11. The control unit 10 transmits an operation command to the cooperative control unit 40.
  • the cooperative operation device 4 operates following the automatic operation device 1.
  • FIG. 9 is a flowchart illustrating the cooperative operation of the control unit 10 according to the second embodiment of the present invention.
  • the operation of the control unit shown in FIG. 9 is different from the operation described with reference to FIG. 3 in steps S51 and S53.
  • step S51 the operation control device 160 (see FIG. 2) of the control unit 10 determines whether or not an operation command has been received from the cooperative control unit 20.
  • the operation control device 160 When the operation control device 160 receives an operation command from the cooperation control unit 20 (Yes in S51), the operation control device 160 determines that the cooperation control unit 20 is the master unit. That is, the control unit 10 operates as a slave unit. In this case, the cooperation control unit 20 executes the processes of steps S21, S22, S53, and S24. In step S53, the operation control device 160 executes the process based on the operation command received in S51.
  • the operation control device 160 determines that the cooperation control unit 20 is a slave unit. That is, the control unit 10 operates as a master unit. In this case, the cooperation control unit 20 carries out the processes of steps S14 to S17.
  • control unit 10 of the present embodiment determines that the cooperation control unit 20 is the master unit when the operation command is input from the cooperation control unit.
  • the control unit 10 determines that the cooperative control unit is the slave unit until an operation command is input.
  • FIG. 10 is a block diagram showing a configuration of an automatic operation system including a control unit according to a third embodiment of the present invention.
  • the automatic operation device 1 in the present embodiment is different from the first embodiment in that it has a hub 13a. Further, the automatic operation device 1 in the present embodiment is connected to the cooperation control unit 20 via the hub 13a instead of directly. Since other points in this embodiment are the same as those in the first embodiment, each part is designated by the same reference numerals as those in the first embodiment.
  • the control unit 10 and the cooperation control unit 20 in FIG. 10 are connected to one remote communication device 13 via the hub 13a.
  • the hub 13a relays data between the control unit 10, the cooperative control unit 20, and the remote communication device 13.
  • the control unit 10 and the cooperative control unit 20 connected to the hub 13a transmit data in a common transmission format.
  • the automatic operation device 1 in FIG. 10 communicates with the cooperation control unit 20 via the hub 13a.
  • the hub 13a in the present embodiment has an independent data mixing function in the remote communication device 13 of the first embodiment. That is, the hub 13a has a part of the functions of the remote communication device 13. Therefore, even in this embodiment, it can be said that the cooperative control unit 20 of the automatic operation device 1 and the cooperative control unit 20 of the cooperative operation device 2 are connected to one remote communication device 13.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

La présente invention aborde le problème de fourniture d'une unité de commande apte à augmenter la flexibilité pour une combinaison avec d'autres équipements et à améliorer la polyvalence. Une unité de commande est pourvue : d'un connecteur d'informations environnementales externe ; d'un connecteur de commande d'opération ; d'une unité de connexion de communication externe ; et d'un dispositif de commande d'opération. Lorsque l'unité maître est connectée à l'unité de connexion de communication externe, le dispositif de commande d'opération génère le signal de commande d'opération sur la base d'une instruction d'opération reçue en provenance de l'unité maître et délivre le signal de commande d'opération à l'actionneur par l'intermédiaire du connecteur de commande d'opération. Lorsque l'unité esclave est connectée à l'unité de connexion de communication externe, le dispositif de commande d'opération génère une instruction d'opération pour commander l'équipement d'opération coordonné sur la base du résultat de traitement des données environnementales externes et délivre l'instruction d'opération à l'unité esclave par l'intermédiaire de l'unité de connexion de communication externe.
PCT/JP2019/014442 2019-04-01 2019-04-01 Unité de commande WO2020202427A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/014442 WO2020202427A1 (fr) 2019-04-01 2019-04-01 Unité de commande
PCT/JP2020/014546 WO2020203968A1 (fr) 2019-04-01 2020-03-30 Unité de commande

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/014442 WO2020202427A1 (fr) 2019-04-01 2019-04-01 Unité de commande

Publications (1)

Publication Number Publication Date
WO2020202427A1 true WO2020202427A1 (fr) 2020-10-08

Family

ID=72666736

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/014442 WO2020202427A1 (fr) 2019-04-01 2019-04-01 Unité de commande

Country Status (1)

Country Link
WO (1) WO2020202427A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005262378A (ja) * 2004-03-18 2005-09-29 Oki Electric Ind Co Ltd 自律ロボットおよびその制御方法
US20150306764A1 (en) * 2012-07-27 2015-10-29 Engineering Services Inc. Modular mobile robot
JP2017030093A (ja) * 2015-07-31 2017-02-09 株式会社東芝 複数ロボットの協調移動システム及び方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005262378A (ja) * 2004-03-18 2005-09-29 Oki Electric Ind Co Ltd 自律ロボットおよびその制御方法
US20150306764A1 (en) * 2012-07-27 2015-10-29 Engineering Services Inc. Modular mobile robot
JP2017030093A (ja) * 2015-07-31 2017-02-09 株式会社東芝 複数ロボットの協調移動システム及び方法

Similar Documents

Publication Publication Date Title
JP6820981B2 (ja) 自動運転システム、車両制御方法及び装置
JP7029910B2 (ja) 情報処理装置、情報処理方法及びプログラム
AU2017261540B2 (en) Command for underground
US20200141755A1 (en) Navigation processing method, apparatus, and control device
JP6605241B2 (ja) 遠隔操縦システム
TWI488015B (zh) 操縱用通信裝置、被操縱體用通信裝置及操縱用通信系統
EP3944209B1 (fr) Système de commande à distance de rattrapage pour machine
WO2020202426A1 (fr) Unité de commande et système de fonctionnement automatique
AU2017261609B2 (en) Command for underground
WO2020202427A1 (fr) Unité de commande
AU2018201213B2 (en) Command for underground
EP3991533A1 (fr) Véhicule de travaux, procédé de détection d'obstacle et programme de détection d'obstacle
WO2020203968A1 (fr) Unité de commande
WO2020202428A1 (fr) Unité de commande
WO2022153669A1 (fr) Système de coordination distribué et procédé d'exécution de tâches
JP7264395B2 (ja) 作業車両の遠隔制御システム、遠隔操作装置および遠隔制御方法
JP7433189B2 (ja) 作業車監視システム
US20220297821A1 (en) Control device, control method, unmanned aircraft, information processing device, information processing method, and program
JP2016224743A (ja) 情報処理装置、環境情報収集方法および環境情報収集用プログラム
JP2022151102A (ja) 作業車両の遠隔制御システム、遠隔操作装置および遠隔制御方法
US11586225B2 (en) Mobile device, mobile body control system, mobile body control method, and program
JP6815279B2 (ja) 作業走行機能診断装置
JP7055189B2 (ja) 作業走行機能診断装置
CN110587610A (zh) 基于5g云调度系统的农场用独立悬架移动机器人控制系统
WO2018168537A1 (fr) Appareil cible d'apprentissage et procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19923379

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19923379

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP