WO2020049765A1 - Mobile manipulator, mobile robot, server, and mobile manipulator system - Google Patents

Mobile manipulator, mobile robot, server, and mobile manipulator system Download PDF

Info

Publication number
WO2020049765A1
WO2020049765A1 PCT/JP2019/009017 JP2019009017W WO2020049765A1 WO 2020049765 A1 WO2020049765 A1 WO 2020049765A1 JP 2019009017 W JP2019009017 W JP 2019009017W WO 2020049765 A1 WO2020049765 A1 WO 2020049765A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
result information
imaging result
robot arm
mobile
Prior art date
Application number
PCT/JP2019/009017
Other languages
French (fr)
Japanese (ja)
Inventor
達也 古賀
聡庸 金井
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020049765A1 publication Critical patent/WO2020049765A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • the present disclosure relates to a mobile manipulator, a mobile robot, a server, and a mobile manipulator system for transporting a work.
  • manipulators for taking out and transporting a workpiece by a robot arm have been known. Further, in order to expand the working range of the manipulator, there is an increasing demand for a mobile manipulator that can load a manipulator on an automatic transport vehicle or the like and transport the workpiece while moving.
  • Patent Literature 1 includes a three-dimensional measurement unit that detects an arrangement state of a work arranged on a stocker, and the three-dimensional measurement unit performs a work removal operation for taking out the work by a robot arm.
  • a manipulator for detecting an arrangement state of a work arranged on another stocker is disclosed.
  • the manipulator detects the arrangement state of the workpieces in parallel with the workpiece extracting operation, thereby shortening the time required for a series of workpiece extracting steps for sequentially extracting the workpieces from the plurality of stockers.
  • the above-described prior art cannot detect the arrangement state of the work arranged on the stocker located at a position distant from the manipulator. Therefore, when the manipulator moves, it is necessary to newly detect the arrangement state of the work. During that time, the work removal operation by the manipulator must be stopped. Therefore, as the number of movements of the manipulator increases, the time for detecting the arrangement state of the work increases.
  • An object of one embodiment of the present disclosure is to realize a mobile manipulator that can reduce a work transfer time by performing a work removal operation and a detection of a work arrangement state in parallel.
  • a mobile manipulator is a mobile manipulator mounted on an automatic guided vehicle, and a robot arm that performs a gripping operation on an object, and the robot arm
  • the imaging unit provided in the, imaging result information based on the imaging result by the imaging unit, and transmitting to an external device by communication, from the external device, receiving the imaging result information stored in the external device,
  • a robot arm control unit that controls the operation of the robot arm based on the imaging result information.
  • a mobile manipulator that can reduce the work transfer time can be realized by performing the work removal operation and the detection of the arrangement state of the work in parallel.
  • FIG. 1 is a diagram schematically illustrating a mobile robot and a mobile manipulator system according to a first embodiment.
  • FIG. 2 is a block diagram illustrating a main configuration of the mobile robot according to the first embodiment.
  • FIG. 4 is a diagram schematically illustrating an example of an operation of the mobile robot according to the first embodiment. It is a figure which shows typically an example of operation
  • 5 is a flowchart illustrating an example of a flow of a process of a control unit according to the first embodiment.
  • FIG. 9 is a block diagram illustrating a main configuration of a mobile robot according to a second embodiment. 13 is a flowchart illustrating an example of a flow of a process of a control unit according to the second embodiment.
  • FIG. 1 schematically illustrates an example of a mobile robot 1 and a mobile manipulator system according to the present embodiment.
  • FIG. 2 is a block diagram illustrating a main configuration of the mobile robot according to the present embodiment.
  • 3 and 4 are diagrams schematically showing an example of the operation of the mobile robots 1 and 1a according to the present embodiment or the mobile robots 101 and 101a according to the reference example of the present embodiment.
  • the mobile robot 1 is a device for transporting a work (object) 2 and is configured to be movable.
  • the mobile robot 1 includes a manipulator (mobile manipulator) 10 and an automatic guided vehicle 50.
  • the manipulator 10 includes a robot arm 11 and a housing 15.
  • the robot arm 11 is a member that performs a gripping operation on the work 2.
  • One end of the robot arm 11 includes a work holding unit 13 for holding the work 2 and a camera (imaging unit) 14.
  • the housing unit 15 includes a control device 20 and a first communication unit 30 described below.
  • the mobile robot 1a also includes a manipulator 10a and an automatic guided vehicle, like the mobile robot 1.
  • the manipulator 10a includes a workpiece gripper at one end of a robot arm and a camera 14a.
  • the control device 20 includes a control unit 21 and a first storage unit 40.
  • the control unit 21 includes an image analysis unit 22, a robot arm control unit 23, an automatic guided vehicle control unit 24, and a camera control unit 25.
  • the image analysis unit 22 performs image analysis on the captured image data (image capture result) captured by the camera 14.
  • the robot arm control unit 23 controls the operation of the robot arm 11 based on the received imaging result information.
  • the first communication unit 30 transmits the captured image data captured by the camera 14 or the result of analyzing the captured image data by the image analysis unit 22 to the server (external device) 60 as captured result information. In addition, the first communication unit 30 receives the imaging result information stored in the server 60.
  • the mobile robot 1 mutually communicates with the other mobile robots 1 a and the like and the server 60 by the first communication unit 30. Thereby, a mobile manipulator system including the plurality of mobile robots 1 and the server 60 is formed.
  • FIG. 3 As shown in FIG. 3, at a timing (period) t1, the mobile robot 1 moves to a position in front of a stocker 3 on which a plurality of works (objects) 2 to be transported are placed.
  • the mobile robot 1 captures an image of the workpiece 2 with the camera 14, and generates workpiece position information (position information of the target object) by the image analysis unit 22 from the obtained captured image data (see FIG. 3). "Image analysis").
  • the captured image data or the work position information is shared with another mobile robot 1a as imaging result information.
  • the mobile robots 101 and 101a according to the reference example shown in FIG. 4 differ from the mobile robots 1.1 and 1a according to the present embodiment in that the imaging result information cannot be shared with another mobile robot.
  • the mobile robot 1 recognizes the arrangement state of the work 2 based on the work position information and grips the work 2. Thereafter, the mobile robot 1 captures an image of the stocker 3. The mobile robot 1 generates new workpiece position information from captured image data while transporting the workpiece 2. The captured image data or the work position information is shared with another mobile robot 1a as imaging result information.
  • the mobile robot 1 similarly carries the work 2 and analyzes the image.
  • the work position information generated by the mobile robot 1 is shared with the mobile robot 1a as needed.
  • the mobile robot 1a moves to the position before the stocker 3 where the mobile robot 1 is carrying. From timing t7, the mobile robot 1a transfers the work 2 instead of the mobile robot 1.
  • the mobile robot 1a receives the imaging result information generated by the mobile robot 1 at timing t6 before performing the gripping operation of the work 2. Thereby, the mobile robot 1a can recognize the arrangement state of the work 2 in the stocker 3. Therefore, the mobile robot 1a can immediately start transporting the work 2 without needing to image the stocker 3 and generate work position information.
  • the mobile robot 1a conveys the work 2 a predetermined number of times, captures an image of the stocker 3, analyzes the image, and shares the imaging result information.
  • the work 2 is transported from the timing t11 to the time t16 in the same manner as the mobile robots 1 and 1a.
  • the mobile robot 101 does not have a function of sharing the generated imaging result information with another mobile robot 101a. Therefore, the mobile robot 101a itself takes an image of the stocker 3 at timing t17 and performs image analysis. At timing t17, the mobile robot 101a cannot transfer the work 2 and must stop its operation.
  • the processing at the timing t16 and the processing at the timing t17 in FIG. 4 can be performed at the timing t6 in FIG.
  • the robot arm 11 includes a joint 12 formed to be rotatable and bendable in an arbitrary direction.
  • the robot arm 11 may include one or more joints 12.
  • One end of the robot arm 11 is formed so as to be rotatable and bendable in the housing 15.
  • the work holding portion 13 includes a vacuum pad capable of vacuum-sucking the work 2.
  • the work gripper 13 is not limited to this, and may include, for example, a pair of claws for sandwiching and gripping.
  • the camera 14 includes a solid-state imaging device and a lens that can three-dimensionally recognize an imaging target.
  • a solid-state imaging device for example, a CCD (Charge Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor may be used. According to such a configuration, the spatial position of the workpiece 2 to be conveyed can be accurately grasped.
  • the camera 14 is not limited to such an example, and may include, for example, a solid-state imaging device and a lens that can only capture a two-dimensional image.
  • the camera 14 may be attached to any position of the mobile robot 1 instead of one end of the robot arm 11.
  • the camera 14 may be attached to the housing 15 or the automatic guided vehicle 50. According to such a configuration, while the robot arm 11 is moving, the camera 14 can capture an image of the workpiece 2 or the like with high accuracy in a stationary state.
  • the control device 20 includes a control unit 21 and a first storage unit 40.
  • the control unit 21 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and controls each component.
  • the first storage unit 40 is, for example, an auxiliary storage device such as a hard disk drive or a solid state drive, and stores various programs executed by the control unit 21, image data shared with other manipulators, imaging result information, and the like. .
  • the image analysis unit 22 acquires captured image data captured by the camera 14 and analyzes the captured image data. The image analysis unit 22 first determines whether or not the workpiece 2 to be transported exists in the captured image data. If it is determined that the work 2 exists, the image analysis unit 22 generates work position information. According to such a configuration, the mobile robot 1 can generate work position information from captured image data captured by itself.
  • the imaging result information generated by the image analysis unit 22 may be output to the first communication unit 30 and the first storage unit 40.
  • the imaging result information may be acquired by the robot arm control unit 23 and the automatic guided vehicle control unit 24.
  • the robot arm control unit 23 controls the operation of the robot arm 11. For example, based on the imaging result information or the like, the operation of the robot arm 11 is controlled to grip and transport the work 2, and the position and direction of the camera 14 provided in the robot arm 11 are adjusted.
  • the automatic guided vehicle control unit 24 controls the operation of the automatic guided vehicle 50 and moves the mobile robot 1 to a predetermined point. Further, the automatic guided vehicle control unit 24 may control the operation of the automatic guided vehicle 50 based on the imaging result information or the like, and finely adjust the position of the mobile robot 1. Further, the mobile robot 1 is moved to a position in front of the stocker 3 on which the work 2 to be transported is placed, based on the position information of the stocker 3 transmitted from the information management control unit 62 provided in the server 60 described later.
  • the camera control unit 25 controls the imaging operation of the camera 14.
  • the first communication unit 30 performs communication with the server 60 and the like.
  • the first communication unit 30 may perform wired communication using a cable or the like, or may perform wireless communication. Further, the first communication unit 30 may communicate with the first communication unit of another mobile robot 1a.
  • the automatic guided vehicle 50 includes a wheel 51.
  • the operation of the wheel 51 is controlled by the automatic guided vehicle control unit 24, so that the automatic guided vehicle 50 can move to an arbitrary point. Therefore, the manipulator 10 placed on the automatic guided vehicle 50 can transport the work 2 to a distant position that is not reachable by the robot arm 11. Further, the point where the manipulator 10 works can be changed as appropriate.
  • the manipulator 10 is not limited to the automatic guided vehicle 50 because the manipulator 10 is configured to be movable as the mobile robot 1.
  • it may be an unmanned aerial vehicle (drone) that can fly, or a bogie that is manually moved.
  • the server 60 includes a second communication unit 61, an information management control unit 62, and a second storage unit 63.
  • the information management control unit 62 controls writing to the second storage unit 63 according to a change in the state of the mobile robot 1 (such as updating a database according to movement of the mobile robot 1), and via the second communication unit 61.
  • Communication control (such as information transmission according to the movement of the mobile robot 1).
  • the server 60 receives, for example, the imaging result information generated by the mobile robot 1 by the second communication unit 61 and stores the imaging result information in the second storage unit 63.
  • the imaging result information is transmitted to another mobile robot 1a or the like as needed. Thereby, the imaging result information can be shared among the plurality of mobile robots 1.
  • the operation of the mobile robot 1 can be smoothly controlled.
  • FIG. 5 is a flowchart illustrating an example of a processing flow of the control unit 21 according to the present embodiment. An example of a processing flow of the control unit 21 will be described with reference to FIG.
  • the automatic guided vehicle controller 24 moves the mobile robot 1 to a position (designated point) in front of the stocker 3 on which the work 2 to be transferred is placed (S1).
  • the first communication unit 30 specifies the position and the like of the stocker 3 including the work 2 to be transported, and obtains imaging result information (second imaging result information) on the stocker 3 generated by another manipulator 10a from the server 60. I do.
  • the first communication unit 30 or the robot arm control unit 23 determines whether the current imaging result information of the stocker 3 can be received from the server 60 (S2).
  • the imaging result information transmitted and received between the first communication unit 30 and the server 60 is work position information generated from the captured image data by the image analysis unit 22.
  • the robot arm control unit 23 images the robot arm 11 with the stocker 3 including the work 2. (S3).
  • the received work position information is information that cannot be used
  • a case where the target work 2 does not exist at the position indicated by the received work position information can be considered.
  • the arrangement of the work 2 on the stocker 3 may be changed after the work position information of the stocker 3 generated most recently.
  • the robot arm control unit 23 determines that the received work position information is information that cannot be used.
  • the camera control unit 25 causes the camera 14 to image the stocker 3 (S4).
  • the image analysis unit 22 performs an image analysis process on the captured image data (S5), and generates work position information in the stocker 3 (S6).
  • the manipulator 10 can generate the work position information by itself even when appropriate work position information for detecting the position of the work 2 cannot be received.
  • the robot arm control unit 23 operates the robot arm 11 To perform the gripping operation of the work 2 (S7).
  • the robot arm control unit 23 moves the robot arm 11 to a position for imaging the stocker 3 including the work 2 (S8). Then, the camera control unit 25 causes the camera 14 to image the stocker 3 (S9).
  • the arrangement state of another work 2 after the work 2 is gripped by the robot arm 11 can be imaged. Therefore, it is possible to generate optimum work position information for the next gripping operation.
  • the robot arm control unit 23 starts the transfer of the work 2.
  • the image analysis unit 22 starts an image analysis process of the captured image data (S10), and generates work position information (first imaging result information) in the stocker 3 (S11).
  • the work position information is output to the first communication unit 30 and transmitted from the first communication unit 30 to the server 60 (S12).
  • the work position information may also be output to the first storage unit 40 and stored in the first storage unit 40.
  • the transfer of the work 2 is completed (S13).
  • the transfer of the work 2 may be completed substantially simultaneously with the completion of the transmission or before the completion of the transmission of the work position information.
  • the control unit 21 determines whether or not the mobile robot 1 moves to another point based on the transfer plan information of the work 2 transmitted from the information management control unit 62 provided in the server 60 (S14). If the mobile robot 1 does not move and continues the transfer operation at the same position (NO in S14), the process returns to step S2, and a series of processing is performed again.
  • the work position information received in step S2 may be obtained from the first storage unit 40 by the work position information generated by itself in step S11, or the work position information transmitted by itself in step S12 may be renewed by the server 60. May be received.
  • the plurality of mobile robots 1 transport the work 2 on the same stocker 3
  • the latest work position information in the stocker 3 may be received from the server 60.
  • the automatic guided vehicle control unit 24 moves the mobile robot 1 to a position before the stocker 3 on which the workpiece 2 to be transported next is placed. (Point), and a series of processing is performed again.
  • the mobile robot 1 moves to the work point and there is another mobile robot 1a working at the work point after the movement, the mobile robot 1 moves to the work position generated by the mobile robot 1a. Receive and use information. Therefore, after moving, the mobile robot 1 can recognize the position of the work 2 at the work point after the movement based on the received work position information. Therefore, immediately after moving to the new work point, the transfer of the work 2 can be started immediately without performing the imaging of the work 2 and generating the work position information by itself.
  • the transport work time can be reduced by the time required for imaging the work 2 and generating the work position information.
  • the effect of shortening the transfer operation time increases as the movement of the mobile robot 1 increases.
  • FIG. 6 is a block diagram illustrating a main configuration of the mobile robot 1 according to the present embodiment. A configuration example of the mobile robot 1 according to the present embodiment will be described with reference to FIG.
  • the mobile robot 1 according to the present embodiment is different from the mobile robot 1 according to the first embodiment in that the control unit 21 does not include the image analysis unit 22 but includes an imaging result information processing unit 22a instead.
  • the imaging result information processing unit 22a outputs the image data captured by the camera 14 to the first communication unit 30, and transmits the work position information transmitted from the server 60 to the robot arm control unit 23 and the automatic guided vehicle control unit 24. Output.
  • the analysis of the captured image data is not performed. Therefore, the captured image data captured by the camera 14 is not subjected to image analysis, and the captured image data itself is transmitted to the server 60.
  • the image analysis unit 64 included in the server 60 analyzes the received captured image data and generates work position information including the position information of the work 2.
  • the work position information generated by the server 60 is transmitted to the mobile robot 1 from the second communication unit 61 provided in the server 60. Note that the captured image data received by the server 60 and the work position information generated by the server 60 are stored in the second storage unit 63.
  • the imaging result information transmitted from the mobile robot 1 to the server 60 is the captured image data itself, and the imaging result information transmitted from the server 60 to the mobile robot 1 is generated from the captured image data. Work position information.
  • the mobile robot 1 can analyze the captured image data by itself and generate the imaging result information including the position information of the work 2.
  • the load on the control unit 21 can be reduced.
  • FIG. 7 is a flowchart illustrating an example of the flow of processing of the control unit 21 according to the present embodiment. An example of a processing flow of the control unit 21 will be described with reference to FIG.
  • the automatic guided vehicle control unit 24 moves the mobile robot 1 to a position (designated point) in front of the stocker 3 on which the work 2 to be transferred is placed (S21).
  • the first communication unit 30 of the manipulator 10 specifies the position of the stocker 3 including the work 2 to be transported, and requests the server 60 for work position information (S22).
  • the information management control unit 62 of the server 60 is generated based on work position information (first imaging result information) on the stocker 3 generated based on the captured image data of the manipulator 10 or captured image data of another manipulator 10a.
  • Work position information (second imaging result information) on the stocker 3 obtained is acquired from the second storage unit 63 and output to the second communication unit 61.
  • the second communication unit 61 transmits the work position information to the first communication unit 30 of the manipulator 10 (S41).
  • the first communication unit 30 of the manipulator 10 receives the work position information (S23). Next, the first communication unit 30 or the robot arm control unit 23 determines whether the current work position information of the stocker 3 has been received from the server 60 (S24).
  • the robot arm control unit 23 images the robot arm 11 with the stocker 3 including the work 2. (S25).
  • the camera control unit 25 causes the camera 14 to image the stocker 3 (S26).
  • the captured image data is transmitted to the server 60 by the first communication unit 30 (S27).
  • the server 60 receives the captured image data (S42), and performs image analysis processing by the image analysis unit 64 (S43). Thereby, the work position information based on the captured image data received by the server 60 is generated (S44).
  • control unit 21 does not need to include the image analysis unit 22, and the load on the control unit 21 can be reduced.
  • the manipulator 10 when the manipulator 10 cannot receive appropriate work position information to detect the position of the work 2, the manipulator 10 takes an image of the stocker 3 on which the work 2 is placed, and transmits the image to the server 60, thereby obtaining the work position. Information can be obtained.
  • the robot arm controller 23 controls the operation of the robot arm 11 based on the received work position information, and the work 2 is gripped. (S28).
  • the robot arm control unit 23 moves the robot arm 11 to a position for imaging the stocker 3 including the work 2 (S29). Then, the camera control unit 25 causes the stocker 3 to take an image of the camera 14 (S30).
  • the robot arm control unit 23 starts the transfer of the work 2. Further, during the transport, the first communication unit 30 transmits the captured image data to the server 60 (S31). At this time, the captured image data may also be output to the first storage unit 40 and stored in the first storage unit 40.
  • the transport of the work 2 is completed (S32).
  • the control unit 21 determines whether or not the mobile robot 1 moves to another point based on the transfer plan information of the work 2 transmitted by the information management control unit 62 provided in the server 60 (S33). When the mobile robot 1 does not move and continues the transport operation at the same point (NO in S33), the process returns to step S22, and a series of processes is performed again.
  • the automatic guided vehicle control unit 24 moves the mobile robot 1 in front of the stocker 3 on which the workpiece 2 to be transported next is placed (another location). (S21), and a series of processing is performed again.
  • control block of the control unit 21 may be realized by a logic circuit (hardware) formed on an integrated circuit (IC chip) or the like, or may be realized by software.
  • the control unit 21 includes a computer that executes instructions of a program that is software for realizing each function.
  • the computer includes, for example, one or more processors and a computer-readable recording medium storing the program. Then, in the computer, the object of the present disclosure is achieved by the processor reading the program from the recording medium and executing the program.
  • the processor for example, a CPU (Central Processing Unit) can be used.
  • the recording medium include "temporary tangible media" such as ROM (Read Only Memory), tapes, disks, cards, semiconductor memories, and programmable logic circuits. Further, a RAM (Random Access Memory) for expanding the above program may be further provided.
  • the program may be supplied to the computer via an arbitrary transmission medium (a communication network, a broadcast wave, or the like) capable of transmitting the program.
  • a transmission medium a communication network, a broadcast wave, or the like
  • one embodiment of the present disclosure can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  • a mobile manipulator is a mobile manipulator mounted on an automatic guided vehicle, a robot arm that performs a gripping operation on an object, an imaging unit provided on the robot arm, The imaging result information based on the imaging result by the imaging unit is transmitted to an external device by communication, the imaging result information stored in the external device is received from the external device, and the robot is configured based on the imaging result information.
  • a robot arm control unit for controlling the operation of the arm.
  • the imaging result information can be shared among a plurality of mobile manipulators. Therefore, when the mobile manipulator moves to the work point and there is another mobile manipulator working at the work point after the movement, the imaging result information generated by the another mobile manipulator is received and used. it can. Therefore, the mobile manipulator does not need to image the target object and generate the imaging result information by itself in order to detect the position of the target object.
  • the imaging result information may be position information of a target which is an analysis result of the imaging image data (imaging result) imaged by the imaging unit, or the imaging image data itself.
  • the external device may be a server described below or another mobile manipulator.
  • a specific mobile manipulator may have a server function, and the imaging result information may be mutually transmitted and received between a plurality of mobile manipulators.
  • the control unit may include an image analysis unit that analyzes image data captured by the imaging unit and calculates position information of the object as the imaging result information. According to the above configuration, the mobile manipulator can calculate the position information of the target object from the captured image data and control the gripping operation by the robot arm based on the position information.
  • the control unit may transmit image data obtained by the imaging unit to the external device as the imaging result information. According to the above configuration, it is possible to cause a server or the like to perform a process of calculating position information of a target object from captured image data. Therefore, the control unit does not need to include the image analysis unit, and the load on the control unit can be reduced.
  • the imaging unit may perform imaging after the gripping operation by the robot arm is completed. According to the configuration, it is possible to image the arrangement state of another target object after the target object is gripped by the robot arm. Therefore, optimal imaging result information for the next gripping operation can be generated and shared with another mobile manipulator.
  • the mobile manipulator that has received the imaging result information does not need to perform a process of excluding information on the previously transported object from the imaging result information. Therefore, processing for controlling the operation of the robot arm from the imaging result information can be simplified.
  • the control unit acquires the imaging result information corresponding to a point at which the gripping operation is performed from the external device before the robot arm performs the gripping operation, and acquires the imaging result information of the target object based on the imaging result information.
  • the gripping operation by the robot arm may be controlled by recognizing a position.
  • the position of the target at the point after the movement can be recognized from the received imaging result information. Therefore, it is possible to skip the step of capturing the target object and generating the imaging result information by itself, and immediately start transporting the target object. Therefore, it is possible to reduce the transport work time by the time required for imaging the target object and generating the imaging result information.
  • the control unit before performing the gripping operation of the robot arm, if the imaging result information could not be obtained from the external device, and causes the imaging unit to perform imaging, and the captured image data by the imaging unit And acquiring the position information of the object based on the position information and controlling the gripping operation by the robot arm based on the position information.
  • the control unit causes the imaging unit to perform imaging when the imaging result information acquired from the external device before the robot arm performs a gripping operation is information that cannot be used, and performs imaging by the imaging unit.
  • Position information of the object based on an image may be acquired, and the gripping operation by the robot arm may be controlled based on the position information.
  • the mobile manipulator acquires the position information of the target by itself and controls the gripping operation by the robot arm. it can.
  • the mobile manipulator may perform image analysis by itself, or transmit the captured image data to a server or the like, and analyze the analysis result from the server or the like. You may receive it.
  • a mobile robot includes any of the above-described mobile manipulators and the automatic guided vehicle. According to the above configuration, the mobile manipulator can be moved by the automatic guided vehicle.
  • the server transmits and receives the imaging result information to and from any of the mobile manipulators described above.
  • a mobile manipulator system includes any one of the mobile manipulators described above and the server. According to the above configuration, sharing of imaging result information among a plurality of mobile manipulators can be performed via the server.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

The present invention realizes a mobile manipulator with which it is possible to reduce workpiece conveying time. This mobile manipulator, mounted on an unmanned conveying vehicle (50), is provided with: a robot arm (11); an image capturing unit (14); and a control unit (21) which transmits and receives imaging result information based on an imaging result obtained by the image capturing unit (14) to and from an external device (60), and controls the operation of the robot arm (11) on the basis of the imaging result information.

Description

移動式マニピュレータ、移動ロボット、サーバ、および移動式マニピュレータシステムMobile manipulator, mobile robot, server, and mobile manipulator system
 本開示は、ワークを搬送するための移動式マニピュレータ、移動ロボット、サーバ、および移動式マニピュレータシステムに関する。 The present disclosure relates to a mobile manipulator, a mobile robot, a server, and a mobile manipulator system for transporting a work.
 従来、ワークをロボットアームにより取り出し、搬送するためのマニピュレータが知られている。また、マニピュレータの作業範囲を拡大させるため、自動搬送車等にマニピュレータを積載し、移動しながらワークを搬送できる移動式マニピュレータの需要が高まってきている。 Conventionally, manipulators for taking out and transporting a workpiece by a robot arm have been known. Further, in order to expand the working range of the manipulator, there is an increasing demand for a mobile manipulator that can load a manipulator on an automatic transport vehicle or the like and transport the workpiece while moving.
 特許文献1には、ストッカに配置されたワークの配置状態を検出する3次元計測ユニットを備え、上記3次元計測ユニットは、上記ワークを取り出すワーク取出動作がロボットアームにより行われている間に、他のストッカに配置されたワークの配置状態を検出するマニピュレータが開示されている。 Patent Literature 1 includes a three-dimensional measurement unit that detects an arrangement state of a work arranged on a stocker, and the three-dimensional measurement unit performs a work removal operation for taking out the work by a robot arm. A manipulator for detecting an arrangement state of a work arranged on another stocker is disclosed.
 このように、上記マニピュレータは、上記ワーク取出動作と並行して、ワークの配置状態の検出を行うことで、複数のストッカからワークを順次取り出す一連のワーク取出工程に要する時間を短縮する。 As described above, the manipulator detects the arrangement state of the workpieces in parallel with the workpiece extracting operation, thereby shortening the time required for a series of workpiece extracting steps for sequentially extracting the workpieces from the plurality of stockers.
日本国公開特許公報「特開2013-86184号公報」Japanese Unexamined Patent Publication "JP-A-2013-86184"
 しかしながら、上述のような従来技術は、上記マニピュレータから離れた位置にあるストッカに配置されたワークの配置状態を検出することはできない。そのため、上記マニピュレータが移動した場合、新たにワークの配置状態を検出しなければならない。また、その間は上記マニピュレータによるワーク取出動作を停止しなければならない。そのため、上記マニピュレータの移動回数が多くなるほど、ワークの配置状態を検出するための時間が増大する。 However, the above-described prior art cannot detect the arrangement state of the work arranged on the stocker located at a position distant from the manipulator. Therefore, when the manipulator moves, it is necessary to newly detect the arrangement state of the work. During that time, the work removal operation by the manipulator must be stopped. Therefore, as the number of movements of the manipulator increases, the time for detecting the arrangement state of the work increases.
 本開示の一態様は、ワーク取出動作およびワークの配置状態の検出を並行して行うことで、ワーク搬送時間を短縮できる移動式マニピュレータを実現することを目的とする。 An object of one embodiment of the present disclosure is to realize a mobile manipulator that can reduce a work transfer time by performing a work removal operation and a detection of a work arrangement state in parallel.
 上記の課題を解決するために、本開示の一態様に係る移動式マニピュレータは、無人搬送車に載置される移動式マニピュレータであって、対象物に対する把持動作を行うロボットアームと、前記ロボットアームに設けられた撮像部と、前記撮像部による撮像結果に基づく撮像結果情報を、通信によって外部装置に送信するとともに、前記外部装置から、該外部装置に記憶されている撮像結果情報を受信し、前記撮像結果情報に基づいて前記ロボットアームの動作を制御するロボットアーム制御部と、を備える。 In order to solve the above problems, a mobile manipulator according to an aspect of the present disclosure is a mobile manipulator mounted on an automatic guided vehicle, and a robot arm that performs a gripping operation on an object, and the robot arm The imaging unit provided in the, imaging result information based on the imaging result by the imaging unit, and transmitting to an external device by communication, from the external device, receiving the imaging result information stored in the external device, A robot arm control unit that controls the operation of the robot arm based on the imaging result information.
 本開示の一態様によれば、ワーク取出動作およびワークの配置状態の検出を並行して行うことで、ワーク搬送時間を短縮できる移動式マニピュレータを実現することができる。 According to one aspect of the present disclosure, a mobile manipulator that can reduce the work transfer time can be realized by performing the work removal operation and the detection of the arrangement state of the work in parallel.
実施形態1に係る移動ロボットおよび移動式マニピュレータシステムを模式的に示す図である。FIG. 1 is a diagram schematically illustrating a mobile robot and a mobile manipulator system according to a first embodiment. 実施形態1に係る移動ロボットの要部構成を示すブロック図である。FIG. 2 is a block diagram illustrating a main configuration of the mobile robot according to the first embodiment. 実施形態1に係る移動ロボットの動作の一例を模式的に示す図である。FIG. 4 is a diagram schematically illustrating an example of an operation of the mobile robot according to the first embodiment. 参考例に係る移動ロボットの動作の一例を模式的に示す図である。It is a figure which shows typically an example of operation | movement of the mobile robot which concerns on a reference example. 実施形態1に係る制御部の処理の流れの一例を示すフローチャートである。5 is a flowchart illustrating an example of a flow of a process of a control unit according to the first embodiment. 実施形態2に係る移動ロボットの要部構成を示すブロック図である。FIG. 9 is a block diagram illustrating a main configuration of a mobile robot according to a second embodiment. 実施形態2に係る制御部の処理の流れの一例を示すフローチャートである。13 is a flowchart illustrating an example of a flow of a process of a control unit according to the second embodiment.
 〔実施形態1〕
 以下、本発明の一側面に係る実施形態(以下、「本実施形態」とも表記する)を、図面に基づいて説明する。
[Embodiment 1]
Hereinafter, an embodiment according to one aspect of the present invention (hereinafter, also referred to as “the present embodiment”) will be described with reference to the drawings.
 §1 適用例
 図1は、本実施形態に係る移動ロボット1および移動式マニピュレータシステムの一例を模式的に例示する。図2は、本実施形態に係る移動ロボットの要部構成を例示するブロック図である。図3および図4はそれぞれ、本実施形態に係る移動ロボット1・1aまたは本実施形態の参考例に係る移動ロボット101・101aの動作の一例を模式的に示す図である。はじめに、図1~図4を用いて、移動ロボット1の適用例の概要を説明する。
§1 Application Example FIG. 1 schematically illustrates an example of a mobile robot 1 and a mobile manipulator system according to the present embodiment. FIG. 2 is a block diagram illustrating a main configuration of the mobile robot according to the present embodiment. 3 and 4 are diagrams schematically showing an example of the operation of the mobile robots 1 and 1a according to the present embodiment or the mobile robots 101 and 101a according to the reference example of the present embodiment. First, an outline of an application example of the mobile robot 1 will be described with reference to FIGS.
 図1および図2に示すように、移動ロボット1は、ワーク(対象物)2を搬送するための装置であり、移動可能に構成される。移動ロボット1は、マニピュレータ(移動式マニピュレータ)10および無人搬送車50を備える。マニピュレータ10は、ロボットアーム11および筐体部15を備える。 As shown in FIGS. 1 and 2, the mobile robot 1 is a device for transporting a work (object) 2 and is configured to be movable. The mobile robot 1 includes a manipulator (mobile manipulator) 10 and an automatic guided vehicle 50. The manipulator 10 includes a robot arm 11 and a housing 15.
 ロボットアーム11は、ワーク2に対する把持動作を行う部材である。ロボットアーム11の一端は、ワーク2を把持するためのワーク把持部13と、カメラ(撮像部)14とを備える。筐体部15は、後述する制御装置20および第1通信部30を備える。 The robot arm 11 is a member that performs a gripping operation on the work 2. One end of the robot arm 11 includes a work holding unit 13 for holding the work 2 and a camera (imaging unit) 14. The housing unit 15 includes a control device 20 and a first communication unit 30 described below.
 移動ロボット1aも、移動ロボット1と同様に、マニピュレータ10aおよび無人搬送車を備える。マニピュレータ10aは、ロボットアームの一端にワーク把持部と、カメラ14aを備える。 The mobile robot 1a also includes a manipulator 10a and an automatic guided vehicle, like the mobile robot 1. The manipulator 10a includes a workpiece gripper at one end of a robot arm and a camera 14a.
 制御装置20は、制御部21および第1記憶部40を備える。制御部21は、画像解析部22、ロボットアーム制御部23、無人搬送車制御部24、およびカメラ制御部25を備える。画像解析部22は、カメラ14により撮像された撮像画像データ(撮像結果)を画像解析する。ロボットアーム制御部23は、受信した撮像結果情報に基づいてロボットアーム11の動作を制御する。 The control device 20 includes a control unit 21 and a first storage unit 40. The control unit 21 includes an image analysis unit 22, a robot arm control unit 23, an automatic guided vehicle control unit 24, and a camera control unit 25. The image analysis unit 22 performs image analysis on the captured image data (image capture result) captured by the camera 14. The robot arm control unit 23 controls the operation of the robot arm 11 based on the received imaging result information.
 第1通信部30は、カメラ14により撮像された撮像画像データ、または、当該撮像画像データを画像解析部22によって解析した結果を、撮像結果情報としてサーバ(外部装置)60に送信する。また、第1通信部30は、サーバ60に記憶されている撮像結果情報を受信する。 The first communication unit 30 transmits the captured image data captured by the camera 14 or the result of analyzing the captured image data by the image analysis unit 22 to the server (external device) 60 as captured result information. In addition, the first communication unit 30 receives the imaging result information stored in the server 60.
 このように、移動ロボット1は、第1通信部30により他の移動ロボット1a等およびサーバ60と相互に通信を行う。これにより、複数の移動ロボット1およびサーバ60を含む移動式マニピュレータシステムが形成される。 As described above, the mobile robot 1 mutually communicates with the other mobile robots 1 a and the like and the server 60 by the first communication unit 30. Thereby, a mobile manipulator system including the plurality of mobile robots 1 and the server 60 is formed.
 図3および図4を用いて、移動ロボット1の動作の概要を説明する。図3に示すように、タイミング(期間)t1において、移動ロボット1は搬送対象の複数のワーク(対象物)2が載置されているストッカ3の前まで移動する。 An outline of the operation of the mobile robot 1 will be described with reference to FIGS. As shown in FIG. 3, at a timing (period) t1, the mobile robot 1 moves to a position in front of a stocker 3 on which a plurality of works (objects) 2 to be transported are placed.
 次に、タイミングt2において、移動ロボット1はカメラ14によってワーク2を撮像し、得られた撮像画像データから、画像解析部22によりワーク位置情報(対象物の位置情報)を生成する(図3における「画像解析」)。上記撮像画像データまたは上記ワーク位置情報は、撮像結果情報として別の移動ロボット1aと共有される。 Next, at timing t2, the mobile robot 1 captures an image of the workpiece 2 with the camera 14, and generates workpiece position information (position information of the target object) by the image analysis unit 22 from the obtained captured image data (see FIG. 3). "Image analysis"). The captured image data or the work position information is shared with another mobile robot 1a as imaging result information.
 なお、図4に示す参考例に係る移動ロボット101・101aは、上記撮像結果情報を別の移動ロボットと共有することができない点において、本実施形態に係る移動ロボット1・1aと異なる。 The mobile robots 101 and 101a according to the reference example shown in FIG. 4 differ from the mobile robots 1.1 and 1a according to the present embodiment in that the imaging result information cannot be shared with another mobile robot.
 次に、タイミングt3において、移動ロボット1は上記ワーク位置情報に基づいてワーク2の配置状態を認識し、ワーク2を把持する。その後、移動ロボット1はストッカ3を撮像する。移動ロボット1は、ワーク2を搬送しながら、撮像した撮像画像データから新たなワーク位置情報を生成する。上記撮像画像データまたは上記ワーク位置情報は、撮像結果情報として別の移動ロボット1aと共有される。 Next, at a timing t3, the mobile robot 1 recognizes the arrangement state of the work 2 based on the work position information and grips the work 2. Thereafter, the mobile robot 1 captures an image of the stocker 3. The mobile robot 1 generates new workpiece position information from captured image data while transporting the workpiece 2. The captured image data or the work position information is shared with another mobile robot 1a as imaging result information.
 タイミングt4からt6まで、移動ロボット1は同様にワーク2の搬送および画像解析を行う。また、移動ロボット1により生成されたワーク位置情報は、随時移動ロボット1aと共有される。 (4) From timing t4 to t6, the mobile robot 1 similarly carries the work 2 and analyzes the image. The work position information generated by the mobile robot 1 is shared with the mobile robot 1a as needed.
 ここで、タイミングt7の始めまでに、移動ロボット1aは移動ロボット1が搬送作業していたストッカ3の前まで移動する。タイミングt7からは、移動ロボット1に代わって移動ロボット1aがワーク2の搬送を行う。 Here, by the beginning of the timing t7, the mobile robot 1a moves to the position before the stocker 3 where the mobile robot 1 is carrying. From timing t7, the mobile robot 1a transfers the work 2 instead of the mobile robot 1.
 タイミングt7において、移動ロボット1aは、ワーク2の把持動作を行う前に、タイミングt6で移動ロボット1により生成された撮像結果情報を受信する。これにより、移動ロボット1aは、ストッカ3におけるワーク2の配置状態を認識することができる。したがって、移動ロボット1aは、ストッカ3を撮像してワーク位置情報を生成する必要なく、すぐにワーク2の搬送を開始することができる。 At timing t7, the mobile robot 1a receives the imaging result information generated by the mobile robot 1 at timing t6 before performing the gripping operation of the work 2. Thereby, the mobile robot 1a can recognize the arrangement state of the work 2 in the stocker 3. Therefore, the mobile robot 1a can immediately start transporting the work 2 without needing to image the stocker 3 and generate work position information.
 タイミングt8以降、移動ロボット1aは、所定の回数ワーク2を搬送すると共にストッカ3を撮像して画像解析を行い、撮像結果情報の共有を行う。 (4) After timing t8, the mobile robot 1a conveys the work 2 a predetermined number of times, captures an image of the stocker 3, analyzes the image, and shares the imaging result information.
 一方、参考例に係る移動ロボット101・101aによれば、タイミングt11からt16までは、移動ロボット1・1aと同様にワーク2の搬送を行う。 On the other hand, according to the mobile robots 101 and 101a according to the reference example, the work 2 is transported from the timing t11 to the time t16 in the same manner as the mobile robots 1 and 1a.
 しかし、移動ロボット101は、生成した撮像結果情報を別の移動ロボット101aと共有する機能を有さない。そのため、移動ロボット101aは、タイミングt17において、自らストッカ3を撮像し、画像解析を行う。タイミングt17では、移動ロボット101aはワーク2の搬送を行うことができず、動作を停止しなければならない。 However, the mobile robot 101 does not have a function of sharing the generated imaging result information with another mobile robot 101a. Therefore, the mobile robot 101a itself takes an image of the stocker 3 at timing t17 and performs image analysis. At timing t17, the mobile robot 101a cannot transfer the work 2 and must stop its operation.
 これに対して、本実施形態に係る移動ロボット1・1aでは、図4のタイミングt16での処理とタイミングt17での処理とを、図3のタイミングt6で行うことができる。このように、本実施形態に係る移動ロボット1・1aによれば、作業地点の移動に伴う、ストッカ3の撮像および画像解析に要する時間分(t17の期間分)、搬送作業時間を短縮できる。 On the other hand, in the mobile robots 1 and 1a according to the present embodiment, the processing at the timing t16 and the processing at the timing t17 in FIG. 4 can be performed at the timing t6 in FIG. As described above, according to the mobile robots 1 and 1a according to the present embodiment, it is possible to reduce the time required for the imaging and image analysis of the stocker 3 (for the period of t17) and the transfer work time in accordance with the movement of the work point.
 §2 構成例
 (ロボットアーム11)
 ロボットアーム11は、任意の方向に回転および屈曲可能に形成される関節部12を備える。ロボットアーム11が備える関節部12は、1つでもよく、複数でもよい。また、ロボットアーム11の一端は、筐体部15に、回転および屈曲可能な状態により取り付けられるように形成されている。
§2 Configuration example (Robot arm 11)
The robot arm 11 includes a joint 12 formed to be rotatable and bendable in an arbitrary direction. The robot arm 11 may include one or more joints 12. One end of the robot arm 11 is formed so as to be rotatable and bendable in the housing 15.
 ワーク把持部13は、ワーク2を真空吸着可能なバキュームパッドを備える。なお、ワーク把持部13はこれに限られず、例えば、挟んで把持するための1対の爪部を備えていてもよい。 (4) The work holding portion 13 includes a vacuum pad capable of vacuum-sucking the work 2. The work gripper 13 is not limited to this, and may include, for example, a pair of claws for sandwiching and gripping.
 カメラ14は、撮像対象を3次元的に認識可能な固体撮像素子およびレンズを備える。上記固体撮像素子は、例えば、CCD(Charge Coupled Device)またはCMOS(Complementary Metal-Oxide Semiconductor)イメージセンサ等が用いられてもよい。このような構成によれば、搬送するワーク2の空間的位置を正確に把握することができる。なお、カメラ14はこのような例に限られず、例えば、2次元画像の撮像のみ可能な固体撮像素子およびレンズを備えていてもよい。 The camera 14 includes a solid-state imaging device and a lens that can three-dimensionally recognize an imaging target. As the solid-state imaging device, for example, a CCD (Charge Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor may be used. According to such a configuration, the spatial position of the workpiece 2 to be conveyed can be accurately grasped. Note that the camera 14 is not limited to such an example, and may include, for example, a solid-state imaging device and a lens that can only capture a two-dimensional image.
 また、カメラ14はロボットアーム11の一端ではなく、移動ロボット1におけるいかなる位置に取り付けられていてもよい。例えば、筐体部15または無人搬送車50にカメラ14が取り付けられていてもよい。このような構成によれば、ロボットアーム11を動かしながら、カメラ14は静止した状態により、高精度にワーク2等を撮像することができる。 The camera 14 may be attached to any position of the mobile robot 1 instead of one end of the robot arm 11. For example, the camera 14 may be attached to the housing 15 or the automatic guided vehicle 50. According to such a configuration, while the robot arm 11 is moving, the camera 14 can capture an image of the workpiece 2 or the like with high accuracy in a stationary state.
 (制御装置20)
 制御装置20は、制御部21および第1記憶部40を備える。制御部21は、CPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)等を含み、各構成要素を制御する。第1記憶部40は、例えば、ハードディスクドライブ、ソリッドステートドライブ等の補助記憶装置であり、制御部21で実行される各種プログラム並びに、他のマニピュレータと共有する画像データおよび撮像結果情報等を記憶する。
(Control device 20)
The control device 20 includes a control unit 21 and a first storage unit 40. The control unit 21 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and controls each component. The first storage unit 40 is, for example, an auxiliary storage device such as a hard disk drive or a solid state drive, and stores various programs executed by the control unit 21, image data shared with other manipulators, imaging result information, and the like. .
 画像解析部22は、カメラ14が撮像した撮像画像データを取得し、当該撮像画像データを解析する。画像解析部22は、まず、上記撮像画像データに搬送対象であるワーク2が存在するか否かの判定を行う。ワーク2が存在すると判定した場合、画像解析部22はワーク位置情報を生成する。このような構成によれば、移動ロボット1は、自ら撮像した撮像画像データから、ワーク位置情報を生成することができる。 (4) The image analysis unit 22 acquires captured image data captured by the camera 14 and analyzes the captured image data. The image analysis unit 22 first determines whether or not the workpiece 2 to be transported exists in the captured image data. If it is determined that the work 2 exists, the image analysis unit 22 generates work position information. According to such a configuration, the mobile robot 1 can generate work position information from captured image data captured by itself.
 画像解析部22が生成した撮像結果情報は、第1通信部30および第1記憶部40に出力されてもよい。また、上記撮像結果情報は、ロボットアーム制御部23および無人搬送車制御部24に取得されてもよい。 The imaging result information generated by the image analysis unit 22 may be output to the first communication unit 30 and the first storage unit 40. The imaging result information may be acquired by the robot arm control unit 23 and the automatic guided vehicle control unit 24.
 ロボットアーム制御部23は、ロボットアーム11の動作を制御する。例えば、上記撮像結果情報等に基づいて、ロボットアーム11の動作を制御しワーク2を把持および搬送したり、ロボットアーム11が備えるカメラ14の位置および方向を調整したりする。 The robot arm control unit 23 controls the operation of the robot arm 11. For example, based on the imaging result information or the like, the operation of the robot arm 11 is controlled to grip and transport the work 2, and the position and direction of the camera 14 provided in the robot arm 11 are adjusted.
 無人搬送車制御部24は、無人搬送車50の動作を制御し、移動ロボット1を所定の地点に移動させる。また、無人搬送車制御部24は、上記撮像結果情報等に基づいて無人搬送車50の動作を制御し、移動ロボット1の位置を微調整してもよい。また、後述するサーバ60が備える情報管理制御部62から送信されるストッカ3の位置情報等によって、移動ロボット1を搬送対象のワーク2が載置されているストッカ3の前まで移動させる。 The automatic guided vehicle control unit 24 controls the operation of the automatic guided vehicle 50 and moves the mobile robot 1 to a predetermined point. Further, the automatic guided vehicle control unit 24 may control the operation of the automatic guided vehicle 50 based on the imaging result information or the like, and finely adjust the position of the mobile robot 1. Further, the mobile robot 1 is moved to a position in front of the stocker 3 on which the work 2 to be transported is placed, based on the position information of the stocker 3 transmitted from the information management control unit 62 provided in the server 60 described later.
 カメラ制御部25は、カメラ14の撮像動作を制御する。 The camera control unit 25 controls the imaging operation of the camera 14.
 (第1通信部30)
 第1通信部30は、サーバ60等との通信を行う。第1通信部30は、ケーブルなどを利用した有線通信を行ってもよいし、無線通信を行ってもよい。また、第1通信部30は、別の移動ロボット1aが備える第1通信部と相互に通信を行ってもよい。
(First communication unit 30)
The first communication unit 30 performs communication with the server 60 and the like. The first communication unit 30 may perform wired communication using a cable or the like, or may perform wireless communication. Further, the first communication unit 30 may communicate with the first communication unit of another mobile robot 1a.
 (無人搬送車50)
 無人搬送車50は、ホイール51を備える。ホイール51の動作が無人搬送車制御部24により制御されることで、無人搬送車50は、任意の地点に移動することができる。そのため、無人搬送車50に載置されるマニピュレータ10は、ロボットアーム11が届かない距離にある、離れた位置にワーク2を搬送することができる。また、マニピュレータ10が作業する地点を適宜変更することができる。
(Automated guided vehicle 50)
The automatic guided vehicle 50 includes a wheel 51. The operation of the wheel 51 is controlled by the automatic guided vehicle control unit 24, so that the automatic guided vehicle 50 can move to an arbitrary point. Therefore, the manipulator 10 placed on the automatic guided vehicle 50 can transport the work 2 to a distant position that is not reachable by the robot arm 11. Further, the point where the manipulator 10 works can be changed as appropriate.
 なお、マニピュレータ10が移動ロボット1として移動可能に構成されるために載置されるのは、無人搬送車50に限られない。例えば、飛行可能な無人航空機(ドローン)であってもよいし、手動により移動される台車等であってもよい。 The manipulator 10 is not limited to the automatic guided vehicle 50 because the manipulator 10 is configured to be movable as the mobile robot 1. For example, it may be an unmanned aerial vehicle (drone) that can fly, or a bogie that is manually moved.
 (サーバ60)
 サーバ60は、第2通信部61、情報管理制御部62および第2記憶部63を備える。情報管理制御部62は、移動ロボット1の状態変化に応じた第2記憶部63への書き込み制御(移動ロボット1の移動に応じてデータベースを更新する等)、および、第2通信部61を介した通信制御(移動ロボット1の移動に応じた情報送信等)を行う。
(Server 60)
The server 60 includes a second communication unit 61, an information management control unit 62, and a second storage unit 63. The information management control unit 62 controls writing to the second storage unit 63 according to a change in the state of the mobile robot 1 (such as updating a database according to movement of the mobile robot 1), and via the second communication unit 61. Communication control (such as information transmission according to the movement of the mobile robot 1).
 サーバ60は、例えば、第2通信部61により移動ロボット1が生成した撮像結果情報を受信し、当該撮像結果情報を第2記憶部63に記憶する。当該撮像結果情報は、別の移動ロボット1a等に随時送信される。これにより、上記撮像結果情報を複数の移動ロボット1間で共有させることができる。 The server 60 receives, for example, the imaging result information generated by the mobile robot 1 by the second communication unit 61 and stores the imaging result information in the second storage unit 63. The imaging result information is transmitted to another mobile robot 1a or the like as needed. Thereby, the imaging result information can be shared among the plurality of mobile robots 1.
 このように、複数の移動ロボット1と、サーバ60とを含む移動式マニピュレータシステムが形成されることで、移動ロボット1の動作が円滑に制御されることができる。 As described above, by forming the mobile manipulator system including the plurality of mobile robots 1 and the server 60, the operation of the mobile robot 1 can be smoothly controlled.
 §3 動作例
 図5は、本実施形態に係る制御部21の処理の流れの一例を示すフローチャートである。図5を用いて、制御部21の処理の流れの一例について説明する。
§3 Operation Example FIG. 5 is a flowchart illustrating an example of a processing flow of the control unit 21 according to the present embodiment. An example of a processing flow of the control unit 21 will be described with reference to FIG.
 まず、無人搬送車制御部24は、移動ロボット1を搬送対象となるワーク2が載置されているストッカ3の前(指定地点)まで移動させる(S1)。 First, the automatic guided vehicle controller 24 moves the mobile robot 1 to a position (designated point) in front of the stocker 3 on which the work 2 to be transferred is placed (S1).
 第1通信部30は、搬送対象のワーク2を含むストッカ3の位置等を指定して、別のマニピュレータ10aによって生成されたストッカ3に関する撮像結果情報(第2撮像結果情報)をサーバ60から取得する。次に、第1通信部30またはロボットアーム制御部23は、現在のストッカ3の撮像結果情報をサーバ60から受信できるか否かを判定する(S2)。本動作例において、第1通信部30とサーバ60との間で送受信される撮像結果情報は、画像解析部22によって撮像画像データから生成されるワーク位置情報である。 The first communication unit 30 specifies the position and the like of the stocker 3 including the work 2 to be transported, and obtains imaging result information (second imaging result information) on the stocker 3 generated by another manipulator 10a from the server 60. I do. Next, the first communication unit 30 or the robot arm control unit 23 determines whether the current imaging result information of the stocker 3 can be received from the server 60 (S2). In the present operation example, the imaging result information transmitted and received between the first communication unit 30 and the server 60 is work position information generated from the captured image data by the image analysis unit 22.
 ワーク位置情報が受信できなかった場合、または受信したワーク位置情報が利用できない情報である場合(S2でNO)、ロボットアーム制御部23は、ロボットアーム11を、ワーク2を含むストッカ3を撮像するための位置へ移動させる(S3)。 If the work position information has not been received or if the received work position information is unavailable (NO in S2), the robot arm control unit 23 images the robot arm 11 with the stocker 3 including the work 2. (S3).
 ワーク位置情報が受信できない場合としては、搬送対象のワーク2を含む撮像結果情報が存在しない場合が考えられる。 で き な い As a case where the work position information cannot be received, it is conceivable that there is no imaging result information including the work 2 to be transported.
 受信したワーク位置情報が利用できない情報である場合としては、例えば、対象のワーク2が、受信したワーク位置情報が示す位置に存在しない場合が考えられる。例えば、直近に生成されたストッカ3のワーク位置情報よりも後に、ストッカ3上のワーク2の配置が変更される場合がある。ロボットアーム制御部23は、ロボットアーム11によってワーク位置情報が示すワーク2を掴むことができなかった場合、受信したワーク位置情報が利用できない情報であると判定する。 As a case where the received work position information is information that cannot be used, for example, a case where the target work 2 does not exist at the position indicated by the received work position information can be considered. For example, the arrangement of the work 2 on the stocker 3 may be changed after the work position information of the stocker 3 generated most recently. When the robot arm 11 cannot grasp the work 2 indicated by the work position information, the robot arm control unit 23 determines that the received work position information is information that cannot be used.
 そして、カメラ制御部25は、カメラ14にストッカ3を撮像させる(S4)。画像解析部22は、撮像された撮像画像データの画像解析処理を行い(S5)、ストッカ3におけるワーク位置情報を生成する(S6)。 Then, the camera control unit 25 causes the camera 14 to image the stocker 3 (S4). The image analysis unit 22 performs an image analysis process on the captured image data (S5), and generates work position information in the stocker 3 (S6).
 このように、マニピュレータ10は、ワーク2の位置を検出するために適切なワーク位置情報が受信できない場合でも、自らワーク位置情報を生成することができる。 As described above, the manipulator 10 can generate the work position information by itself even when appropriate work position information for detecting the position of the work 2 cannot be received.
 次に、サーバ60から受信したワーク位置情報に基づいて(S2でYES)、またはステップS6において画像解析部22により生成されたワーク位置情報に基づいて、ロボットアーム制御部23がロボットアーム11の動作を制御し、ワーク2の把持動作を行う(S7)。 Next, based on the work position information received from the server 60 (YES in S2) or based on the work position information generated by the image analysis unit 22 in step S6, the robot arm control unit 23 operates the robot arm 11 To perform the gripping operation of the work 2 (S7).
 ここで、ワーク2が把持された後に、ロボットアーム制御部23は、ロボットアーム11を、ワーク2を含むストッカ3を撮像するための位置へ移動させる(S8)。そして、カメラ制御部25は、カメラ14にストッカ3を撮像させる(S9)。 Here, after the work 2 is gripped, the robot arm control unit 23 moves the robot arm 11 to a position for imaging the stocker 3 including the work 2 (S8). Then, the camera control unit 25 causes the camera 14 to image the stocker 3 (S9).
 このような構成によれば、ロボットアーム11によってワーク2が把持された後における、他のワーク2の配置状態を撮像することができる。そのため、次の把持動作のために最適なワーク位置情報を生成することができる。 According to such a configuration, the arrangement state of another work 2 after the work 2 is gripped by the robot arm 11 can be imaged. Therefore, it is possible to generate optimum work position information for the next gripping operation.
 次に、ロボットアーム制御部23は、ワーク2の搬送を開始する。また、当該搬送開始と略同時に、画像解析部22は、撮像された画像データの画像解析処理を開始し(S10)、ストッカ3におけるワーク位置情報(第1撮像結果情報)を生成する(S11)。当該ワーク位置情報は、第1通信部30に出力され、第1通信部30からサーバ60に送信される(S12)。このとき、上記ワーク位置情報は、第1記憶部40にも出力され、第1記憶部40に記憶されてもよい。 (4) Next, the robot arm control unit 23 starts the transfer of the work 2. At substantially the same time as the start of the conveyance, the image analysis unit 22 starts an image analysis process of the captured image data (S10), and generates work position information (first imaging result information) in the stocker 3 (S11). . The work position information is output to the first communication unit 30 and transmitted from the first communication unit 30 to the server 60 (S12). At this time, the work position information may also be output to the first storage unit 40 and stored in the first storage unit 40.
 上記ワーク位置情報の送信完了後、ワーク2の搬送が完了する(S13)。なお、上記送信完了と略同時、または上記ワーク位置情報の送信完了より前に、ワーク2の搬送が完了してもよい。次に、制御部21は、サーバ60が備える情報管理制御部62から送信されるワーク2の搬送計画情報等により、移動ロボット1が別地点に移動するか否かを判定する(S14)。移動ロボット1が移動せず、同じ位置で搬送作業を継続する場合(S14でNO)、ステップS2に戻り、改めて一連の処理が行われる。 After the transmission of the work position information is completed, the transfer of the work 2 is completed (S13). The transfer of the work 2 may be completed substantially simultaneously with the completion of the transmission or before the completion of the transmission of the work position information. Next, the control unit 21 determines whether or not the mobile robot 1 moves to another point based on the transfer plan information of the work 2 transmitted from the information management control unit 62 provided in the server 60 (S14). If the mobile robot 1 does not move and continues the transfer operation at the same position (NO in S14), the process returns to step S2, and a series of processing is performed again.
 このとき、ステップS2におけるワーク位置情報の受信は、ステップS11で自ら生成したワーク位置情報を第1記憶部40から取得してもよいし、ステップS12で自ら送信したワーク位置情報を、改めてサーバ60から受信してもよい。また、複数の移動ロボット1により同じストッカ3上のワーク2を搬送する場合、ストッカ3における最新のワーク位置情報を、サーバ60から受信してもよい。 At this time, the work position information received in step S2 may be obtained from the first storage unit 40 by the work position information generated by itself in step S11, or the work position information transmitted by itself in step S12 may be renewed by the server 60. May be received. When the plurality of mobile robots 1 transport the work 2 on the same stocker 3, the latest work position information in the stocker 3 may be received from the server 60.
 一方、移動ロボット1が別地点に移動する場合(S14でYES)、無人搬送車制御部24は、移動ロボット1を次の搬送対象となるワーク2が載置されているストッカ3の前(別地点)まで移動させ、改めて一連の処理が行われる。 On the other hand, when the mobile robot 1 moves to another location (YES in S14), the automatic guided vehicle control unit 24 moves the mobile robot 1 to a position before the stocker 3 on which the workpiece 2 to be transported next is placed. (Point), and a series of processing is performed again.
 以上のように、移動ロボット1が作業地点を移動したときに、当該移動後の作業地点で作業していた別の移動ロボット1aがいた場合、移動ロボット1は、移動ロボット1aが生成したワーク位置情報を受信し利用できる。したがって、移動ロボット1は、移動した後に、移動後の作業地点におけるワーク2の位置について、受信したワーク位置情報により認識することができる。そのため、新しい作業地点に移動した直後でも、自らワーク2の撮像およびワーク位置情報の生成を行うことなく、すぐにワーク2の搬送を開始できる。 As described above, when the mobile robot 1 moves to the work point and there is another mobile robot 1a working at the work point after the movement, the mobile robot 1 moves to the work position generated by the mobile robot 1a. Receive and use information. Therefore, after moving, the mobile robot 1 can recognize the position of the work 2 at the work point after the movement based on the received work position information. Therefore, immediately after moving to the new work point, the transfer of the work 2 can be started immediately without performing the imaging of the work 2 and generating the work position information by itself.
 したがって、ワーク2の撮像およびワーク位置情報の生成に要する時間分、搬送作業時間を短縮できる。このような搬送作業時間の短縮効果は、移動ロボット1の移動が頻繁になるほど大きくなる。 {Accordingly, the transport work time can be reduced by the time required for imaging the work 2 and generating the work position information. The effect of shortening the transfer operation time increases as the movement of the mobile robot 1 increases.
 〔実施形態2〕
 本開示の他の実施形態について、以下に説明する。なお、説明の便宜上、上記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を繰り返さない。
[Embodiment 2]
Another embodiment of the present disclosure will be described below. For convenience of description, members having the same functions as those described in the above embodiment are denoted by the same reference numerals, and description thereof will not be repeated.
 §1 構成例
 図6は、本実施形態に係る移動ロボット1の要部構成を例示するブロック図である。図6を用いて、本実施形態に係る移動ロボット1の構成例について説明する。
§1 Configuration Example FIG. 6 is a block diagram illustrating a main configuration of the mobile robot 1 according to the present embodiment. A configuration example of the mobile robot 1 according to the present embodiment will be described with reference to FIG.
 本実施形態に係る移動ロボット1は、制御部21が画像解析部22を備えず、代わりに撮像結果情報処理部22aを備える点において、実施形態1に係る移動ロボット1と異なる。 移動 The mobile robot 1 according to the present embodiment is different from the mobile robot 1 according to the first embodiment in that the control unit 21 does not include the image analysis unit 22 but includes an imaging result information processing unit 22a instead.
 撮像結果情報処理部22aは、カメラ14により撮像された撮像画像データを第1通信部30に出力し、サーバ60から送信されたワーク位置情報をロボットアーム制御部23および無人搬送車制御部24に出力する。しかし、画像解析部22のように、撮像画像データの解析は行わない。したがって、カメラ14が撮像した撮像画像データは、画像解析が行われず、当該撮像画像データ自体がサーバ60に送信される。 The imaging result information processing unit 22a outputs the image data captured by the camera 14 to the first communication unit 30, and transmits the work position information transmitted from the server 60 to the robot arm control unit 23 and the automatic guided vehicle control unit 24. Output. However, unlike the image analysis unit 22, the analysis of the captured image data is not performed. Therefore, the captured image data captured by the camera 14 is not subjected to image analysis, and the captured image data itself is transmitted to the server 60.
 この場合、サーバ60が備える画像解析部64は、受信した撮像画像データの解析を行い、ワーク2の位置情報を含むワーク位置情報を生成する。サーバ60により生成されたワーク位置情報は、サーバ60が備える第2通信部61から移動ロボット1に送信される。なお、サーバ60が受信した撮像画像データおよびサーバ60が生成したワーク位置情報は、第2記憶部63に記憶される。 In this case, the image analysis unit 64 included in the server 60 analyzes the received captured image data and generates work position information including the position information of the work 2. The work position information generated by the server 60 is transmitted to the mobile robot 1 from the second communication unit 61 provided in the server 60. Note that the captured image data received by the server 60 and the work position information generated by the server 60 are stored in the second storage unit 63.
 すなわち、本実施形態において、移動ロボット1からサーバ60に送信される撮像結果情報は撮像画像データ自体であり、サーバ60から移動ロボット1に送信される撮像結果情報は、当該撮像画像データから生成されるワーク位置情報である。 That is, in the present embodiment, the imaging result information transmitted from the mobile robot 1 to the server 60 is the captured image data itself, and the imaging result information transmitted from the server 60 to the mobile robot 1 is generated from the captured image data. Work position information.
 制御部21が画像解析部22を備える構成によれば、移動ロボット1が自ら撮像画像データを解析し、ワーク2の位置情報を含む撮像結果情報を生成できる。一方、制御部21が画像解析部22を備えない構成によれば、制御部21にかかる負荷を低減することができる。 According to the configuration in which the control unit 21 includes the image analysis unit 22, the mobile robot 1 can analyze the captured image data by itself and generate the imaging result information including the position information of the work 2. On the other hand, according to the configuration in which the control unit 21 does not include the image analysis unit 22, the load on the control unit 21 can be reduced.
 §2 動作例
 図7は、本実施形態に係る制御部21の処理の流れの一例を示すフローチャートである。図7を用いて、制御部21の処理の流れの一例について説明する。
§2 Operation Example FIG. 7 is a flowchart illustrating an example of the flow of processing of the control unit 21 according to the present embodiment. An example of a processing flow of the control unit 21 will be described with reference to FIG.
 無人搬送車制御部24は、移動ロボット1を搬送対象となるワーク2が載置されているストッカ3の前(指定地点)まで移動させる(S21)。マニピュレータ10の第1通信部30は、搬送対象のワーク2を含むストッカ3の位置等を指定して、サーバ60にワーク位置情報を要求する(S22)。 (4) The automatic guided vehicle control unit 24 moves the mobile robot 1 to a position (designated point) in front of the stocker 3 on which the work 2 to be transferred is placed (S21). The first communication unit 30 of the manipulator 10 specifies the position of the stocker 3 including the work 2 to be transported, and requests the server 60 for work position information (S22).
 サーバ60の情報管理制御部62は、当該マニピュレータ10の撮像画像データに基づいて生成されたストッカ3に関するワーク位置情報(第1撮像結果情報)または別のマニピュレータ10aの撮像画像データに基づいて生成されたストッカ3に関するワーク位置情報(第2撮像結果情報)を、第2記憶部63から取得し、第2通信部61に出力する。第2通信部61は、マニピュレータ10の第1通信部30に、ワーク位置情報を送信する(S41)。 The information management control unit 62 of the server 60 is generated based on work position information (first imaging result information) on the stocker 3 generated based on the captured image data of the manipulator 10 or captured image data of another manipulator 10a. Work position information (second imaging result information) on the stocker 3 obtained is acquired from the second storage unit 63 and output to the second communication unit 61. The second communication unit 61 transmits the work position information to the first communication unit 30 of the manipulator 10 (S41).
 マニピュレータ10の第1通信部30は、ワーク位置情報を受信する(S23)。次に、第1通信部30またはロボットアーム制御部23は、サーバ60から、現在のストッカ3のワーク位置情報を受信できたか否かを判定する(S24)。 The first communication unit 30 of the manipulator 10 receives the work position information (S23). Next, the first communication unit 30 or the robot arm control unit 23 determines whether the current work position information of the stocker 3 has been received from the server 60 (S24).
 ワーク位置情報が受信できなかった場合、または受信したワーク位置情報が利用できない情報である場合(S24でNO)、ロボットアーム制御部23は、ロボットアーム11を、ワーク2を含むストッカ3を撮像するための位置へ移動させる(S25)。 If the work position information could not be received, or if the received work position information is information that cannot be used (NO in S24), the robot arm control unit 23 images the robot arm 11 with the stocker 3 including the work 2. (S25).
 次に、カメラ制御部25は、カメラ14にストッカ3を撮像させる(S26)。撮像された撮像画像データは、第1通信部30によりサーバ60に送信される(S27)。 Next, the camera control unit 25 causes the camera 14 to image the stocker 3 (S26). The captured image data is transmitted to the server 60 by the first communication unit 30 (S27).
 次に、サーバ60は、撮像画像データを受信し(S42)、画像解析部64によって画像解析処理を行う(S43)。これにより、サーバ60が受信した撮像画像データに基づくワーク位置情報が生成される(S44)。 Next, the server 60 receives the captured image data (S42), and performs image analysis processing by the image analysis unit 64 (S43). Thereby, the work position information based on the captured image data received by the server 60 is generated (S44).
 このような構成によれば、撮像画像データからワーク位置情報を算出する処理を、サーバ60に行わせることができる。そのため、制御部21が画像解析部22を備える必要がなく、制御部21にかかる負荷を低減することができる。 According to such a configuration, it is possible to cause the server 60 to perform the process of calculating the work position information from the captured image data. Therefore, the control unit 21 does not need to include the image analysis unit 22, and the load on the control unit 21 can be reduced.
 このように、マニピュレータ10は、ワーク2の位置を検出するために適切なワーク位置情報が受信できない場合、ワーク2が載置されたストッカ3を撮像し、サーバ60に送信することで、ワーク位置情報を取得することができる。 As described above, when the manipulator 10 cannot receive appropriate work position information to detect the position of the work 2, the manipulator 10 takes an image of the stocker 3 on which the work 2 is placed, and transmits the image to the server 60, thereby obtaining the work position. Information can be obtained.
 次に、サーバ60からワーク位置情報を受信したら(S24でYES)、受信したワーク位置情報に基づいて、ロボットアーム制御部23がロボットアーム11の動作を制御し、ワーク2の把持動作が行われる(S28)。 Next, when the work position information is received from the server 60 (YES in S24), the robot arm controller 23 controls the operation of the robot arm 11 based on the received work position information, and the work 2 is gripped. (S28).
 ここで、ワーク2が把持された後に、ロボットアーム制御部23は、ロボットアーム11を、ワーク2を含むストッカ3を撮像するための位置へ移動させる(S29)。そして、カメラ制御部25は、カメラ14にストッカ3が撮像させる(S30)。 Here, after the work 2 is gripped, the robot arm control unit 23 moves the robot arm 11 to a position for imaging the stocker 3 including the work 2 (S29). Then, the camera control unit 25 causes the stocker 3 to take an image of the camera 14 (S30).
 次に、ロボットアーム制御部23は、ワーク2の搬送を開始する。また、当該搬送の間に、第1通信部30は、撮像された撮像画像データをサーバ60に送信する(S31)。このとき、上記撮像画像データは、第1記憶部40にも出力され、第1記憶部40に記憶されてもよい。 (4) Next, the robot arm control unit 23 starts the transfer of the work 2. Further, during the transport, the first communication unit 30 transmits the captured image data to the server 60 (S31). At this time, the captured image data may also be output to the first storage unit 40 and stored in the first storage unit 40.
 上記撮像画像データの送信完了後または上記送信完了と略同時に、ワーク2の搬送が完了する(S32)。次に、制御部21は、サーバ60が備える情報管理制御部62により送信されるワーク2の搬送計画情報等により、移動ロボット1が別地点に移動するか否かを判定する(S33)。移動ロボット1が移動せず、同じ地点で搬送作業を継続する場合(S33でNO)、ステップS22に戻り、改めて一連の処理が行われる。 (4) After the transmission of the captured image data is completed or almost simultaneously with the completion of the transmission, the transport of the work 2 is completed (S32). Next, the control unit 21 determines whether or not the mobile robot 1 moves to another point based on the transfer plan information of the work 2 transmitted by the information management control unit 62 provided in the server 60 (S33). When the mobile robot 1 does not move and continues the transport operation at the same point (NO in S33), the process returns to step S22, and a series of processes is performed again.
 一方、移動ロボット1が別地点に移動する場合(S33でYES)、無人搬送車制御部24は、移動ロボット1を次の搬送対象となるワーク2が載置されているストッカ3の前(別地点)まで移動させ(S21)、改めて一連の処理が行われる。 On the other hand, when the mobile robot 1 moves to another location (YES in S33), the automatic guided vehicle control unit 24 moves the mobile robot 1 in front of the stocker 3 on which the workpiece 2 to be transported next is placed (another location). (S21), and a series of processing is performed again.
 〔ソフトウェアによる実現例〕
 制御部21の制御ブロック(特に画像解析部22)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、ソフトウェアによって実現してもよい。
[Example of software implementation]
The control block of the control unit 21 (in particular, the image analysis unit 22) may be realized by a logic circuit (hardware) formed on an integrated circuit (IC chip) or the like, or may be realized by software.
 後者の場合、制御部21は、各機能を実現するソフトウェアであるプログラムの命令を実行するコンピュータを備えている。このコンピュータは、例えば1つ以上のプロセッサを備えていると共に、上記プログラムを記憶したコンピュータ読み取り可能な記録媒体を備えている。そして、上記コンピュータにおいて、上記プロセッサが上記プログラムを上記記録媒体から読み取って実行することにより、本開示の目的が達成される。上記プロセッサとしては、例えばCPU(Central Processing Unit)を用いることができる。上記記録媒体としては、「一時的でない有形の媒体」、例えば、ROM(Read Only Memory)等の他、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムを展開するRAM(Random Access Memory)などをさらに備えていてもよい。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本開示の一態様は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the control unit 21 includes a computer that executes instructions of a program that is software for realizing each function. The computer includes, for example, one or more processors and a computer-readable recording medium storing the program. Then, in the computer, the object of the present disclosure is achieved by the processor reading the program from the recording medium and executing the program. As the processor, for example, a CPU (Central Processing Unit) can be used. Examples of the recording medium include "temporary tangible media" such as ROM (Read Only Memory), tapes, disks, cards, semiconductor memories, and programmable logic circuits. Further, a RAM (Random Access Memory) for expanding the above program may be further provided. Further, the program may be supplied to the computer via an arbitrary transmission medium (a communication network, a broadcast wave, or the like) capable of transmitting the program. Note that one embodiment of the present disclosure can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
 〔まとめ〕
 本開示の一態様に係る移動式マニピュレータは、無人搬送車に載置される移動式マニピュレータであって、対象物に対する把持動作を行うロボットアームと、前記ロボットアームに設けられた撮像部と、前記撮像部による撮像結果に基づく撮像結果情報を、通信によって外部装置に送信するとともに、前記外部装置から、該外部装置に記憶されている撮像結果情報を受信し、前記撮像結果情報に基づいて前記ロボットアームの動作を制御するロボットアーム制御部と、を備える。
[Summary]
A mobile manipulator according to an aspect of the present disclosure is a mobile manipulator mounted on an automatic guided vehicle, a robot arm that performs a gripping operation on an object, an imaging unit provided on the robot arm, The imaging result information based on the imaging result by the imaging unit is transmitted to an external device by communication, the imaging result information stored in the external device is received from the external device, and the robot is configured based on the imaging result information. A robot arm control unit for controlling the operation of the arm.
 上記の構成によれば、複数の移動式マニピュレータの間で撮像結果情報を共有することができる。そのため、移動式マニピュレータが作業地点を移動したときに、当該移動後の作業地点で作業していた別の移動式マニピュレータがいた場合、当該別の移動式マニピュレータが生成した撮像結果情報を受信し利用できる。そのため、移動式マニピュレータは、対象物の位置を検出するために、自ら対象物の撮像および撮像結果情報の生成を行う必要がない。 According to the above configuration, the imaging result information can be shared among a plurality of mobile manipulators. Therefore, when the mobile manipulator moves to the work point and there is another mobile manipulator working at the work point after the movement, the imaging result information generated by the another mobile manipulator is received and used. it can. Therefore, the mobile manipulator does not need to image the target object and generate the imaging result information by itself in order to detect the position of the target object.
 なお、上記撮像結果情報は、撮像部により撮像された撮像画像データ(撮像結果)の解析結果である対象物の位置情報でもよいし、上記撮像画像データそのものでもよい。また、上記外部装置は、後述するサーバでもよいし、他の移動式マニピュレータでもよい。特定の移動式マニピュレータがサーバ機能を有していてもよいし、複数の移動式マニピュレータの間で上記撮像結果情報を相互に送受信してもよい。 Note that the imaging result information may be position information of a target which is an analysis result of the imaging image data (imaging result) imaged by the imaging unit, or the imaging image data itself. Further, the external device may be a server described below or another mobile manipulator. A specific mobile manipulator may have a server function, and the imaging result information may be mutually transmitted and received between a plurality of mobile manipulators.
 前記制御部は、前記撮像部による撮像画像データを解析し、前記対象物の位置情報を前記撮像結果情報として算出する画像解析部を備えていてもよい。上記の構成によれば、移動式マニピュレータは、自ら撮像画像データから対象物の位置情報を算出し、当該位置情報に基づいてロボットアームによる把持動作を制御することができる。 The control unit may include an image analysis unit that analyzes image data captured by the imaging unit and calculates position information of the object as the imaging result information. According to the above configuration, the mobile manipulator can calculate the position information of the target object from the captured image data and control the gripping operation by the robot arm based on the position information.
 前記制御部は、前記撮像部による撮像画像データを前記撮像結果情報として前記外部装置に送信してもよい。上記の構成によれば、撮像画像データから対象物の位置情報を算出する処理を、サーバ等に行わせることができる。そのため、制御部が画像解析部を備える必要がなく、制御部にかかる負荷を低減することができる。 The control unit may transmit image data obtained by the imaging unit to the external device as the imaging result information. According to the above configuration, it is possible to cause a server or the like to perform a process of calculating position information of a target object from captured image data. Therefore, the control unit does not need to include the image analysis unit, and the load on the control unit can be reduced.
 前記撮像部は、前記ロボットアームによる前記把持動作が完了した後に撮像を行ってもよい。上記の構成によれば、ロボットアームによって対象物が把持された後における、他の対象物の配置状態を撮像することができる。そのため、次の把持動作のために最適な撮像結果情報を生成し、他の移動式マニピュレータと共有することができる。 The imaging unit may perform imaging after the gripping operation by the robot arm is completed. According to the configuration, it is possible to image the arrangement state of another target object after the target object is gripped by the robot arm. Therefore, optimal imaging result information for the next gripping operation can be generated and shared with another mobile manipulator.
 換言すれば、撮像結果情報を受信した移動式マニピュレータは、撮像結果情報から、前回搬送された対象物の情報を除外する処理を行う必要がない。そのため、撮像結果情報からロボットアーム動作を制御するための処理を簡略化できる。 In other words, the mobile manipulator that has received the imaging result information does not need to perform a process of excluding information on the previously transported object from the imaging result information. Therefore, processing for controlling the operation of the robot arm from the imaging result information can be simplified.
 前記制御部は、前記ロボットアームが前記把持動作を行う前に、前記外部装置から、前記把持動作を行う地点に対応する前記撮像結果情報を取得し、該撮像結果情報に基づいて前記対象物の位置を認識して前記ロボットアームによる前記把持動作を制御してもよい。 The control unit acquires the imaging result information corresponding to a point at which the gripping operation is performed from the external device before the robot arm performs the gripping operation, and acquires the imaging result information of the target object based on the imaging result information. The gripping operation by the robot arm may be controlled by recognizing a position.
 上記の構成によれば、移動式マニピュレータが移動した後に、移動後の地点における対象物の位置を、受信した撮像結果情報から認識することができる。そのため、自ら対象物の撮像および撮像結果情報の生成を行う工程をスキップして、すぐに対象物の搬送を開始できる。したがって、対象物の撮像および撮像結果情報の生成に要する時間分、搬送作業時間を短縮できる。 According to the above configuration, after the movable manipulator has moved, the position of the target at the point after the movement can be recognized from the received imaging result information. Therefore, it is possible to skip the step of capturing the target object and generating the imaging result information by itself, and immediately start transporting the target object. Therefore, it is possible to reduce the transport work time by the time required for imaging the target object and generating the imaging result information.
 前記制御部は、前記ロボットアームが把持動作を行う前に、前記外部装置から前記撮像結果情報を取得できなかった場合に、前記撮像部に撮像を行わせるとともに、前記撮像部による撮像画像データに基づく前記対象物の位置情報を取得し、前記位置情報に基づいて前記ロボットアームによる前記把持動作を制御してもよい。 The control unit, before performing the gripping operation of the robot arm, if the imaging result information could not be obtained from the external device, and causes the imaging unit to perform imaging, and the captured image data by the imaging unit And acquiring the position information of the object based on the position information and controlling the gripping operation by the robot arm based on the position information.
 前記制御部は、前記ロボットアームが把持動作を行う前に前記外部装置から取得した前記撮像結果情報が利用できない情報であった場合に、前記撮像部に撮像を行わせるとともに、前記撮像部による撮像画像に基づく前記対象物の位置情報を取得し、前記位置情報に基づいて前記ロボットアームによる前記把持動作を制御してもよい。 The control unit causes the imaging unit to perform imaging when the imaging result information acquired from the external device before the robot arm performs a gripping operation is information that cannot be used, and performs imaging by the imaging unit. Position information of the object based on an image may be acquired, and the gripping operation by the robot arm may be controlled based on the position information.
 上記の構成によれば、対象物の位置を検出するために適切な撮像結果情報が存在しない場合には、移動式マニピュレータが自ら対象物の位置情報を取得して、ロボットアームによる把持動作を制御できる。なお、撮像画像データに基づく対象物の位置情報を取得するために、移動式マニピュレータが自ら画像解析を行ってもよいし、サーバ等に撮像画像データを送信して、当該サーバ等から解析結果を受信してもよい。 According to the above configuration, if there is no appropriate imaging result information for detecting the position of the target, the mobile manipulator acquires the position information of the target by itself and controls the gripping operation by the robot arm. it can. In addition, in order to acquire the position information of the target object based on the captured image data, the mobile manipulator may perform image analysis by itself, or transmit the captured image data to a server or the like, and analyze the analysis result from the server or the like. You may receive it.
 また、本開示の一態様に係る移動ロボットは、上述の何れかの移動式マニピュレータと前記無人搬送車とを備える。上記の構成によれば、移動式マニピュレータを無人搬送車により移動させることができる。 移動 Further, a mobile robot according to an aspect of the present disclosure includes any of the above-described mobile manipulators and the automatic guided vehicle. According to the above configuration, the mobile manipulator can be moved by the automatic guided vehicle.
 また、本開示の一態様に係るサーバは、上述の何れかの移動式マニピュレータとの間で前記撮像結果情報の送受信を行う。 サ ー バ Further, the server according to an aspect of the present disclosure transmits and receives the imaging result information to and from any of the mobile manipulators described above.
 また、本開示の一態様に係る移動式マニピュレータシステムは、上述の何れかの移動式マニピュレータと、前記サーバとを備える。上記の構成によれば、複数の移動式マニピュレータ間での撮像結果情報の共有を、サーバを介して行うことができる。 In addition, a mobile manipulator system according to an aspect of the present disclosure includes any one of the mobile manipulators described above and the server. According to the above configuration, sharing of imaging result information among a plurality of mobile manipulators can be performed via the server.
 本開示は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本開示の技術的範囲に含まれる。 The present disclosure is not limited to the embodiments described above, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present disclosure.
 1、1a、101、101a 移動ロボット
 2 ワーク
 10 マニピュレータ(移動式マニピュレータ)
 11 ロボットアーム
 13 ワーク把持部
 14 カメラ(撮像部)
 22、64 画像解析部
 23 ロボットアーム制御部
 24 無人搬送車制御部
 30、30a 第1通信部
 40 第1記憶部
 50 無人搬送車
 60 サーバ(外部装置)
 61 第2通信部
 63 第2記憶部
1, 1a, 101, 101a Mobile robot 2 Work 10 Manipulator (mobile manipulator)
11 Robot arm 13 Work gripper 14 Camera (imaging unit)
22, 64 Image analysis unit 23 Robot arm control unit 24 Automatic guided vehicle control unit 30, 30a First communication unit 40 First storage unit 50 Automatic guided vehicle 60 Server (external device)
61 second communication unit 63 second storage unit

Claims (10)

  1.  無人搬送車に載置される移動式マニピュレータであって、
     対象物に対する把持動作を行うロボットアームと、
     前記ロボットアームに設けられた撮像部と、
     前記撮像部による撮像結果に基づく第1撮像結果情報を、通信によって外部装置に送信するとともに、前記外部装置から、別の移動式マニピュレータによる撮像結果に基づく第2撮像結果情報を受信する通信部と、
     前記第1撮像結果情報または第2撮像結果情報に基づいて前記ロボットアームの動作を制御する制御部と、
    を備える移動式マニピュレータ。
    A mobile manipulator mounted on an automatic guided vehicle,
    A robot arm that performs a gripping operation on an object,
    An imaging unit provided on the robot arm;
    A communication unit configured to transmit first imaging result information based on an imaging result obtained by the imaging unit to an external device by communication, and receive second imaging result information based on an imaging result obtained by another mobile manipulator from the external device; ,
    A control unit that controls the operation of the robot arm based on the first imaging result information or the second imaging result information;
    A mobile manipulator comprising:
  2.  前記撮像部による撮像画像データを解析し、前記対象物の位置情報を前記第1撮像結果情報として得る画像解析部を備える請求項1に記載の移動式マニピュレータ。 The mobile manipulator according to claim 1, further comprising: an image analysis unit that analyzes image data captured by the imaging unit and obtains position information of the object as the first imaging result information.
  3.  前記通信部は、前記撮像部による撮像画像データを前記第1撮像結果情報として前記外部装置に送信する請求項1に記載の移動式マニピュレータ。 The mobile manipulator according to claim 1, wherein the communication unit transmits image data captured by the imaging unit to the external device as the first imaging result information.
  4.  前記撮像部は、前記ロボットアームによる前記把持動作が完了した後に撮像を行う請求項1~3のいずれか1項に記載の移動式マニピュレータ。 The mobile manipulator according to any one of claims 1 to 3, wherein the imaging unit performs imaging after the gripping operation by the robot arm is completed.
  5.  前記通信部は、前記ロボットアームが前記把持動作を行う前に、前記外部装置から、前記把持動作を行う地点に対応する前記第2撮像結果情報を取得し、
     前記制御部は、該第2撮像結果情報に基づいて前記対象物の位置を認識して、前記ロボットアームによる前記把持動作を制御する請求項1~4のいずれか1項に記載の移動式マニピュレータ。
    Before the robot arm performs the gripping operation, the communication unit acquires the second imaging result information corresponding to a point at which the gripping operation is performed, from the external device,
    The mobile manipulator according to any one of claims 1 to 4, wherein the control unit recognizes a position of the object based on the second imaging result information and controls the gripping operation by the robot arm. .
  6.  前記制御部は、前記ロボットアームが把持動作を行う前に、前記外部装置から前記第2撮像結果情報を取得できなかった場合に、前記撮像部に撮像を行わせるとともに、前記撮像部による撮像画像データに基づいて前記ロボットアームによる前記把持動作を制御する請求項1~4のいずれか1項に記載の移動式マニピュレータ。 The control unit, before the robot arm performs a gripping operation, when the second imaging result information cannot be acquired from the external device, causes the imaging unit to perform imaging, and an image captured by the imaging unit. The mobile manipulator according to any one of claims 1 to 4, wherein the gripping operation by the robot arm is controlled based on data.
  7.  前記制御部は、前記ロボットアームが把持動作を行う前に前記外部装置から取得した前記第2撮像結果情報が利用できない情報であった場合に、前記撮像部に撮像を行わせるとともに、前記撮像部による撮像画像データに基づいて前記ロボットアームによる前記把持動作を制御する請求項1~4のいずれか1項に記載の移動式マニピュレータ。 The controller, when the second imaging result information acquired from the external device before the robot arm performs a gripping operation is information that cannot be used, causes the imaging unit to perform imaging, and the imaging unit The mobile manipulator according to any one of claims 1 to 4, wherein the gripping operation by the robot arm is controlled based on image data captured by the robot arm.
  8.  請求項1~7の何れか1項に記載の移動式マニピュレータと前記無人搬送車とを備える移動ロボット。 A mobile robot comprising the mobile manipulator according to any one of claims 1 to 7 and the automatic guided vehicle.
  9.  請求項1~7の何れか1項に記載の移動式マニピュレータとの間で前記第1撮像結果情報および前記第2撮像結果情報の送受信を行うサーバ。 A server that transmits and receives the first imaging result information and the second imaging result information to and from the mobile manipulator according to any one of claims 1 to 7.
  10.  請求項1~7の何れか1項に記載の移動式マニピュレータと、
     前記移動式マニピュレータとの間で前記第1撮像結果情報および前記第2撮像結果情報の送受信を行うサーバとを備える移動式マニピュレータシステム。
    A mobile manipulator according to any one of claims 1 to 7,
    A mobile manipulator system comprising: a server that transmits and receives the first imaging result information and the second imaging result information to and from the mobile manipulator.
PCT/JP2019/009017 2018-09-07 2019-03-07 Mobile manipulator, mobile robot, server, and mobile manipulator system WO2020049765A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-168034 2018-09-07
JP2018168034A JP7059872B2 (en) 2018-09-07 2018-09-07 Mobile manipulators, mobile robots, servers, and mobile manipulator systems

Publications (1)

Publication Number Publication Date
WO2020049765A1 true WO2020049765A1 (en) 2020-03-12

Family

ID=69723023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009017 WO2020049765A1 (en) 2018-09-07 2019-03-07 Mobile manipulator, mobile robot, server, and mobile manipulator system

Country Status (2)

Country Link
JP (1) JP7059872B2 (en)
WO (1) WO2020049765A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05228866A (en) * 1991-05-14 1993-09-07 Canon Inc Controller of automatic holding device in use of visual sense
WO2013150598A1 (en) * 2012-04-02 2013-10-10 株式会社安川電機 Robot system and robot control device
JP2016147330A (en) * 2015-02-10 2016-08-18 株式会社三次元メディア Control apparatus based on object recognition
JP2018504333A (en) * 2014-12-16 2018-02-15 アマゾン テクノロジーズ インコーポレイテッド Item gripping by robot in inventory system
US20180043547A1 (en) * 2016-07-28 2018-02-15 X Development Llc Collaborative Inventory Monitoring
JP2019025566A (en) * 2017-07-27 2019-02-21 株式会社日立物流 Picking robot and picking system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05228866A (en) * 1991-05-14 1993-09-07 Canon Inc Controller of automatic holding device in use of visual sense
WO2013150598A1 (en) * 2012-04-02 2013-10-10 株式会社安川電機 Robot system and robot control device
JP2018504333A (en) * 2014-12-16 2018-02-15 アマゾン テクノロジーズ インコーポレイテッド Item gripping by robot in inventory system
JP2016147330A (en) * 2015-02-10 2016-08-18 株式会社三次元メディア Control apparatus based on object recognition
US20180043547A1 (en) * 2016-07-28 2018-02-15 X Development Llc Collaborative Inventory Monitoring
JP2019025566A (en) * 2017-07-27 2019-02-21 株式会社日立物流 Picking robot and picking system

Also Published As

Publication number Publication date
JP7059872B2 (en) 2022-04-26
JP2020040147A (en) 2020-03-19

Similar Documents

Publication Publication Date Title
JP4226623B2 (en) Work picking device
JP4265088B2 (en) Robot apparatus and control method thereof
JP5582126B2 (en) Work take-out system, robot apparatus, and workpiece manufacturing method
JP2013132742A (en) Object gripping apparatus, control method for object gripping apparatus, and program
WO2017163973A1 (en) Unmanned movement devices, take-over method, and program
KR102249176B1 (en) Three-way communication system comprising end device, edge server controlling end device and cloud server, and operating method of the same
US20150343637A1 (en) Robot, robot system, and control method
JP2010120141A (en) Picking device for workpieces loaded in bulk and method for controlling the same
KR20130085438A (en) Image processing apparatus and image processing system
CN108778637B (en) Actuator control system, actuator control method, and recording medium
JP2016147330A (en) Control apparatus based on object recognition
US20150343634A1 (en) Robot, robot system, and control method
CN114714333B (en) System for changing tools on a gripper device
JP2006224291A (en) Robot system
JP6958517B2 (en) Manipulators and mobile robots
WO2020049765A1 (en) Mobile manipulator, mobile robot, server, and mobile manipulator system
US20210197391A1 (en) Robot control device, robot control method, and robot control non-transitory computer readable medium
JP6790861B2 (en) Inspection system, controller, inspection method, and inspection program
JP7467984B2 (en) Mobile manipulator, control method and control program for mobile manipulator
JP6238629B2 (en) Image processing method and image processing apparatus
WO2020179507A1 (en) Control device and alignment device
JP2015085457A (en) Robot, robot system, and robot control device
JP6838672B2 (en) Actuator control system, sensor device, control device, actuator control method, information processing program, and recording medium
KR20210057710A (en) Three-way communication system comprising end device, edge server controlling end device and cloud server, and operating method of the same
JP2004286699A (en) Method and detector for detecting image position

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19857794

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19857794

Country of ref document: EP

Kind code of ref document: A1