WO2020049774A1 - Manipulator and mobile robot - Google Patents

Manipulator and mobile robot Download PDF

Info

Publication number
WO2020049774A1
WO2020049774A1 PCT/JP2019/009737 JP2019009737W WO2020049774A1 WO 2020049774 A1 WO2020049774 A1 WO 2020049774A1 JP 2019009737 W JP2019009737 W JP 2019009737W WO 2020049774 A1 WO2020049774 A1 WO 2020049774A1
Authority
WO
WIPO (PCT)
Prior art keywords
gripping
target object
unit
point
distance
Prior art date
Application number
PCT/JP2019/009737
Other languages
French (fr)
Japanese (ja)
Inventor
達也 古賀
聡庸 金井
昭朗 小林
小林 真司
知大 岡田
孝浩 井上
嘉典 小西
康裕 大西
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020049774A1 publication Critical patent/WO2020049774A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • the present invention relates to a manipulator and a mobile robot.
  • Patent Literature 1 discloses a robot sorting system that obtains distance information to a top surface of each of a plurality of packages placed on a carry-in side cargo by using a laser sensor, and specifies a package whose top surface is at the highest position. By acquiring the outer shape information of the specific package specified by the vision sensor and obtaining the destination information of the specific package, calculating the shape and size of the specific package, and calculating the specific shape corresponding to the specific package.
  • a robot sorting system is described which determines a destination area and controls the operation of the robot such that a specific load is lifted by a suction pad and loaded onto a specific unloading side cargo corresponding to the specific destination area.
  • the shape and size of the specific article calculated by the calculating unit, the mounting state of the article on the specific second mounting unit, and the stacking pattern stored in the pattern storage unit It is described that the position of stacking the specific article on the specific second placing portion is determined based on the information.
  • Patent Literature 2 discloses an article transport apparatus including an article removal unit that removes an article arranged on a circulation conveyor while the circulation conveyor is rotating, and conveys the extracted article to a predetermined conveyance position. It describes that the article having a posture that matches a predetermined reference posture is preferentially selected in the article arrangement information.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a manipulator and a mobile robot capable of automatically and efficiently taking out articles.
  • the present invention employs the following configuration in order to solve the above-described problems.
  • a manipulator includes a robot arm that performs a gripping operation on an object, A measurement unit that measures a grippable point of the object and a normal direction of a surface at the grippable point in a predetermined space area where a plurality of objects can be stacked under an interaction of their own weights.
  • a gripping position specifying unit that specifies each gripping position of the object, and the object to be gripped based on the spatial position of the gripping position specified by the gripping position specifying unit
  • a control unit that controls the operation of the robot arm so as to grip the target object specified by the target object specifying unit at the grip position specified by the grip position specifying unit.
  • FIG. 2 is a block diagram showing a main configuration of the manipulator according to the embodiment of the present invention. It is a figure showing the application scene of the manipulator concerning an embodiment of the present invention.
  • 5 is a flowchart illustrating control in the manipulator according to the embodiment of the present invention. It is a flowchart which shows the flow of grip position specification and target object specification of embodiment of this invention. It is a figure which shows the state transition of the target object when performing the holding
  • FIG. 2 is a block diagram illustrating a main part configuration of the mobile robot according to the embodiment of the present invention.
  • FIG. 1 is a block diagram showing a main part configuration of the manipulator according to the embodiment of the present invention.
  • the manipulator 50 includes at least the robot arm 30, the measuring unit 11, the gripping position specifying unit 12, the object specifying unit 13, and the robot arm control unit 20.
  • the manipulator 50 further captures a predetermined space region, detects a two-dimensional image or a three-dimensional image, as a configuration for detecting a grippable point on an object and information about a normal direction of a surface at the grippable point in the space region. May be further provided.
  • an imaging device such as a camera can be used.
  • the imaging unit 5 generates an image including a target in the predetermined space region by imaging the predetermined space region.
  • the measurement unit 11 is connected to the imaging unit 5 and, based on the image data obtained by the imaging unit 5, a graspable point on the target object and a normal line of a surface at the graspable point in a predetermined space area. Measure the direction.
  • the gripping position specifying unit 12 is connected to the measuring unit 11 and specifies a gripping position of the target object based on the measurement result obtained by the measuring unit 11.
  • the target specifying unit 13 connected to the grip position specifying unit 12 specifies an object to be gripped.
  • the robot arm control unit 20 is connected to the object specifying unit 13 and controls the operation of the robot arm 30 so that the robot arm 30 grips the selected grip target at a specific grip position.
  • the robot arm 30 has a configuration capable of holding an object.
  • the measurement unit 11, the gripping position identification unit 12, the target object identification unit 13, and the robot arm control unit 20 may be configured as functional blocks included in the control unit 10.
  • the control unit 10 is realized by a CPU (central processing unit) or the like.
  • the manipulator 50 may include a storage unit 40 that stores various programs to be read to execute the processing in the control unit 10 and various data to be read when the programs are executed.
  • a gripping position and a gripping target are specified from among a plurality of grippable targets and gripping is performed. can do. Therefore, in taking out the target object, it is possible to select and take out the target object that minimizes the influence on the target object other than the object to be taken out, for each operation of taking out the target object. That is, the object can be automatically and efficiently taken out while suppressing damage or damage to the object.
  • FIG. 2 is a diagram showing an application scene of the manipulator according to the embodiment of the present invention.
  • FIG. 2A shows a state in which the box 3 is placed on the flow rack 2 and the object 1 is loaded in the box 3.
  • FIG. 2B is an enlarged view of a portion where the box 3 of FIG. 2A is placed.
  • the predetermined space area on which the objects 1 are stacked is an area where the plurality of objects 1 can be stacked in a state where they are mutually interacted by their own weight.
  • an area on the mounting surface on which the objects 1 are stacked for example, when a plurality of the objects 1 are stacked, the weight of one object 1 is applied to at least one other object 1 as a load. It means the area where it is placed.
  • a mounting surface inclined with respect to the vertical direction, a mounting surface parallel to the vertical direction (that is, at 90 ° C. with respect to the horizontal direction), and the like are given.
  • Such a mounting surface is, for example, a surface on a table for mounting articles in the flow rack 2 as shown in FIG.
  • the “flow rack” is a rack in which a platform on which articles such as commodities are placed is inclined obliquely, and is vertically lowered from the back surface to the front surface from which the product is taken out.
  • the flow rack 2 is generally used in the field of physical distribution and the like.
  • the predetermined space region is an inner region of a box in which the object is laid.
  • the box 3 is, for example, a box such as a cardboard box that is used for carrying or storing articles such as commodities and in which a plurality of articles can be spread and loaded. Therefore, in a further specific example, the predetermined space area is an area in the box 3 placed on the flow rack 2 as shown in FIG. 2A shows an example in which the mounting surface of the box 3 is parallel to the vertical direction.
  • the robot arm 30 has a gripper 31 that grips the target object 1 and an arm 32 to which the gripper 31 is attached at the tip. Further, the gripper 31 performs a gripping operation by sucking the target object 1 by vacuum suction. In addition, in order to realize a gripping operation, a configuration having a plurality of claw portions that can be opened and closed may be employed.
  • the measuring unit 11 measures a grippable point on the target object and a normal direction of a surface at the grippable point in a predetermined space area.
  • the following example is given as a method of measuring the grippable point on the target object and the normal direction of the surface at the grippable point.
  • Two-dimensional pattern periodic pattern
  • linear light is irradiated to specify a three-dimensional shape by pattern distortion.
  • TOF (time-of-flight) method distance for each pixel Measure)
  • the measuring unit 11 also calculates position information of a plurality of objects in a predetermined space area in the normal direction and position information of the object 1 in a direction along the mounting surface from the bottom of the predetermined space area. measure.
  • the lowermost position means the lowest position in the vertical direction in the predetermined space area.
  • the two-dimensional image or three-dimensional image captured by the imaging unit 5 may be a still image or a moving image.
  • the camera of the imaging unit 5 include a monocular camera, a stereo camera, a TOF (Time of Flight) camera, and the like.
  • the manipulator 50 may include a light irradiating unit that irradiates the object with pattern light or linear light.
  • the imaging unit 5 is provided on the robot arm 30. Note that the imaging unit 5 is not limited to the configuration provided in the robot arm 30, and may be arranged at a fixed position where an area in the box 3 placed on the flow rack 2 can be imaged.
  • the two-dimensional image or the three-dimensional image obtained by the imaging unit 5 is supplied to the measurement unit 11.
  • the measuring unit 11 is a device that performs image processing on image data of an image captured by the image capturing unit 5 to acquire information on a grippable point on the object 1 and information on a normal direction of a surface at the grippable point.
  • the measurement unit 11 includes, for example, a position calculation unit, a distance calculation unit, an angle calculation unit, and the like.
  • the plurality of objects 1 have at least one or more surfaces, and the normal direction of each surface is mounted on the mounting surface in various directions.
  • Each of the plurality of objects 1 has, for example, a cubic or rectangular parallelepiped shape. Further, the plurality of objects 1 may have the same shape or different shapes.
  • the normal direction of the mounting surface is defined as the Z direction
  • the direction along the mounting surface from the lowermost part of the predetermined space area is defined as the Y direction.
  • a distance from the lowermost part of the predetermined space area to the gripping position of the object 1 along the mounting surface is a first distance (also referred to as a Y value in this specification), and the object in the normal direction from the mounting surface.
  • the distance to the first gripping position is referred to as a second distance (also referred to as a Z value in this specification).
  • the gripping position specifying unit 12 determines a gripping position of the object based on a grippable point of the object calculated based on a measurement result by the measuring unit 11 and a normal direction of a surface at the grippable point. Identify.
  • the grip position specifying unit 12 specifies, as a grip position, a surface whose normal direction (Z direction) of the surface of the placed target object 1 is within a predetermined range. As will be described later, a gripping candidate point that minimizes the influence on other objects in the normal direction of the surface of the target object 1 can be determined from the grippable points that are the gripping candidates.
  • the object identifying unit 13 sorts the gripping positions in descending order of the first distance (Y value), selects the largest gripping position as a candidate gripping position, and selects the second distance ( When a distance obtained by adding a predetermined distance to the (Z value) is larger than the second distance of the other gripping positions within a predetermined distance range in a direction along the mounting surface from the candidate gripping position, The object corresponding to the candidate grip position is specified as the object to be gripped.
  • the storage unit 40 connected to the control unit 10 stores various programs to be read out for executing the processing in the control unit 10 and various data to be read out when executing the programs.
  • the storage unit 40 may include, for example, a control object specifying result storage unit and a robot arm control storage unit.
  • the robot arm control unit 20 controls the operation of the robot arm 30 so that the control unit 10 grips the target 1 specified by the target specifying unit 13.
  • the robot arm control unit 20 moves the arm unit 32 to a position where the grip unit 31 can grip the target object 1.
  • the robot arm control unit 20 controls the operation of the grip unit 31 so as to grip the target object 1.
  • the robot arm control unit 20 may further control the operation of moving the grasped target object 1 to a target position and opening the target object.
  • FIG. 3 is a flowchart showing a flow of robot arm control in the manipulator 50 according to the embodiment of the present invention illustrated in FIG.
  • the measurement unit 11 measures a grippable point on the target object and a normal direction of a surface at the grippable point in a predetermined space area (step S101). .
  • the grippable point on the object and the normal direction data of the surface at the grippable point measured by the measuring unit 11 are sequentially stored in the storage unit 40.
  • the gripping position specifying unit 12 specifies each gripping position of the object 1 based on the measurement result of the measuring unit 11 (Step S102). Specifically, the grip position specifying unit 12 specifies, as the grip position, a grippable point whose normal direction of the surface at the grippable point is within a predetermined range. By specifying such a surface as a gripping position, it is possible to suppress the possibility that the robot arm 30 will hit another target object 1 in holding the target object 1. Data related to the grip position specified by the grip position specifying unit 12 is also temporarily stored in the storage unit 40 sequentially. Then, the grip position specifying unit 12 supplies the obtained grip position information of the target object 1 to the target object specifying unit 13.
  • the target object specifying unit 13 specifies the target object 1 to be held based on the holding position information received from the holding position specifying unit 12 (Step S103). Specifically, the target object specifying unit 13 is configured to perform a first distance (Y value) of the grip position specified by the grip position specifying unit 12 in a direction from a lowermost portion of the space area along the mounting surface described above. And the target object 1 to be grasped is specified based on the second distance (Z value) in the normal direction from the placement surface.
  • the robot arm control unit 20 controls the operation of the robot arm 30 so as to grip the target object 1 specified by the target object specifying unit 13 at the holding position specified by the holding position specifying unit 12 (step). S104).
  • FIG. 4 is a flowchart showing the flow of specifying the gripping position in step S102 and specifying the target in step S103.
  • FIG. 5 is a diagram showing a state transition of the object when the manipulator according to the embodiment of the present invention performs a grasping operation of the object.
  • the robot arm 30 has a gripper 31 for gripping the target object 1 and an arm 32 to which the gripper 31 is attached at the tip.
  • the distance from the center of the axis of the grip 31 to the side surface of the arm 32 is defined as a predetermined distance A
  • the distance from the connection of the grip 31 to the arm 32 to the tip is defined as a predetermined distance B.
  • step S201 the gripping position specifying unit 12 determines all grippable points that may be gripped by the robot arm 30 based on the information on the normal direction calculated based on the measurement result by the measuring unit 11. calculate.
  • grippable points ⁇ i> to ⁇ viii> are calculated.
  • step S202 it is determined whether or not the normal direction of the surface at the grippable point is within a predetermined range. Specifically, it is determined whether or not the normal direction at the grippable point is within a predetermined angle range from a direction perpendicular to the mounting surface.
  • step S204 If the normal direction at the grippable point is within the predetermined range (Yes), the process proceeds to step S204. On the other hand, if the normal direction of the grippable point is outside the predetermined range (No), the process proceeds to step S203, and the currently selected grip point is excluded from the grip candidate points. In this way, steps S201 to S204 are repeated.
  • step S201 grippable points ⁇ i> to ⁇ viii> are calculated.
  • step S202 it is determined that the grippable points ⁇ ii>, ⁇ iv>, ⁇ vi>, ⁇ vii>, and ⁇ viii> have their normal directions within a predetermined range (Yes), and step S204. Proceed to.
  • the grippable points ⁇ i>, ⁇ iii>, and ⁇ v> are determined to have normal directions outside the predetermined range (No), and proceed to step S203 to be excluded from candidate gripping points.
  • the object identifying unit 13 sorts the graspable points whose normal direction is determined to be within the predetermined range in the order of the Y value in step S204, and selects the candidate having the largest Y value as a candidate. Select as a grip point.
  • a grippable point ⁇ ii> having the largest Y value is selected from a plurality of candidate gripping points.
  • step S205 among other gripping points other than the candidate gripping point, among other gripping points within a predetermined distance A from the Y value of the candidate gripping point, the Z value is the Z of the candidate gripping point. It is determined whether there is no value larger than the value obtained by adding the predetermined distance B to the value. If there is no larger one (Yes), the process proceeds to step S207, and the candidate gripping point is determined as a gripping point. In (a) of FIG. 5, there is no other grip point other than the grip candidate point ⁇ ii> whose Z value is larger than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point ⁇ ii>.
  • step S207 the gripping candidate point ⁇ ii> is determined as the gripping point. If there is a larger one (No), the process proceeds to step S206, and the currently selected grip point is excluded. In this way, steps S204 to S207 are repeated.
  • FIG. 5B shows a state after the object 1 has been gripped and taken out of the grippable point ⁇ ii>.
  • the grippable point ⁇ iv> having the largest Y value is selected as a candidate gripping point in step S204, and within a range of a predetermined distance A from the Y value of the candidate gripping point in step S205. It is determined that there is no other grip point having a Z value greater than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point ⁇ iv> (Yes), and the process proceeds to step S207 to select the candidate grip point.
  • ⁇ Iv> is determined as the grip point.
  • FIG. 5C shows a state after the object is grasped and taken out of the graspable point ⁇ iv>.
  • the grippable point ⁇ vi> having the largest Y value among the grippable points ⁇ vi> to ⁇ viii> is selected as a candidate grippable point in step S204.
  • vii> has a large Z value.
  • step S205 it is determined as No, and the process proceeds to step S206 to exclude the grippable point ⁇ vi> from the gripping candidate points.
  • the grippable point ⁇ vii> having the largest Y value is selected as a candidate grippable point from the remaining grippable points ⁇ vii> and ⁇ viii>.
  • the Z value is larger than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point ⁇ vii>. It is determined that there is no object (Yes), the process proceeds to step S207, and the candidate grip point ⁇ vii> is determined as the grip point.
  • FIG. 5 shows a state after the object is gripped and taken out of the grippable point ⁇ vii>.
  • the grippable point ⁇ vi> having the largest Y value is selected as a candidate grippable point from the grippable points ⁇ vi> and ⁇ viii> in step S204.
  • viii> has a large Z value.
  • step S205 it is determined as No, and the process proceeds to step S206 to exclude the grippable point ⁇ vi> from the gripping candidate points.
  • step S206 the remaining grippable points ⁇ viii> are selected as candidate gripping points.
  • step S205 among the other grip points within the range of the predetermined distance A from the Y value of the candidate grip point, the Z value is larger than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point ⁇ viii>. It is determined that there is no object (Yes), the process proceeds to step S207, and the candidate grip point ⁇ viii> is determined as the grip point.
  • FIG. 6 is a block diagram showing a main configuration of the mobile robot 100 according to the embodiment of the present invention.
  • the mobile robot 100 includes the manipulator 50 and the automatic guided vehicle 60.
  • the manipulator 50 is mounted on the upper part of the automatic guided vehicle 60.
  • the automatic guided vehicle 60 may include, for example, a communication unit that communicates with an external system and an automatic guided vehicle control unit that controls the operation of the automatic guided vehicle.
  • the communication with the external system is performed by, for example, wireless communication via a wireless communication network.
  • One example of the external system is a system that transmits a signal instructing the mobile robot 100 to move to the automatic guided vehicle control unit and controls the operation of the automatic guided vehicle 60.
  • the automatic guided vehicle 60 may include a pair of wheels and a motor that drives each of the pair of wheels simultaneously or separately.
  • the manipulator 50 includes the imaging unit 5, and the imaging unit 5 may be provided at the tip of the robot arm 30.
  • the control block of the manipulator 50 (in particular, the control unit 10) may be realized by a logic circuit (hardware) formed on an integrated circuit (IC chip) or the like, or may be realized by software.
  • the manipulator 50 includes a computer that executes instructions of a program that is software for realizing each function.
  • the computer includes, for example, one or more processors and a computer-readable recording medium storing the program. Then, in the computer, the object of the present invention is achieved when the processor reads the program from the recording medium and executes the program.
  • the processor for example, a CPU (Central Processing Unit) can be used.
  • the recording medium include "temporary tangible media" such as ROM (Read Only Memory), tapes, disks, cards, semiconductor memories, and programmable logic circuits.
  • a RAM Random Access Memory
  • the program may be supplied to the computer via an arbitrary transmission medium (a communication network, a broadcast wave, or the like) capable of transmitting the program.
  • a transmission medium a communication network, a broadcast wave, or the like
  • one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  • a manipulator includes a robot arm that performs a gripping operation on an object, A measurement unit that measures a grippable point of the object and a normal direction of a surface at the grippable point in a predetermined space area where a plurality of objects can be stacked under an interaction of their own weights.
  • a gripping position specifying unit that specifies each gripping position of the object, and the object to be gripped based on the spatial position of the gripping position specified by the gripping position specifying unit
  • a control unit that controls the operation of the robot arm so as to grip the target object specified by the target object specifying unit at the grip position specified by the grip position specifying unit.
  • the grip position specifying unit may specify the surface whose normal direction is within a predetermined range as the grip position.
  • the object is mounted on the mounting area inclined with respect to the vertical direction or the space area on the mounting surface parallel to the vertical direction, and the object specifying unit. Is a first distance in the direction along the mounting surface from the lowermost portion of the space area of the gripping position specified by the gripping position specifying unit, and a second distance in a normal direction from the mounting surface. May be specified based on the target.
  • an optimal grip point is determined as a grip candidate point based on a positional relationship between a target object and other objects from among a plurality of objects each having a weight applied as a load. be able to. Thereby, when taking out the target object, it is possible to take out the object while minimizing the risk that the robot arm comes into contact with the other object or the other object is damaged or damaged.
  • the target object specifying unit sorts the gripping positions in descending order of the first distance, selects the largest gripping position as a candidate gripping position, and selects the candidate gripping position.
  • a distance obtained by adding a predetermined distance to the second distance is larger than the second distance of the other gripping positions within a predetermined distance range in a direction along the placement surface from the candidate gripping position.
  • the object corresponding to the candidate gripping position may be specified as the object to be gripped.
  • the robot arm contacts only the target object from among the plurality of objects each of which has been applied with a weight as a load, or the other object Can be removed while minimizing the risk of damage or breakage.
  • a mobile robot includes a manipulator and an automatic guided vehicle.
  • the manipulator since the manipulator can be automatically moved by providing the automatic guided vehicle, it is not necessary for an operator to operate the manipulator to move the manipulator. Therefore, work efficiency can be further improved.
  • the present invention can be used for manipulators and mobile robots.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

The objective of the present invention is to provide a novel manipulator and mobile robot. A manipulator according to one aspect of the present invention is provided with: a robot arm which performs a grasping operation with respect to a target object; a measuring unit which measures a graspable point of the target object and a normal line direction of a surface at the graspable point, in a predetermined spatial region in which a plurality of target objects can be stacked in a state in which the target objects interact with one another as a result of the self weights thereof; a grasping position identifying unit which identifies a grasping position for each target object on the basis of the respective normal line directions; a target object identifying unit which identifies the target object to be grasped, on the basis of the spatial positions of the grasping positions identified by the grasping position identifying unit; and a control unit which controls the operation of the robot arm in such a way that the target object identified by the target object identifying unit is grasped at the grasping position identified by the grasping position identifying unit.

Description

マニピュレータおよび移動ロボットManipulators and mobile robots
 本発明はマニピュレータおよび移動ロボットに関する。 The present invention relates to a manipulator and a mobile robot.
 従来、物流センターなどにおいて、棚に積載された複数の商品等の物品の中から所定の物品を所定の個数取り出し、所定の搬送先に移送する作業が行われる。例えば、棚に段ボール箱などの箱に入った物品の中から所定の物品を取り出す。この作業は、ロボットによる自動化がされている。その際、複数の物品の積載状況に応じて物品の積み上げ位置を決定したり、載置されている姿勢を考慮して搬送する物品を選択する技術が開発されている。 2. Description of the Related Art Conventionally, in a distribution center or the like, an operation of extracting a predetermined number of predetermined articles from a plurality of articles or the like loaded on a shelf and transferring the same to a predetermined destination is performed. For example, a predetermined article is taken out of articles contained in a box such as a cardboard box on a shelf. This work is automated by a robot. At this time, a technique has been developed in which a stacking position of the articles is determined in accordance with a loading situation of a plurality of articles, and an article to be conveyed is selected in consideration of a mounted posture.
 特許文献1には、ロボット仕分けシステムであって、レーザセンサにより搬入側カーゴに載置された複数の荷物それぞれの上面までの距離情報を取得し、上面が最も高い位置に存在する荷物を特定し、ビジョンセンサにより上記特定された特定の荷物の外形情報を取得すると共に特定の荷物の仕向地情報の取得を図り、特定の荷物の形状及び大きさを算出し、特定の荷物に対応した特定の仕向地区域を決定し、特定の荷物を吸着パッドにより持ち上げつつ特定の仕向地区域に対応した特定の搬出側カーゴへ積み付けるように、ロボットの動作を制御するロボット仕分けシステムが記載されている。そして、前記算出手段により算出された前記特定の物品の形状及び大きさ、前記特定の第2載置部における前記物品の載置状況、及び、前記パターン記憶手段に記憶された前記積み付けパターン、に基づき、前記特定の第2載置部への前記特定の物品の積み付け位置を決定することが記載されている。 Patent Literature 1 discloses a robot sorting system that obtains distance information to a top surface of each of a plurality of packages placed on a carry-in side cargo by using a laser sensor, and specifies a package whose top surface is at the highest position. By acquiring the outer shape information of the specific package specified by the vision sensor and obtaining the destination information of the specific package, calculating the shape and size of the specific package, and calculating the specific shape corresponding to the specific package. A robot sorting system is described which determines a destination area and controls the operation of the robot such that a specific load is lifted by a suction pad and loaded onto a specific unloading side cargo corresponding to the specific destination area. Then, the shape and size of the specific article calculated by the calculating unit, the mounting state of the article on the specific second mounting unit, and the stacking pattern stored in the pattern storage unit, It is described that the position of stacking the specific article on the specific second placing portion is determined based on the information.
 特許文献2には、循環コンベヤ上に配置された物品を前記循環コンベヤが回転している状態で取出し、取出した前記物品を所定の搬送位置まで搬送する物品取出手段とを備えた物品搬送装置において、物品配置情報中で所定の基準姿勢に適合する姿勢を有する前記物品を優先して選ぶことが記載されている。 Patent Literature 2 discloses an article transport apparatus including an article removal unit that removes an article arranged on a circulation conveyor while the circulation conveyor is rotating, and conveys the extracted article to a predetermined conveyance position. It describes that the article having a posture that matches a predetermined reference posture is preferentially selected in the article arrangement information.
日本国公開特許公報「特開2013-086914号公報(2013年5月13日公開)」Japanese Unexamined Patent Publication “JP 2013-086914 A (published May 13, 2013)” 日本国公開特許公報「特開2017-205841号公報(2017年11月24日公開)」Japanese Unexamined Patent Publication "JP-A-2017-205841 (published on November 24, 2017)"
 従来の一般的なばら積みの商品等の物品の取り出しにおいては、敷詰まっている複数の物品の中からある物品を取り出す場合、目的の物品の周辺にあるものの重量による荷重が目的の物品にかかることにより、引き抜くのを失敗する可能性がある。また、敷詰まっている物品を崩してしまい、対象物を傷つけたり壊したりしてしまう、または周辺の段ボールなどに物品があたってしまう、などの問題がある。 In the conventional removal of articles such as bulk goods, when taking out an article from a plurality of stacked articles, the load due to the weight of the object around the target article is applied to the target article May fail to pull out. In addition, there is a problem that the packed articles are broken and the target object is damaged or broken, or the article hits a peripheral cardboard or the like.
 本発明は、上記の問題点に鑑みてなされたものであり、その目的は、自動で効率的な物品の取り出しを可能にするマニピュレータおよび移動ロボットを提供することにある。 The present invention has been made in view of the above problems, and an object of the present invention is to provide a manipulator and a mobile robot capable of automatically and efficiently taking out articles.
 本発明は、上述した課題を解決するために、以下の構成を採用する。 The present invention employs the following configuration in order to solve the above-described problems.
 すなわち、本発明の一側面に係るマニピュレータは、対象物に対する把持動作を行うロボットアームと、
 複数の対象物が互いに自重による相互作用を受ける状態で積載されることが可能な所定の空間領域における、前記対象物における把持可能点および該把持可能点における面の法線方向を計測する計測部と、
 前記法線方向に基づいて、前記対象物のそれぞれの把持位置を特定する把持位置特定部と、前記把持位置特定部によって特定された前記把持位置の空間位置に基づいて、把持すべき前記対象物を特定する対象物特定部と、前記対象物特定部によって特定された前記対象物を、前記把持位置特定部によって特定された把持位置において把持するように前記ロボットアームの動作を制御する制御部と、を備える。
That is, a manipulator according to one aspect of the present invention includes a robot arm that performs a gripping operation on an object,
A measurement unit that measures a grippable point of the object and a normal direction of a surface at the grippable point in a predetermined space area where a plurality of objects can be stacked under an interaction of their own weights. When,
Based on the normal direction, a gripping position specifying unit that specifies each gripping position of the object, and the object to be gripped based on the spatial position of the gripping position specified by the gripping position specifying unit And a control unit that controls the operation of the robot arm so as to grip the target object specified by the target object specifying unit at the grip position specified by the grip position specifying unit. , Is provided.
 本発明の一側面によれば、所定の領域からの自動での物体の取り出し作業の成功率を向上させることができる。 According to one aspect of the present invention, it is possible to improve the success rate of automatically taking out an object from a predetermined area.
本発明の実施形態に係るマニピュレータにおける、要部構成を示すブロック図である。FIG. 2 is a block diagram showing a main configuration of the manipulator according to the embodiment of the present invention. 本発明の実施形態に係るマニピュレータの適用場面を示す図である。It is a figure showing the application scene of the manipulator concerning an embodiment of the present invention. 本発明の実施形態に係るマニピュレータにおける制御を示すフローチャートである。5 is a flowchart illustrating control in the manipulator according to the embodiment of the present invention. 本発明の実施形態の把持位置特定および対象物特定の流れを示すフローチャートである。It is a flowchart which shows the flow of grip position specification and target object specification of embodiment of this invention. 本発明の実施形態に係るマニピュレータにおける、対象物の把持動作を実行したときの対象物の状態遷移を示す図である。It is a figure which shows the state transition of the target object when performing the holding | grip operation | movement of the target object in the manipulator which concerns on embodiment of this invention. 本発明の実施形態に係る移動ロボットにおける、要部構成を示すブロック図である。FIG. 2 is a block diagram illustrating a main part configuration of the mobile robot according to the embodiment of the present invention.
 〔実施形態1:マニピュレータの構成〕
 §1 適用例
 まず、図1を用いて、本発明のマニピュレータの構成概要について説明する。
[Embodiment 1: Configuration of manipulator]
§1 Application Example First, the configuration outline of the manipulator of the present invention will be described with reference to FIG.
 図1は、本発明の実施形態に係るマニピュレータにおける、要部構成を示すブロック図である。 FIG. 1 is a block diagram showing a main part configuration of the manipulator according to the embodiment of the present invention.
 マニピュレータ50は、ロボットアーム30と、計測部11と、把持位置特定部12と、対象物特定部13と、ロボットアーム制御部20と、を少なくとも備えている。 The manipulator 50 includes at least the robot arm 30, the measuring unit 11, the gripping position specifying unit 12, the object specifying unit 13, and the robot arm control unit 20.
 マニピュレータ50は、さらに、空間領域の、対象物における把持可能点および該把持可能点における面の法線方向に関する情報を検出する構成として、所定の空間領域を撮像し、2次元画像または3次元画像を得る撮像部5をさらに備えていてもよい。撮像部5としては、カメラ等の撮像装置を用いることができる。 The manipulator 50 further captures a predetermined space region, detects a two-dimensional image or a three-dimensional image, as a configuration for detecting a grippable point on an object and information about a normal direction of a surface at the grippable point in the space region. May be further provided. As the imaging unit 5, an imaging device such as a camera can be used.
 撮像部5は、所定の空間領域を撮像することにより、所定の空間領域における対象物を含む画像を生成する。 (4) The imaging unit 5 generates an image including a target in the predetermined space region by imaging the predetermined space region.
 計測部11は、撮像部5に接続されており、撮像部5にて得られた画像データに基づいて、所定の空間領域における、対象物における把持可能点および該把持可能点における面の法線方向を計測する。 The measurement unit 11 is connected to the imaging unit 5 and, based on the image data obtained by the imaging unit 5, a graspable point on the target object and a normal line of a surface at the graspable point in a predetermined space area. Measure the direction.
 把持位置特定部12は、計測部11に接続されており、計測部11で得られた計測結果に基づいて対象物の把持位置を特定する。 The gripping position specifying unit 12 is connected to the measuring unit 11 and specifies a gripping position of the target object based on the measurement result obtained by the measuring unit 11.
 把持位置特定部12によって特定された把持位置に基づいて、把持位置特定部12に接続されている対象物特定部13にて、把持する対象物を特定する。 Based on the grip position specified by the grip position specifying unit 12, the target specifying unit 13 connected to the grip position specifying unit 12 specifies an object to be gripped.
 ロボットアーム制御部20は、対象物特定部13に接続されており、ロボットアーム30が、選択された把持対象物を特定の把持位置にて把持するように、ロボットアーム30の動作を制御する。 The robot arm control unit 20 is connected to the object specifying unit 13 and controls the operation of the robot arm 30 so that the robot arm 30 grips the selected grip target at a specific grip position.
 ロボットアーム30は、対象物を把持可能な構成を有する。 The robot arm 30 has a configuration capable of holding an object.
 計測部11、把持位置特定部12および対象物特定部13と、ロボットアーム制御部20とは、制御部10に含まれる機能ブロックとして構成されていてもよい。制御部10は、CPU(central processing unit)などで実現される。 The measurement unit 11, the gripping position identification unit 12, the target object identification unit 13, and the robot arm control unit 20 may be configured as functional blocks included in the control unit 10. The control unit 10 is realized by a CPU (central processing unit) or the like.
 また、マニピュレータ50は、制御部10における処理を実行するために読み出す各種プログラム、およびプログラムを実行するときに読み出す各種データを記憶している記憶部40を備えていてもよい。 The manipulator 50 may include a storage unit 40 that stores various programs to be read to execute the processing in the control unit 10 and various data to be read when the programs are executed.
 上記構成によれば、複数の対象物が互いに自重による相互作用を受ける状態で積載される対象物について、把持可能な複数の対象物のうち、把持位置および把持対象物を特定して把持を実行することができる。そのため、対象物の取り出しをするにあたり、対象物の取り出し動作毎に、取り出されるもの以外の対象物への影響が最小限となる対象物を選択して取り出すことが可能である。つまり、対象物を破損したり傷つけたりすることを抑制しつつ、対象物の自動で効率的な取り出し作業を行うことができる。 According to the above configuration, for a plurality of targets to be stacked in a state where they are interacted by their own weights, a gripping position and a gripping target are specified from among a plurality of grippable targets and gripping is performed. can do. Therefore, in taking out the target object, it is possible to select and take out the target object that minimizes the influence on the target object other than the object to be taken out, for each operation of taking out the target object. That is, the object can be automatically and efficiently taken out while suppressing damage or damage to the object.
 §2 構成例
 続いて、図1に加え図2も参照しつつ、本実施形態に係るマニピュレータ50の構成の一例について詳細に説明する。
§2 Configuration Example Next, an example of the configuration of the manipulator 50 according to the present embodiment will be described in detail with reference to FIG. 2 in addition to FIG.
 図2は、本発明の実施形態に係るマニピュレータの適用場面を示す図である。図2の(a)はフローラック2上に箱3が載置されており、箱3内に対象物1が積載されている様子を示している。図2の(b)は、図2の(a)の箱3が載置されている部分の拡大図である。 FIG. 2 is a diagram showing an application scene of the manipulator according to the embodiment of the present invention. FIG. 2A shows a state in which the box 3 is placed on the flow rack 2 and the object 1 is loaded in the box 3. FIG. 2B is an enlarged view of a portion where the box 3 of FIG. 2A is placed.
 ここで、対象物1が積載される所定の空間領域は、複数の対象物1が互いに自重による相互作用を受ける状態で積載されることが可能な領域である。 Here, the predetermined space area on which the objects 1 are stacked is an area where the plurality of objects 1 can be stacked in a state where they are mutually interacted by their own weight.
 例えば対象物1が積載される載置面上の領域であって、例えば複数の対象物1が積み重なることによって、ある対象物1の重量が少なくとも1つの別の対象物1に荷重としてかかる状態で載置される領域を意味している。具体的には、鉛直方向に対して傾斜した載置面、または鉛直方向に平行な(つまり水平方向に対して90℃の)載置面などが挙げられる。 For example, an area on the mounting surface on which the objects 1 are stacked, for example, when a plurality of the objects 1 are stacked, the weight of one object 1 is applied to at least one other object 1 as a load. It means the area where it is placed. Specifically, a mounting surface inclined with respect to the vertical direction, a mounting surface parallel to the vertical direction (that is, at 90 ° C. with respect to the horizontal direction), and the like are given.
 このような載置面は、例えば図2に図示されているような、フローラック2における物品の載置するための台上の面が挙げられる。「フローラック」とは、商品等の物品を置く台が斜めに傾斜しており、背面から取り出しを行う方の面である前面にかけて鉛直方向に下がっているラックのことである。フローラック2は物流等の分野において一般的に用いられている。 (2) Such a mounting surface is, for example, a surface on a table for mounting articles in the flow rack 2 as shown in FIG. The “flow rack” is a rack in which a platform on which articles such as commodities are placed is inclined obliquely, and is vertically lowered from the back surface to the front surface from which the product is taken out. The flow rack 2 is generally used in the field of physical distribution and the like.
 また、一例では所定の空間領域は、対象物がその中に敷き詰められる箱の内部領域である。箱3は例えば、商品等物品の運搬または保管のために用いられ、内部に複数の物品を敷き詰めて積載することが可能な段ボール箱等の箱である。よって、さらなる具体例では、所定の空間領域は、図2に示すようなフローラック2に載置された箱3内の領域である。なお、図2(a)の最上段は、箱3の載置面が鉛直方向に平行となっている例を示している。 {Circle around (1)} In one example, the predetermined space region is an inner region of a box in which the object is laid. The box 3 is, for example, a box such as a cardboard box that is used for carrying or storing articles such as commodities and in which a plurality of articles can be spread and loaded. Therefore, in a further specific example, the predetermined space area is an area in the box 3 placed on the flow rack 2 as shown in FIG. 2A shows an example in which the mounting surface of the box 3 is parallel to the vertical direction.
 図2の例では、ロボットアーム30は、対象物1を把持する把持部31と、把持部31が先端部に取り付けられたアーム部32とを有する。また、把持部31は、真空吸着により対象物1を吸着することによって把持動作を行う。なお、把持動作を実現するために、開閉動作可能な複数の爪部を有する構成としてもよい。 In the example of FIG. 2, the robot arm 30 has a gripper 31 that grips the target object 1 and an arm 32 to which the gripper 31 is attached at the tip. Further, the gripper 31 performs a gripping operation by sucking the target object 1 by vacuum suction. In addition, in order to realize a gripping operation, a configuration having a plurality of claw portions that can be opened and closed may be employed.
 計測部11は、所定の空間領域における、対象物における把持可能点および該把持可能点における面の法線方向を計測する。なお、対象物における把持可能点および該把持可能点における面の法線方向の計測の手法としては、以下の例が挙げられる。(1)2次元パターン(定周期模様)の光、または、直線状の光を照射し、パターンの歪みによって立体形状を特定する、(2)TOF(time of flight)方式(画素毎に距離を測る)、(3)カメラ2台にてステレオ撮影し、立体形状を特定する、または(4)2次元画像からエッジ検出をし、対象物のモデルとパターンマッチングすることによって立体形状を特定する。 The measuring unit 11 measures a grippable point on the target object and a normal direction of a surface at the grippable point in a predetermined space area. The following example is given as a method of measuring the grippable point on the target object and the normal direction of the surface at the grippable point. (1) Two-dimensional pattern (periodic pattern) light or linear light is irradiated to specify a three-dimensional shape by pattern distortion. (2) TOF (time-of-flight) method (distance for each pixel Measure), (3) Specify two-dimensional images by two cameras and specify a three-dimensional shape, or (4) Specify edges by detecting a two-dimensional image and perform pattern matching with a model of an object to specify a three-dimensional shape.
 また、計測部11は、所定の空間領域における複数の対象物の、法線方向の位置情報および所定の空間領域の最下部からの載置面に沿った方向における対象物1の位置情報をそれぞれ計測する。ここで最下部とは、所定の空間領域のうち鉛直方向において最も低い位置を意味している。 The measuring unit 11 also calculates position information of a plurality of objects in a predetermined space area in the normal direction and position information of the object 1 in a direction along the mounting surface from the bottom of the predetermined space area. measure. Here, the lowermost position means the lowest position in the vertical direction in the predetermined space area.
 撮像部5が撮像する2次元画像または3次元画像は静止画であっても動画であってもよい。撮像部5のカメラとしては、例えば単眼のカメラ、ステレオカメラおよびTOF(Time Of Flight)カメラ等が挙げられる。また、マニピュレータ50は、対象物にパターン光または直線状の光を照射する光照射部を備えていても良い。本実施形態において、撮像部5は、ロボットアーム30に設けられる。なお、撮像部5はロボットアーム30に設けられる構成に限定されるものではなく、例えばフローラック2に載置されている箱3内の領域を撮像可能な固定位置に配置されていてもよい。 The two-dimensional image or three-dimensional image captured by the imaging unit 5 may be a still image or a moving image. Examples of the camera of the imaging unit 5 include a monocular camera, a stereo camera, a TOF (Time of Flight) camera, and the like. In addition, the manipulator 50 may include a light irradiating unit that irradiates the object with pattern light or linear light. In the present embodiment, the imaging unit 5 is provided on the robot arm 30. Note that the imaging unit 5 is not limited to the configuration provided in the robot arm 30, and may be arranged at a fixed position where an area in the box 3 placed on the flow rack 2 can be imaged.
 撮像部5が撮像して得られた2次元画像または3次元画像は、計測部11に供給される。 The two-dimensional image or the three-dimensional image obtained by the imaging unit 5 is supplied to the measurement unit 11.
 計測部11は、撮像部5において撮像した画像の画像データを画像処理して対象物1における把持可能点および該把持可能点における面の法線方向の情報を取得する装置である。計測部11は、例えば、位置演算部、距離演算部、角度演算部等を含む。 The measuring unit 11 is a device that performs image processing on image data of an image captured by the image capturing unit 5 to acquire information on a grippable point on the object 1 and information on a normal direction of a surface at the grippable point. The measurement unit 11 includes, for example, a position calculation unit, a distance calculation unit, an angle calculation unit, and the like.
 図2の例示では、複数の対象物1は少なくとも1以上の面を有し、それぞれの面の法線方向は様々な方向で載置面に載置されている。複数の対象物1のそれぞれは、例えば、立方体または直方体の形状である。また、複数の対象物1は、同一形状であってもよく、異なる形状であってもよい。 In the example of FIG. 2, the plurality of objects 1 have at least one or more surfaces, and the normal direction of each surface is mounted on the mounting surface in various directions. Each of the plurality of objects 1 has, for example, a cubic or rectangular parallelepiped shape. Further, the plurality of objects 1 may have the same shape or different shapes.
 ここで、図2にも図示するように、載置面の法線方向をZ方向、所定の空間領域の最下部からの載置面に沿った方向をY方向とする。 Here, as also shown in FIG. 2, the normal direction of the mounting surface is defined as the Z direction, and the direction along the mounting surface from the lowermost part of the predetermined space area is defined as the Y direction.
 また、所定の空間領域の最下部から載置面に沿った対象物1の把持位置までの距離を第1距離(本明細書ではY値とも称する)、載置面から法線方向における対象物1の把持位置までの距離を第2距離(本明細書ではZ値とも称する)とする。 Further, a distance from the lowermost part of the predetermined space area to the gripping position of the object 1 along the mounting surface is a first distance (also referred to as a Y value in this specification), and the object in the normal direction from the mounting surface. The distance to the first gripping position is referred to as a second distance (also referred to as a Z value in this specification).
 把持位置特定部12は、計測部11による計測結果に基づいて算出される上記対象物における把持可能点および該把持可能点における面の法線方向に基づいて、上記対象物のそれぞれの把持位置を特定する。 The gripping position specifying unit 12 determines a gripping position of the object based on a grippable point of the object calculated based on a measurement result by the measuring unit 11 and a normal direction of a surface at the grippable point. Identify.
 本実施形態において把持位置特定部12は、載置された対象物1の面の法線方向(Z方向)が所定の範囲内にある面を把持位置として特定する。後述する通り、把持候補となる把持可能点の中から、対象物1の面の法線方向において、その他の対象物に対する影響を最小限とする把持候補点を決定することができる。 In the present embodiment, the grip position specifying unit 12 specifies, as a grip position, a surface whose normal direction (Z direction) of the surface of the placed target object 1 is within a predetermined range. As will be described later, a gripping candidate point that minimizes the influence on other objects in the normal direction of the surface of the target object 1 can be determined from the grippable points that are the gripping candidates.
 また、対象物特定部13は、上記把持位置を上記第1距離(Y値)が大きい順にソートし、最も大きい上記把持位置を候補把持位置として選択し、上記候補把持位置の上記第2距離(Z値)に所定距離を加えた距離が、上記候補把持位置から上記載置面に沿った方向における所定の距離範囲内にあるその他の上記把持位置の上記第2距離よりも大きい場合に、上記候補把持位置に対応する上記対象物を、把持すべき上記対象物として特定する。 Further, the object identifying unit 13 sorts the gripping positions in descending order of the first distance (Y value), selects the largest gripping position as a candidate gripping position, and selects the second distance ( When a distance obtained by adding a predetermined distance to the (Z value) is larger than the second distance of the other gripping positions within a predetermined distance range in a direction along the mounting surface from the candidate gripping position, The object corresponding to the candidate grip position is specified as the object to be gripped.
 上述の実施形態ならびに、マニピュレータの各ブロックでの制御および把持位置特定部12の制御については、以下の(マニピュレータ制御の流れ(および(把持位置特定および対象物特定の流れの詳細説明)の中で詳細に説明する。 The control of each block of the manipulator and the control of the gripping position specifying unit 12 in the above-described embodiment and the control of the gripping position specifying unit 12 will be described in the following (manipulator control flow (and (detailed description of gripping position specification and target specifying flow)). This will be described in detail.
 また、制御部10に接続された記憶部40は、制御部10における処理を実行するために読み出す各種プログラム、プログラムを実行するときに読み出す各種データを記憶している。具体的には、記憶部40には、例えば、対照物特定結果記憶部およびロボットアーム制御記憶部が含まれていてもよい。 The storage unit 40 connected to the control unit 10 stores various programs to be read out for executing the processing in the control unit 10 and various data to be read out when executing the programs. Specifically, the storage unit 40 may include, for example, a control object specifying result storage unit and a robot arm control storage unit.
 ロボットアーム制御部20は、制御部10において、対象物特定部13により特定した対象物1を把持するようにロボットアーム30の動作を制御する。 The robot arm control unit 20 controls the operation of the robot arm 30 so that the control unit 10 grips the target 1 specified by the target specifying unit 13.
 図1および図2の例では、ロボットアーム制御部20は、把持部31が対象物1を把持できる位置に、アーム部32を移動させる。次いで、ロボットアーム制御部20は、対象物1を把持するように把持部31の動作を制御する。ロボットアーム制御部20は、さらに、把持した対象物1を目的の位置まで移動させ、対象物を開放する動作の制御をさらに行ってもよい。 1 and 2, the robot arm control unit 20 moves the arm unit 32 to a position where the grip unit 31 can grip the target object 1. Next, the robot arm control unit 20 controls the operation of the grip unit 31 so as to grip the target object 1. The robot arm control unit 20 may further control the operation of moving the grasped target object 1 to a target position and opening the target object.
 §3 動作例
 (マニピュレータ制御の流れ)
 本実施形態におけるマニピュレータ制御の流れについて、図3を参照して以下に説明する。図3は、図1に例示した本発明の実施形態に係るマニピュレータ50におけるロボットアーム制御の流れを示すフローチャートである。
§3 Operation example (manipulator control flow)
The flow of manipulator control in the present embodiment will be described below with reference to FIG. FIG. 3 is a flowchart showing a flow of robot arm control in the manipulator 50 according to the embodiment of the present invention illustrated in FIG.
 まず、撮像部5によって撮像された画像データを元に、計測部11が、所定の空間領域における、対象物における把持可能点および該把持可能点における面の法線方向を計測する(ステップS101)。計測部11によって計測された、対象物における把持可能点および該把持可能点における面の法線方向データは、逐次、記憶部40に格納される。 First, based on the image data captured by the imaging unit 5, the measurement unit 11 measures a grippable point on the target object and a normal direction of a surface at the grippable point in a predetermined space area (step S101). . The grippable point on the object and the normal direction data of the surface at the grippable point measured by the measuring unit 11 are sequentially stored in the storage unit 40.
 続いて把持位置特定部12は、計測部11の計測結果に基づいて、対象物1のそれぞれの把持位置を特定する(ステップS102)。具体的には、把持位置特定部12は、把持可能点における面の法線方向が所定の範囲内にある把持可能点を上記把持位置として特定する。このような面を把持位置として特定することにより、目的の対象物1の把持において、その他の対象物1にロボットアーム30がぶつかる可能性を抑制できる。把持位置特定部12によって特定された把持位置に関するデータも、逐次、記憶部40に一旦格納される。そして、把持位置特定部12は、取得した対象物1のそれぞれの把持位置情報を、対象物特定部13に供給する。 Subsequently, the gripping position specifying unit 12 specifies each gripping position of the object 1 based on the measurement result of the measuring unit 11 (Step S102). Specifically, the grip position specifying unit 12 specifies, as the grip position, a grippable point whose normal direction of the surface at the grippable point is within a predetermined range. By specifying such a surface as a gripping position, it is possible to suppress the possibility that the robot arm 30 will hit another target object 1 in holding the target object 1. Data related to the grip position specified by the grip position specifying unit 12 is also temporarily stored in the storage unit 40 sequentially. Then, the grip position specifying unit 12 supplies the obtained grip position information of the target object 1 to the target object specifying unit 13.
 次に対象物特定部13は、把持位置特定部12から受け取った把持位置情報に基づいて、把持する対象物1を特定する(ステップS103)。具体的には、対象物特定部13は、上記把持位置特定部12によって特定された上記把持位置の、上記空間領域の最下部から上記載置面に沿った方向における第1距離(Y値)、および、上記載置面から法線方向における第2距離(Z値)に基づいて、把持すべき上記対象物1を特定する。 Next, the target object specifying unit 13 specifies the target object 1 to be held based on the holding position information received from the holding position specifying unit 12 (Step S103). Specifically, the target object specifying unit 13 is configured to perform a first distance (Y value) of the grip position specified by the grip position specifying unit 12 in a direction from a lowermost portion of the space area along the mounting surface described above. And the target object 1 to be grasped is specified based on the second distance (Z value) in the normal direction from the placement surface.
 ロボットアーム制御部20は、対象物特定部13によって特定された上記対象物1を、上記把持位置特定部12によって特定された把持位置において把持するように上記ロボットアーム30の動作を制御する(ステップS104)。 The robot arm control unit 20 controls the operation of the robot arm 30 so as to grip the target object 1 specified by the target object specifying unit 13 at the holding position specified by the holding position specifying unit 12 (step). S104).
 (把持位置特定および対象物特定の流れの詳細説明)
 図4は、ステップS102における把持位置特定およびステップS103における対象物特定の流れを示すフローチャートである。
(Detailed description of gripping position identification and object identification flow)
FIG. 4 is a flowchart showing the flow of specifying the gripping position in step S102 and specifying the target in step S103.
 また、図5は、本発明の実施形態に係るマニピュレータにおける、対象物の把持動作を実行したときの対象物の状態遷移を示す図である。 FIG. 5 is a diagram showing a state transition of the object when the manipulator according to the embodiment of the present invention performs a grasping operation of the object.
 図2に示したようにロボットアーム30は、対象物1を把持する把持部31と、把持部31が先端部に取り付けられたアーム部32とを有している。ここで、把持部31の軸の中心から、アーム部32の側面までの距離を所定距離A、把持部31におけるアーム部32との接続部から先端部までの距離を所定距離Bとする。 ロ ボ ッ ト As shown in FIG. 2, the robot arm 30 has a gripper 31 for gripping the target object 1 and an arm 32 to which the gripper 31 is attached at the tip. Here, the distance from the center of the axis of the grip 31 to the side surface of the arm 32 is defined as a predetermined distance A, and the distance from the connection of the grip 31 to the arm 32 to the tip is defined as a predetermined distance B.
 まず、ステップS201において、把持位置特定部12は、計測部11にて計測結果に基づき算出された法線方向の情報に基づいて、ロボットアーム30が把時する可能性がある把持可能点を全て算出する。図5の(a)においては把持可能点<i>~<viii>が算出される。 First, in step S201, the gripping position specifying unit 12 determines all grippable points that may be gripped by the robot arm 30 based on the information on the normal direction calculated based on the measurement result by the measuring unit 11. calculate. In FIG. 5A, grippable points <i> to <viii> are calculated.
 続いて、ステップS202において、把持可能点における面の法線方向が所定の範囲内であるか否かについて判定を行う。具体的には、把持可能点における法線方向が、載置面に垂直な方向から所定の角度範囲内にあるか否かが判定される。 Subsequently, in step S202, it is determined whether or not the normal direction of the surface at the grippable point is within a predetermined range. Specifically, it is determined whether or not the normal direction at the grippable point is within a predetermined angle range from a direction perpendicular to the mounting surface.
 把持可能点における法線方向が所定の範囲内である場合(Yes)には、ステップS204に進む。一方、把持可能点における法線方向が所定の範囲外である場合(No)には、ステップS203に進んで現在選択した把持点を把持候補点から除外する。このようにして、ステップS201~ステップS204の間を繰り返す。 If the normal direction at the grippable point is within the predetermined range (Yes), the process proceeds to step S204. On the other hand, if the normal direction of the grippable point is outside the predetermined range (No), the process proceeds to step S203, and the currently selected grip point is excluded from the grip candidate points. In this way, steps S201 to S204 are repeated.
 まず、図5の(a)においてはステップS201において、把持可能点<i>~<viii>が算出される。次にステップS202において、把持可能点<ii>、<iv>、<vi>、<vii>および<viii>が、その法線方向が所定の範囲内である(Yes)と判断され、ステップS204に進む。一方、把持可能点<i>、<iii>および<v>は、その法線方向が所定の範囲外である(No)と判断され、ステップS203に進んで候補把持点から除外される。 First, in FIG. 5A, in steps S201, grippable points <i> to <viii> are calculated. Next, in step S202, it is determined that the grippable points <ii>, <iv>, <vi>, <vii>, and <viii> have their normal directions within a predetermined range (Yes), and step S204. Proceed to. On the other hand, the grippable points <i>, <iii>, and <v> are determined to have normal directions outside the predetermined range (No), and proceed to step S203 to be excluded from candidate gripping points.
 次に対象物特定部13は、法線方向が所定の範囲内であると判定された把時可能点について、ステップS204においてY値が大きいものの順にソートし、Y値が一番大きいものを候補把持点として選択する。図5の(a)においては複数の候補把持点のうち、Y値が最も大きい把持可能点<ii>が選択される。 Next, the object identifying unit 13 sorts the graspable points whose normal direction is determined to be within the predetermined range in the order of the Y value in step S204, and selects the candidate having the largest Y value as a candidate. Select as a grip point. In FIG. 5A, a grippable point <ii> having the largest Y value is selected from a plurality of candidate gripping points.
 続いてステップS205において、候補把持点以外のその他の把持点のうち、候補把持点のY値からの所定距離Aの範囲内にあるその他の把持点の中で、Z値が候補把持点のZ値に所定距離Bを加えた値より大きいものがないか否かについて判定する。大きいものがない場合(Yes)には、ステップS207に進んで候補把持点を把持点として決定する。図5の(a)においては、把持候補点<ii>以外のその他の把持点のうち、Z値が候補把持点<ii>のZ値に所定距離Bを加えた値より大きいものがないので(Yes)、ステップS207に進んで把持候補点<ii>が把持点として決定される。大きいものがある場合(No)には、ステップS206に進んで現在選択した把持点を除外する。このようにして、ステップS204~ステップS207の間を繰り返す。 Subsequently, in step S205, among other gripping points other than the candidate gripping point, among other gripping points within a predetermined distance A from the Y value of the candidate gripping point, the Z value is the Z of the candidate gripping point. It is determined whether there is no value larger than the value obtained by adding the predetermined distance B to the value. If there is no larger one (Yes), the process proceeds to step S207, and the candidate gripping point is determined as a gripping point. In (a) of FIG. 5, there is no other grip point other than the grip candidate point <ii> whose Z value is larger than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point <ii>. (Yes), the process proceeds to step S207, and the gripping candidate point <ii> is determined as the gripping point. If there is a larger one (No), the process proceeds to step S206, and the currently selected grip point is excluded. In this way, steps S204 to S207 are repeated.
 次に、図5の(b)は、対象物1を把持可能点<ii>を把持して取り出し後の状態を示す。図5の(b)の状態では、ステップS204においてY値が最も大きい把持可能点<iv>を候補把持点として選択し、ステップS205において、候補把持点のY値からの所定距離Aの範囲内にあるその他の把持点の中で、Z値が候補把持点<iv>のZ値に所定距離Bを加えた値より大きいものがない(Yes)と判定してステップS207に進んで候補把持点<iv>を把持点として決定する。 Next, FIG. 5B shows a state after the object 1 has been gripped and taken out of the grippable point <ii>. In the state of FIG. 5B, the grippable point <iv> having the largest Y value is selected as a candidate gripping point in step S204, and within a range of a predetermined distance A from the Y value of the candidate gripping point in step S205. It is determined that there is no other grip point having a Z value greater than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point <iv> (Yes), and the process proceeds to step S207 to select the candidate grip point. <Iv> is determined as the grip point.
 続いて図5の(c)は、対象物を把持可能点<iv>を把持して取り出した後の状態を示す。図5の(c)の状態では、ステップS204において把持可能点<vi>~<viii>のうち、Y値が最も大きい把持可能点<vi>を候補把持点として選択する。しかし、候補把持点のY値からの所定距離Aの範囲内にあるその他の把持点の中で、現在の候補把持点<vi>のZ値に所定距離Bを加えた値より把持可能点<vii>のZ値が大きい。したがって、ステップS205において、Noと判定してステップS206に進んで把持可能点<vi>を把持候補点から除外する。次にステップ204に戻って残りの把持可能点<vii>および<viii>のうち、Y値が最も大きい把持可能点<vii>を候補把持点として選択する。ステップS205において、候補把持点のY値からの所定距離Aの範囲内にあるその他の把持点の中で、Z値が候補把持点<vii>のZ値に所定距離Bを加えた値より大きいものがない(Yes)と判定してステップS207に進んで候補把持点<vii>を把持点として決定する。 FIG. 5C shows a state after the object is grasped and taken out of the graspable point <iv>. In the state of FIG. 5C, the grippable point <vi> having the largest Y value among the grippable points <vi> to <viii> is selected as a candidate grippable point in step S204. However, among the other grip points within the range of the predetermined distance A from the Y value of the candidate grip point, the grippable point <from the value obtained by adding the predetermined distance B to the Z value of the current candidate grip point <vi>. vii> has a large Z value. Therefore, in step S205, it is determined as No, and the process proceeds to step S206 to exclude the grippable point <vi> from the gripping candidate points. Next, returning to step 204, the grippable point <vii> having the largest Y value is selected as a candidate grippable point from the remaining grippable points <vii> and <viii>. In step S205, among the other grip points within the range of the predetermined distance A from the Y value of the candidate grip point, the Z value is larger than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point <vii>. It is determined that there is no object (Yes), the process proceeds to step S207, and the candidate grip point <vii> is determined as the grip point.
 続いて図5の(d)は、対象物を把持可能点<vii>を把持して取り出した後の状態を示す。図5の(d)の状態では、ステップS204において把持可能点<vi>および<viii>のうち、Y値が最も大きい把持可能点<vi>を候補把持点として選択する。しかし、候補把持点のY値からの所定距離Aの範囲内にあるその他の把持点の中で、現在の候補把持点<vi>のZ値に所定距離Bを加えた値より把持可能点<viii>のZ値が大きい。したがって、ステップS205において、Noと判定してステップS206に進んで把持可能点<vi>を把持候補点から除外する。次にステップ204に戻って残りの把持可能点<viii>を候補把持点として選択する。ステップS205において、候補把持点のY値からの所定距離Aの範囲内にあるその他の把持点の中で、Z値が候補把持点<viii>のZ値に所定距離Bを加えた値より大きいものがない(Yes)と判定してステップS207に進んで候補把持点<viii>を把持点として決定する。 (D) of FIG. 5 shows a state after the object is gripped and taken out of the grippable point <vii>. In the state shown in FIG. 5D, the grippable point <vi> having the largest Y value is selected as a candidate grippable point from the grippable points <vi> and <viii> in step S204. However, among the other grip points within the range of the predetermined distance A from the Y value of the candidate grip point, the grippable point <from the value obtained by adding the predetermined distance B to the Z value of the current candidate grip point <vi>. viii> has a large Z value. Therefore, in step S205, it is determined as No, and the process proceeds to step S206 to exclude the grippable point <vi> from the gripping candidate points. Next, returning to step 204, the remaining grippable points <viii> are selected as candidate gripping points. In step S205, among the other grip points within the range of the predetermined distance A from the Y value of the candidate grip point, the Z value is larger than the value obtained by adding the predetermined distance B to the Z value of the candidate grip point <viii>. It is determined that there is no object (Yes), the process proceeds to step S207, and the candidate grip point <viii> is determined as the grip point.
 このように複数の対象物1の把持点を決定していくことにより、把持すべき目的の対象物の取り出しの際にその他の対象物が目的の対象物の取り出しの影響で移動したり、その他の対象物へのロボットアーム30が接触したりすることを好適に抑制することができる。これにより、互いの重量が荷重としてかかった複数の対象物の中から、目的の対象物を取り出す際に、その他の対象物を傷付けたり破損させたりする恐れを最小限にとどめつつ取り出すことができる。 By determining the gripping points of the plurality of objects 1 in this manner, when the target object to be gripped is taken out, other objects move due to the effect of taking out the target object, It is possible to preferably suppress the robot arm 30 from contacting the target object. Thereby, when taking out a target object from a plurality of objects whose weights are applied as loads, it is possible to take out the target object while minimizing the risk of damaging or damaging other objects. .
 〔実施形態2:移動ロボット〕
 図6は、本発明の実施形態に係る移動ロボット100における、要部構成を示すブロック図である。
[Embodiment 2: Mobile robot]
FIG. 6 is a block diagram showing a main configuration of the mobile robot 100 according to the embodiment of the present invention.
 移動ロボット100は、マニピュレータ50と無人搬送車60とを備える。 The mobile robot 100 includes the manipulator 50 and the automatic guided vehicle 60.
 無人搬送車60を備えることにより、自動で移動することが可能となるため、移動するために作業者が操作する必要がない。そのため作業効率を向上させることができる。 備 え る By providing the automatic guided vehicle 60, it is possible to automatically move, so that there is no need for an operator to operate to move. Therefore, work efficiency can be improved.
 図6の例では、マニピュレータ50は無人搬送車60の上部に搭載されている。無人搬送車60は、例えば、外部システムと通信を行う通信部と、無人搬送車の動作を制御する無人搬送車制御部とを備えていてもよい。外部システムとの通信は、例えば、無線通信ネットワークを介した無線通信によって行われる。 で は In the example of FIG. 6, the manipulator 50 is mounted on the upper part of the automatic guided vehicle 60. The automatic guided vehicle 60 may include, for example, a communication unit that communicates with an external system and an automatic guided vehicle control unit that controls the operation of the automatic guided vehicle. The communication with the external system is performed by, for example, wireless communication via a wireless communication network.
 外部システムの一例は、移動ロボット100に移動を指示する信号を無人搬送車制御部に送信し、無人搬送車60の動作を制御するシステムである。 One example of the external system is a system that transmits a signal instructing the mobile robot 100 to move to the automatic guided vehicle control unit and controls the operation of the automatic guided vehicle 60.
 また、無人搬送車60は、一対の車輪と、一対の車輪のそれぞれを同時または別々に駆動するモータとを備えていてもよい。ある実施形態ではマニピュレータ50に撮像部5を包含しており、撮像部5はロボットアーム30の先端において設けられていてもよい。 The automatic guided vehicle 60 may include a pair of wheels and a motor that drives each of the pair of wheels simultaneously or separately. In one embodiment, the manipulator 50 includes the imaging unit 5, and the imaging unit 5 may be provided at the tip of the robot arm 30.
 〔実施形態3:ソフトウェアによる実現例〕
 マニピュレータ50の制御ブロック(特に制御部10)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、ソフトウェアによって実現してもよい。
[Embodiment 3: Example of realization by software]
The control block of the manipulator 50 (in particular, the control unit 10) may be realized by a logic circuit (hardware) formed on an integrated circuit (IC chip) or the like, or may be realized by software.
 後者の場合、マニピュレータ50は、各機能を実現するソフトウェアであるプログラムの命令を実行するコンピュータを備えている。このコンピュータは、例えば1つ以上のプロセッサを備えていると共に、上記プログラムを記憶したコンピュータ読み取り可能な記録媒体を備えている。そして、上記コンピュータにおいて、上記プロセッサが上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記プロセッサとしては、例えばCPU(Central Processing Unit)を用いることができる。上記記録媒体としては、「一時的でない有形の媒体」、例えば、ROM(Read Only Memory)等の他、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムを展開するRAM(Random Access Memory)などをさらに備えていてもよい。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明の一側面は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the manipulator 50 includes a computer that executes instructions of a program that is software for realizing each function. The computer includes, for example, one or more processors and a computer-readable recording medium storing the program. Then, in the computer, the object of the present invention is achieved when the processor reads the program from the recording medium and executes the program. As the processor, for example, a CPU (Central Processing Unit) can be used. Examples of the recording medium include "temporary tangible media" such as ROM (Read Only Memory), tapes, disks, cards, semiconductor memories, and programmable logic circuits. Further, a RAM (Random Access Memory) for expanding the above program may be further provided. Further, the program may be supplied to the computer via an arbitrary transmission medium (a communication network, a broadcast wave, or the like) capable of transmitting the program. Note that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
 〔まとめ〕
 本発明は、以下の構成を採用する。
[Summary]
The present invention employs the following configuration.
 すなわち、本発明の一側面に係るマニピュレータは、対象物に対する把持動作を行うロボットアームと、
 複数の対象物が互いに自重による相互作用を受ける状態で積載されることが可能な所定の空間領域における、前記対象物における把持可能点および該把持可能点における面の法線方向を計測する計測部と、
 前記法線方向に基づいて、前記対象物のそれぞれの把持位置を特定する把持位置特定部と、前記把持位置特定部によって特定された前記把持位置の空間位置に基づいて、把持すべき前記対象物を特定する対象物特定部と、前記対象物特定部によって特定された前記対象物を、前記把持位置特定部によって特定された把持位置において把持するように前記ロボットアームの動作を制御する制御部と、を備える。
That is, a manipulator according to one aspect of the present invention includes a robot arm that performs a gripping operation on an object,
A measurement unit that measures a grippable point of the object and a normal direction of a surface at the grippable point in a predetermined space area where a plurality of objects can be stacked under an interaction of their own weights. When,
Based on the normal direction, a gripping position specifying unit that specifies each gripping position of the object, and the object to be gripped based on the spatial position of the gripping position specified by the gripping position specifying unit And a control unit that controls the operation of the robot arm so as to grip the target object specified by the target object specifying unit at the grip position specified by the grip position specifying unit. , Is provided.
 上記構成によれば、複数の対象物が互いに自重による相互作用を受ける状態で積載されている空間領域において、複数の対象物の中からある対象物の取り出しをするにあたり、把持されるもの以外の対象物への影響を最小限とするために最適な把持位置および把持対象物を特定することができる。そのため、対象物を破損したり傷つけたりすることを抑制しつつ、自動で効率的な対象物の取り出しを行うことができる。 According to the above configuration, in taking out a certain object from among the plurality of objects in the space area in which the plurality of objects are stacked under the interaction of their own weight with each other, other than the object to be grasped An optimal gripping position and an object to be gripped to minimize the influence on the object can be specified. Therefore, it is possible to automatically and efficiently take out the target object while preventing the target object from being damaged or damaged.
 本発明の一側面に係るマニピュレータにおいて、前記把持位置特定部は、前記法線方向が所定の範囲内にある前記面を前記把持位置として特定してもよい。 In the manipulator according to one aspect of the present invention, the grip position specifying unit may specify the surface whose normal direction is within a predetermined range as the grip position.
 上記構成によれば、把持候補となる把持可能点の中から、対象物の面の法線方向において、その他の対象物に対する影響を最小限とする把持候補点を決定することができる。 According to the above configuration, it is possible to determine, from among the grippable points that are gripping candidates, a gripping candidate point that minimizes the influence on other objects in the normal direction of the surface of the object.
 本発明の一側面に係るマニピュレータにおいて、前記対象物が、鉛直方向に対して傾斜した載置面または鉛直方向に平行な載置面上の前記空間領域に積載されるとともに、前記対象物特定部は、前記把持位置特定部によって特定された前記把持位置の、前記空間領域の最下部から前記載置面に沿った方向における第1距離、および、前記載置面から法線方向における第2距離に基づいて、把持すべき前記対象物を特定してもよい。 In the manipulator according to one aspect of the present invention, the object is mounted on the mounting area inclined with respect to the vertical direction or the space area on the mounting surface parallel to the vertical direction, and the object specifying unit. Is a first distance in the direction along the mounting surface from the lowermost portion of the space area of the gripping position specified by the gripping position specifying unit, and a second distance in a normal direction from the mounting surface. May be specified based on the target.
 上記構成によれば、互いの重量が荷重としてかかった複数の対象物の中から、目的の対象物とその他の対象物との位置関係に基づいて、把持候補点として最適な把持点を決定することができる。これにより、目的の対象物を取り出すにあたり、その他の対象物にロボットアームが接触したり、その他の対象物が傷付いたり破損したりする恐れを最小限にとどめつつ取り出すことができる。 According to the above configuration, an optimal grip point is determined as a grip candidate point based on a positional relationship between a target object and other objects from among a plurality of objects each having a weight applied as a load. be able to. Thereby, when taking out the target object, it is possible to take out the object while minimizing the risk that the robot arm comes into contact with the other object or the other object is damaged or damaged.
 本発明の一側面に係るマニピュレータにおいて、前記対象物特定部は、前記把持位置を前記第1距離が大きい順にソートし、最も大きい前記把持位置を候補把持位置として選択し、前記候補把持位置の前記第2距離に所定距離を加えた距離が、前記候補把持位置から前記載置面に沿った方向における所定の距離範囲内にあるその他の前記把持位置の前記第2距離よりも大きい場合に、前記候補把持位置に対応する前記対象物を、把持すべき前記対象物として特定してもよい。 In the manipulator according to one aspect of the present invention, the target object specifying unit sorts the gripping positions in descending order of the first distance, selects the largest gripping position as a candidate gripping position, and selects the candidate gripping position. When a distance obtained by adding a predetermined distance to the second distance is larger than the second distance of the other gripping positions within a predetermined distance range in a direction along the placement surface from the candidate gripping position, The object corresponding to the candidate gripping position may be specified as the object to be gripped.
 上記構成によれば、目的の対象物と、ロボットアームと、その他の対象物との位置関係に基づいて、把持候補点として最適な把持点を決定することができる。したがって、目的の対象物取り出しの際のその他の対象物へのロボットアームの接触を好適に抑制することができる。これにより、目的の対象物を取り出すにあたり、互いの重量が荷重としてかかった複数の対象物の中から、目的の対象物のみを、その他の対象物にロボットアームが接触したり、その他の対象物が傷付いたり破損したりする恐れを最小限にとどめつつ取り出すことができる。 According to the above configuration, it is possible to determine an optimal gripping point as a gripping candidate point based on the positional relationship between the target object, the robot arm, and other objects. Therefore, contact of the robot arm with other objects at the time of taking out the target object can be suitably suppressed. As a result, when taking out the target object, the robot arm contacts only the target object from among the plurality of objects each of which has been applied with a weight as a load, or the other object Can be removed while minimizing the risk of damage or breakage.
 本発明の一側面に係る移動ロボットは、マニピュレータと無人搬送車とを備える。 移動 A mobile robot according to one aspect of the present invention includes a manipulator and an automatic guided vehicle.
 上記構成によれば、無人搬送車を備えることにより、マニピュレータが自動で移動することが可能となるため、マニピュレータを移動させるために作業者が操作する必要がない。そのため作業効率を一層向上させることができる。 According to the above configuration, since the manipulator can be automatically moved by providing the automatic guided vehicle, it is not necessary for an operator to operate the manipulator to move the manipulator. Therefore, work efficiency can be further improved.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。 The present invention is not limited to the embodiments described above, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention. Further, new technical features can be formed by combining the technical means disclosed in each embodiment.
 本発明は、マニピュレータおよび移動ロボットに利用することができる。 The present invention can be used for manipulators and mobile robots.
 1 対象物
 2 フローラック
 3 箱
 5 撮像部
 10 制御部
 11 計測部
 12 把持位置特定部
 13 対象物特定部
 20 ロボットアーム制御部
 30 ロボットアーム
 31 把持部
 32 アーム部
 40 記憶部
 50 マニピュレータ
 60 無人搬送車
 100 移動ロボット
REFERENCE SIGNS LIST 1 object 2 flow rack 3 box 5 imaging unit 10 control unit 11 measuring unit 12 gripping position specifying unit 13 object specifying unit 20 robot arm control unit 30 robot arm 31 gripping unit 32 arm unit 40 storage unit 50 manipulator 60 unmanned transport vehicle 100 mobile robot

Claims (5)

  1.  対象物に対する把持動作を行うロボットアームと、
     複数の対象物が互いに自重による相互作用を受ける状態で積載されることが可能な所定の空間領域における、前記対象物における把持可能点および該把持可能点における面の法線方向を計測する計測部と、
     前記法線方向に基づいて、前記対象物のそれぞれの把持位置を特定する把持位置特定部と、
     前記把持位置特定部によって特定された前記把持位置の空間位置に基づいて、把持すべき前記対象物を特定する対象物特定部と、
     前記対象物特定部によって特定された前記対象物を、前記把持位置特定部によって特定された把持位置において把持するように前記ロボットアームの動作を制御する制御部と、を備えるマニピュレータ。
    A robot arm that performs a gripping operation on an object,
    A measurement unit that measures a grippable point of the object and a normal direction of a surface at the grippable point in a predetermined space area where a plurality of objects can be stacked under an interaction of their own weights. When,
    Based on the normal direction, a gripping position specifying unit that specifies each gripping position of the object,
    An object specifying unit that specifies the object to be held, based on a spatial position of the holding position specified by the holding position specifying unit,
    A control unit that controls an operation of the robot arm so as to grip the target object specified by the target object specifying unit at a holding position specified by the holding position specifying unit.
  2.  前記把持位置特定部は、前記法線方向が所定の範囲内にある前記面を前記把持位置として特定する請求項1に記載のマニピュレータ。 The manipulator according to claim 1, wherein the gripping position specifying unit specifies the surface whose normal direction is within a predetermined range as the gripping position.
  3.  前記対象物が、鉛直方向に対して傾斜した載置面または鉛直方向に平行な載置面上の前記空間領域に積載されるとともに、
     前記対象物特定部は、前記把持位置特定部によって特定された前記把持位置の、前記空間領域の最下部から前記載置面に沿った方向における第1距離、および、前記載置面から法線方向における第2距離に基づいて、把持すべき前記対象物を特定する請求項1または2に記載のマニピュレータ。
    The object is loaded on the space area on the mounting surface inclined with respect to the vertical direction or the mounting surface parallel to the vertical direction,
    The target object identification unit is a first distance in the direction along the mounting surface from the lowermost part of the space area of the gripping position specified by the gripping position identification unit, and a normal to the mounting surface. The manipulator according to claim 1, wherein the object to be grasped is specified based on a second distance in a direction.
  4.  前記対象物特定部は、
      前記把持位置を前記第1距離が大きい順にソートし、最も大きい前記把持位置を候補把持位置として選択し、
      前記候補把持位置の前記第2距離に所定距離を加えた距離が、前記候補把持位置から前記載置面に沿った方向における所定の距離範囲内にあるその他の前記把持位置の前記第2距離よりも大きい場合に、前記候補把持位置に対応する前記対象物を、把持すべき前記対象物として特定する請求項3に記載のマニピュレータ。
    The object identification unit,
    The gripping positions are sorted in descending order of the first distance, the largest gripping position is selected as a candidate gripping position,
    A distance obtained by adding a predetermined distance to the second distance of the candidate gripping position is greater than the second distance of the other gripping positions within a predetermined distance range in a direction along the placing surface from the candidate gripping position. 4. The manipulator according to claim 3, wherein, when is larger, the object corresponding to the candidate gripping position is specified as the object to be gripped. 5.
  5.  請求項1から4の何れか1項に記載のマニピュレータと無人搬送車とを備える移動ロボット。 A mobile robot comprising the manipulator according to any one of claims 1 to 4 and an automatic guided vehicle.
PCT/JP2019/009737 2018-09-07 2019-03-11 Manipulator and mobile robot WO2020049774A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018168030A JP7021620B2 (en) 2018-09-07 2018-09-07 Manipulators and mobile robots
JP2018-168030 2018-09-07

Publications (1)

Publication Number Publication Date
WO2020049774A1 true WO2020049774A1 (en) 2020-03-12

Family

ID=69721524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009737 WO2020049774A1 (en) 2018-09-07 2019-03-11 Manipulator and mobile robot

Country Status (2)

Country Link
JP (1) JP7021620B2 (en)
WO (1) WO2020049774A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112578767A (en) * 2020-12-03 2021-03-30 斯比泰电子(嘉兴)有限公司 Quick check out test set of electronic tail-gate control unit of car

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012187666A (en) * 2011-03-10 2012-10-04 Fuji Electric Co Ltd Robot control device, article pick-out system, program, and robot control method
JP2013043271A (en) * 2011-08-26 2013-03-04 Canon Inc Information processing device, method for controlling the same, and program
JP2017100216A (en) * 2015-11-30 2017-06-08 キヤノン株式会社 Information processing device and information processing method
JP2017170567A (en) * 2016-03-24 2017-09-28 シグマ株式会社 Workpiece recognition method and random picking method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4640499B2 (en) * 2008-12-12 2011-03-02 トヨタ自動車株式会社 Grip control device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012187666A (en) * 2011-03-10 2012-10-04 Fuji Electric Co Ltd Robot control device, article pick-out system, program, and robot control method
JP2013043271A (en) * 2011-08-26 2013-03-04 Canon Inc Information processing device, method for controlling the same, and program
JP2017100216A (en) * 2015-11-30 2017-06-08 キヤノン株式会社 Information processing device and information processing method
JP2017170567A (en) * 2016-03-24 2017-09-28 シグマ株式会社 Workpiece recognition method and random picking method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112578767A (en) * 2020-12-03 2021-03-30 斯比泰电子(嘉兴)有限公司 Quick check out test set of electronic tail-gate control unit of car

Also Published As

Publication number Publication date
JP7021620B2 (en) 2022-02-17
JP2020040144A (en) 2020-03-19

Similar Documents

Publication Publication Date Title
JP6738112B2 (en) Robot system control device and control method
US11288810B2 (en) Robotic system with automated package registration mechanism and methods of operating the same
US10562188B1 (en) Automated package registration systems, devices, and methods
CN111328408B (en) Shape information generating device, control device, loading/unloading device, distribution system, program, and control method
CN112802105A (en) Object grabbing method and device
JP7175487B1 (en) Robotic system with image-based sizing mechanism and method for operating the robotic system
JP7398662B2 (en) Robot multi-sided gripper assembly and its operating method
WO2020049774A1 (en) Manipulator and mobile robot
CN113307042B (en) Object unstacking method and device based on conveyor belt, computing equipment and storage medium
CN112802107A (en) Robot-based control method and device for clamp group
US11459190B2 (en) Systems and methods for die transfer
CN115003613A (en) Device and method for separating piece goods
JP2020040796A (en) Picking system
JP7264387B2 (en) Robotic gripper assembly for openable objects and method for picking objects
US20240173866A1 (en) Robotic system with multi-location placement control mechanism
US20230071488A1 (en) Robotic system with overlap processing mechanism and methods for operating the same
CN115703238A (en) System and method for robotic body placement
CN115609569A (en) Robot system with image-based sizing mechanism and method of operating the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19858054

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19858054

Country of ref document: EP

Kind code of ref document: A1