WO2021044473A1 - Multi-joint robot-arm control device and multi-joint robot arm device - Google Patents

Multi-joint robot-arm control device and multi-joint robot arm device Download PDF

Info

Publication number
WO2021044473A1
WO2021044473A1 PCT/JP2019/034393 JP2019034393W WO2021044473A1 WO 2021044473 A1 WO2021044473 A1 WO 2021044473A1 JP 2019034393 W JP2019034393 W JP 2019034393W WO 2021044473 A1 WO2021044473 A1 WO 2021044473A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot arm
articulated robot
control device
image
arm control
Prior art date
Application number
PCT/JP2019/034393
Other languages
French (fr)
Japanese (ja)
Inventor
航 石井
佳典 原田
雅史 上野山
和弘 中谷
Original Assignee
ヤマハ発動機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ヤマハ発動機株式会社 filed Critical ヤマハ発動機株式会社
Priority to PCT/JP2019/034393 priority Critical patent/WO2021044473A1/en
Publication of WO2021044473A1 publication Critical patent/WO2021044473A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present invention relates to an articulated robot arm control device and an articulated robot arm device.
  • An articulated robot arm control device that controls an articulated robot arm based on an image taken by a fixed camera and a hand camera provided at the tip of the arm is known.
  • the articulated robot arm control device uses measurement data obtained from an image taken by the hand camera in order to improve the accuracy of the articulated robot arm, and obtains measurement data obtained from an image taken by the fixed camera. I am correcting.
  • Patent Document 1 describes a robot arm, a first visual sensor that measures a measurement range including a movable range of the robot arm, a second visual sensor that is positioned at the tip of the robot arm, and the robot arm.
  • a robot device having a control device for controlling the position and orientation of the robot device is disclosed.
  • the control device controls the position and orientation of the robot arm based on the measured value of the first visual sensor
  • the control device generates a command value given to the robot arm using the measured value of the second visual sensor. ..
  • the appropriate arrangement of the fixed camera is determined in consideration of the usage environment of the articulated robot arm, the type of the object to be handled, the degree of freedom of the work area, and the like. Therefore, when the usage environment of the articulated robot arm, the type of the object to be handled, the degree of freedom of the work area, and the like change, the control based on the image taken by the fixed camera described in Patent Document 1 can be performed. There was a possibility that the articulated robot arm could not handle it.
  • the type of work includes, for example, the usage environment of the articulated robot arm (outside, weather, vehicle mounting / fixing, etc.), the type of object (industrial parts, agricultural products, cooking utensils, etc.), the degree of freedom in the work area, and the like.
  • the present invention provides an articulated robot arm control device and an articulated robot arm device that can increase the versatility of the articulated robot arm by increasing the types of work that can be handled by the articulated robot arm.
  • the present inventors can increase the versatility of the articulated robot arm by increasing the types of work that can be handled by the articulated robot arm by using the conventionally proposed fixed camera and hand camera.
  • An arm control device and an articulated robot arm device were examined. As a result of diligent studies, the present inventors have come up with the following configuration.
  • the articulated robot arm control device includes an imaging unit provided on the articulated robot arm, an image processing unit that detects an image of an object from an image captured by the imaging unit, and the object. It is provided with a coordinate information processing unit for calculating the position coordinates of the robot arm and a drive control unit for driving the actuator of the articulated robot arm.
  • the imaging unit images the work area at a first imaging position away from the work area where the articulated robot arm works on the object, and the image processing unit captures the object from the image of the work area.
  • the image of the object is detected, and the coordinate information processing unit calculates the position coordinates of the object in the coordinate system of the articulated robot arm based on the detected image of the object.
  • the imaging unit captures the work area at the first imaging position away from the work area where the articulated robot arm works on the object. Therefore, the object can be searched from the area including the work area. Further, the image processing unit detects an image of the object from the entire image of the captured work area, and the coordinate information processing unit in the coordinate system of the articulated robot arm based on the image of the object. The position coordinates of the object are calculated. Thereby, the articulated robot arm control device can identify the positions of a plurality of the objects in the work area. As described above, the imaging unit corresponding to the conventionally proposed hand camera captures the region including the working region at the first imaging position away from the working region, thereby fixing the conventionally proposed fixed region. It plays the role of a camera. Therefore, the articulated robot arm control device can omit the fixed camera in the conventional proposal using the fixed camera and the hand camera.
  • the articulated robot arm control device calculates the position coordinates of a plurality of the objects in the work area, so that the optimum work path when the articulated robot arm performs the work. Can be calculated based on predetermined conditions. As a result, the articulated robot arm control device can efficiently cause the articulated robot arm to perform work on the object even if a plurality of the objects are randomly arranged in the work area. ..
  • the degree of freedom in the usage environment of the articulated robot arm device can be increased by increasing the types of work that can be handled by the articulated robot arm. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm control device of the present invention preferably includes the following configurations.
  • the imaging unit further photographs the working area including the object at a second imaging position different from the first imaging position.
  • the articulated robot arm control device further images the work area where the object is located at the second imaging position different from the first imaging position, thereby imaging at the first imaging position.
  • a specific area or the specific object can be imaged at different angles or magnified.
  • the articulated robot arm control device can acquire an image suitable for the state of the work area and the shape of the object.
  • the types of work that can be handled by the articulated robot arm are increased. This makes it possible to increase the degree of freedom in the usage environment of the articulated robot arm device. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm control device of the present invention preferably includes the following configurations.
  • the second imaging position is a position closer to the working area than the first imaging position.
  • the articulated robot arm control device can acquire an entire image of the work area including the object from the image of the work area captured at the first imaging position. Further, the articulated robot arm control device can acquire a detailed image of a specific object among a plurality of objects imaged at the first imaging position by imaging at the second imaging position. As a result, the articulated robot arm control device searches the work area over a wide area using the image of the work area captured at the first imaging position by one of the imaging units, and images the image at the second imaging position. The work area can be locally searched by using the image of the work area.
  • the articulated robot arm control device that images the work area where the object is located at the second imaging position that is closer to the work area than the first imaging position, the work that can be handled by the articulated robot arm.
  • the degree of freedom in the usage environment of the articulated robot arm device can be increased. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm control device of the present invention preferably includes the following configurations.
  • the imaging unit images the working area at an angle of view of 90 degrees or more in the imaging range.
  • the articulated robot arm control device can secure an imaging position in which the working area is entirely included in the imaging range of the imaging unit within the movable range of the articulated arm robot.
  • the articulated robot arm device can be handled by increasing the types of work that can be handled by the articulated robot arm. It is possible to increase the degree of freedom in the usage environment of. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm control device of the present invention preferably includes the following configurations.
  • the imaging unit includes a monocular camera that can move in an arbitrary direction by driving the articulated robot arm and can acquire a parallax image of the work area.
  • the articulated robot arm control device can measure the distance to the work area or the object by moving the monocular camera with the articulated robot arm and taking an image. As a result, the articulated robot arm control device can image the work area or the object with the monocular camera and measure the distance to the work area or the object.
  • the degree of freedom in the usage environment of the articulated robot arm device is increased by increasing the types of work that can be handled by the articulated robot arm. Can be done. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm control device of the present invention preferably includes the following configurations.
  • the amount of movement of the monocular camera in an arbitrary direction for capturing the parallax image differs between the first imaging position and the second imaging position.
  • the articulated robot arm control device has the monocular for capturing a parallax image at the first imaging position and the second imaging position closer to the working area than the first imaging position.
  • the articulated robot arm control device measures the distance to the work area or the object with measurement accuracy based on the positional relationship between the first imaging position and the second imaging position and the work area. be able to.
  • the articulated robot arm corresponds to the movement amount.
  • the degree of freedom in the usage environment of the articulated robot arm device can be increased. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm control device of the present invention preferably includes the following configurations.
  • the imaging unit captures images from different imaging directions at the first imaging position and the second imaging position.
  • the articulated robot arm control device captures images from a different imaging direction from the first imaging position at the second imaging position, so that it is difficult to image an object due to backlight, obstacles, or the like. It can be avoided. As a result, the articulated robot arm control device can change the imaging position according to the surrounding conditions.
  • the articulated robot arm device can be operated by increasing the types of work that can be handled by the articulated robot arm.
  • the degree of freedom in the usage environment can be increased. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm control device of the present invention preferably includes the following configurations.
  • the drive control unit corrects the control amount of the actuator of the articulated robot arm based on the calculated fluctuation of the position coordinates of the object.
  • the position of the articulated robot arm is changed by the drive control unit correcting the control amount of the actuator of the articulated robot arm based on the calculated fluctuation of the position coordinates of the object. Even so, the position of the imaging unit can be maintained by the articulated robot arm. As a result, the articulated robot arm control device can continue the work even if the surrounding state of the articulated robot arm changes.
  • the types of work that can be handled by the articulated robot arm can be increased. It is possible to increase the degree of freedom in the usage environment of the articulated robot arm device. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm device of the present invention preferably includes the following configurations.
  • the articulated robot arm device includes the articulated robot arm, and the articulated robot arm is controlled by the articulated robot arm control device according to any one of the above.
  • the articulated robot arm is controlled by the articulated robot arm control device, so that the imaging unit provided on the articulated robot arm can be arbitrarily moved within the movable range of the articulated robot arm. It can be arranged at the position of, and the information of the work area or the object can be acquired. As a result, the articulated robot arm device can continue the work even if the state of the work area, the object, and the surroundings of the articulated robot arm changes.
  • the degree of freedom in the usage environment of the articulated robot arm device can be increased by increasing the types of work that can be handled by the articulated robot arm. .. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm device of the present invention preferably includes the following configurations.
  • the articulated robot arm has a tip rotating portion that can rotate around the axis of the most advanced link, and the tip rotating portion is provided with the imaging unit.
  • the imaging unit is provided at the tip rotating unit that can rotate around the axis of the link, the imaging unit can be moved to a position where an object can be imaged. As a result, the articulated robot arm device can continue imaging even if the surrounding state of the articulated robot arm changes.
  • the articulated robot arm device in which the imaging unit is provided at the tip rotating portion that can rotate around the axis of the link, the articulated robot arm device can be handled by increasing the types of work that can be handled by the articulated robot arm. It is possible to increase the degree of freedom in the usage environment of. Therefore, the versatility of the articulated robot arm device can be increased.
  • the articulated robot arm means a robot arm having a plurality of joint portions connecting a plurality of links.
  • the articulated robot arm includes a vertical articulated robot arm.
  • the vertical articulated robot arm is a robot arm of a serial link mechanism in which links are connected in series from the root to the tip by a rotary joint or a linear motion joint having one degree of freedom.
  • the vertical articulated robot arm has a plurality of joints.
  • the work area is a process in which when an articulated robot arm works on an object, the articulated robot arm approaches the object to be worked on or the object to be worked on. Means the area through which the articulated robot arm may pass. The area through which the articulated robot arm passes when it leaves the position of one object is excluded.
  • the angle of view means an angle indicating a range actually captured by the camera.
  • the angle of view is a diagonal angle of view indicating an angle formed by the optical center of the lens forming two points diagonal to the light receiving surface of the image sensor. That is, the diagonal angle of view is determined by the effective focal length of the lens and the length of the diagonal line of the light receiving surface.
  • the diagonal angle of view becomes smaller as the effective focal length of the lens becomes longer, and becomes smaller as the diagonal line of the light receiving surface becomes shorter.
  • an articulated robot arm control device and an articulated robot arm device capable of increasing the versatility of the articulated robot arm by increasing the types of work that can be handled by the articulated robot arm. it can.
  • FIG. 1 is a schematic view of a remote-controlled vehicle 1 provided with an articulated robot arm device 10 according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram of the control configuration of the remote-controlled vehicle 1.
  • the remote-controlled vehicle 1 is in front of the remote-controlled vehicle 1 in the direction from the end where the articulated robot arm is not arranged to the end where the articulated robot arm is arranged at both ends in the traveling direction. Defined as direction.
  • Arrows X, Y, and Z in the figure shown below indicate the direction of the coordinate axis which is the Cartesian coordinate system (hereinafter, simply referred to as "robot coordinate system") in the articulated robot arm device 10.
  • robot coordinate system the X-axis direction coincides with the front direction of the remote-controlled vehicle 1. Therefore, in the robot coordinate system, if the front direction of the remote control vehicle 1 is the X-axis direction and the vertically upward direction is the Z direction, the left direction when facing the front direction of the remote control vehicle 1 is the Y-axis direction.
  • the origin of the robot coordinate system is the intersection of the axis of the S-axis motor unit 12 and the axis of the L-axis motor unit 14 in the articulated robot arm 11, which will be described later.
  • the remote-controlled vehicle 1 is a four-wheeled vehicle that is remotely controlled by an external control signal.
  • the remote-controlled vehicle 1 communicates with the vehicle body 2, the pair of wheels 3, the pair of wheels 4, the drive motor 5 for driving the wheels 3 and 4, and the steering motor 6 for steering the wheels 3 and 4. It includes a device 7, a battery 8, and a vehicle control device 9.
  • the pair of wheels 3 are located at the front of the vehicle body 2 and are located at the rear of the pair of wheels. Therefore, for example, one wheel 3 is a front wheel and a pair of wheels 4 are rear wheels. Each of the pair of wheels 3 and 4 is steerable by a steering device (not shown) on the vehicle body 2.
  • a harvest box H for storing the harvested material is mounted on the upper surface of the vehicle body 2.
  • the drive motor 5 is an actuator that applies a driving force to each of the pair of wheels 3.
  • the drive motor 5 is provided on a pair of wheels 3.
  • the drive motor 5 applies a driving force to the pair of wheels 3 via, for example, a speed reducer (not shown).
  • the steering motor 6 is an actuator that steers a pair of wheels 3 and 4, respectively.
  • the steering motor 6 is provided in a steering device (not shown).
  • the steering motor 6 steers a pair of wheels 3 and 4 by driving a steering device.
  • the communication device 7 transmits and receives control signals to and from the external operation terminal C.
  • the communication device 7 is provided on the vehicle body 2.
  • the communication device 7 receives a control signal transmitted from the external operation terminal C. Further, the communication device 7 transmits a control signal output from the vehicle control device 9 to the operation terminal C.
  • the operation terminal C transmits a signal for remotely controlling the vehicle 1 to the communication device 7 as a control signal.
  • Battery 8 is a battery that can be charged and discharged.
  • the battery 8 is provided on the vehicle body 2.
  • the battery 8 is, for example, a lead storage battery, an alkaline storage battery, a lithium ion battery, or the like.
  • the battery 8 supplies electricity to the drive motor 5, the steering motor 6, the communication device 7, the vehicle control device 9, and the articulated robot arm device 10 described later.
  • the vehicle control device 9 is a device that controls the remote-controlled vehicle 1.
  • the vehicle control device 9 may substantially have a configuration in which a CPU, ROM, RAM, HDD, etc. are connected by a bus, or may have a configuration including a one-chip LSI or the like.
  • the vehicle control device 9 stores various programs and data for controlling the operation of the drive motor 5, the steering motor 6, the communication device 7, and the like.
  • the vehicle control device 9 is communicably connected to the drive circuit of the drive motor 5. As a result, the vehicle control device 9 can transmit a control signal to the drive circuit of the drive motor 5.
  • the vehicle control device 9 is communicably connected to the drive circuit of the steering motor 6. As a result, the vehicle control device 9 can transmit a control signal to the drive circuit of the steering motor 6.
  • the vehicle control device 9 is communicably connected to the communication device 7. As a result, the vehicle control device 9 can acquire the control signal transmitted from the external operation terminal C and received by the communication device 7. Further, the vehicle control device 9 can transmit a control signal to the external operation terminal C via the communication device 7.
  • the vehicle control device 9 is electrically connected to the battery 8. As a result, electric power is supplied to the vehicle control device 9 from the battery 8.
  • the vehicle control device 9 is communicably connected to the battery 8. As a result, the vehicle control device 9 can acquire information about the state of the battery 8.
  • the remote-controlled vehicle 1 configured as described above is remotely controlled by a control signal transmitted from the external operation terminal C.
  • FIG. 3 is a schematic view of the articulated robot arm device 10 according to the first embodiment of the present invention.
  • FIG. 4A is a plan view of the end effector according to the first embodiment of the present invention.
  • FIG. 4B is a side view of the end effector according to the first embodiment of the present invention.
  • FIG. 4C is a diagram schematically showing a viewing angle in a monocular camera provided on an end effector.
  • FIG. 5 is a block diagram of the articulated robot arm control device 25.
  • the articulated robot arm device 10 includes an articulated robot arm 11, an end effector 23, and an articulated robot arm control device 25.
  • the articulated robot arm 11 is a robot arm of a serial link mechanism in which the links are connected in series from the base end to the tip end by a rotary joint having one degree of freedom in the present embodiment.
  • the articulated robot arm 11 is, for example, a 6-axis vertical articulated robot arm.
  • the articulated robot arm 11 is provided on the front portion of the upper surface of the vehicle body 2 in the remote-controlled vehicle 1.
  • the S-axis motor unit 12, the L-axis motor unit 14, the U-axis motor unit 16, the B-axis motor unit 18, and the R-axis motor unit 20 are arranged in this order from the base end portion fixed to the remote-operated vehicle 1.
  • the T-axis motor unit 22 are connected in series by a link, respectively.
  • the motor unit of each axis constitutes a rotary joint.
  • the motor unit of each axis includes a motor, a speed reducer, an encoder and a drive circuit (not shown).
  • the articulated robot arm 11 is controlled by the articulated robot arm control device 25.
  • the articulated robot arm 11 acquires a control signal from the articulated robot arm control device 25 to the drive circuit of each axis. Further, the articulated robot arm 11 transmits information regarding the output of the motor of the motor unit of each axis and information from the encoder to the articulated robot arm control device 25.
  • the S-axis motor unit 12 is provided in the remote-controlled vehicle 1.
  • the S-axis motor unit 12 is a rotary joint that rotates the entire articulated robot arm 11.
  • the S-axis motor unit 12 is arranged so that the axis of the S-axis motor unit 12 extends in a direction perpendicular to the installation surface of the articulated robot arm 11.
  • a base member 13 is fixed to the output shaft of the S-axis motor unit 12.
  • the base member 13 is provided with an L-axis motor unit 14.
  • the L-axis motor unit 14 is a rotary joint that swings the lower bowl link 15.
  • the L-axis motor unit 14 is arranged so that the axis of the L-axis motor unit 14 extends in a direction perpendicular to the axis of the S-axis motor unit 12.
  • One side end of the lower bowl link 15 is fixed to the output shaft of the L-axis motor unit 14.
  • a U-axis motor unit 16 is provided at the other end of the lower bowl link 15.
  • the U-axis motor unit 16 is a rotary joint that swings the upper arm link 17.
  • the U-axis motor unit 16 is arranged so that the axis of the U-axis motor unit 16 extends in a direction parallel to the axis of the L-axis motor unit 14.
  • One side end of the upper arm link 17 is fixed to the output shaft of the U-axis motor unit 16.
  • a B-axis motor unit 18 is provided at the other end of the upper arm link 17.
  • the B-axis motor unit 18 is a rotary joint that swings the wrist vertical link 19.
  • the B-axis motor unit 18 is arranged so that the axis of the B-axis motor unit 18 extends in a direction parallel to the axis of the U-axis motor unit 16.
  • a wrist vertical link 19 is fixed to the output shaft of the B-axis motor unit 18.
  • An R-axis motor unit 20 is provided on the wrist upper / lower link 19.
  • the R-axis motor unit 20 is a rotary joint that rotates the wrist rotation link 21.
  • the R-axis motor unit 20 is arranged so that the axis of the R-axis motor unit 20 extends in a direction perpendicular to the axis of the B-axis motor unit 18.
  • a wrist rotation link 21 is fixed to the output shaft of the R-axis motor unit 20.
  • the wrist rotation link 21 is provided with a T-axis motor unit 22.
  • the T-axis motor unit 22 is a rotary joint that rotates the end effector 23.
  • the T-axis motor unit 22 is arranged so that the axis of the T-axis motor unit 22 extends in a direction perpendicular to the axis of the R-axis motor unit 20.
  • An end effector 23 is fixed to the output shaft of the T-axis motor unit 22.
  • the articulated robot arm 11 configured in this way has three degrees of freedom of translation in the X-axis, Y-axis, and Z-axis directions and three degrees of freedom of rotation around the X-axis, Y-axis, and Z-axis, depending on the motor unit of each axis. Has 6 degrees of freedom. Therefore, the articulated robot arm 11 can move the end effector 23 fixed to the output shaft of the T axis to an arbitrary position and take an arbitrary posture in the movable space of the articulated robot arm 11. Can be done.
  • the end effector 23 is a device that works on an object.
  • the end effector 23 according to the first embodiment of the present invention is a harvesting device for harvesting an agricultural product as an object.
  • the end effector 23 is fixed to the output shaft of the T-axis of the articulated robot arm 11.
  • the end effector 23 includes a gripping device 23a for gripping the crop and a cutting device 23b for separating the crop from the branches and stems.
  • the gripping device 23a and the cutting device 23b are driven by, for example, a motor.
  • the end effector 23 harvests the object by cutting the stem on the trunk side of the gripping position with the cutting device 23b while the stem of the object to be harvested is gripped by the gripping device 23a.
  • the articulated robot arm control device 25 is a device that controls the articulated robot arm 11, the end effector 23, and the monocular camera 24.
  • the articulated robot arm control device 25 includes a monocular camera 24 which is an imaging unit.
  • the monocular camera 24 is a camera that captures an object from a single viewpoint at a time.
  • the monocular camera 24 is a digital camera using a CCD sensor or a COMPOS sensor.
  • the monocular camera 24 has a diagonal angle of view of 90 degrees or more, which is formed by two points diagonal to the optical center of the lens and the light receiving surface of the CCD sensor or the COMPOS sensor.
  • the monocular camera 24 is provided on the end effector 23.
  • the monocular camera 24 is arranged in the end effector 23 so that the gripping device 23a or the cutting device 23b of the end effector 23 is included in the angle of view of the monocular camera 24.
  • the monocular camera 24 is rotatably provided around the axis of the T-axis by the T-axis motor unit 22 which is a tip rotating portion of the articulated robot arm 11. As a result, the monocular camera 24 is arranged on the circumference centered on the end effector 23.
  • the articulated robot arm control device 25 may actually have a configuration in which a CPU, ROM, RAM, HDD, etc. are connected by a bus, or may consist of a one-chip LSI or the like. It may be a configuration.
  • the articulated robot arm control device 25 stores various programs and data for controlling the operations of the articulated robot arm 11, the monocular camera 24, and the end effector 23.
  • the articulated robot arm control device 25 is connected to the battery 8 and can be supplied with electric power from the battery 8 and can acquire information about the state of the battery.
  • the articulated robot arm control device 25 includes an S-axis motor unit 12, an L-axis motor unit 14, a U-axis motor unit 16, a B-axis motor unit 18, an R-axis motor unit 20, and a T-axis motor unit of the articulated robot arm 11. It is connected to each of the drive circuits of the motor included in 22, and can transmit a control signal to the drive circuit of the motor of each axis. Further, the articulated robot arm control device 25 can acquire the rotation position information (encoder signal) of the motor from the motor unit of each axis.
  • the articulated robot arm control device 25 is connected to the monocular camera 24 and can acquire an image captured by the monocular camera 24.
  • the articulated robot arm control device 25 is communicably connected to the drive circuit of the end effector motor that drives the gripping device 23a and the cutting device 23b of the end effector 23, and drives the motor that drives the gripping device 23a and the cutting device 23b.
  • a control signal can be sent to the circuit.
  • the articulated robot arm control device 25 is communicably connected to the vehicle control device 9 of the remote-controlled vehicle 1, acquires a control signal from the vehicle control device 9 of the remote-controlled vehicle 1, or is a vehicle control device of the remote-controlled vehicle 1.
  • a control signal can be transmitted to 9.
  • the articulated robot arm control device 25 is communicably connected to the communication device 7 of the remote control vehicle 1 and can acquire a control signal from the external operation terminal C received by the communication device 7. Further, the articulated robot arm control device 25 continuously transmits the control signal generated by the articulated robot arm control device 25 or the image captured by the monocular camera 24 to the external operation terminal C via the communication device 7. be able to.
  • the articulated robot arm device 10 configured as described above can remotely control the articulated robot arm 11, the monocular camera 24, and the end effector 23 by a control signal from an external operation terminal C. That is, the articulated robot arm device 10 uses the articulated robot arm 11 as an actuator for moving the monocular camera 24 and the end effector 23 to an arbitrary position in an arbitrary posture. As described above, in the articulated robot arm device 10, since the monocular camera 24 captures images from the viewpoint from the articulated robot arm 11, remote control based on the image captured by the monocular camera 24 is easy. Further, the articulated robot arm device 10 can automatically perform a predetermined work based on a control signal from the operation terminal C.
  • FIG. 6 shows a relationship diagram between the movement amount T of the monocular camera and the distance L from the imaging position of the monocular camera to the crop to be harvested (target grape G (n)).
  • FIG. 7 shows a processed image in which the target grape G (n) and the harvest order A are calculated.
  • FIG. 8 shows a schematic diagram showing an example of the movement of the end effector for generating a parallax image by the monocular camera.
  • the articulated robot arm control device 25 is articulated so as to image the work area W at the first imaging position P1 (see the black arrow) away from the work area W for harvesting crops. It controls the robot arm 11 and the monocular camera 24.
  • the first imaging position P1 is a position within the movable range of the articulated robot arm 11 and including the entire working area W within the angle of view ⁇ of the monocular camera 24. In other words, it is a position where the area for searching for harvestable crops can be imaged at once.
  • the articulated robot arm control device 25 controls the articulated robot arm 11 and the monocular camera 24 so as to image crops that can be harvested at a second imaging position P2 (see the black-painted arrow) different from the first imaging position P1.
  • the second imaging position P2 is within the movable range of the articulated robot arm 11 and is closer to the working area W than the first imaging position P1. That is, it is a position where the crop to be harvested can be imaged in detail.
  • the articulated robot arm control device 25 calculates the detailed position coordinates of the crop to be harvested at the second imaging position P2.
  • the articulated robot arm device 10 moves the monocular camera 24 to an arbitrary position and direction by using the articulated robot arm 11 as a moving actuator. It can be omitted.
  • the articulated robot arm control device 25 can increase the degree of freedom in the usage environment of the articulated robot arm device 10. Therefore, the articulated robot arm control device 25 and the articulated robot arm device 10 can enhance the versatility of the articulated robot arm 11.
  • the articulated robot arm control device 25 includes an image processing unit 26, a coordinate information processing unit 27, and a drive control unit 28.
  • the image processing unit 26 detects, for example, grape G (n) (hereinafter, simply referred to as “target grape G (n)”) as an agricultural product to be harvested from the image of the work area W captured by the monocular camera 24. It is a control device.
  • the image processing unit 26 stores in advance the crop detection data D (see FIG. 7) acquired by learning about various images of the target grape G (n).
  • (n) of the target grape G (n) is a subscript (n is an integer) for distinguishing the grapes.
  • the image processing unit 26 causes the monocular camera 24 to image the work area W, and detects all the target grapes G (n) existing in the image captured based on the detection data D. Further, the image processing unit 26 continuously transmits the captured image to the external operation terminal C via the communication device 7 of the remote control vehicle 1. As a result, the articulated robot arm control device 25 can detect all the target grapes G (n) from the agricultural products existing in the work area W.
  • the coordinate information processing unit 27 is a control device that calculates the position coordinates G (n) (x, y, z) of the target grape G (n) detected by the image processing unit 26 in the robot coordinate system.
  • the coordinate information processing unit 27 acquires an image of the work area W of the target grape G (n) from the image processing unit 26. Further, the coordinate information processing unit 27 acquires the posture information of the articulated robot arm 11 from the drive control unit 28.
  • the coordinate information processing unit 27 obtains the position coordinates of the target grape G (n) in the image of the work area W from the attachment position of the monocular camera 24 on the articulated robot arm 11 and the posture information of the articulated robot arm 11 to the robot coordinates. It is converted into the position coordinates G (n) (x, y, z) in the system. As a result, the articulated robot arm control device 25 can control the articulated robot arm 11 with reference to the image of the work area W.
  • the position coordinates G (n) (x, y, z) of the target grape G (n) are the position coordinates based on the first image Im1 from the first imaging position. It is described as (x1, y1, z1), and the position coordinates based on the second image Im2 from the second imaging position are described as the position coordinates G (n) (x2, y2, z2).
  • the coordinate information processing unit 27 moves the first image Im1 captured at the first imaging position P1 away from the work area W by an arbitrary movement amount T in an arbitrary direction and images the image.
  • a parallax image is generated from the auxiliary image Is1 of the first image Im1 and the distance L to the target grape G (n) is calculated.
  • the coordinate information processing unit 27 assists the first image Im1 and the first image Im1 which are two images captured by the monocular camera 24 at the first imaging position P1 by shifting the imaging position by the movement amount T.
  • the articulated robot arm control device 25 can calculate the distance L to the target grape G (n) with the monocular camera 24.
  • the lines Om and Os in FIG. 6 are the main optical axes of the monocular camera 24.
  • the coordinate information processing unit 27 calculates the harvest order A for performing the harvesting work. ..
  • the coordinate information processing unit 27 is based on the calculated position coordinates G1 (x1, y1, z1), G2 (x1, y1, z1), and G3 (x1, y1, z1) of the plurality of target grapes G1, G2, and G3. For example, the harvesting order A in which the harvesting work is performed in order from the position where the Z coordinate is large and the target grape G1 closest to the articulated robot arm 11 is calculated.
  • the coordinate information processing unit 27 uses the communication device 7 of the remote control vehicle 1 to calculate the position coordinates G1 (x1, y1, z1) and G2 (x1, y1, z1) of the target grapes G1, G2, and G3. , G3 (x1, y1, z1) and the information of the harvest order A are transmitted to the external operation terminal C. As a result, the articulated robot arm control device 25 can efficiently perform the harvesting work in the work area W.
  • the drive control unit 28 is a control device that controls the motor of the end effector 23 and the motor unit of each axis of the articulated robot arm 11.
  • the drive control unit 28 acquires the control signal of the articulated robot arm 11 from the external operation terminal C via the communication device 7 of the remote control vehicle 1, and provides the external operation terminal C with the posture information of the articulated robot arm 11. To send. Further, the drive control unit 28 acquires information on the position coordinates G (n) (x1, y1, z1) of the target grape G (n) and the harvest order A from the coordinate information processing unit 27. Further, the drive control unit 28 acquires the rotation position information of the motor from the motor unit of each axis.
  • the drive control unit 28 uses the control signal acquired from the external operation terminal C, the position coordinates G (n) (x1, y1, z1) of the acquired target grape G (n), information on the harvest order A, and each axis.
  • a control signal is transmitted to the drive circuit of the motor of each axis based on the rotation position information of the motor unit of.
  • the articulated robot arm control device 25 can arrange the end effector 23 at an arbitrary position and in an arbitrary posture by the articulated robot arm 11.
  • the drive control unit 28 transmits a control signal to the drive circuit of the motor of the gripping device 23a and the cutting device 23b in the end effector 23. ..
  • the articulated robot arm control device 25 can perform the harvesting operation of the target grape G (n) by the end effector 23.
  • the articulated robot arm device 10 harvests the target vine G (n), which is an agricultural product, from, for example, vines planted in a row.
  • the remote-controlled vehicle 1 moves along a row of vines. It is assumed that the articulated robot arm device 10 is moved by the remote-controlled vehicle 1 to a position within the movable range of the articulated robot arm 11 to include a vine for harvesting work.
  • the articulated robot arm control device 25 controls to start imaging from the external operation terminal C via the vehicle control device 9 of the remote control vehicle 1 or the communication device 7 of the remote control vehicle 1.
  • the imaging of the work area W is started.
  • the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and is the first imaging position P1 separated from the work area W by a predetermined distance, and is the work area.
  • the monocular camera 24 is moved to approximately the center of the robot coordinate system in W in the Z-axis direction and substantially in the center of the robot coordinate system in the work area W in the X-axis direction.
  • the drive control unit 28 controls the motor units of each axis of the articulated robot arm 11 to move the monocular camera 24 so that the imaging direction of the monocular camera 24 faces the work area W.
  • the monocular camera 24 is arranged so that the entire work area W is included in the angle of view of the monocular camera 24. Since the monocular camera 24 has a diagonal angle of view of 90 degrees or more, the entire work area W can be contained within the angle of view of the monocular camera 24 in the vicinity of the work area W.
  • the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the work area W from the first imaging position P1. As shown in FIG. 7, the image processing unit 26 detects all the target grapes G (n) to be harvested existing in the first image Im1 imaged based on the detection data D.
  • the detected target grapes G (n) are grapes G1, grapes G2, and grapes G3.
  • the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and moves arbitrarily from the first imaging position P1 in an arbitrary direction.
  • the monocular camera 24 is moved by the amount T.
  • the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the work area W from a position moved by an arbitrary movement amount T in an arbitrary direction, and assists the first image Im1. Image Is1 is acquired.
  • the articulated robot arm control device 25 transmits the captured first image Im1 and the auxiliary image Is1 of the first image Im1 to the external operation terminal C via the communication device 7 of the remote control vehicle 1.
  • the articulated robot arm control device 25 uses the coordinate information processing unit 27 to obtain the attitude information of the articulated robot arm 11 at the time of capturing the first image Im1 and the mounting position of the monocular camera 24.
  • the position coordinates G1 (x1, z1), G2 (x1, z1), and G3 (x1, z1) on the XZ plane in the robot coordinate system of the grape G2 and the grape G3 are calculated.
  • the coordinate information processing unit 27 calculates the position coordinates on the XX plane of the grape G1, the grape G2, and the grape G3 in the auxiliary image Is1 of the first image Im1 in the robot coordinate system.
  • the coordinate information processing unit 27 generates a disparity image from the first image Im1 and the auxiliary image Is1 of the first image Im1, and calculates the Y coordinates of the grape G1, the grape G2, and the grape G3 in the robot coordinate system. As a result, the coordinate information processing unit 27 has the position coordinates G1 (x1, y1, z1), G2 (x1, y1, z1), G3 (x1, y1, z1) of the grape G1, the grape G2, and the grape G3 in the robot coordinate system. ) Is calculated.
  • the articulated robot arm control device 25 normally calculates the position coordinates of at least one of the grapes G1, the grapes G2, and the grapes G3, for example, the position coordinates G2 (x1, y1, z1) of the grapes G2 by the coordinate information processing unit 27. If this is not possible, the drive control unit 28 transmits a control signal for moving the monocular camera 24 to the motor units of each axis of the articulated robot arm 11.
  • the articulated robot arm control device 25 again uses the position coordinates G1 (x1, y1, z1), G2 (x1, y1, z1), G3 (x1, y1, z1) of the grapes G1, the grapes G2, and the grapes G3 in the robot coordinate system. ) Is calculated.
  • the articulated robot arm control device 25 has the position coordinates G1 (x1, y1, z1), G2 (x1, y1, of the grape G1, the grape G2, and the grape G3) of the grape G1, the grape G2, and the grape G3 calculated by the coordinate information processing unit 27.
  • the harvesting order A for performing the harvesting work is calculated.
  • the coordinate information processing unit 27 calculates, for example, the harvesting order A in which the harvesting work is performed in the order of increasing Z coordinate and decreasing Y coordinate.
  • the articulated robot arm control device 25 for example, a harvesting order A in which the harvesting work is performed in the order of grape G1, grape G2, and grape G3 is set.
  • the articulated robot arm control device 25 transmits the calculated position coordinates of the grape G1, the grape G2, and the grape G3 and the information of the harvest order A to the external operation terminal C via the communication device 7 of the remote control vehicle 1.
  • the articulated robot arm control device 25 calculates detailed position coordinates in the order of grape G1, grape G2, and grape G3 based on the harvesting order A, and performs harvesting work.
  • the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and the second image is closer to the grape G1 than the first imaging position P1.
  • the monocular camera 24 is moved to the imaging position P2.
  • the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the grape G1 from the second imaging region and acquire the second image Im2 (see FIG. 6).
  • the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and is in an arbitrary direction from the first imaging position P1.
  • the monocular camera 24 is moved by an arbitrary movement amount T.
  • the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the grape G1 from a position where the grape G1 is moved in an arbitrary direction by an arbitrary amount of movement T, and an auxiliary image of the second image Im2. Acquire Is2.
  • the articulated robot arm control device 25 transmits the captured second image Im2 and the auxiliary image Is2 of the second image Im2 to the external operation terminal C via the communication device 7 of the remote control vehicle 1.
  • the articulated robot arm control device 25 converts the position coordinates of the grape G1 in the second image Im2 into the position coordinates G1 (x2, z2) on the XX plane in the robot coordinate system by the coordinate information processing unit 27.
  • the coordinate information processing unit 27 converts the position coordinates of the grape G1 in the auxiliary image Is2 of the second image Im2 into the position coordinates on the XX plane in the robot coordinate system.
  • the coordinate information processing unit 27 generates a parallax image from the second image Im2 and the auxiliary image Is2 of the second image Im2, and calculates the Y coordinate of the grape G1 in the robot coordinate system. As a result, the coordinate information processing unit 27 calculates the position coordinates G1 (x2, y2, z2) of the grape G1 in the robot coordinate system. Since the position coordinates G1 (x2, y2, z2) of the grape G1 are calculated from the second image Im2 captured at the second imaging position P2, which is closer to the grape G1 than the first imaging position P1, G1 (x1, y2, z2). It is more accurate than y1 and z1).
  • the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and sets the end effector 23 up to the calculated position coordinates G1 (x2, y2, z2) of the grape G1. Move.
  • the drive control unit 28 controls the motor of the end effector 23, grips the stalk of the vine G1 with the gripping device 23a, and separates the stalk of the vine G1 from the branch of the vine tree with the cutting device 23b. After that, the drive control unit 28 controls the motor units of each axis of the articulated robot arm 11 to store the harvested grape G1 in the harvest box H of the remote-controlled vehicle 1 (see FIG. 1).
  • the articulated robot arm control device 25 calculates the detailed position coordinates G2 (x2, y2, z2) and G3 (x2, y2, z2) of the grape G2 and the grape G3 based on the harvest order A. Harvesting work is performed by the end effector 23.
  • FIG. 9 shows a control flow diagram of the articulated robot arm control device 25 according to the first embodiment of the present invention.
  • step S110 the articulated robot arm control device 25 is moved by the drive control unit 28 to each axis of the articulated robot arm 11.
  • the motor unit is controlled to move the monocular camera 24 to the first imaging position P1 and shift the step to step S120.
  • step S120 the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to capture the first image Im1 of the work area W from the first imaging position P1. Further, the articulated robot arm control device 25 detects the target grape G (n) from the first image Im1 based on the detection data D by the image processing unit 26, and shifts the step to step S130.
  • step S130 the articulated robot arm control device 25 captures the auxiliary image Is1 of the first image Im1 from a position in which the monocular camera 24 is moved from the first imaging position P1 in an arbitrary direction by an arbitrary movement amount T. The step is shifted to step S140.
  • step S140 the articulated robot arm control device 25 is subjected to the position coordinates G (n) of the target grape G (n) in the robot coordinate system from the auxiliary image Is1 of the first image Im1 and the first image Im1 by the coordinate information processing unit 27. (X1, y1, z1) is calculated, and the step is shifted to step S150.
  • step S150 the articulated robot arm control device 25 determines whether or not all the position coordinates G (n) (x1, y1, z1) of the target grape G (n) can be calculated. As a result, when all the position coordinates G (n) (x1, y1, z1) of the target grape G (n) can be calculated, the articulated robot arm control device 25 shifts the step to step S160. On the other hand, when the target grape G (n) cannot be properly imaged and all the position coordinates G (n) (x1, y1, z1) of the target grape G (n) cannot be calculated, the articulated robot arm control device 25 The step is shifted to step S161.
  • step S160 the articulated robot arm control device 25 performs the harvesting operation based on the position coordinates G (n) (x1, y1, z1) of all the target grapes G (n) calculated by the coordinate information processing unit 27.
  • the harvest order A to be performed is calculated, and the step is shifted to step S170.
  • step S170 the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and the second of the target grapes G (n) selected based on the harvest order A. 2
  • the monocular camera 24 is moved to the imaging position P2, and the step is shifted to step S180.
  • step S180 the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the second image Im2 of the target grape G (n) from the second imaging position P2. Further, the articulated robot arm control device 25 captures the auxiliary image Is2 of the second image Im2 from the position where the monocular camera 24 is moved from the second imaging position P2 in an arbitrary direction by an arbitrary movement amount T, and steps. To step S190.
  • step S190 the articulated robot arm control device 25 is subjected to the position coordinates G (n) of the target grape G (n) in the robot coordinate system from the auxiliary image Is2 of the second image Im2 and the second image Im2 by the coordinate information processing unit 27. (X2, y2, z2) is calculated. Further, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and the target grapes at the calculated position coordinates G (n) (x2, y2, z2). G (n) is harvested and the step shifts to step S200.
  • step S200 the articulated robot arm control device 25 determines whether or not all the target grapes G (n) have been harvested.
  • the articulated robot arm control device 25 ends the harvesting work and ends this flow.
  • the articulated robot arm control device 25 shifts the step to step S170.
  • step S161 the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and a new first imaging closer to the work area W than the first imaging position P1.
  • the monocular camera 24 is moved to the position P1 and the step is shifted to step S162.
  • step S162 the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to capture a new first image Im1 of the work area W from the new first imaging position Pn, and for detection.
  • the target grape G (n) is detected from the new first image Im1 based on the data D, and the step is shifted to step S163.
  • step S163 the articulated robot arm control device 25 captures the auxiliary image Is1 of the new first image Im1 from the position where the monocular camera 24 is moved from the new first imaging position P1 in an arbitrary direction by an arbitrary amount of movement. , Move the step to step S164.
  • step S164 the articulated robot arm control device 25 receives the new position coordinates G of the target grape G (n) in the robot coordinate system from the auxiliary image Is1 of the new first image Im1 and the new first image Im1 by the coordinate information processing unit 27.
  • N Calculate (x1, y1, z1) and shift the step to step S150.
  • the articulated robot arm control device 25 includes a monocular camera 24 provided on the articulated robot arm 11, an image processing unit 26, a coordinate information processing unit 27, and a drive control unit 28.
  • the monocular camera 24 images the work area W at the first imaging position P1 in which the articulated robot arm 11 is separated from the work area W and the work area W is completely included in the angle of view of the monocular camera 24.
  • the image processing unit 26 detects the first image Im1 of the target grape G (n) from the image in the work area W.
  • the coordinate information processing unit 27 of the target grape G (n) in the robot coordinate system of the articulated robot arm 11 based on the first image Im1 of the detected target grape G (n) and the auxiliary image Is1 of the first image Im1.
  • the position coordinates G (n) (x1, y1, z1) are calculated.
  • the articulated robot arm control device 25 searches for the target grape G (n) from the area including the work area W by the monocular camera 24 taking an image of the work area W at the first imaging position P1. Can be done. Further, the coordinate information processing unit 27 can calculate the position coordinates of the target grape G (n) based on the first image Im1 and the auxiliary image Is1 of the first image Im1. As described above, the monocular camera 24 provided on the articulated robot arm 11 has a role of calculating detailed position coordinates for improving the work accuracy of the end effector 23 and the target grape G (n) included in the entire work area W. It plays a role in detecting.
  • the articulated robot arm control device 25 moves the monocular camera 24 by the articulated robot arm 11 to image the work area W and the target grape G (n), which are imaging targets having different imaging ranges. Can be done. As a result, the articulated robot arm control device 25 can omit the fixed camera in the conventional proposal using the fixed camera and the hand camera.
  • the coordinate information processing unit 27 calculates the position coordinates G (n) (x1, y1, z1) of the plurality of target grapes G (n) in the work area W, so that the articulated robot The optimum harvesting order A when the arm 11 performs the harvesting operation can be calculated based on a predetermined condition. As a result, the articulated robot arm control device 25 can efficiently harvest the target grape G (n) existing at a random position.
  • the monocular camera 24 further images the work area W including the target grape G (n) at the second imaging position P2 different from the first imaging position P1.
  • a specific region or a specific target grape G (n) is imaged from the second imaging position P2 at a different angle from the first image Im1 imaged from the first imaging position P1, or from the first imaging position P1. Can be magnified and imaged at a close position.
  • the articulated robot arm control device 25 can search the work area W over a wide area using the first image Im1 imaged from the first imaging position P1, and the second image Im2 imaged from the second imaging position P2.
  • the work area W can be searched locally using. That is, the articulated robot arm control device 25 can acquire an image suitable for the state of the work area W, the state around the target grape G (n), and the shape of the target grape G (n).
  • the monocular camera 24 takes an image with an angle of view ⁇ of 90 degrees or more in the imaging range, the entire work area W can be imaged only by moving the monocular camera 24 within the movable range of the articulated robot arm 11. ..
  • the articulated robot arm control device 25 can capture the entire working area W within the movable range of the articulated robot arm 11 with the monocular camera 24.
  • the monocular camera 24 can generate a parallax image of the work area W by moving in an arbitrary direction by driving the articulated robot arm 11 and imaging the work area W at a plurality of places.
  • the articulated robot arm control device 25 can measure the distance L to the work area W or the target grape G (n) by moving the monocular camera 24 to take an image. Further, in the articulated robot arm control device 25, the distance between the two cameras is fixed by setting the movement amount T of the monocular camera 24 in an arbitrary direction according to the work environment and the work state. Compared with a stereo camera, the range of the distance at which the target grape G (n) can be recognized can be expanded, and the range that can be measured with a certain accuracy or higher can be expanded.
  • the articulated robot arm control device 25 captures a plurality of target grapes G (n) in the work area W by the monocular camera 24, and is based on a parallax image according to the state of the target grapes G (n).
  • the distance L to the target grape G (n) can be measured.
  • the articulated robot arm 11 has a T-axis motor unit 22 that can rotate around the axis of the wrist rotation link 21 at the most advanced end, and has an end effector 23 on the output shaft of the T-axis motor unit 22 that is the tip rotating portion.
  • a monocular camera 24 is provided. At this time, the monocular camera 24 is arranged so that at least one part of the gripping device 23a and the cutting device 23b of the end effector 23 is included in the angle of view of the monocular camera 24.
  • the articulated robot arm control device 25 can move the monocular camera 24 to a position where the target grape G (n) can be imaged by rotating the monocular camera 24 around the axis of the wrist rotation link 21. As a result, the articulated robot arm device 10 can continue imaging by rotating the monocular camera 24 to a position where imaging can be performed even if the surrounding state of the target grape G (n) changes. Further, in the articulated robot arm device 10, since the end effector 23 is projected within the angle of view of the monocular camera 24, remote control by the external operation terminal C can be easily performed.
  • the articulated robot arm 11 is used as a moving actuator and the monocular camera 24 is positioned at an arbitrary position. Can be moved in the direction of. That is, the articulated robot arm device 25 and the articulated robot arm device 10 including the articulated robot arm control device 25 can image the work range from a position and a direction according to the work environment and the work state. Therefore, the articulated robot arm device 25 and the articulated robot arm device 10 including the articulated robot arm control device 25 can increase the types of work that can be handled by the articulated robot arm 11 and enhance versatility.
  • FIG. 10 shows a block diagram of the articulated robot arm control device 25 according to another embodiment of the present invention.
  • the articulated robot arm device 10 has a first image Im1 captured by a monocular camera 24 provided on the articulated robot arm 11 and a first image obtained by moving an arbitrary amount of movement in an arbitrary direction.
  • the distance to the target grape G (n) is calculated by using the auxiliary image Is1 of the image Im1.
  • the articulated robot arm device 10 further includes a distance measuring unit 29 including a laser ranging sensor and the like, and measures the distance to the target grape G (n) without relying on the monocular camera 24. You may.
  • the articulated robot arm device 10 configured in this way captures the work area W with the monocular camera 24, and the coordinate information processing unit 27 acquires the value measured by the distance measuring unit 29 to obtain the target grape G (n). ) Position coordinates G (n) (x1, y1, z1) can be measured.
  • the articulated robot arm control device 25 controls so that the amount of movement T of the monocular camera 24 for capturing a parallax image in an arbitrary direction differs between the first imaging position P1 and the second imaging position P2. May be good.
  • the articulated robot arm control device 25 changes the movement amount T of the monocular camera 24 for capturing the parallax image in an arbitrary direction, thereby moving the distance to the work area W or the target grape G (n).
  • the measurement accuracy when measuring L can be adjusted.
  • the articulated robot arm control device 25 can determine the distance in the work area W and the articulated robot arm even if the distance L between the first imaging position P1 and the second imaging position P2 to the work area W is different.
  • the distance L to the work area W or the target grape G (n) can be measured with the measurement accuracy according to the work environment and the work state of 11.
  • the articulated robot arm control device 25 may take an image from different imaging directions at the first imaging position P1 and the second imaging position P2.
  • FIG. 11 shows a layout diagram showing a state in which the first imaging position P1 and the second imaging position P2 and the imaging direction are different in the articulated robot arm control device 25 according to another embodiment of the present invention.
  • the articulated robot arm control device 25 moves the monocular camera 24 to the second imaging position P2 while maintaining the posture of the monocular camera 24 at the first imaging position P1.
  • the positional relationship of the backlight in which the lens of the monocular camera 24 is directed in the direction of the sunlight SL see the whitewashed arrow
  • the target grape G (n) between the anti-monocular camera 24 and the elephant grape G (n) When the positional relationship is such that grapes, leaves, etc. other than) are present, the monocular camera 24 cannot capture the target grape G (n) from the second imaging position.
  • the articulated robot arm control device 25 When the articulated robot arm control device 25 cannot acquire the position coordinates at the second imaging position due to backlight or an obstacle, the articulated robot arm control device 25 acquires images from different imaging directions at the first imaging position P1 and the second imaging position P2. Even when it is difficult to take an image of the target grape G (n) due to backlight, obstacles, or the like, it is possible to take an image in a state where the monocular camera 24 can take an image by changing the image taking direction and the image taking position.
  • the articulated robot arm control device 25 changes the position and posture of the monocular camera 24 according to changes in the work environment and the work state, so that the state of the work area W frequently changes. (N) can be imaged. That is, the articulated robot arm control device 25 can increase the degree of freedom in the usage environment of the articulated robot arm device 10 by increasing the recognition accuracy of the object by the monocular camera 24.
  • the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28 based on the calculated fluctuation amount of the position coordinates of the target grape G (n).
  • the position coordinates G (n) (xc, yc, zc) of the target grape G (n) after the change may be corrected to the position coordinates G (n) (x, y, z) before the change.
  • FIG. 12 shows a schematic view of a remote-controlled vehicle provided with an articulated robot arm device according to another embodiment of the present invention.
  • the position of the remote-controlled vehicle 1 may change due to the influence of the muddy ground or the change in the weight balance of the remote-controlled vehicle 1 due to the change in the posture of the articulated robot arm 11.
  • the origin of the robot coordinate system in the articulated robot arm device 10 is moved. That is, the position coordinates G (n) (x1, y1, z1) or the position coordinates G (n) (n) of the target grape G (n) calculated by the articulated robot arm control device 25 before the movement of the origin of the robot coordinate system.
  • x2, y2, z2) are different from the position coordinates of the target grape G (n) in the robot coordinate system after the origin has moved.
  • the articulated robot arm control device 25 has the same first position coordinates G (n) (x1c, y1c, z1c) of the target grape G (n) based on the first image Im1 calculated by the coordinate information processing unit 27. If the position coordinates G (n) (x1, y1, z1) of the target grape G (n) based on the first image Im1 captured in the past from the imaging position are different, it is determined that the origin of the robot coordinate system is moving. ..
  • the articulated robot arm control device 25 has the position coordinates G (n) (x1c, y1c, z1c) of the target grape G (n) after the origin of the robot coordinate system has moved and before the origin of the robot coordinate system has moved.
  • the motor unit of each axis of the articulated robot arm 11 is controlled so that the position coordinates G (n) (x1, y1, z1) coincide with each other.
  • the monocular camera 24 is corrected so as to be in the position and posture before the origin of the robot coordinate system moves (see the black arrow). Therefore, the articulated robot arm control device 25 can continue the work under the control of the articulated robot arm 11 even if the surrounding state of the articulated robot arm 11 changes. That is, the articulated robot arm control device 25 can increase the degree of freedom in the usage environment of the articulated robot arm device 10.
  • the articulated robot arm control device 25 determines the harvesting order A of the target grape G (n) in the work area W by a predetermined program based on the position of the target grape G (n).
  • the articulated robot arm control device 25 may determine the harvest order A by AI (artificial intelligence) or by applying a specific standard such as the size order of the target grape G (n).
  • the articulated robot arm 11 is a 6-axis vertical articulated robot arm.
  • the articulated robot arm 11 is, for example, an S-axis motor unit 12, an L-axis motor unit 14, and a U-axis motor unit 16.
  • B-axis motor unit 18, R-axis motor unit 20, and T-axis motor unit 22 are connected in series by links, respectively.
  • the articulated robot arm 11 may have a structure in which the motor units of the respective axes are connected in the order of connection, the axial direction at the time of connection, and the like are established as an articulated robot arm.
  • the articulated robot arm 11 is provided on the upper surface of the vehicle body 2 in the remote-controlled vehicle 1 and in front of the vehicle body 2, but this is an example and is not limited thereto.
  • the articulated robot arm 11 may be provided on the upper surface of the vehicle body 2 and on the rear side or the left or right side of the vehicle body 2.
  • the articulated robot arm 11 may be provided on the front-rear side surface or the left-right side surface of the remote-controlled vehicle 1.
  • the articulated robot arm device 10 performs harvesting work of the target grape G (n), which is an agricultural product, but this is an example and is not limited thereto.
  • the articulated robot arm device 10 is a type of end effector 23 mounted on the articulated robot arm 11 regardless of whether it is outdoors or indoors, such as not only harvesting agricultural products outdoors but also handling industrial parts outdoors. Carry out the work according to.
  • the end effector 23 is a device including a gripping device 23a and a cutting device 23b, and grips and cuts the target grape G (n), but is not limited thereto.
  • the end effector 23 may be any device that performs a predetermined operation on the object.
  • the articulated robot arm device 10 semi-automatically harvests the target grape G (n) by the control signal from the external communication terminal and the control signal from the articulated robot arm control device 25.
  • the articulated robot arm device 10 may be configured to control the motor unit, the end effector 23, and the monocular camera 24 of each axis of the articulated robot arm 11 by a control signal from an external operation terminal C.
  • the remote-controlled vehicle 1 is controlled by the vehicle control device 9 so as to move on a predetermined route when a control signal is acquired from the operation terminal C, but this is an example and is not limited thereto. Absent.
  • the remote-controlled vehicle 1 may be configured to move independently based on position information and map data from GNSS (Global Navigation Satellite System).
  • GNSS Global Navigation Satellite System
  • the remote-controlled vehicle 1 is a four-wheeled vehicle including a pair of wheels 3 and a pair of wheels 4, but is not limited to this as an example.
  • the remote-controlled vehicle may be a vehicle other than a four-wheeled vehicle, for example, a three-wheeled vehicle or a two-wheeled vehicle.

Abstract

The present invention addresses the problem of obtaining a multi-joint robot-arm control device and a multi-joint robot arm device that can enhance versatility of the multi-joint robot arm by increasing types of work that the multi-joint robot arm can handle. The present invention comprises: a single monocular camera 24 provided in a multi-joint robot arm 11; an image processing unit 26 that detects an image of a target grape G(n) from images photographed by the monocular camera 24; a coordinate information processing unit 27 that calculates a position coordinate of the target grape G(n); and a driving control unit 28 that drives an actuator of the multi-joint robot arm 11. The monocular camera 24 captures an image of a work area W in a first image-capturing position P1 positioned at a predetermined distance from the work area W. The image processing unit 26 detects an image of the target grape G(n) from the image of the work area W. On the basis of the detected image of the target grape G(n), the coordinate information processing unit 27 calculates a position coordinate of the target grape G(n) in a coordinate system of the multi-joint robot arm 11.

Description

多関節ロボットアーム制御装置及び多関節ロボットアーム装置Articulated robot arm control device and articulated robot arm device
 本発明は、多関節ロボットアーム制御装置及び多関節ロボットアーム装置に関する。 The present invention relates to an articulated robot arm control device and an articulated robot arm device.
 固定カメラとアームの先端部に設けられた手先カメラとが撮影した画像に基づいて多関節ロボットアームを制御する多関節ロボットアーム制御装置が知られている。前記多関節ロボットアーム制御装置は、前記多関節ロボットアームの精度を高めるために、前記手先カメラが撮影した画像から得た計測データを用いて、前記固定カメラが撮影した画像から得た計測データを補正している。 An articulated robot arm control device that controls an articulated robot arm based on an image taken by a fixed camera and a hand camera provided at the tip of the arm is known. The articulated robot arm control device uses measurement data obtained from an image taken by the hand camera in order to improve the accuracy of the articulated robot arm, and obtains measurement data obtained from an image taken by the fixed camera. I am correcting.
 例えば特許文献1には、ロボットアームと、前記ロボットアームの可動範囲を含む計測範囲を計測する第1の視覚センサと、前記ロボットアームの先端に位置決めされる第2の視覚センサと、前記ロボットアームの位置姿勢を制御する制御装置とを有するロボット装置が開示されている。前記制御装置は、前記第1の視覚センサの計測値に基づいて前記ロボットアームの位置姿勢を制御する際、前記第2の視覚センサの計測値を用いて前記ロボットアームに与える指令値を生成する。 For example, Patent Document 1 describes a robot arm, a first visual sensor that measures a measurement range including a movable range of the robot arm, a second visual sensor that is positioned at the tip of the robot arm, and the robot arm. A robot device having a control device for controlling the position and orientation of the robot device is disclosed. When the control device controls the position and orientation of the robot arm based on the measured value of the first visual sensor, the control device generates a command value given to the robot arm using the measured value of the second visual sensor. ..
特開2018-202608号JP-A-2018-2020608
 前記特許文献1の制御装置では、多関節ロボットアームの使用環境、ハンドリングされる対象物の種類、作業領域の自由度などを考慮して、固定カメラの適切な配置が決定されている。従って、前記多関節ロボットアームの使用環境、ハンドリングされる前記対象物の種類、作業領域の自由度などが変化した場合、前記特許文献1に記載の固定カメラが撮影した画像に基づいた制御では、前記多関節ロボットアームが対応できない可能性があった。 In the control device of Patent Document 1, the appropriate arrangement of the fixed camera is determined in consideration of the usage environment of the articulated robot arm, the type of the object to be handled, the degree of freedom of the work area, and the like. Therefore, when the usage environment of the articulated robot arm, the type of the object to be handled, the degree of freedom of the work area, and the like change, the control based on the image taken by the fixed camera described in Patent Document 1 can be performed. There was a possibility that the articulated robot arm could not handle it.
 上述のような多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアームの汎用性を高めることが求められている。作業の種類は、例えば、多関節ロボットアームの使用環境(外、天候、車両搭載/固定など)、対象物の種類(工業部品、農作物、調理器具など)、作業領域の自由度などを含む。 It is required to increase the versatility of the articulated robot arm by increasing the types of work that can be handled by the articulated robot arm as described above. The type of work includes, for example, the usage environment of the articulated robot arm (outside, weather, vehicle mounting / fixing, etc.), the type of object (industrial parts, agricultural products, cooking utensils, etc.), the degree of freedom in the work area, and the like.
 本発明は、多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアームの汎用性を高めることができる多関節ロボットアーム制御装置及び多関節ロボットアーム装置を提供する。 The present invention provides an articulated robot arm control device and an articulated robot arm device that can increase the versatility of the articulated robot arm by increasing the types of work that can be handled by the articulated robot arm.
 本発明者らは、従来提案されている固定カメラと手先カメラを使って、多関節ロボットアームで対応できる作業の種類を増やすことで、多関節ロボットアームの汎用性を高めることのできる多関節ロボットアーム制御装置及び多関節ロボットアーム装置について検討した。鋭意検討の結果、本発明者らは、以下のような構成に想到した。 The present inventors can increase the versatility of the articulated robot arm by increasing the types of work that can be handled by the articulated robot arm by using the conventionally proposed fixed camera and hand camera. An arm control device and an articulated robot arm device were examined. As a result of diligent studies, the present inventors have come up with the following configuration.
 本発明の一実施形態に係る多関節ロボットアーム制御装置は、多関節ロボットアームに設けられる撮像部と、前記撮像部が撮像した画像から対象物の画像を検出する画像処理部と、前記対象物の位置座標を算出する座標情報処理部と、前記多関節ロボットアームのアクチュエータを駆動する駆動制御部と、を備えている。前記撮像部は、前記多関節ロボットアームが前記対象物に対する作業を行う作業領域から離れている第1撮像位置で前記作業領域を撮像し、前記画像処理部は、前記作業領域の画像から前記対象物の画像を検出し、前記座標情報処理部は、検出された前記対象物の画像に基づいて多関節ロボットアームの座標系における前記対象物の位置座標を算出する。 The articulated robot arm control device according to an embodiment of the present invention includes an imaging unit provided on the articulated robot arm, an image processing unit that detects an image of an object from an image captured by the imaging unit, and the object. It is provided with a coordinate information processing unit for calculating the position coordinates of the robot arm and a drive control unit for driving the actuator of the articulated robot arm. The imaging unit images the work area at a first imaging position away from the work area where the articulated robot arm works on the object, and the image processing unit captures the object from the image of the work area. The image of the object is detected, and the coordinate information processing unit calculates the position coordinates of the object in the coordinate system of the articulated robot arm based on the detected image of the object.
 上述のように、前記多関節ロボットアーム制御装置は、前記多関節ロボットアームが前記対象物に対する作業を行う作業領域から離れている前記第1撮像位置で前記撮像部が前記作業領域を撮像することで、前記作業領域を含む領域から前記対象物を探索することができる。また、前記画像処理部が、前記撮像した作業領域の画像全体から前記対象物の画像を検出し、前記座標情報処理部が、前記対象物の画像に基づいて前記多関節ロボットアームの座標系における前記対象物の位置座標を算出する。これにより、前記多関節ロボットアーム制御装置は、前記作業領域内の複数の前記対象物の位置を特定することができる。このように、従来提案されている手先カメラに該当する前記撮像部は、前記作業領域から離れている前記第1撮像位置で前記作業領域を含む領域を撮像することで、従来提案されている固定カメラの役割を担っている。よって、前記多関節ロボットアーム制御装置は、前記固定カメラと前記手先カメラを用いる従来の提案における前記固定カメラを省略することができる。 As described above, in the articulated robot arm control device, the imaging unit captures the work area at the first imaging position away from the work area where the articulated robot arm works on the object. Therefore, the object can be searched from the area including the work area. Further, the image processing unit detects an image of the object from the entire image of the captured work area, and the coordinate information processing unit in the coordinate system of the articulated robot arm based on the image of the object. The position coordinates of the object are calculated. Thereby, the articulated robot arm control device can identify the positions of a plurality of the objects in the work area. As described above, the imaging unit corresponding to the conventionally proposed hand camera captures the region including the working region at the first imaging position away from the working region, thereby fixing the conventionally proposed fixed region. It plays the role of a camera. Therefore, the articulated robot arm control device can omit the fixed camera in the conventional proposal using the fixed camera and the hand camera.
 しかも、上述のように、前記多関節ロボットアーム制御装置は、前記作業領域内の複数の前記対象物の位置座標を算出することで、前記多関節ロボットアームが作業を行う際の最適な作業経路を所定の条件に基づいて算出することができる。これにより、前記多関節ロボットアーム制御装置は、前記作業領域に複数の前記対象物が無作為に配置されていても、前記多関節ロボットアームに前記対象物に対する作業を効率良く行わせることができる。 Moreover, as described above, the articulated robot arm control device calculates the position coordinates of a plurality of the objects in the work area, so that the optimum work path when the articulated robot arm performs the work. Can be calculated based on predetermined conditions. As a result, the articulated robot arm control device can efficiently cause the articulated robot arm to perform work on the object even if a plurality of the objects are randomly arranged in the work area. ..
 従って、前記多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device, the degree of freedom in the usage environment of the articulated robot arm device can be increased by increasing the types of work that can be handled by the articulated robot arm. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム制御装置は、以下の構成を含むことが好ましい。前記撮像部は、前記第1撮像位置と異なる第2撮像位置で前記対象物を含む前記作業領域をさらに撮影する。 From another point of view, the articulated robot arm control device of the present invention preferably includes the following configurations. The imaging unit further photographs the working area including the object at a second imaging position different from the first imaging position.
 上述のように、前記多関節ロボットアーム制御装置は、前記第1撮像位置と異なる前記第2撮像位置で前記対象物が位置する前記作業領域をさらに撮像することで、前記第1撮像位置で撮像した前記作業領域の画像のうち、特定の領域や特定の前記対象物を異なる角度で撮像したり、拡大して撮像したりすることができる。これにより、前記多関節ロボットアーム制御装置は、前記作業領域の状態や前記対象物の形状に適した画像を取得することができる。 As described above, the articulated robot arm control device further images the work area where the object is located at the second imaging position different from the first imaging position, thereby imaging at the first imaging position. Of the images of the work area, a specific area or the specific object can be imaged at different angles or magnified. As a result, the articulated robot arm control device can acquire an image suitable for the state of the work area and the shape of the object.
 従って、前記第1撮像位置と異なる前記第2撮像位置で前記対象物が位置する前記作業領域をさらに撮像する前記多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device that further images the work area where the object is located at the second imaging position different from the first imaging position, the types of work that can be handled by the articulated robot arm are increased. This makes it possible to increase the degree of freedom in the usage environment of the articulated robot arm device. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム制御装置は、以下の構成を含むことが好ましい。前記第2撮像位置は、前記第1撮像位置よりも前記作業領域に近い位置である。 From another point of view, the articulated robot arm control device of the present invention preferably includes the following configurations. The second imaging position is a position closer to the working area than the first imaging position.
 上述のように、前記多関節ロボットアーム制御装置は、前記第1撮像位置で撮像した前記作業領域の画像から前記対象物を含む前記作業領域の全体の画像を取得しすることができる。また、前記多関節ロボットアーム制御装置は、前記第1撮像位置で撮像した複数の対象物のうち特定の対象物の詳細な画像を前記第2撮像位置での撮像で取得することができる。これにより、前記多関節ロボットアーム制御装置は、一つの前記撮像部によって前記第1撮像位置で撮像した前記作業領域の画像を用いて前記作業領域を広域探索し、前記第2撮像位置で撮像した前記作業領域の画像を用いて前記作業領域を局所的に探索することができる。 As described above, the articulated robot arm control device can acquire an entire image of the work area including the object from the image of the work area captured at the first imaging position. Further, the articulated robot arm control device can acquire a detailed image of a specific object among a plurality of objects imaged at the first imaging position by imaging at the second imaging position. As a result, the articulated robot arm control device searches the work area over a wide area using the image of the work area captured at the first imaging position by one of the imaging units, and images the image at the second imaging position. The work area can be locally searched by using the image of the work area.
 従って、前記第1撮像位置よりも前記作業領域に近い前記第2撮像位置で前記対象物が位置する前記作業領域を撮像する前記多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device that images the work area where the object is located at the second imaging position that is closer to the work area than the first imaging position, the work that can be handled by the articulated robot arm. By increasing the types of robot arm devices, the degree of freedom in the usage environment of the articulated robot arm device can be increased. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム制御装置は、以下の構成を含むことが好ましい。前記撮像部は、撮像範囲において90度以上の画角で前記作業領域を撮像する。 From another point of view, the articulated robot arm control device of the present invention preferably includes the following configurations. The imaging unit images the working area at an angle of view of 90 degrees or more in the imaging range.
 上述のように、前記撮像部が、実際に映る範囲において90度以上の画角を有するので、前記作業領域の近傍で前記作業領域全体を撮影することができる。これにより、前記多関節ロボットアーム制御装置は、前記多関節アームロボットの可動範囲内において、前記撮像部の撮像範囲に前記作業領域が全て含まれる撮像位置を確保することができる。 As described above, since the imaging unit has an angle of view of 90 degrees or more in the range actually projected, the entire working area can be photographed in the vicinity of the working area. As a result, the articulated robot arm control device can secure an imaging position in which the working area is entirely included in the imaging range of the imaging unit within the movable range of the articulated arm robot.
 従って、90度以上の対角画角で前記作業領域を撮像する撮像部を有する前記多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device having an imaging unit that images the work area with a diagonal angle of 90 degrees or more, the articulated robot arm device can be handled by increasing the types of work that can be handled by the articulated robot arm. It is possible to increase the degree of freedom in the usage environment of. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム制御装置は、以下の構成を含むことが好ましい。前記撮像部は、前記多関節ロボットアームの駆動によって任意の方向に移動可能であり、且つ、前記作業領域の視差画像を取得可能な単眼カメラを備える。 From another point of view, the articulated robot arm control device of the present invention preferably includes the following configurations. The imaging unit includes a monocular camera that can move in an arbitrary direction by driving the articulated robot arm and can acquire a parallax image of the work area.
 上述のように前記多関節ロボットアーム制御装置は、前記単眼カメラを前記多関節ロボットアームで移動させて撮像することで、前記作業領域または前記対象物までの距離を計測することができる。これにより、前記多関節ロボットアーム制御装置は、前記単眼カメラによって前記作業領域または前記対象物を撮像するとともに、前記作業領域または前記対象物までの距離を計測することができる。 As described above, the articulated robot arm control device can measure the distance to the work area or the object by moving the monocular camera with the articulated robot arm and taking an image. As a result, the articulated robot arm control device can image the work area or the object with the monocular camera and measure the distance to the work area or the object.
 従って、前記単眼カメラで前記作業領域を撮像する前記多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device that images the work area with the monocular camera, the degree of freedom in the usage environment of the articulated robot arm device is increased by increasing the types of work that can be handled by the articulated robot arm. Can be done. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム制御装置は、以下の構成を含むことが好ましい。前記視差画像を撮像するための前記単眼カメラの任意の方向の移動量が、前記第1撮像位置と前記第2撮像位置とで異なる。 From another point of view, the articulated robot arm control device of the present invention preferably includes the following configurations. The amount of movement of the monocular camera in an arbitrary direction for capturing the parallax image differs between the first imaging position and the second imaging position.
 上述のように、前記多関節ロボットアーム制御装置は、前記第1撮像位置と、前記第1撮像位置よりも前記作業領域に近い前記第2撮像位置とで、視差画像を撮影するための前記単眼カメラの任意の方向の移動量を変えることで、前記作業領域または前記対象物までの距離を計測する際の計測精度を調整することができる。これにより、前記多関節ロボットアーム制御装置は、前記第1撮像位置及び前記第2撮像位置と前記作業領域との位置関係に基づいた計測精度で前記作業領域または前記対象物までの距離を計測することができる。 As described above, the articulated robot arm control device has the monocular for capturing a parallax image at the first imaging position and the second imaging position closer to the working area than the first imaging position. By changing the amount of movement of the camera in an arbitrary direction, it is possible to adjust the measurement accuracy when measuring the distance to the work area or the object. As a result, the articulated robot arm control device measures the distance to the work area or the object with measurement accuracy based on the positional relationship between the first imaging position and the second imaging position and the work area. be able to.
 従って、記単眼カメラの視差画像を撮像するための任意の方向の移動量が、前記第1撮像位置と前記第2撮像位置とで異なる多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device in which the amount of movement in an arbitrary direction for capturing the disparity image of the monocular camera differs between the first imaging position and the second imaging position, the articulated robot arm corresponds to the movement amount. By increasing the types of work that can be done, the degree of freedom in the usage environment of the articulated robot arm device can be increased. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム制御装置は、以下の構成を含むことが好ましい。前記撮像部は、前記第1撮像位置と前記第2撮像位置とで異なる撮像方向から撮像する。 From another point of view, the articulated robot arm control device of the present invention preferably includes the following configurations. The imaging unit captures images from different imaging directions at the first imaging position and the second imaging position.
 上述のように、前記多関節ロボットアーム制御装置は、前記第2撮像位置において、前記第1撮像位置と異なる撮像方向から撮像することで、逆光や障害物等によって対象物の撮像が難しい状態を回避することができる。これにより、多関節ロボットアーム制御装置は、周囲の状態に応じて撮像位置を変更することができる。 As described above, the articulated robot arm control device captures images from a different imaging direction from the first imaging position at the second imaging position, so that it is difficult to image an object due to backlight, obstacles, or the like. It can be avoided. As a result, the articulated robot arm control device can change the imaging position according to the surrounding conditions.
 従って、前記第1撮像位置と前記第2撮像位置とで異なる撮像方向から撮像する多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device that captures images from different imaging directions at the first imaging position and the second imaging position, the articulated robot arm device can be operated by increasing the types of work that can be handled by the articulated robot arm. The degree of freedom in the usage environment can be increased. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム制御装置は、以下の構成を含むことが好ましい。前記駆動制御部は、算出した前記対象物の位置座標の変動に基づいて、前記多関節ロボットアームのアクチュエータの制御量を補正する。 From another point of view, the articulated robot arm control device of the present invention preferably includes the following configurations. The drive control unit corrects the control amount of the actuator of the articulated robot arm based on the calculated fluctuation of the position coordinates of the object.
 上述のように、前記駆動制御部が、算出された前記対象物の位置座標の変動に基づいて、前記多関節ロボットアームのアクチュエータの制御量を補正することで、多関節ロボットアームの位置が変化しても多関節ロボットアームによって撮像部の位置を維持することができる。これにより、多関節ロボットアーム制御装置は、多関節ロボットアームの周囲の状態が変化しても作業を継続することができる。
 
As described above, the position of the articulated robot arm is changed by the drive control unit correcting the control amount of the actuator of the articulated robot arm based on the calculated fluctuation of the position coordinates of the object. Even so, the position of the imaging unit can be maintained by the articulated robot arm. As a result, the articulated robot arm control device can continue the work even if the surrounding state of the articulated robot arm changes.
 従って、前記対象物の位置座標の変動に基づいて、前記多関節ロボットアームのアクチュエータの制御量を補正する多関節ロボットアーム制御装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm control device that corrects the control amount of the actuator of the articulated robot arm based on the fluctuation of the position coordinates of the object, the types of work that can be handled by the articulated robot arm can be increased. It is possible to increase the degree of freedom in the usage environment of the articulated robot arm device. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム装置は、以下の構成を含むことが好ましい。前記多関節ロボットアーム装置は、前記多関節ロボットアームを含み、前記多関節ロボットアームは、上述のいずれか一つに記載の多関節ロボットアーム制御装置によって制御される。 From another point of view, the articulated robot arm device of the present invention preferably includes the following configurations. The articulated robot arm device includes the articulated robot arm, and the articulated robot arm is controlled by the articulated robot arm control device according to any one of the above.
 上述のように、前記多関節ロボットアームが、前記多関節ロボットアーム制御装置で制御されることで、前記多関節ロボットアームに設けられた前記撮像部を前記多関節ロボットアームの可動範囲内の任意の位置に配置し、前記作業領域または前記対象物の情報を取得することができる。これにより、多関節ロボットアーム装置は、前記作業領域、前記対象物及び前記多関節ロボットアームの周囲の状態が変化しても作業を継続することができる。 As described above, the articulated robot arm is controlled by the articulated robot arm control device, so that the imaging unit provided on the articulated robot arm can be arbitrarily moved within the movable range of the articulated robot arm. It can be arranged at the position of, and the information of the work area or the object can be acquired. As a result, the articulated robot arm device can continue the work even if the state of the work area, the object, and the surroundings of the articulated robot arm changes.
 従って、多関節ロボットアーム制御装置で制御される多関節ロボットアーム装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm device controlled by the articulated robot arm control device, the degree of freedom in the usage environment of the articulated robot arm device can be increased by increasing the types of work that can be handled by the articulated robot arm. .. Therefore, the versatility of the articulated robot arm device can be increased.
 他の観点によれば、本発明の多関節ロボットアーム装置は、以下の構成を含むことが好ましい。前記多関節ロボットアームは、最も先端のリンクの軸線まわりに回転可能な先端回転部を有し、前記先端回転部に前記撮像部が設けられている。 From another point of view, the articulated robot arm device of the present invention preferably includes the following configurations. The articulated robot arm has a tip rotating portion that can rotate around the axis of the most advanced link, and the tip rotating portion is provided with the imaging unit.
 上述のように、前記撮像部が前記リンクの軸線まわりに回転可能な先端回転部に設けられているので、対象物を撮像可能な位置に前記撮像部を移動することができる。これにより、前記多関節ロボットアーム装置は、前記多関節ロボットアームの周囲の状態が変化しても撮像を継続することができる。 As described above, since the imaging unit is provided at the tip rotating unit that can rotate around the axis of the link, the imaging unit can be moved to a position where an object can be imaged. As a result, the articulated robot arm device can continue imaging even if the surrounding state of the articulated robot arm changes.
 従って、前記撮像部が前記リンクの軸線まわりに回転可能な先端回転部に設けられている多関節ロボットアーム装置において、前記多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアーム装置の使用環境の自由度を高めることができる。よって、多関節ロボットアーム装置の汎用性を高めることができる。 Therefore, in the articulated robot arm device in which the imaging unit is provided at the tip rotating portion that can rotate around the axis of the link, the articulated robot arm device can be handled by increasing the types of work that can be handled by the articulated robot arm. It is possible to increase the degree of freedom in the usage environment of. Therefore, the versatility of the articulated robot arm device can be increased.
 本明細書で使用される専門用語は、特定の実施例のみを定義する目的で使用されるのであって、前記専門用語によって発明を制限する意図はない。 The terminology used herein is used for the purpose of defining only specific examples, and there is no intention of limiting the invention by the terminology.
 本明細書で使用される「及び/または」は、一つまたは複数の関連して列挙された構成物のすべての組み合わせを含む。 As used herein, "and / or" includes all combinations of one or more relatedly listed components.
 本明細書において、「含む、備える(including)」「含む、備える(comprising)」または「有する(having)」及びそれらの変形の使用は、記載された特徴、工程、要素、成分、及び/または、それらの等価物の存在を特定するが、ステップ、動作、要素、コンポーネント、及び/または、それらのグループのうちの1つまたは複数を含むことができる。 As used herein, the use of "including, including," "comprising," or "having" and variations thereof is described in the features, processes, elements, components, and / or , Identifying the existence of their equivalents, but may include one or more of steps, actions, elements, components, and / or their groups.
 本明細書において、「取り付けられた」、「接続された」、「結合された」、及び/または、それらの等価物は、広義の意味で使用され、“直接的及び間接的な”取り付け、接続及び結合の両方を包含する。さらに、「接続された」及び「結合された」は、物理的または機械的な接続または結合に限定されず、直接的または間接的な接続または結合を含むことができる。 In the present specification, "attached", "connected", "combined", and / or their equivalents are used in a broad sense and are "direct and indirect" attachments. Includes both connection and connection. Further, "connected" and "connected" are not limited to physical or mechanical connections or connections, but can include direct or indirect connections or connections.
 他に定義されない限り、本明細書で使用される全ての用語(技術用語及び科学用語を含む)は、本発明が属する技術分野の当業者によって一般的に理解される意味と同じ意味を有する。 Unless otherwise defined, all terms used herein (including technical and scientific terms) have the same meaning as commonly understood by one of ordinary skill in the art to which the present invention belongs.
 一般的に使用される辞書に定義された用語は、関連する技術及び本開示の文脈における意味と一致する意味を有すると解釈されるべきであり、本明細書で明示的に定義されていない限り、理想的または過度に形式的な意味で解釈されることはない。 Terms defined in commonly used dictionaries should be construed to have meaning consistent with the relevant technology and meaning in the context of the present disclosure, unless expressly defined herein. , Is not interpreted in an ideal or overly formal sense.
 本発明の説明においては、いくつもの技術および工程が開示されていると理解される。これらの各々は、個別の利益を有し、他に開示された技術の1つ以上、または、場合によっては全てと共に使用することもできる。 It is understood that a number of techniques and processes are disclosed in the description of the present invention. Each of these has its own interests and can be used with one or more of the other disclosed techniques, or in some cases all.
 したがって、明確にするために、本発明の説明では、不要に個々のステップの可能な組み合わせをすべて繰り返すことを控える。しかしながら、本明細書及び特許請求の範囲は、そのような組み合わせがすべて本発明の範囲内であることを理解して読まれるべきである。 Therefore, for the sake of clarity, the description of the present invention refrains from unnecessarily repeating all possible combinations of individual steps. However, the specification and claims should be read with the understanding that all such combinations are within the scope of the present invention.
 本明細書では、本発明に係る多関節ロボットアーム制御装置及び多関節アームロボット装置の実施形態について説明する。 This specification describes embodiments of the articulated robot arm control device and the articulated arm robot device according to the present invention.
 以下の説明では、本発明の完全な理解を提供するために多数の具体的な例を述べる。しかしながら、当業者は、これらの具体的な例がなくても本発明を実施できることが明らかである。 In the following description, a number of specific examples will be given to provide a complete understanding of the present invention. However, it will be apparent to those skilled in the art that the present invention can be practiced without these specific examples.
 よって、以下の開示は、本発明の例示として考慮されるべきであり、本発明を以下の図面または説明によって示される特定の実施形態に限定することを意図するものではない。 Therefore, the following disclosure should be considered as an example of the invention and is not intended to limit the invention to the particular embodiments set forth in the drawings or description below.
 [多関節ロボットアーム]
 本明細書において、多関節ロボットアームとは、複数のリンクを連結する関節部を複数有するロボットアームを意味する。前記多関節ロボットアームは、垂直多関節ロボットアームを含む。具体的には、前記垂直多関節ロボットアームは、リンクが1自由度の回転関節または直動関節で根元から先端まで直列に連結されたシリアルリンク機構のロボットアームである。前記垂直多関節ロボットアームは、複数の関節部を有する。
[Articulated robot arm]
In the present specification, the articulated robot arm means a robot arm having a plurality of joint portions connecting a plurality of links. The articulated robot arm includes a vertical articulated robot arm. Specifically, the vertical articulated robot arm is a robot arm of a serial link mechanism in which links are connected in series from the root to the tip by a rotary joint or a linear motion joint having one degree of freedom. The vertical articulated robot arm has a plurality of joints.
 [作業領域]
 本明細書において、作業領域とは、多関節ロボットアームが対象物に対して作業を行う際に、前記多関節ロボットアームが作業を行う対象物又は作業を行う予定の対象物に近づいていく過程で前記多関節ロボットアームが通過する可能性がある領域を意味する。なお、前記多関節ロボットアームが一つの対象物の位置から離れる際に通過する領域は除く。
[Work area]
In the present specification, the work area is a process in which when an articulated robot arm works on an object, the articulated robot arm approaches the object to be worked on or the object to be worked on. Means the area through which the articulated robot arm may pass. The area through which the articulated robot arm passes when it leaves the position of one object is excluded.
 [画角]
 本明細書において、画角とは、カメラに実際に写る範囲を示す角度を意味する。具体的には、前記画角は、レンズの光学的中心が撮像素子における受光面の対角をなす二点と作る角度を示す対角画角である。つまり、前記対角画角は、レンズの実効焦点距離と前記受光面の対角線の長さとから定まる。前記対角画角は、受光面の大きさが同じ場合、レンズの実効焦点距離が長くなるほど小さくなり、前記受光面の対角線が短くなるほど小さくなる。
[Angle of view]
In the present specification, the angle of view means an angle indicating a range actually captured by the camera. Specifically, the angle of view is a diagonal angle of view indicating an angle formed by the optical center of the lens forming two points diagonal to the light receiving surface of the image sensor. That is, the diagonal angle of view is determined by the effective focal length of the lens and the length of the diagonal line of the light receiving surface. When the size of the light receiving surface is the same, the diagonal angle of view becomes smaller as the effective focal length of the lens becomes longer, and becomes smaller as the diagonal line of the light receiving surface becomes shorter.
 本発明の一実施形態によれば、多関節ロボットアームで対応できる作業の種類を増やすことで多関節ロボットアームの汎用性を高めることのできる多関節ロボットアーム制御装置及び多関節ロボットアーム装置を提供できる。 According to one embodiment of the present invention, there is provided an articulated robot arm control device and an articulated robot arm device capable of increasing the versatility of the articulated robot arm by increasing the types of work that can be handled by the articulated robot arm. it can.
本発明の実施形態1に係る多関節ロボットアーム装置を備えた遠隔操作車両の模式図である。It is a schematic diagram of the remote control vehicle provided with the articulated robot arm device which concerns on Embodiment 1 of this invention. 遠隔操作車両1の制御構成のブロック図である。It is a block diagram of the control composition of the remote control vehicle 1. 本発明の実施形態1に係る多関節ロボットアーム装置の模式図である。It is a schematic diagram of the articulated robot arm device which concerns on Embodiment 1 of this invention. 本発明の実施形態1に係るエンドエフェクタの平面図を示す。The plan view of the end effector which concerns on Embodiment 1 of this invention is shown. 本発明の実施形態1に係るエンドエフェクタの側面図を示す。The side view of the end effector which concerns on Embodiment 1 of this invention is shown. エンドエフェクタに設けられた単眼カメラにおける視野角を模式的に示す図である。It is a figure which shows typically the viewing angle in the monocular camera provided in the end effector. 多関節ロボットアーム制御装置のブロック図である。It is a block diagram of an articulated robot arm control device. 単眼カメラの移動量と単眼カメラの結像位置から収穫対象の農作物までの距離との関係を示す図である。It is a figure which shows the relationship between the movement amount of a monocular camera and the distance from the image formation position of a monocular camera to the crop to be harvested. 多関節ロボットアーム制御装置によって算出された対象葡萄及び収穫順の処理画像の一例を示す図である。It is a figure which shows an example of the processed image of the target grape and the harvesting order calculated by the articulated robot arm control device. 単眼カメラによって視差画像を生成するためのエンドエフェクタの動きの一例を表す模式図である。It is a schematic diagram which shows an example of the movement of an end effector for generating a parallax image by a monocular camera. 本発明の実施形態1に係る多関節ロボットアーム制御装置の制御フロー図である。It is a control flow diagram of the articulated robot arm control device which concerns on Embodiment 1 of this invention. 本発明の他の実施形態に係る多関節ロボットアーム制御装置のブロック図である。It is a block diagram of the articulated robot arm control device which concerns on other embodiment of this invention. 本発明の他の実施形態に係る多関節ロボットアーム制御装置において、第1撮像位置と第2撮像位置とで撮像方向が異なる状態を表す図である。It is a figure which shows the state which the imaging direction is different between the 1st imaging position and the 2nd imaging position in the articulated robot arm control device which concerns on other embodiment of this invention. 本発明の他の実施形態に係る多関節ロボットアーム装置を備えた遠隔操作車両の模式図である。It is a schematic diagram of the remote control vehicle provided with the articulated robot arm device which concerns on another embodiment of this invention.
 以下で、各実施形態について、図面を参照しながら説明する。各図において、同一部分には同一の符号を付して、その同一部分の説明は繰り返さない。なお、各図中の構成部材の寸法は、実際の構成部材の寸法及び各構成部材の寸法比率等を忠実に表したものではない。 Hereinafter, each embodiment will be described with reference to the drawings. In each figure, the same parts are designated by the same reference numerals, and the description of the same parts will not be repeated. The dimensions of the constituent members in each drawing do not faithfully represent the actual dimensions of the constituent members and the dimensional ratio of each constituent member.
 <全体構成>
 図1と図2を用いて本発明の実施形態1に係る遠隔操作車両1を説明する。図1は、本発明の実施形態1に係る多関節ロボットアーム装置10を備えた遠隔操作車両1の模式図である。図2は、遠隔操作車両1の制御構成のブロック図である。本実施形態において、遠隔操作車両1は、進行方向の両端部において多関節ロボットアームが配置されてない端部から多関節ロボットアームが配置されている端部に向かう方向を遠隔操作車両1の前方向と規定する。
<Overall configuration>
The remote-controlled vehicle 1 according to the first embodiment of the present invention will be described with reference to FIGS. 1 and 2. FIG. 1 is a schematic view of a remote-controlled vehicle 1 provided with an articulated robot arm device 10 according to a first embodiment of the present invention. FIG. 2 is a block diagram of the control configuration of the remote-controlled vehicle 1. In the present embodiment, the remote-controlled vehicle 1 is in front of the remote-controlled vehicle 1 in the direction from the end where the articulated robot arm is not arranged to the end where the articulated robot arm is arranged at both ends in the traveling direction. Defined as direction.
 以下に示す図中の矢印X、Y、Zは、多関節ロボットアーム装置10における直角座標系(以下、単に「ロボット座標系」と記す)である座標軸方向を示す。ロボット座標系では、X軸方向が遠隔操作車両1の前方向と一致している。従って、ロボット座標系では、遠隔操作車両1の前方向をX軸方向とし、鉛直上方向をZ方向とすると、遠隔操作車両1の前方向を向いた時の左方向がY軸方向になる。また、ロボット座標系は、後述の多関節ロボットアーム11におけるS軸モータユニット12の軸線とL軸モータユニット14の軸線との交点を原点とする。 Arrows X, Y, and Z in the figure shown below indicate the direction of the coordinate axis which is the Cartesian coordinate system (hereinafter, simply referred to as "robot coordinate system") in the articulated robot arm device 10. In the robot coordinate system, the X-axis direction coincides with the front direction of the remote-controlled vehicle 1. Therefore, in the robot coordinate system, if the front direction of the remote control vehicle 1 is the X-axis direction and the vertically upward direction is the Z direction, the left direction when facing the front direction of the remote control vehicle 1 is the Y-axis direction. The origin of the robot coordinate system is the intersection of the axis of the S-axis motor unit 12 and the axis of the L-axis motor unit 14 in the articulated robot arm 11, which will be described later.
 図1に示すように、遠隔操作車両1は、外部からの制御信号によって、遠隔操作される四輪車両である。遠隔操作車両1は、車体2と、一対の車輪3と、一対の車輪4と、前記車輪3、4を駆動させる駆動用モータ5と、車輪3、4を操舵する操舵用モータ6と、通信装置7と、バッテリー8と、車両制御装置9とを備える。 As shown in FIG. 1, the remote-controlled vehicle 1 is a four-wheeled vehicle that is remotely controlled by an external control signal. The remote-controlled vehicle 1 communicates with the vehicle body 2, the pair of wheels 3, the pair of wheels 4, the drive motor 5 for driving the wheels 3 and 4, and the steering motor 6 for steering the wheels 3 and 4. It includes a device 7, a battery 8, and a vehicle control device 9.
 (遠隔操作車両)
 遠隔操作車両1では、一対の車輪3が車体2の前部に位置し、一対の車輪の後部に位置する。よって、例えば、一つの車輪3が前輪であり且つ一対の車輪4が後輪である。一対の車輪3、4は、それぞれが車体2に図示しない操舵装置によって操舵可能に設けられている。車体2の上面には、収穫物を収納する収穫箱Hが搭載される。
(Remote control vehicle)
In the remote-controlled vehicle 1, the pair of wheels 3 are located at the front of the vehicle body 2 and are located at the rear of the pair of wheels. Therefore, for example, one wheel 3 is a front wheel and a pair of wheels 4 are rear wheels. Each of the pair of wheels 3 and 4 is steerable by a steering device (not shown) on the vehicle body 2. A harvest box H for storing the harvested material is mounted on the upper surface of the vehicle body 2.
 駆動用モータ5は、一対の車輪3にそれぞれ駆動力を与えるアクチュエータである。駆動用モータ5は、一対の車輪3に設けられている。駆動用モータ5は、例えば、図示しない減速機を介して一対の車輪3に駆動力を与える。 The drive motor 5 is an actuator that applies a driving force to each of the pair of wheels 3. The drive motor 5 is provided on a pair of wheels 3. The drive motor 5 applies a driving force to the pair of wheels 3 via, for example, a speed reducer (not shown).
 操舵用モータ6は、一対の車輪3、4をそれぞれ転舵させるアクチュエータである。操舵用モータ6は、図示しない操舵装置に設けられている。操舵用モータ6は、操舵装置を駆動することで一対の車輪3、4を転舵させる。 The steering motor 6 is an actuator that steers a pair of wheels 3 and 4, respectively. The steering motor 6 is provided in a steering device (not shown). The steering motor 6 steers a pair of wheels 3 and 4 by driving a steering device.
 通信装置7は、外部の操作端末Cとの間で制御信号を送受信する。通信装置7は、車体2に設けられている。通信装置7は、外部の操作端末Cから送信される制御信号を受信する。また、通信装置7は、車両制御装置9から出力される制御信号を操作端末Cに送信する。操作端末Cは、制御信号として、車両1を遠隔操作するための信号を、通信装置7に送信する。 The communication device 7 transmits and receives control signals to and from the external operation terminal C. The communication device 7 is provided on the vehicle body 2. The communication device 7 receives a control signal transmitted from the external operation terminal C. Further, the communication device 7 transmits a control signal output from the vehicle control device 9 to the operation terminal C. The operation terminal C transmits a signal for remotely controlling the vehicle 1 to the communication device 7 as a control signal.
 バッテリー8は、充放電可能な電池である。バッテリー8は、車体2に設けられている。バッテリー8は、例えば、鉛蓄電池、アルカリ蓄電池、リチウムイオン電池等である。バッテリー8は、駆動用モータ5、操舵用モータ6、通信装置7、車両制御装置9及び後述の多関節ロボットアーム装置10に電気を供給する。 Battery 8 is a battery that can be charged and discharged. The battery 8 is provided on the vehicle body 2. The battery 8 is, for example, a lead storage battery, an alkaline storage battery, a lithium ion battery, or the like. The battery 8 supplies electricity to the drive motor 5, the steering motor 6, the communication device 7, the vehicle control device 9, and the articulated robot arm device 10 described later.
 車両制御装置9は、遠隔操作車両1を制御する装置である。車両制御装置9は、実体的には、CPU、ROM、RAM、HDD等がバスで接続された構成であってもよく、あるいはワンチップのLSI等からなる構成であってもよい。車両制御装置9には、駆動用モータ5、操舵用モータ6、通信装置7等の動作を制御するために種々のプログラムやデータが格納されている。 The vehicle control device 9 is a device that controls the remote-controlled vehicle 1. The vehicle control device 9 may substantially have a configuration in which a CPU, ROM, RAM, HDD, etc. are connected by a bus, or may have a configuration including a one-chip LSI or the like. The vehicle control device 9 stores various programs and data for controlling the operation of the drive motor 5, the steering motor 6, the communication device 7, and the like.
 図2に示すように、車両制御装置9は、駆動用モータ5の駆動回路に通信可能に接続されている。これにより、車両制御装置9は、駆動用モータ5の前記駆動回路に制御信号を送信することができる。 As shown in FIG. 2, the vehicle control device 9 is communicably connected to the drive circuit of the drive motor 5. As a result, the vehicle control device 9 can transmit a control signal to the drive circuit of the drive motor 5.
 車両制御装置9は、操舵用モータ6の駆動回路に通信可能に接続されている。これにより、車両制御装置9は、操舵用モータ6の前記駆動回路に制御信号を送信することができる。 The vehicle control device 9 is communicably connected to the drive circuit of the steering motor 6. As a result, the vehicle control device 9 can transmit a control signal to the drive circuit of the steering motor 6.
 車両制御装置9は、通信装置7に通信可能に接続されている。これにより、車両制御装置9は、外部の操作端末Cから送信され且つ通信装置7が受信した制御信号を取得することができる。また、車両制御装置9は、通信装置7を介して、外部の操作端末Cに制御信号を送信することができる。 The vehicle control device 9 is communicably connected to the communication device 7. As a result, the vehicle control device 9 can acquire the control signal transmitted from the external operation terminal C and received by the communication device 7. Further, the vehicle control device 9 can transmit a control signal to the external operation terminal C via the communication device 7.
 車両制御装置9は、バッテリー8に電気的に接続されている。これにより、車両制御装置9には、バッテリー8から電力が供給される。車両制御装置9は、バッテリー8に通信可能に接続されている。これにより、車両制御装置9は、バッテリー8の状態についての情報を取得することができる。 The vehicle control device 9 is electrically connected to the battery 8. As a result, electric power is supplied to the vehicle control device 9 from the battery 8. The vehicle control device 9 is communicably connected to the battery 8. As a result, the vehicle control device 9 can acquire information about the state of the battery 8.
 以上のように構成される遠隔操作車両1は、外部の操作端末Cから送信される制御信号によって遠隔操作される。 The remote-controlled vehicle 1 configured as described above is remotely controlled by a control signal transmitted from the external operation terminal C.
 (多関節ロボットアーム装置10)
 次に、図3から図5を用いて本発明の実施形態1に係る多関節ロボットアーム装置10の全体構成について説明する。図3は、本発明の実施形態1に係る多関節ロボットアーム装置10の模式図である。図4Aは、本発明の実施形態1に係るエンドエフェクタの平面図である。図4Bは、本発明の実施形態1に係るエンドエフェクタの側面図である。図4Cは、エンドエフェクタに設けられた単眼カメラにおける視野角を模式的に示す図である。図5は、多関節ロボットアーム制御装置25のブロック図である。多関節ロボットアーム装置10は、多関節ロボットアーム11、エンドエフェクタ23及び多関節ロボットアーム制御装置25を含む。
(Articulated robot arm device 10)
Next, the overall configuration of the articulated robot arm device 10 according to the first embodiment of the present invention will be described with reference to FIGS. 3 to 5. FIG. 3 is a schematic view of the articulated robot arm device 10 according to the first embodiment of the present invention. FIG. 4A is a plan view of the end effector according to the first embodiment of the present invention. FIG. 4B is a side view of the end effector according to the first embodiment of the present invention. FIG. 4C is a diagram schematically showing a viewing angle in a monocular camera provided on an end effector. FIG. 5 is a block diagram of the articulated robot arm control device 25. The articulated robot arm device 10 includes an articulated robot arm 11, an end effector 23, and an articulated robot arm control device 25.
 (多関節ロボットアーム11)
 図3に示すように、多関節ロボットアーム11は、本実施形態において、リンクが1自由度の回転関節で基端から先端まで直列に連結されたシリアルリンク機構のロボットアームである。多関節ロボットアーム11は、例えば6軸の垂直多関節ロボットアームである。多関節ロボットアーム11は、遠隔操作車両1における車体2の上面の前部に設けられている。
(Articulated robot arm 11)
As shown in FIG. 3, the articulated robot arm 11 is a robot arm of a serial link mechanism in which the links are connected in series from the base end to the tip end by a rotary joint having one degree of freedom in the present embodiment. The articulated robot arm 11 is, for example, a 6-axis vertical articulated robot arm. The articulated robot arm 11 is provided on the front portion of the upper surface of the vehicle body 2 in the remote-controlled vehicle 1.
 多関節ロボットアーム11では、遠隔操作車両1に固定された基端部から順に、S軸モータユニット12、L軸モータユニット14、U軸モータユニット16、B軸モータユニット18、R軸モータユニット20及びT軸モータユニット22がそれぞれリンクによって直列に連結されている。各軸のモータユニットは、回転関節を構成している。各軸のモータユニットは、図示しないモータ、減速機、エンコーダ及び駆動回路を含む。多関節ロボットアーム11は、多関節ロボットアーム制御装置25によって制御される。多関節ロボットアーム11は、多関節ロボットアーム制御装置25から各軸の駆動回路に制御信号を取得する。また、多関節ロボットアーム11は、各軸のモータユニットのモータの出力に関する情報及びエンコーダからの情報を多関節ロボットアーム制御装置25に送信する。 In the articulated robot arm 11, the S-axis motor unit 12, the L-axis motor unit 14, the U-axis motor unit 16, the B-axis motor unit 18, and the R-axis motor unit 20 are arranged in this order from the base end portion fixed to the remote-operated vehicle 1. And the T-axis motor unit 22 are connected in series by a link, respectively. The motor unit of each axis constitutes a rotary joint. The motor unit of each axis includes a motor, a speed reducer, an encoder and a drive circuit (not shown). The articulated robot arm 11 is controlled by the articulated robot arm control device 25. The articulated robot arm 11 acquires a control signal from the articulated robot arm control device 25 to the drive circuit of each axis. Further, the articulated robot arm 11 transmits information regarding the output of the motor of the motor unit of each axis and information from the encoder to the articulated robot arm control device 25.
 S軸モータユニット12は、遠隔操作車両1に設けられている。S軸モータユニット12は、多関節ロボットアーム11全体を旋回させる回転関節である。S軸モータユニット12は、多関節ロボットアーム11の設置面に対して垂直な方向にS軸モータユニット12の軸線が延びるように配置されている。S軸モータユニット12の出力軸には、ベース部材13が固定されている。ベース部材13には、L軸モータユニット14が設けられている。 The S-axis motor unit 12 is provided in the remote-controlled vehicle 1. The S-axis motor unit 12 is a rotary joint that rotates the entire articulated robot arm 11. The S-axis motor unit 12 is arranged so that the axis of the S-axis motor unit 12 extends in a direction perpendicular to the installation surface of the articulated robot arm 11. A base member 13 is fixed to the output shaft of the S-axis motor unit 12. The base member 13 is provided with an L-axis motor unit 14.
 L軸モータユニット14は、下碗リンク15を揺動させる回転関節である。L軸モータユニット14は、S軸モータユニット12の軸線に対して垂直な方向にL軸モータユニット14の軸線が延びるように配置されている。L軸モータユニット14の出力軸には、下碗リンク15の一側端部が固定されている。下碗リンク15の他側端部には、U軸モータユニット16が設けられている。 The L-axis motor unit 14 is a rotary joint that swings the lower bowl link 15. The L-axis motor unit 14 is arranged so that the axis of the L-axis motor unit 14 extends in a direction perpendicular to the axis of the S-axis motor unit 12. One side end of the lower bowl link 15 is fixed to the output shaft of the L-axis motor unit 14. A U-axis motor unit 16 is provided at the other end of the lower bowl link 15.
 U軸モータユニット16は、上腕リンク17を揺動させる回転関節である。U軸モータユニット16は、L軸モータユニット14の軸線に対して平行な方向にU軸モータユニット16の軸線が延びるように配置されている。U軸モータユニット16の出力軸には、上腕リンク17の一側端部が固定されている。上腕リンク17の他側端部には、B軸モータユニット18が設けられている。 The U-axis motor unit 16 is a rotary joint that swings the upper arm link 17. The U-axis motor unit 16 is arranged so that the axis of the U-axis motor unit 16 extends in a direction parallel to the axis of the L-axis motor unit 14. One side end of the upper arm link 17 is fixed to the output shaft of the U-axis motor unit 16. A B-axis motor unit 18 is provided at the other end of the upper arm link 17.
 B軸モータユニット18は、手首上下リンク19を揺動させる回転関節である。B軸モータユニット18は、U軸モータユニット16の軸線に対して平行な方向にB軸モータユニット18の軸線が延びるように配置されている。B軸モータユニット18の出力軸には、手首上下リンク19が固定されている。手首上下リンク19には、R軸モータユニット20が設けられている。 The B-axis motor unit 18 is a rotary joint that swings the wrist vertical link 19. The B-axis motor unit 18 is arranged so that the axis of the B-axis motor unit 18 extends in a direction parallel to the axis of the U-axis motor unit 16. A wrist vertical link 19 is fixed to the output shaft of the B-axis motor unit 18. An R-axis motor unit 20 is provided on the wrist upper / lower link 19.
 R軸モータユニット20は、手首回転リンク21を回転させる回転関節である。R軸モータユニット20は、B軸モータユニット18の軸線に対して垂直な方向にR軸モータユニット20の軸線が延びるように配置されている。R軸モータユニット20の出力軸には、手首回転リンク21が固定されている。手首回転リンク21には、T軸モータユニット22が設けられている。 The R-axis motor unit 20 is a rotary joint that rotates the wrist rotation link 21. The R-axis motor unit 20 is arranged so that the axis of the R-axis motor unit 20 extends in a direction perpendicular to the axis of the B-axis motor unit 18. A wrist rotation link 21 is fixed to the output shaft of the R-axis motor unit 20. The wrist rotation link 21 is provided with a T-axis motor unit 22.
 T軸モータユニット22は、エンドエフェクタ23を回転させる回転関節である。T軸モータユニット22は、R軸モータユニット20の軸線に対して垂直な方向にT軸モータユニット22の軸線が延びるように配置されている。T軸モータユニット22の出力軸には、エンドエフェクタ23が固定されている。 The T-axis motor unit 22 is a rotary joint that rotates the end effector 23. The T-axis motor unit 22 is arranged so that the axis of the T-axis motor unit 22 extends in a direction perpendicular to the axis of the R-axis motor unit 20. An end effector 23 is fixed to the output shaft of the T-axis motor unit 22.
 このように構成される多関節ロボットアーム11は、各軸のモータユニットによってX軸、Y軸、Z軸方向の併進3自由度とX軸、Y軸、Z軸まわりの回転3自由度の合計6自由度を有する。従って、多関節ロボットアーム11は、多関節ロボットアーム11の可動空間内において、T軸の出力軸に固定されているエンドエフェクタ23を任意の位置に移動させることができるとともに任意の姿勢にすることができる。 The articulated robot arm 11 configured in this way has three degrees of freedom of translation in the X-axis, Y-axis, and Z-axis directions and three degrees of freedom of rotation around the X-axis, Y-axis, and Z-axis, depending on the motor unit of each axis. Has 6 degrees of freedom. Therefore, the articulated robot arm 11 can move the end effector 23 fixed to the output shaft of the T axis to an arbitrary position and take an arbitrary posture in the movable space of the articulated robot arm 11. Can be done.
 (エンドエフェクタ)
 図3、図4A及び図4Bに示すように、エンドエフェクタ23は、対象物に対して作業を行う機器である。本発明の実施形態1に係るエンドエフェクタ23は、対象物である農作物を収穫する収穫装置である。エンドエフェクタ23は、多関節ロボットアーム11のT軸の出力軸に固定されている。エンドエフェクタ23は、農作物を把持する把持装置23aと、農作物を枝や茎から切り離す切断装置23bとを含む。把持装置23a及び切断装置23bは、例えばモータによって駆動される。エンドエフェクタ23は、収穫する対象物の茎を把持装置23aで把持した状態で把持位置よりも幹側の茎を切断装置23bで切断することで、対象物を収穫する。
(End effector)
As shown in FIGS. 3, 4A and 4B, the end effector 23 is a device that works on an object. The end effector 23 according to the first embodiment of the present invention is a harvesting device for harvesting an agricultural product as an object. The end effector 23 is fixed to the output shaft of the T-axis of the articulated robot arm 11. The end effector 23 includes a gripping device 23a for gripping the crop and a cutting device 23b for separating the crop from the branches and stems. The gripping device 23a and the cutting device 23b are driven by, for example, a motor. The end effector 23 harvests the object by cutting the stem on the trunk side of the gripping position with the cutting device 23b while the stem of the object to be harvested is gripped by the gripping device 23a.
 (多関節ロボットアーム制御装置25)
 次に、図3から図5を用いて、多関節ロボットアーム制御装置25の構成について説明する。多関節ロボットアーム制御装置25は、多関節ロボットアーム11、エンドエフェクタ23及び単眼カメラ24を制御する装置である。多関節ロボットアーム制御装置25には、撮像部である単眼カメラ24が含まれている。
(Articulated robot arm control device 25)
Next, the configuration of the articulated robot arm control device 25 will be described with reference to FIGS. 3 to 5. The articulated robot arm control device 25 is a device that controls the articulated robot arm 11, the end effector 23, and the monocular camera 24. The articulated robot arm control device 25 includes a monocular camera 24 which is an imaging unit.
 (単眼カメラ24)
 図3、図4A、図4B及び図4Cに示すように、単眼カメラ24は、一度に単一視点から対象物を撮像するカメラである。単眼カメラ24は、CCDセンサ又はCOMOSセンサを用いたデジタルカメラである。単眼カメラ24は、レンズの光学的中心とCCDセンサ又はCOMOSセンサにおける受光面の対角をなす二点とによって作られる角度θが90度以上の対角画角を有している。
(Monocular camera 24)
As shown in FIGS. 3, 4A, 4B and 4C, the monocular camera 24 is a camera that captures an object from a single viewpoint at a time. The monocular camera 24 is a digital camera using a CCD sensor or a COMPOS sensor. The monocular camera 24 has a diagonal angle of view of 90 degrees or more, which is formed by two points diagonal to the optical center of the lens and the light receiving surface of the CCD sensor or the COMPOS sensor.
 単眼カメラ24は、エンドエフェクタ23に設けられている。単眼カメラ24は、単眼カメラ24の画角内にエンドエフェクタ23の把持装置23aまたは切断装置23bが含まれるようにエンドエフェクタ23に配置されている。単眼カメラ24は、多関節ロボットアーム11の先端回転部であるT軸モータユニット22によってT軸の軸線まわりに回転可能に設けられている。これにより、単眼カメラ24は、エンドエフェクタ23を中心とする円周上に配置される。 The monocular camera 24 is provided on the end effector 23. The monocular camera 24 is arranged in the end effector 23 so that the gripping device 23a or the cutting device 23b of the end effector 23 is included in the angle of view of the monocular camera 24. The monocular camera 24 is rotatably provided around the axis of the T-axis by the T-axis motor unit 22 which is a tip rotating portion of the articulated robot arm 11. As a result, the monocular camera 24 is arranged on the circumference centered on the end effector 23.
 図5に示すように、多関節ロボットアーム制御装置25は、実体的には、CPU、ROM、RAM、HDD等がバスで接続された構成であってもよく、あるいはワンチップのLSI等からなる構成であってもよい。多関節ロボットアーム制御装置25には、多関節ロボットアーム11、単眼カメラ24及びエンドエフェクタ23の動作を制御するために種々のプログラムやデータが格納されている。 As shown in FIG. 5, the articulated robot arm control device 25 may actually have a configuration in which a CPU, ROM, RAM, HDD, etc. are connected by a bus, or may consist of a one-chip LSI or the like. It may be a configuration. The articulated robot arm control device 25 stores various programs and data for controlling the operations of the articulated robot arm 11, the monocular camera 24, and the end effector 23.
 多関節ロボットアーム制御装置25は、バッテリー8に接続され、バッテリー8から電力を供給されるととともに、バッテリーの状態についての情報を取得することができる。 The articulated robot arm control device 25 is connected to the battery 8 and can be supplied with electric power from the battery 8 and can acquire information about the state of the battery.
 多関節ロボットアーム制御装置25は、多関節ロボットアーム11の、S軸モータユニット12、L軸モータユニット14、U軸モータユニット16、B軸モータユニット18、R軸モータユニット20及びT軸モータユニット22に含まれるモータの駆動回路にそれぞれ接続され、各軸のモータの駆動回路に制御信号を送信することができる。また、多関節ロボットアーム制御装置25は、各軸のモータユニットからモータの回転位置情報(エンコーダ信号)を取得することができる。 The articulated robot arm control device 25 includes an S-axis motor unit 12, an L-axis motor unit 14, a U-axis motor unit 16, a B-axis motor unit 18, an R-axis motor unit 20, and a T-axis motor unit of the articulated robot arm 11. It is connected to each of the drive circuits of the motor included in 22, and can transmit a control signal to the drive circuit of the motor of each axis. Further, the articulated robot arm control device 25 can acquire the rotation position information (encoder signal) of the motor from the motor unit of each axis.
 多関節ロボットアーム制御装置25は、単眼カメラ24に接続され、単眼カメラ24が撮像した画像を取得することができる。 The articulated robot arm control device 25 is connected to the monocular camera 24 and can acquire an image captured by the monocular camera 24.
 多関節ロボットアーム制御装置25は、エンドエフェクタ23の把持装置23a及び切断装置23bを駆動するエンドエフェクタ用モータの駆動回路に通信可能に接続され、把持装置23a及び切断装置23bを駆動するモータの駆動回路に制御信号を送信することができる。 The articulated robot arm control device 25 is communicably connected to the drive circuit of the end effector motor that drives the gripping device 23a and the cutting device 23b of the end effector 23, and drives the motor that drives the gripping device 23a and the cutting device 23b. A control signal can be sent to the circuit.
 多関節ロボットアーム制御装置25は、遠隔操作車両1の車両制御装置9に通信可能に接続され、遠隔操作車両1の車両制御装置9から制御信号を取得し、または遠隔操作車両1の車両制御装置9に制御信号を送信することができる。 The articulated robot arm control device 25 is communicably connected to the vehicle control device 9 of the remote-controlled vehicle 1, acquires a control signal from the vehicle control device 9 of the remote-controlled vehicle 1, or is a vehicle control device of the remote-controlled vehicle 1. A control signal can be transmitted to 9.
 多関節ロボットアーム制御装置25は、遠隔操作車両1の通信装置7に通信可能に接続され、通信装置7が受信した外部の操作端末Cからの制御信号を取得することができる。また、多関節ロボットアーム制御装置25は、通信装置7を介して、多関節ロボットアーム制御装置25が生成した制御信号又は単眼カメラ24が撮像した画像を外部の操作端末Cに連続的に送信することができる。 The articulated robot arm control device 25 is communicably connected to the communication device 7 of the remote control vehicle 1 and can acquire a control signal from the external operation terminal C received by the communication device 7. Further, the articulated robot arm control device 25 continuously transmits the control signal generated by the articulated robot arm control device 25 or the image captured by the monocular camera 24 to the external operation terminal C via the communication device 7. be able to.
 以上のように構成される多関節ロボットアーム装置10は、外部の操作端末Cからの制御信号によって多関節ロボットアーム11、単眼カメラ24及びエンドエフェクタ23を遠隔操作することができる。つまり、多関節ロボットアーム装置10は、単眼カメラ24とエンドエフェクタ23を任意の姿勢で任意の位置に移動させるアクチュエータとして多関節ロボットアーム11を用いる。このように、多関節ロボットアーム装置10は、単眼カメラ24が多関節ロボットアーム11からの視点で撮像するので、単眼カメラ24が撮像した映像に基づく遠隔操作が容易である。また、多関節ロボットアーム装置10は、操作端末Cからの制御信号に基づいて、所定の作業を自動で行うことができる。 The articulated robot arm device 10 configured as described above can remotely control the articulated robot arm 11, the monocular camera 24, and the end effector 23 by a control signal from an external operation terminal C. That is, the articulated robot arm device 10 uses the articulated robot arm 11 as an actuator for moving the monocular camera 24 and the end effector 23 to an arbitrary position in an arbitrary posture. As described above, in the articulated robot arm device 10, since the monocular camera 24 captures images from the viewpoint from the articulated robot arm 11, remote control based on the image captured by the monocular camera 24 is easy. Further, the articulated robot arm device 10 can automatically perform a predetermined work based on a control signal from the operation terminal C.
 (多関節ロボットアーム制御装置25の制御)
 次に、図3、図5から図8を用いて、多関節ロボットアーム制御装置25の制御について説明する。図6に、単眼カメラの移動量Tと単眼カメラの結像位置から収穫対象の農作物(対象葡萄G(n))までの距離Lとの関係図を示す。図7に対象葡萄G(n)と収穫順Aを算出した処理画像を示す。図8に、単眼カメラによって視差画像を生成するためのエンドエフェクタの動きの一例を表す模式図を示す。
(Control of articulated robot arm control device 25)
Next, the control of the articulated robot arm control device 25 will be described with reference to FIGS. 3, 5 to 8. FIG. 6 shows a relationship diagram between the movement amount T of the monocular camera and the distance L from the imaging position of the monocular camera to the crop to be harvested (target grape G (n)). FIG. 7 shows a processed image in which the target grape G (n) and the harvest order A are calculated. FIG. 8 shows a schematic diagram showing an example of the movement of the end effector for generating a parallax image by the monocular camera.
 図3に示すように、多関節ロボットアーム制御装置25は、農作物の収穫を行う作業領域Wから離れている第1撮像位置P1(黒塗矢印参照)で作業領域Wを撮像するように多関節ロボットアーム11と単眼カメラ24を制御する。第1撮像位置P1は、多関節ロボットアーム11の可動範囲内であって、単眼カメラ24の画角θ内に作業領域Wが全て含まれる位置である。つまり、収穫可能な農作物の探索を行う領域を一度に撮像可能な位置である。 As shown in FIG. 3, the articulated robot arm control device 25 is articulated so as to image the work area W at the first imaging position P1 (see the black arrow) away from the work area W for harvesting crops. It controls the robot arm 11 and the monocular camera 24. The first imaging position P1 is a position within the movable range of the articulated robot arm 11 and including the entire working area W within the angle of view θ of the monocular camera 24. In other words, it is a position where the area for searching for harvestable crops can be imaged at once.
 多関節ロボットアーム制御装置25は、第1撮像位置P1と異なる第2撮像位置P2(黒塗矢印参照)で収穫可能な農作物を撮像するように多関節ロボットアーム11と単眼カメラ24を制御する。第2撮像位置P2は、多関節ロボットアーム11の可動範囲内であって、第1撮像位置P1よりも作業領域Wに近い位置である。すなわち、収穫対象の農作物を詳細に撮像可能な位置である。多関節ロボットアーム制御装置25は、第2撮像位置P2で収穫対象の農作物の詳細な位置座標を算出する。 The articulated robot arm control device 25 controls the articulated robot arm 11 and the monocular camera 24 so as to image crops that can be harvested at a second imaging position P2 (see the black-painted arrow) different from the first imaging position P1. The second imaging position P2 is within the movable range of the articulated robot arm 11 and is closer to the working area W than the first imaging position P1. That is, it is a position where the crop to be harvested can be imaged in detail. The articulated robot arm control device 25 calculates the detailed position coordinates of the crop to be harvested at the second imaging position P2.
 これにより、多関節ロボットアーム装置10は、多関節ロボットアーム11を移動用アクチュエータとして、単眼カメラ24を任意の位置と方向に移動させるので、固定カメラと手先カメラを用いる従来の提案における固定カメラを省略することができる。これにより、多関節ロボットアーム制御装置25は、多関節ロボットアーム装置10における使用環境の自由度を高めることができる。従って、多関節ロボットアーム制御装置25及び多関節ロボットアーム装置10は、多関節ロボットアーム11の汎用性を高めることができる。 As a result, the articulated robot arm device 10 moves the monocular camera 24 to an arbitrary position and direction by using the articulated robot arm 11 as a moving actuator. It can be omitted. As a result, the articulated robot arm control device 25 can increase the degree of freedom in the usage environment of the articulated robot arm device 10. Therefore, the articulated robot arm control device 25 and the articulated robot arm device 10 can enhance the versatility of the articulated robot arm 11.
 図5に示すように、多関節ロボットアーム制御装置25は、画像処理部26と、座標情報処理部27と、駆動制御部28とを備える。 As shown in FIG. 5, the articulated robot arm control device 25 includes an image processing unit 26, a coordinate information processing unit 27, and a drive control unit 28.
 画像処理部26は、単眼カメラ24が撮像した作業領域Wの画像から収穫作業の対象である農作物として、例えば葡萄G(n)(以下、単に「対象葡萄G(n)」と記す)を検出する制御装置である。画像処理部26は、対象葡萄G(n)の様々な画像について学習することにより獲得した農作物の検出用データD(図7参照)を予め格納している。以下、対象葡萄G(n)の(n)は、葡萄を区別するための添え字(nは整数)とする。 The image processing unit 26 detects, for example, grape G (n) (hereinafter, simply referred to as “target grape G (n)”) as an agricultural product to be harvested from the image of the work area W captured by the monocular camera 24. It is a control device. The image processing unit 26 stores in advance the crop detection data D (see FIG. 7) acquired by learning about various images of the target grape G (n). Hereinafter, (n) of the target grape G (n) is a subscript (n is an integer) for distinguishing the grapes.
 画像処理部26は、単眼カメラ24に作業領域Wを撮像させ、検出用データDに基づいて撮像した画像内に存在する対象葡萄G(n)を全て検出する。また、画像処理部26は、遠隔操作車両1の通信装置7を介して、撮像した画像を外部の操作端末Cに連続的に送信する。これにより、多関節ロボットアーム制御装置25は、作業領域W内に存在する農作物の中から対象葡萄G(n)を全て検出することができる。 The image processing unit 26 causes the monocular camera 24 to image the work area W, and detects all the target grapes G (n) existing in the image captured based on the detection data D. Further, the image processing unit 26 continuously transmits the captured image to the external operation terminal C via the communication device 7 of the remote control vehicle 1. As a result, the articulated robot arm control device 25 can detect all the target grapes G (n) from the agricultural products existing in the work area W.
 座標情報処理部27は、画像処理部26が検出した対象葡萄G(n)のロボット座標系における位置座標G(n)(x、y、z)を算出する制御装置である。座標情報処理部27は、画像処理部26から対象葡萄G(n)の作業領域Wの画像を取得する。また、座標情報処理部27は、駆動制御部28から多関節ロボットアーム11の姿勢情報を取得する。 The coordinate information processing unit 27 is a control device that calculates the position coordinates G (n) (x, y, z) of the target grape G (n) detected by the image processing unit 26 in the robot coordinate system. The coordinate information processing unit 27 acquires an image of the work area W of the target grape G (n) from the image processing unit 26. Further, the coordinate information processing unit 27 acquires the posture information of the articulated robot arm 11 from the drive control unit 28.
 座標情報処理部27は、多関節ロボットアーム11における単眼カメラ24の取り付け位置と多関節ロボットアーム11の姿勢情報とから、作業領域Wの画像における対象葡萄G(n)の位置座標を、ロボット座標系における位置座標G(n)(x、y、z)に変換する。これにより、多関節ロボットアーム制御装置25は、作業領域Wの画像を基準として多関節ロボットアーム11を制御することができる。なお、以下の説明において、対象葡萄G(n)の位置座標G(n)(x、y、z)は、第1撮像位置からの第1画像Im1に基づく位置座標を位置座標G(n)(x1、y1、z1)と記載し、第2撮像位置からの第2画像Im2に基づく位置座標を位置座標G(n)(x2、y2、z2)と記載する。 The coordinate information processing unit 27 obtains the position coordinates of the target grape G (n) in the image of the work area W from the attachment position of the monocular camera 24 on the articulated robot arm 11 and the posture information of the articulated robot arm 11 to the robot coordinates. It is converted into the position coordinates G (n) (x, y, z) in the system. As a result, the articulated robot arm control device 25 can control the articulated robot arm 11 with reference to the image of the work area W. In the following description, the position coordinates G (n) (x, y, z) of the target grape G (n) are the position coordinates based on the first image Im1 from the first imaging position. It is described as (x1, y1, z1), and the position coordinates based on the second image Im2 from the second imaging position are described as the position coordinates G (n) (x2, y2, z2).
 図6に示すように、座標情報処理部27は、作業領域Wから離れている第1撮像位置P1において撮像した第1画像Im1と、任意の方向に任意の移動量Tだけ移動して撮像した第1画像Im1の補助画像Is1とから視差画像を生成し、対象葡萄G(n)までの距離Lを算出する。具体的には、座標情報処理部27は、第1撮像位置P1において、単眼カメラ24が撮像位置を移動量Tだけずらして撮像した2つの画像である第1画像Im1及び第1画像Im1の補助画像Is1と、2つの画像を撮像するために単眼カメラ24が移動した移動量Tと、単眼カメラ24の焦点距離fと、2つの画像における対象葡萄G(n)の位置座標Gm1、Gs1から、単眼カメラ24から対象葡萄G(n)までの距離Lを算出する。これにより、多関節ロボットアーム制御装置25は、単眼カメラ24で対象葡萄G(n)までの距離Lを算出することができる。なお、図6における線Omと線Osは、単眼カメラ24の主光軸である。 As shown in FIG. 6, the coordinate information processing unit 27 moves the first image Im1 captured at the first imaging position P1 away from the work area W by an arbitrary movement amount T in an arbitrary direction and images the image. A parallax image is generated from the auxiliary image Is1 of the first image Im1 and the distance L to the target grape G (n) is calculated. Specifically, the coordinate information processing unit 27 assists the first image Im1 and the first image Im1 which are two images captured by the monocular camera 24 at the first imaging position P1 by shifting the imaging position by the movement amount T. From the image Is1, the amount of movement T that the monocular camera 24 has moved to capture the two images, the focal distance f of the monocular camera 24, and the position coordinates Gm1 and Gs1 of the target grape G (n) in the two images. The distance L from the monocular camera 24 to the target grape G (n) is calculated. As a result, the articulated robot arm control device 25 can calculate the distance L to the target grape G (n) with the monocular camera 24. The lines Om and Os in FIG. 6 are the main optical axes of the monocular camera 24.
 図7に示すように、座標情報処理部27は、作業領域Wの第1画像Im1内に複数の対象葡萄G1、G2、G3が存在する場合、収穫作業を行うための収穫順Aを算出する。座標情報処理部27は、算出した複数の対象葡萄G1、G2、G3の位置座標G1(x1、y1、z1)、G2(x1、y1、z1)、G3(x1、y1、z1)に基づいて、例えばZ座標が大きい位置と、多関節ロボットアーム11に近い対象葡萄G1から順に収穫作業を行う収穫順Aを算出する。 As shown in FIG. 7, when a plurality of target grapes G1, G2, and G3 exist in the first image Im1 of the work area W, the coordinate information processing unit 27 calculates the harvest order A for performing the harvesting work. .. The coordinate information processing unit 27 is based on the calculated position coordinates G1 (x1, y1, z1), G2 (x1, y1, z1), and G3 (x1, y1, z1) of the plurality of target grapes G1, G2, and G3. For example, the harvesting order A in which the harvesting work is performed in order from the position where the Z coordinate is large and the target grape G1 closest to the articulated robot arm 11 is calculated.
 また、座標情報処理部27は、遠隔操作車両1の通信装置7を介して、算出した対象葡萄G1、G2、G3の位置座標G1(x1、y1、z1)、G2(x1、y1、z1)、G3(x1、y1、z1)と収穫順Aの情報を外部の操作端末Cに送信する。これにより、多関節ロボットアーム制御装置25は、作業領域Wでの収穫作業を効率的に行うことができる。 Further, the coordinate information processing unit 27 uses the communication device 7 of the remote control vehicle 1 to calculate the position coordinates G1 (x1, y1, z1) and G2 (x1, y1, z1) of the target grapes G1, G2, and G3. , G3 (x1, y1, z1) and the information of the harvest order A are transmitted to the external operation terminal C. As a result, the articulated robot arm control device 25 can efficiently perform the harvesting work in the work area W.
 駆動制御部28は、エンドエフェクタ23のモータと多関節ロボットアーム11の各軸のモータユニットを制御する制御装置である。駆動制御部28は、遠隔操作車両1の通信装置7を介して、外部の操作端末Cから多関節ロボットアーム11の制御信号を取得し、外部の操作端末Cに多関節ロボットアーム11の姿勢情報を送信する。また、駆動制御部28は、座標情報処理部27から対象葡萄G(n)の位置座標G(n)(x1、y1、z1)と収穫順Aの情報を取得する。更に、駆動制御部28は、各軸のモータユニットからモータの回転位置情報を取得する。 The drive control unit 28 is a control device that controls the motor of the end effector 23 and the motor unit of each axis of the articulated robot arm 11. The drive control unit 28 acquires the control signal of the articulated robot arm 11 from the external operation terminal C via the communication device 7 of the remote control vehicle 1, and provides the external operation terminal C with the posture information of the articulated robot arm 11. To send. Further, the drive control unit 28 acquires information on the position coordinates G (n) (x1, y1, z1) of the target grape G (n) and the harvest order A from the coordinate information processing unit 27. Further, the drive control unit 28 acquires the rotation position information of the motor from the motor unit of each axis.
 駆動制御部28は、外部の操作端末Cから取得した制御信号、又は取得した対象葡萄G(n)の位置座標G(n)(x1、y1、z1)と収穫順Aの情報、及び各軸のモータユニットの回転位置情報に基づいて、各軸のモータの駆動回路に制御信号を送信する。これにより、多関節ロボットアーム制御装置25は、多関節ロボットアーム11によってエンドエフェクタ23を任意の位置に任意の姿勢で配置することができる。 The drive control unit 28 uses the control signal acquired from the external operation terminal C, the position coordinates G (n) (x1, y1, z1) of the acquired target grape G (n), information on the harvest order A, and each axis. A control signal is transmitted to the drive circuit of the motor of each axis based on the rotation position information of the motor unit of. As a result, the articulated robot arm control device 25 can arrange the end effector 23 at an arbitrary position and in an arbitrary posture by the articulated robot arm 11.
 駆動制御部28は、エンドエフェクタ23が対象葡萄G(n)に対して所定の位置に移動されると、エンドエフェクタ23における把持装置23a及び切断装置23bのモータの駆動回路に制御信号を送信する。これにより、多関節ロボットアーム制御装置25は、エンドエフェクタ23によって対象葡萄G(n)の収穫作業を行うことができる。 When the end effector 23 is moved to a predetermined position with respect to the target grape G (n), the drive control unit 28 transmits a control signal to the drive circuit of the motor of the gripping device 23a and the cutting device 23b in the end effector 23. .. As a result, the articulated robot arm control device 25 can perform the harvesting operation of the target grape G (n) by the end effector 23.
 <制御>
 次に、多関節ロボットアーム装置10による対象葡萄G(n)の収穫作業について説明する。本実施形態において、多関節ロボットアーム装置10は、例えば一列に並んで植えられている葡萄の木から農作物である対象葡萄G(n)を収穫するものとする。遠隔操作車両1は、葡萄の木の列に沿って移動する。多関節ロボットアーム装置10は、遠隔操作車両1によって多関節ロボットアーム11の可動範囲内に収穫作業を行う葡萄の木が含まれる位置に移動しているものとする。
<Control>
Next, the harvesting operation of the target grape G (n) by the articulated robot arm device 10 will be described. In the present embodiment, the articulated robot arm device 10 harvests the target vine G (n), which is an agricultural product, from, for example, vines planted in a row. The remote-controlled vehicle 1 moves along a row of vines. It is assumed that the articulated robot arm device 10 is moved by the remote-controlled vehicle 1 to a position within the movable range of the articulated robot arm 11 to include a vine for harvesting work.
 図3に示すように、多関節ロボットアーム制御装置25は、遠隔操作車両1の車両制御装置9又は遠隔操作車両1の通信装置7を介して外部の操作端末Cから撮像を開始する旨の制御信号を取得すると、作業領域Wの撮像を開始する。 As shown in FIG. 3, the articulated robot arm control device 25 controls to start imaging from the external operation terminal C via the vehicle control device 9 of the remote control vehicle 1 or the communication device 7 of the remote control vehicle 1. When the signal is acquired, the imaging of the work area W is started.
 多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、作業領域Wから所定距離だけ離間した第1撮像位置P1であって、作業領域Wにおけるロボット座標系のZ軸方向の略中央、且つ作業領域Wにおけるロボット座標系のX軸方向の略中央に単眼カメラ24を移動させる。また、駆動制御部28は、多関節ロボットアーム11の各軸のモータユニットを制御して、単眼カメラ24の撮像方向が作業領域Wに向くように単眼カメラ24を移動させる。 The articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and is the first imaging position P1 separated from the work area W by a predetermined distance, and is the work area. The monocular camera 24 is moved to approximately the center of the robot coordinate system in W in the Z-axis direction and substantially in the center of the robot coordinate system in the work area W in the X-axis direction. Further, the drive control unit 28 controls the motor units of each axis of the articulated robot arm 11 to move the monocular camera 24 so that the imaging direction of the monocular camera 24 faces the work area W.
 この際、単眼カメラ24は、作業領域Wの全体が単眼カメラ24の画角内に含まれるように配置されている。単眼カメラ24は、対角画角が90度以上であることから、前記作業領域Wの近傍で前記作業領域W全体を単眼カメラ24の画角内に収めることができる。 At this time, the monocular camera 24 is arranged so that the entire work area W is included in the angle of view of the monocular camera 24. Since the monocular camera 24 has a diagonal angle of view of 90 degrees or more, the entire work area W can be contained within the angle of view of the monocular camera 24 in the vicinity of the work area W.
 多関節ロボットアーム制御装置25は、画像処理部26によって単眼カメラ24を制御して、第1撮像位置P1から作業領域Wを撮像する。図7に示すように、画像処理部26は、検出用データDに基づいて撮像した第1画像Im1内に存在する収穫対象である対象葡萄G(n)を全て検出する。本実施形態において、検出された対象葡萄G(n)を、葡萄G1、葡萄G2、葡萄G3とする。 The articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the work area W from the first imaging position P1. As shown in FIG. 7, the image processing unit 26 detects all the target grapes G (n) to be harvested existing in the first image Im1 imaged based on the detection data D. In the present embodiment, the detected target grapes G (n) are grapes G1, grapes G2, and grapes G3.
 図6に示すように、多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、第1撮像位置P1から任意の方向に任意の移動量Tだけ単眼カメラ24を移動させる。 As shown in FIG. 6, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and moves arbitrarily from the first imaging position P1 in an arbitrary direction. The monocular camera 24 is moved by the amount T.
 多関節ロボットアーム制御装置25は、画像処理部26によって単眼カメラ24を制御して、任意の方向に任意の移動量Tだけ移動させた位置から作業領域Wを撮像し、第1画像Im1の補助画像Is1を取得する。多関節ロボットアーム制御装置25は、遠隔操作車両1の通信装置7を介して、撮像した第1画像Im1と第1画像Im1の補助画像Is1を外部の操作端末Cに送信する。 The articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the work area W from a position moved by an arbitrary movement amount T in an arbitrary direction, and assists the first image Im1. Image Is1 is acquired. The articulated robot arm control device 25 transmits the captured first image Im1 and the auxiliary image Is1 of the first image Im1 to the external operation terminal C via the communication device 7 of the remote control vehicle 1.
 多関節ロボットアーム制御装置25は、座標情報処理部27によって第1画像Im1の撮像時における多関節ロボットアーム11の姿勢情報と、単眼カメラ24の取り付け位置とから、第1画像Im1における葡萄G1、葡萄G2、葡萄G3のロボット座標系におけるX-Z平面上の位置座標G1(x1、z1)、G2(x1、z1)、G3(x1、z1)を算出する。同様に、座標情報処理部27は、第1画像Im1の補助画像Is1における葡萄G1、葡萄G2、葡萄G3のロボット座標系におけるX-Z平面上の位置座標を算出する。 The articulated robot arm control device 25 uses the coordinate information processing unit 27 to obtain the attitude information of the articulated robot arm 11 at the time of capturing the first image Im1 and the mounting position of the monocular camera 24. The position coordinates G1 (x1, z1), G2 (x1, z1), and G3 (x1, z1) on the XZ plane in the robot coordinate system of the grape G2 and the grape G3 are calculated. Similarly, the coordinate information processing unit 27 calculates the position coordinates on the XX plane of the grape G1, the grape G2, and the grape G3 in the auxiliary image Is1 of the first image Im1 in the robot coordinate system.
 座標情報処理部27は、第1画像Im1と第1画像Im1の補助画像Is1とから視差画像を生成し、ロボット座標系における葡萄G1、葡萄G2、葡萄G3のY座標を算出する。これにより、座標情報処理部27は、ロボット座標系における葡萄G1、葡萄G2、葡萄G3の位置座標G1(x1、y1、z1)、G2(x1、y1、z1)、G3(x1、y1、z1)を算出する。 The coordinate information processing unit 27 generates a disparity image from the first image Im1 and the auxiliary image Is1 of the first image Im1, and calculates the Y coordinates of the grape G1, the grape G2, and the grape G3 in the robot coordinate system. As a result, the coordinate information processing unit 27 has the position coordinates G1 (x1, y1, z1), G2 (x1, y1, z1), G3 (x1, y1, z1) of the grape G1, the grape G2, and the grape G3 in the robot coordinate system. ) Is calculated.
 多関節ロボットアーム制御装置25は、座標情報処理部27によって葡萄G1、葡萄G2、葡萄G3の少なくとも一つ葡萄の位置座標、例えば葡萄G2の位置座標G2(x1、y1、z1)が正常に算出できなかった場合、駆動制御部28よって単眼カメラ24を移動させるための制御信号を多関節ロボットアーム11の各軸のモータユニットに送信する。多関節ロボットアーム制御装置25は、再度ロボット座標系における葡萄G1、葡萄G2、葡萄G3の位置座標G1(x1、y1、z1)、G2(x1、y1、z1)、G3(x1、y1、z1)を算出する。 The articulated robot arm control device 25 normally calculates the position coordinates of at least one of the grapes G1, the grapes G2, and the grapes G3, for example, the position coordinates G2 (x1, y1, z1) of the grapes G2 by the coordinate information processing unit 27. If this is not possible, the drive control unit 28 transmits a control signal for moving the monocular camera 24 to the motor units of each axis of the articulated robot arm 11. The articulated robot arm control device 25 again uses the position coordinates G1 (x1, y1, z1), G2 (x1, y1, z1), G3 (x1, y1, z1) of the grapes G1, the grapes G2, and the grapes G3 in the robot coordinate system. ) Is calculated.
 図7に示すように、多関節ロボットアーム制御装置25は、座標情報処理部27によって算出した葡萄G1、葡萄G2、葡萄G3の位置座標G1(x1、y1、z1)、G2(x1、y1、z1)、G3(x1、y1、z1)に基づいて、収穫作業を行うための収穫順Aを算出する。座標情報処理部27は、例えばZ座標が大きい順であって、Y座標が小さい順に収穫作業を行う収穫順Aを算出する。本実施形態において、多関節ロボットアーム制御装置25では、例えば、葡萄G1、葡萄G2、葡萄G3の順に収穫作業を行う収穫順Aが設定されている。多関節ロボットアーム制御装置25は、遠隔操作車両1の通信装置7を介して、算出した葡萄G1、葡萄G2、葡萄G3の位置座標と収穫順Aの情報を外部の操作端末Cに送信する。 As shown in FIG. 7, the articulated robot arm control device 25 has the position coordinates G1 (x1, y1, z1), G2 (x1, y1, of the grape G1, the grape G2, and the grape G3) of the grape G1, the grape G2, and the grape G3 calculated by the coordinate information processing unit 27. Based on z1) and G3 (x1, y1, z1), the harvesting order A for performing the harvesting work is calculated. The coordinate information processing unit 27 calculates, for example, the harvesting order A in which the harvesting work is performed in the order of increasing Z coordinate and decreasing Y coordinate. In the present embodiment, in the articulated robot arm control device 25, for example, a harvesting order A in which the harvesting work is performed in the order of grape G1, grape G2, and grape G3 is set. The articulated robot arm control device 25 transmits the calculated position coordinates of the grape G1, the grape G2, and the grape G3 and the information of the harvest order A to the external operation terminal C via the communication device 7 of the remote control vehicle 1.
 次に、多関節ロボットアーム制御装置25は、収穫順Aに基づいて葡萄G1、葡萄G2、葡萄G3の順に詳細な位置座標を算出し、収穫作業を行う。 Next, the articulated robot arm control device 25 calculates detailed position coordinates in the order of grape G1, grape G2, and grape G3 based on the harvesting order A, and performs harvesting work.
 図3に示すように、多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、第1撮像位置P1よりも葡萄G1に近い第2撮像位置P2まで単眼カメラ24を移動させる。多関節ロボットアーム制御装置25は、画像処理部26によって単眼カメラ24を制御して、第2撮像領域から葡萄G1を撮像して第2画像Im2を取得する(図6参照)。 As shown in FIG. 3, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and the second image is closer to the grape G1 than the first imaging position P1. The monocular camera 24 is moved to the imaging position P2. The articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the grape G1 from the second imaging region and acquire the second image Im2 (see FIG. 6).
 図6と図8とに示すように、多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、第1撮像位置P1から任意の方向に任意の移動量Tだけ単眼カメラ24を移動させる。多関節ロボットアーム制御装置25は、画像処理部26によって単眼カメラ24を制御して、任意の方向に任意の移動量Tだけ移動させた位置から葡萄G1を撮像し、第2画像Im2の補助画像Is2を取得する。 As shown in FIGS. 6 and 8, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and is in an arbitrary direction from the first imaging position P1. The monocular camera 24 is moved by an arbitrary movement amount T. The articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the grape G1 from a position where the grape G1 is moved in an arbitrary direction by an arbitrary amount of movement T, and an auxiliary image of the second image Im2. Acquire Is2.
 多関節ロボットアーム制御装置25は、遠隔操作車両1の通信装置7を介して、撮像した第2画像Im2と第2画像Im2の補助画像Is2を外部の操作端末Cに送信する。 The articulated robot arm control device 25 transmits the captured second image Im2 and the auxiliary image Is2 of the second image Im2 to the external operation terminal C via the communication device 7 of the remote control vehicle 1.
 多関節ロボットアーム制御装置25は、座標情報処理部27によって第2画像Im2における葡萄G1の位置座標を、ロボット座標系におけるX-Z平面上の位置座標G1(x2、z2)に変換する。同様に、座標情報処理部27は、第2画像Im2の補助画像Is2における葡萄G1の位置座標を、ロボット座標系におけるX-Z平面上の位置座標に変換する。 The articulated robot arm control device 25 converts the position coordinates of the grape G1 in the second image Im2 into the position coordinates G1 (x2, z2) on the XX plane in the robot coordinate system by the coordinate information processing unit 27. Similarly, the coordinate information processing unit 27 converts the position coordinates of the grape G1 in the auxiliary image Is2 of the second image Im2 into the position coordinates on the XX plane in the robot coordinate system.
 さらに、座標情報処理部27は、第2画像Im2と第2画像Im2の補助画像Is2とから視差画像を生成し、ロボット座標系における葡萄G1のY座標を算出する。これにより、座標情報処理部27は、ロボット座標系における葡萄G1の位置座標G1(x2、y2、z2)を算出する。葡萄G1の位置座標G1(x2、y2、z2)は、第1撮像位置P1よりも葡萄G1に近い第2撮像位置P2で撮像された第2画像Im2から算出されているので、G1(x1、y1、z1)よりも高精度である。 Further, the coordinate information processing unit 27 generates a parallax image from the second image Im2 and the auxiliary image Is2 of the second image Im2, and calculates the Y coordinate of the grape G1 in the robot coordinate system. As a result, the coordinate information processing unit 27 calculates the position coordinates G1 (x2, y2, z2) of the grape G1 in the robot coordinate system. Since the position coordinates G1 (x2, y2, z2) of the grape G1 are calculated from the second image Im2 captured at the second imaging position P2, which is closer to the grape G1 than the first imaging position P1, G1 (x1, y2, z2). It is more accurate than y1 and z1).
 多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、算出した葡萄G1の位置座標G1(x2、y2、z2)までエンドエフェクタ23を移動させる。駆動制御部28は、エンドエフェクタ23のモータを制御して、把持装置23aで葡萄G1の茎を把持しつつ、切断装置23bで葡萄G1の茎を葡萄の木の枝から切り離す。その後、駆動制御部28は、多関節ロボットアーム11の各軸のモータユニットを制御して、収穫した葡萄G1を遠隔操作車両1の収穫箱Hに収納する(図1参照)。 The articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and sets the end effector 23 up to the calculated position coordinates G1 (x2, y2, z2) of the grape G1. Move. The drive control unit 28 controls the motor of the end effector 23, grips the stalk of the vine G1 with the gripping device 23a, and separates the stalk of the vine G1 from the branch of the vine tree with the cutting device 23b. After that, the drive control unit 28 controls the motor units of each axis of the articulated robot arm 11 to store the harvested grape G1 in the harvest box H of the remote-controlled vehicle 1 (see FIG. 1).
 同様にして、多関節ロボットアーム制御装置25は、収穫順Aに基づいて葡萄G2、葡萄G3の詳細な位置座標G2(x2、y2、z2)、G3(x2、y2、z2)を算出し、エンドエフェクタ23によって収穫作業を行う。 Similarly, the articulated robot arm control device 25 calculates the detailed position coordinates G2 (x2, y2, z2) and G3 (x2, y2, z2) of the grape G2 and the grape G3 based on the harvest order A. Harvesting work is performed by the end effector 23.
 (多関節ロボットアーム制御装置25の動作)
 次に、図9を用いて、上述の構成を有する多関節ロボットアーム制御装置25の動作について具体的に説明する。図9に、本発明の実施形態1に係る多関節ロボットアーム制御装置25の制御フロー図を示す。
(Operation of articulated robot arm control device 25)
Next, with reference to FIG. 9, the operation of the articulated robot arm control device 25 having the above configuration will be specifically described. FIG. 9 shows a control flow diagram of the articulated robot arm control device 25 according to the first embodiment of the present invention.
 図9に示すように、多関節ロボットアーム制御装置25による収穫作業のフローがスタートすると、ステップS110において、多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、第1撮像位置P1に単眼カメラ24を移動させ、ステップをステップS120に移行させる。 As shown in FIG. 9, when the flow of the harvesting work by the articulated robot arm control device 25 starts, in step S110, the articulated robot arm control device 25 is moved by the drive control unit 28 to each axis of the articulated robot arm 11. The motor unit is controlled to move the monocular camera 24 to the first imaging position P1 and shift the step to step S120.
 ステップS120において、多関節ロボットアーム制御装置25は、画像処理部26によって単眼カメラ24を制御して、第1撮像位置P1からの作業領域Wの第1画像Im1を撮像する。さらに、多関節ロボットアーム制御装置25は、画像処理部26によって、検出用データDに基づいて第1画像Im1から対象葡萄G(n)を検出し、ステップをステップS130に移行させる。 In step S120, the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to capture the first image Im1 of the work area W from the first imaging position P1. Further, the articulated robot arm control device 25 detects the target grape G (n) from the first image Im1 based on the detection data D by the image processing unit 26, and shifts the step to step S130.
 ステップS130において、多関節ロボットアーム制御装置25は、単眼カメラ24を第1撮像位置P1から任意の方向に任意の移動量Tだけ移動させた位置から第1画像Im1の補助画像Is1を撮像し、ステップをステップS140に移行させる。 In step S130, the articulated robot arm control device 25 captures the auxiliary image Is1 of the first image Im1 from a position in which the monocular camera 24 is moved from the first imaging position P1 in an arbitrary direction by an arbitrary movement amount T. The step is shifted to step S140.
 ステップS140において、多関節ロボットアーム制御装置25は、座標情報処理部27によって第1画像Im1及び第1画像Im1の補助画像Is1からロボット座標系における対象葡萄G(n)の位置座標G(n)(x1、y1、z1)を算出し、ステップをステップS150に移行させる。 In step S140, the articulated robot arm control device 25 is subjected to the position coordinates G (n) of the target grape G (n) in the robot coordinate system from the auxiliary image Is1 of the first image Im1 and the first image Im1 by the coordinate information processing unit 27. (X1, y1, z1) is calculated, and the step is shifted to step S150.
 ステップS150において、多関節ロボットアーム制御装置25は、対象葡萄G(n)の位置座標G(n)(x1、y1、z1)が全て算出できたか否か判定する。
 その結果、対象葡萄G(n)の位置座標G(n)(x1、y1、z1)が全て算出できた場合、多関節ロボットアーム制御装置25はステップをステップS160に移行させる。
 一方、対象葡萄G(n)が適切に撮像できずに対象葡萄G(n)の位置座標G(n)(x1、y1、z1)の全てを算出できない場合、多関節ロボットアーム制御装置25はステップをステップS161に移行させる。
In step S150, the articulated robot arm control device 25 determines whether or not all the position coordinates G (n) (x1, y1, z1) of the target grape G (n) can be calculated.
As a result, when all the position coordinates G (n) (x1, y1, z1) of the target grape G (n) can be calculated, the articulated robot arm control device 25 shifts the step to step S160.
On the other hand, when the target grape G (n) cannot be properly imaged and all the position coordinates G (n) (x1, y1, z1) of the target grape G (n) cannot be calculated, the articulated robot arm control device 25 The step is shifted to step S161.
 ステップS160において、多関節ロボットアーム制御装置25は、座標情報処理部27によって算出した全ての対象葡萄G(n)の位置座標G(n)(x1、y1、z1)に基づいて、収穫作業を行うための収穫順Aを算出し、ステップをステップS170に移行させる。 In step S160, the articulated robot arm control device 25 performs the harvesting operation based on the position coordinates G (n) (x1, y1, z1) of all the target grapes G (n) calculated by the coordinate information processing unit 27. The harvest order A to be performed is calculated, and the step is shifted to step S170.
 ステップS170において、多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、収穫順Aに基づいて選択した対象葡萄G(n)の第2撮像位置P2に単眼カメラ24を移動させ、ステップをステップS180に移行させる。 In step S170, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and the second of the target grapes G (n) selected based on the harvest order A. 2 The monocular camera 24 is moved to the imaging position P2, and the step is shifted to step S180.
 ステップS180において、多関節ロボットアーム制御装置25は、画像処理部26によって単眼カメラ24を制御して、第2撮像位置P2から対象葡萄G(n)の第2画像Im2を撮像する。さらに、多関節ロボットアーム制御装置25は、単眼カメラ24を第2撮像位置P2から任意の方向に任意の移動量Tだけ移動させた位置から、第2画像Im2の補助画像Is2を撮像し、ステップをステップS190に移行させる。 In step S180, the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to image the second image Im2 of the target grape G (n) from the second imaging position P2. Further, the articulated robot arm control device 25 captures the auxiliary image Is2 of the second image Im2 from the position where the monocular camera 24 is moved from the second imaging position P2 in an arbitrary direction by an arbitrary movement amount T, and steps. To step S190.
 ステップS190において、多関節ロボットアーム制御装置25は、座標情報処理部27によって第2画像Im2及び第2画像Im2の補助画像Is2からロボット座標系における対象葡萄G(n)の位置座標G(n)(x2、y2、z2)を算出する。さらに、多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、算出した位置座標G(n)(x2、y2、z2)における対象葡萄G(n)を収穫し、ステップをステップS200に移行させる。 In step S190, the articulated robot arm control device 25 is subjected to the position coordinates G (n) of the target grape G (n) in the robot coordinate system from the auxiliary image Is2 of the second image Im2 and the second image Im2 by the coordinate information processing unit 27. (X2, y2, z2) is calculated. Further, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and the target grapes at the calculated position coordinates G (n) (x2, y2, z2). G (n) is harvested and the step shifts to step S200.
 ステップS200において、多関節ロボットアーム制御装置25は、対象葡萄G(n)を全て収穫したか否か判定する。 In step S200, the articulated robot arm control device 25 determines whether or not all the target grapes G (n) have been harvested.
 その結果、対象葡萄G(n)を全て収穫した場合、多関節ロボットアーム制御装置25は収穫作業を終了し、このフローを終了する。 As a result, when all the target grapes G (n) are harvested, the articulated robot arm control device 25 ends the harvesting work and ends this flow.
 一方、対象葡萄G(n)を全て収穫できていない場合、多関節ロボットアーム制御装置25はステップをステップS170に移行させる。 On the other hand, if all the target grapes G (n) have not been harvested, the articulated robot arm control device 25 shifts the step to step S170.
 ステップS161において、多関節ロボットアーム制御装置25は、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して、第1撮像位置P1よりも作業領域Wに近い新しい第1撮像位置P1に単眼カメラ24を移動させ、ステップをステップS162に移行させる。 In step S161, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28, and a new first imaging closer to the work area W than the first imaging position P1. The monocular camera 24 is moved to the position P1 and the step is shifted to step S162.
 ステップS162において、多関節ロボットアーム制御装置25は、画像処理部26によって単眼カメラ24を制御して、新しい第1撮像位置Pnからの作業領域Wの新しい第1画像Im1を撮像するとともに、検出用データDに基づいて新しい第1画像Im1から対象葡萄G(n)を検出し、ステップをステップS163に移行させる。 In step S162, the articulated robot arm control device 25 controls the monocular camera 24 by the image processing unit 26 to capture a new first image Im1 of the work area W from the new first imaging position Pn, and for detection. The target grape G (n) is detected from the new first image Im1 based on the data D, and the step is shifted to step S163.
 ステップS163において、多関節ロボットアーム制御装置25は、単眼カメラ24を新しい第1撮像位置P1から任意の方向に任意の移動量だけ移動させた位置から新しい第1画像Im1の補助画像Is1を撮像し、ステップをステップS164に移行させる。 In step S163, the articulated robot arm control device 25 captures the auxiliary image Is1 of the new first image Im1 from the position where the monocular camera 24 is moved from the new first imaging position P1 in an arbitrary direction by an arbitrary amount of movement. , Move the step to step S164.
 ステップS164において、多関節ロボットアーム制御装置25は、座標情報処理部27によって新しい第1画像Im1及び新しい第1画像Im1の補助画像Is1からロボット座標系における対象葡萄G(n)の新しい位置座標G(n)(x1、y1、z1)を算出し、ステップをステップS150に移行させる。 In step S164, the articulated robot arm control device 25 receives the new position coordinates G of the target grape G (n) in the robot coordinate system from the auxiliary image Is1 of the new first image Im1 and the new first image Im1 by the coordinate information processing unit 27. (N) Calculate (x1, y1, z1) and shift the step to step S150.
 以上より、本実施形態では、多関節ロボットアーム制御装置25は、多関節ロボットアーム11に設けられる単眼カメラ24と、画像処理部26と、座標情報処理部27と、駆動制御部28と、を備えている。単眼カメラ24は、多関節ロボットアーム11が作業領域Wから離れており、且つ単眼カメラ24の画角内に作業領域Wが全て含まれる第1撮像位置P1で作業領域Wを撮像する。画像処理部26は、作業領域Wの画像から対象葡萄G(n)の第1画像Im1を検出する。座標情報処理部27は、検出された対象葡萄G(n)の第1画像Im1及び第1画像Im1の補助画像Is1に基づいて多関節ロボットアーム11のロボット座標系における対象葡萄G(n)の位置座標G(n)(x1、y1、z1)を算出する。 From the above, in the present embodiment, the articulated robot arm control device 25 includes a monocular camera 24 provided on the articulated robot arm 11, an image processing unit 26, a coordinate information processing unit 27, and a drive control unit 28. I have. The monocular camera 24 images the work area W at the first imaging position P1 in which the articulated robot arm 11 is separated from the work area W and the work area W is completely included in the angle of view of the monocular camera 24. The image processing unit 26 detects the first image Im1 of the target grape G (n) from the image in the work area W. The coordinate information processing unit 27 of the target grape G (n) in the robot coordinate system of the articulated robot arm 11 based on the first image Im1 of the detected target grape G (n) and the auxiliary image Is1 of the first image Im1. The position coordinates G (n) (x1, y1, z1) are calculated.
 上述のように、多関節ロボットアーム制御装置25は、第1撮像位置P1で単眼カメラ24が作業領域Wを撮像することで、作業領域Wを含む領域から対象葡萄G(n)を探索することができる。また、座標情報処理部27が、第1画像Im1と第1画像Im1の補助画像Is1とに基づいて対象葡萄G(n)の位置座標を算出することができる。このように、多関節ロボットアーム11に設けられた単眼カメラ24は、エンドエフェクタ23の作業精度を高めるための詳細な位置座標を算出する役割と作業領域W全体に含まれる対象葡萄G(n)を検出する役割を担っている。つまり、前記多関節ロボットアーム制御装置25は、多関節ロボットアーム11によって単眼カメラ24を移動させることで、撮像範囲が異なる撮像対象である作業領域Wと対象葡萄G(n)とを撮像することができる。これにより、多関節ロボットアーム制御装置25は、固定カメラと手先カメラを用いる従来の提案における固定カメラを省略することができる。 As described above, the articulated robot arm control device 25 searches for the target grape G (n) from the area including the work area W by the monocular camera 24 taking an image of the work area W at the first imaging position P1. Can be done. Further, the coordinate information processing unit 27 can calculate the position coordinates of the target grape G (n) based on the first image Im1 and the auxiliary image Is1 of the first image Im1. As described above, the monocular camera 24 provided on the articulated robot arm 11 has a role of calculating detailed position coordinates for improving the work accuracy of the end effector 23 and the target grape G (n) included in the entire work area W. It plays a role in detecting. That is, the articulated robot arm control device 25 moves the monocular camera 24 by the articulated robot arm 11 to image the work area W and the target grape G (n), which are imaging targets having different imaging ranges. Can be done. As a result, the articulated robot arm control device 25 can omit the fixed camera in the conventional proposal using the fixed camera and the hand camera.
 しかも、上述のように、座標情報処理部27が、作業領域W内の複数の対象葡萄G(n)の位置座標G(n)(x1、y1、z1)を算出することで、多関節ロボットアーム11が収穫作業を行う際の最適な収穫順Aを所定の条件に基づいて算出することができる。これにより、多関節ロボットアーム制御装置25は、無作為な位置に存在する対象葡萄G(n)を効率的に収穫することができる。 Moreover, as described above, the coordinate information processing unit 27 calculates the position coordinates G (n) (x1, y1, z1) of the plurality of target grapes G (n) in the work area W, so that the articulated robot The optimum harvesting order A when the arm 11 performs the harvesting operation can be calculated based on a predetermined condition. As a result, the articulated robot arm control device 25 can efficiently harvest the target grape G (n) existing at a random position.
 また、単眼カメラ24は、第1撮像位置P1と異なる第2撮像位置P2で対象葡萄G(n)を含む作業領域Wをさらに撮像する。 Further, the monocular camera 24 further images the work area W including the target grape G (n) at the second imaging position P2 different from the first imaging position P1.
 上述のように、第2撮像位置P2から特定の領域や特定の対象葡萄G(n)を第1撮像位置P1から撮像した第1画像Im1と異なる角度で撮像したり、第1撮像位置P1よりも近い位置で拡大して撮像したりすることができる。これにより、多関節ロボットアーム制御装置25は、第1撮像位置P1から撮像した第1画像Im1を用いて作業領域Wを広域探索することができ、第2撮像位置P2から撮像した第2画像Im2を用いて作業領域Wを局所的に探索することができる。つまり、多関節ロボットアーム制御装置25は、作業領域Wの状態や対象葡萄G(n)の周囲の状態、及び対象葡萄G(n)の形状に適した画像を取得することができる。 As described above, a specific region or a specific target grape G (n) is imaged from the second imaging position P2 at a different angle from the first image Im1 imaged from the first imaging position P1, or from the first imaging position P1. Can be magnified and imaged at a close position. As a result, the articulated robot arm control device 25 can search the work area W over a wide area using the first image Im1 imaged from the first imaging position P1, and the second image Im2 imaged from the second imaging position P2. The work area W can be searched locally using. That is, the articulated robot arm control device 25 can acquire an image suitable for the state of the work area W, the state around the target grape G (n), and the shape of the target grape G (n).
 また、単眼カメラ24は、撮像範囲において90度以上の画角θで撮像するので、単眼カメラ24を多関節ロボットアーム11の可動範囲内で移動させるだけで作業領域W全体を撮像することができる。これにより、多関節ロボットアーム制御装置25は、多関節ロボットアーム11の可動範囲内の作業領域W全体を単眼カメラ24で撮像することができる。 Further, since the monocular camera 24 takes an image with an angle of view θ of 90 degrees or more in the imaging range, the entire work area W can be imaged only by moving the monocular camera 24 within the movable range of the articulated robot arm 11. .. As a result, the articulated robot arm control device 25 can capture the entire working area W within the movable range of the articulated robot arm 11 with the monocular camera 24.
 また、単眼カメラ24は、多関節ロボットアーム11の駆動によって任意の方向に移動されて複数個所で前記作業領域Wを撮像することにより、前記作業領域Wの視差画像を生成することができる。 Further, the monocular camera 24 can generate a parallax image of the work area W by moving in an arbitrary direction by driving the articulated robot arm 11 and imaging the work area W at a plurality of places.
 上述のように、多関節ロボットアーム制御装置25は、単眼カメラ24を移動させて撮像することで、作業領域Wまたは対象葡萄G(n)までの距離Lを計測することができる。また、多関節ロボットアーム制御装置25は、単眼カメラ24の任意の方向の移動量Tを作業環境や作業状態に応じて設定することで、2台のカメラ間の距離が固定されている従来のステレオカメラに比べて、対象葡萄G(n)を認識できる距離の範囲を拡大させることができるとともに、一定の精度以上で測定できる範囲を拡大させることができる。これにより、多関節ロボットアーム制御装置25は、単眼カメラ24によって作業領域W内の複数の対象葡萄G(n)を撮像するとともに、対象葡萄G(n)の状態に応じた視差画像に基づいて対象葡萄G(n)までの距離Lを計測することができる。 As described above, the articulated robot arm control device 25 can measure the distance L to the work area W or the target grape G (n) by moving the monocular camera 24 to take an image. Further, in the articulated robot arm control device 25, the distance between the two cameras is fixed by setting the movement amount T of the monocular camera 24 in an arbitrary direction according to the work environment and the work state. Compared with a stereo camera, the range of the distance at which the target grape G (n) can be recognized can be expanded, and the range that can be measured with a certain accuracy or higher can be expanded. As a result, the articulated robot arm control device 25 captures a plurality of target grapes G (n) in the work area W by the monocular camera 24, and is based on a parallax image according to the state of the target grapes G (n). The distance L to the target grape G (n) can be measured.
 また、多関節ロボットアーム11は、最も先端の手首回転リンク21の軸線まわりに回転可能なT軸モータユニット22を有し、先端回転部であるT軸モータユニット22の出力軸にエンドエフェクタ23と単眼カメラ24とが設けられている。この際、単眼カメラ24は、単眼カメラ24の画角内にエンドエフェクタ23の把持装置23aと切断装置23bの少なくとも一方の一部が含まれるように配置されている。 Further, the articulated robot arm 11 has a T-axis motor unit 22 that can rotate around the axis of the wrist rotation link 21 at the most advanced end, and has an end effector 23 on the output shaft of the T-axis motor unit 22 that is the tip rotating portion. A monocular camera 24 is provided. At this time, the monocular camera 24 is arranged so that at least one part of the gripping device 23a and the cutting device 23b of the end effector 23 is included in the angle of view of the monocular camera 24.
 多関節ロボットアーム制御装置25は、単眼カメラ24を手首回転リンク21の軸線まわりに回転させることで対象葡萄G(n)を撮像可能な位置に単眼カメラ24を移動させることができる。これにより、多関節ロボットアーム装置10は、対象葡萄G(n)の周囲の状態が変化しても撮像可能な位置に単眼カメラ24を回転移動させることで撮像を継続することができる。また、多関節ロボットアーム装置10は、単眼カメラ24の画角内にエンドエフェクタ23が映し出されているので、外部の操作端末Cによる遠隔操作が容易に行える。 The articulated robot arm control device 25 can move the monocular camera 24 to a position where the target grape G (n) can be imaged by rotating the monocular camera 24 around the axis of the wrist rotation link 21. As a result, the articulated robot arm device 10 can continue imaging by rotating the monocular camera 24 to a position where imaging can be performed even if the surrounding state of the target grape G (n) changes. Further, in the articulated robot arm device 10, since the end effector 23 is projected within the angle of view of the monocular camera 24, remote control by the external operation terminal C can be easily performed.
 このように構成される多関節ロボットアーム制御装置25、及び多関節ロボットアーム制御装置25を備える多関節ロボットアーム装置10は、多関節ロボットアーム11を移動用アクチュエータとして、単眼カメラ24を任意の位置と方向に移動させることができる。つまり、多関節ロボットアーム制御装置25、及び多関節ロボットアーム制御装置25を備える多関節ロボットアーム装置10は、作業環境や作業状態に応じた位置や方向から作業範囲を撮像することができる。従って、多関節ロボットアーム制御装置25、及び多関節ロボットアーム制御装置25を備える多関節ロボットアーム装置10は、多関節ロボットアーム11で対応できる作業の種類を増やし、汎用性を高めることができる。 In the articulated robot arm device 10 including the articulated robot arm control device 25 and the articulated robot arm control device 25 configured as described above, the articulated robot arm 11 is used as a moving actuator and the monocular camera 24 is positioned at an arbitrary position. Can be moved in the direction of. That is, the articulated robot arm device 25 and the articulated robot arm device 10 including the articulated robot arm control device 25 can image the work range from a position and a direction according to the work environment and the work state. Therefore, the articulated robot arm device 25 and the articulated robot arm device 10 including the articulated robot arm control device 25 can increase the types of work that can be handled by the articulated robot arm 11 and enhance versatility.
 (その他の実施形態)
 以上、本発明の実施の形態を説明したが、上述した実施の形態は本発明を実施するための例示に過ぎない。よって、上述した実施の形態に限定されることなく、その趣旨を逸脱しない範囲内で上述した実施の形態を適宜変形して実施することが可能である。
(Other embodiments)
Although the embodiments of the present invention have been described above, the above-described embodiments are merely examples for carrying out the present invention. Therefore, the embodiment is not limited to the above-described embodiment, and the above-described embodiment can be appropriately modified and implemented within a range that does not deviate from the purpose.
 図10に、本発明の他の実施形態に係る多関節ロボットアーム制御装置25のブロック図を示す。前記実施形態において、多関節ロボットアーム装置10は、多関節ロボットアーム11に設けられた単眼カメラ24が撮像した第1画像Im1と、任意の方向に任意の移動量だけ移動して撮像した第1画像Im1の補助画像Is1とを用いて対象葡萄G(n)までの距離を算出する。しかしながら、図10に示すように、多関節ロボットアーム装置10は、レーザー測距センサ等を含む距離測定部29を更に備え、単眼カメラ24によらずに対象葡萄G(n)までの距離を測定してもよい。このように構成される多関節ロボットアーム装置10は、単眼カメラ24で作業領域Wを撮像しつつ、距離測定部29が測定した値を座標情報処理部27が取得することで対象葡萄G(n)の位置座標G(n)(x1、y1、z1)を測定することができる。 FIG. 10 shows a block diagram of the articulated robot arm control device 25 according to another embodiment of the present invention. In the above embodiment, the articulated robot arm device 10 has a first image Im1 captured by a monocular camera 24 provided on the articulated robot arm 11 and a first image obtained by moving an arbitrary amount of movement in an arbitrary direction. The distance to the target grape G (n) is calculated by using the auxiliary image Is1 of the image Im1. However, as shown in FIG. 10, the articulated robot arm device 10 further includes a distance measuring unit 29 including a laser ranging sensor and the like, and measures the distance to the target grape G (n) without relying on the monocular camera 24. You may. The articulated robot arm device 10 configured in this way captures the work area W with the monocular camera 24, and the coordinate information processing unit 27 acquires the value measured by the distance measuring unit 29 to obtain the target grape G (n). ) Position coordinates G (n) (x1, y1, z1) can be measured.
 また、多関節ロボットアーム制御装置25は、視差画像を撮像するための単眼カメラ24の任意の方向の移動量Tが、第1撮像位置P1と第2撮像位置P2とで異なるように制御してもよい。 Further, the articulated robot arm control device 25 controls so that the amount of movement T of the monocular camera 24 for capturing a parallax image in an arbitrary direction differs between the first imaging position P1 and the second imaging position P2. May be good.
 上述のように、多関節ロボットアーム制御装置25は、視差画像を撮像するための単眼カメラ24の任意の方向の移動量Tを変えることで、作業領域Wまたは対象葡萄G(n)までの距離Lを計測する際の計測精度を調整することができる。これにより、多関節ロボットアーム制御装置25は、第1撮像位置P1と前記第2撮像位置P2との前記作業領域Wまでの距離Lが異なっていても作業領域Wでの距離、多関節ロボットアーム11の作業環境及び作業状態に応じた計測精度で作業領域Wまたは対象葡萄G(n)までの距離Lを計測することができる。 As described above, the articulated robot arm control device 25 changes the movement amount T of the monocular camera 24 for capturing the parallax image in an arbitrary direction, thereby moving the distance to the work area W or the target grape G (n). The measurement accuracy when measuring L can be adjusted. As a result, the articulated robot arm control device 25 can determine the distance in the work area W and the articulated robot arm even if the distance L between the first imaging position P1 and the second imaging position P2 to the work area W is different. The distance L to the work area W or the target grape G (n) can be measured with the measurement accuracy according to the work environment and the work state of 11.
 また、多関節ロボットアーム制御装置25は、第1撮像位置P1と第2撮像位置P2とで異なる撮像方向から撮像してもよい。 Further, the articulated robot arm control device 25 may take an image from different imaging directions at the first imaging position P1 and the second imaging position P2.
 図11に、本発明の他の実施形態に係る多関節ロボットアーム制御装置25において、第1撮像位置P1と第2撮像位置P2と撮像方向が異なる状態を表す配置図を示す。図11に示すように、多関節ロボットアーム制御装置25は、第1撮像位置P1での単眼カメラ24の姿勢を維持した状態で第2撮像位置P2に単眼カメラ24を移動させる。この際、単眼カメラ24のレンズが太陽光SL(白塗矢印参照)の方向に向いている逆光の位置関係や、対単眼カメラ24と象葡萄G(n)との間に対象葡萄G(n)以外の葡萄や葉っぱ等が存在する位置関係になると、単眼カメラ24が第2撮像位置から対象葡萄G(n)を撮影できない。 FIG. 11 shows a layout diagram showing a state in which the first imaging position P1 and the second imaging position P2 and the imaging direction are different in the articulated robot arm control device 25 according to another embodiment of the present invention. As shown in FIG. 11, the articulated robot arm control device 25 moves the monocular camera 24 to the second imaging position P2 while maintaining the posture of the monocular camera 24 at the first imaging position P1. At this time, the positional relationship of the backlight in which the lens of the monocular camera 24 is directed in the direction of the sunlight SL (see the whitewashed arrow) and the target grape G (n) between the anti-monocular camera 24 and the elephant grape G (n). When the positional relationship is such that grapes, leaves, etc. other than) are present, the monocular camera 24 cannot capture the target grape G (n) from the second imaging position.
 多関節ロボットアーム制御装置25は、逆光や障害物によって第2撮像位置での位置座標の取得ができない場合、第1撮像位置P1と第2撮像位置P2とで異なる撮像方向から撮像することで、逆光や障害物等によって対象葡萄G(n)の撮像が難しい場合でも、単眼カメラ24の撮像方向や撮像位置を変更して撮像可能な状態で撮像することができる。 When the articulated robot arm control device 25 cannot acquire the position coordinates at the second imaging position due to backlight or an obstacle, the articulated robot arm control device 25 acquires images from different imaging directions at the first imaging position P1 and the second imaging position P2. Even when it is difficult to take an image of the target grape G (n) due to backlight, obstacles, or the like, it is possible to take an image in a state where the monocular camera 24 can take an image by changing the image taking direction and the image taking position.
 これにより、多関節ロボットアーム制御装置25は、作業環境及び作業状態の変化に応じて単眼カメラ24の位置及び姿勢を変更することで、作業領域Wの状態が頻繁に変化する屋外の対象葡萄G(n)を撮像することができる。つまり、多関節ロボットアーム制御装置25は、単眼カメラ24による対象物の認識精度を高めることで、多関節ロボットアーム装置10における使用環境の自由度を高めることができる。 As a result, the articulated robot arm control device 25 changes the position and posture of the monocular camera 24 according to changes in the work environment and the work state, so that the state of the work area W frequently changes. (N) can be imaged. That is, the articulated robot arm control device 25 can increase the degree of freedom in the usage environment of the articulated robot arm device 10 by increasing the recognition accuracy of the object by the monocular camera 24.
 また、多関節ロボットアーム制御装置25は、算出された対象葡萄G(n)の位置座標の変動量に基づいて、駆動制御部28によって多関節ロボットアーム11の各軸のモータユニットを制御して変動後の対象葡萄G(n)の位置座標G(n)(xc、yc、zc)を変動前の位置座標G(n)(x、y、z)に補正してもよい。 Further, the articulated robot arm control device 25 controls the motor units of each axis of the articulated robot arm 11 by the drive control unit 28 based on the calculated fluctuation amount of the position coordinates of the target grape G (n). The position coordinates G (n) (xc, yc, zc) of the target grape G (n) after the change may be corrected to the position coordinates G (n) (x, y, z) before the change.
 図12に、本発明の他の実施形態に係る多関節ロボットアーム装置を備えた遠隔操作車両の模式図を示す。図12に示すように、遠隔操作車両1は、地面のぬかるみ等や、多関節ロボットアーム11の姿勢の変化による遠隔操作車両1の重量バランスの変化の影響によって位置が変動する場合がある。これにより、多関節ロボットアーム装置10におけるロボット座標系の原点が移動される。つまり、多関節ロボットアーム制御装置25が、ロボット座標系の原点の移動前に算出した対象葡萄G(n)の位置座標G(n)(x1、y1、z1)又は位置座標G(n)(x2、y2、z2)は、原点が移動した後のロボット座標系における対象葡萄G(n)の位置座標と異なる。 FIG. 12 shows a schematic view of a remote-controlled vehicle provided with an articulated robot arm device according to another embodiment of the present invention. As shown in FIG. 12, the position of the remote-controlled vehicle 1 may change due to the influence of the muddy ground or the change in the weight balance of the remote-controlled vehicle 1 due to the change in the posture of the articulated robot arm 11. As a result, the origin of the robot coordinate system in the articulated robot arm device 10 is moved. That is, the position coordinates G (n) (x1, y1, z1) or the position coordinates G (n) (n) of the target grape G (n) calculated by the articulated robot arm control device 25 before the movement of the origin of the robot coordinate system. x2, y2, z2) are different from the position coordinates of the target grape G (n) in the robot coordinate system after the origin has moved.
 多関節ロボットアーム制御装置25は、座標情報処理部27によって算出された第1画像Im1に基づく対象葡萄G(n)の位置座標G(n)(x1c、y1c、z1c)が、同一の第1撮像位置から過去に撮像した第1画像Im1に基づく対象葡萄G(n)の位置座標G(n)(x1、y1、z1)と異なる場合、ロボット座標系の原点が移動していると判断する。 The articulated robot arm control device 25 has the same first position coordinates G (n) (x1c, y1c, z1c) of the target grape G (n) based on the first image Im1 calculated by the coordinate information processing unit 27. If the position coordinates G (n) (x1, y1, z1) of the target grape G (n) based on the first image Im1 captured in the past from the imaging position are different, it is determined that the origin of the robot coordinate system is moving. ..
 多関節ロボットアーム制御装置25は、ロボット座標系の原点が移動した後の対象葡萄G(n)の位置座標G(n)(x1c、y1c、z1c)とロボット座標系の原点が移動する前の位置座標G(n)(x1、y1、z1)とが一致するように多関節ロボットアーム11の各軸のモータユニットを制御する。これにより、単眼カメラ24は、ロボット座標系の原点が移動する前の位置や姿勢になるように補正される(黒塗矢印参照)。従って、多関節ロボットアーム制御装置25は、多関節ロボットアーム11の周囲の状態が変化しても多関節ロボットアーム11の制御によって作業を継続することができる。つまり、多関節ロボットアーム制御装置25は、多関節ロボットアーム装置10における使用環境の自由度を高めることができる。 The articulated robot arm control device 25 has the position coordinates G (n) (x1c, y1c, z1c) of the target grape G (n) after the origin of the robot coordinate system has moved and before the origin of the robot coordinate system has moved. The motor unit of each axis of the articulated robot arm 11 is controlled so that the position coordinates G (n) (x1, y1, z1) coincide with each other. As a result, the monocular camera 24 is corrected so as to be in the position and posture before the origin of the robot coordinate system moves (see the black arrow). Therefore, the articulated robot arm control device 25 can continue the work under the control of the articulated robot arm 11 even if the surrounding state of the articulated robot arm 11 changes. That is, the articulated robot arm control device 25 can increase the degree of freedom in the usage environment of the articulated robot arm device 10.
 なお、前記実施形態において、多関節ロボットアーム制御装置25は、作業領域W内の対象葡萄G(n)の収穫順Aを対象葡萄G(n)の位置に基づいて所定のプログラムにより決定しているが、一例でありこれに限定するものではない。例えば、多関節ロボットアーム制御装置25は、AI(人口知能)によって収穫順Aを決定したり、対象葡萄G(n)の大きさ順など特定の基準のあてはめによって決定したりしてもよい。 In the above embodiment, the articulated robot arm control device 25 determines the harvesting order A of the target grape G (n) in the work area W by a predetermined program based on the position of the target grape G (n). However, this is just an example and is not limited to this. For example, the articulated robot arm control device 25 may determine the harvest order A by AI (artificial intelligence) or by applying a specific standard such as the size order of the target grape G (n).
 また、前記実施形態において、多関節ロボットアーム11は、6軸の垂直多関節ロボットアームである多関節ロボットアーム11は、一例としてS軸モータユニット12、L軸モータユニット14、U軸モータユニット16、B軸モータユニット18、R軸モータユニット20、T軸モータユニット22がそれぞれリンクによって直列に連結されている。多関節ロボットアーム11は、各軸のモータユニットの連結順、連結される際の軸線方向等は、多関節ロボットアームとして成立する構造であればよい。 Further, in the above-described embodiment, the articulated robot arm 11 is a 6-axis vertical articulated robot arm. The articulated robot arm 11 is, for example, an S-axis motor unit 12, an L-axis motor unit 14, and a U-axis motor unit 16. , B-axis motor unit 18, R-axis motor unit 20, and T-axis motor unit 22 are connected in series by links, respectively. The articulated robot arm 11 may have a structure in which the motor units of the respective axes are connected in the order of connection, the axial direction at the time of connection, and the like are established as an articulated robot arm.
 また、前記実施形態において、多関節ロボットアーム11は、遠隔操作車両1における車体2の上面であって車体2の前方に設けられているが、一例でありこれに限定するものではない。例えば、多関節ロボットアーム11は、車体2の上面であって車体2の後方や左右一方に設けられていてもよい。また、多関節ロボットアーム11は、遠隔操作車両1の前後方向の側面や左右方向側面に設けられていてもよい。 Further, in the above-described embodiment, the articulated robot arm 11 is provided on the upper surface of the vehicle body 2 in the remote-controlled vehicle 1 and in front of the vehicle body 2, but this is an example and is not limited thereto. For example, the articulated robot arm 11 may be provided on the upper surface of the vehicle body 2 and on the rear side or the left or right side of the vehicle body 2. Further, the articulated robot arm 11 may be provided on the front-rear side surface or the left-right side surface of the remote-controlled vehicle 1.
 また、前記実施形態において、多関節ロボットアーム装置10は、農作物である対象葡萄G(n)の収穫作業を行っているが、一例でありこれに限定するものではない。例えば、多関節ロボットアーム装置10は、屋外での農作物の収穫作業だけでなく、屋外での工業部品のハンドリング等、屋外、屋内問わずに多関節ロボットアーム11に装着されるエンドエフェクタ23の種類に応じた作業を実施する。 Further, in the above-described embodiment, the articulated robot arm device 10 performs harvesting work of the target grape G (n), which is an agricultural product, but this is an example and is not limited thereto. For example, the articulated robot arm device 10 is a type of end effector 23 mounted on the articulated robot arm 11 regardless of whether it is outdoors or indoors, such as not only harvesting agricultural products outdoors but also handling industrial parts outdoors. Carry out the work according to.
 また、前記実施形態において、エンドエフェクタ23は、把持装置23aと切断装置23bとを備え、対象葡萄G(n)を把持及び切断する機器であるが、これに限定するものではない。エンドエフェクタ23は、対象物に対して所定の作業を行う機器であればよい。 Further, in the above-described embodiment, the end effector 23 is a device including a gripping device 23a and a cutting device 23b, and grips and cuts the target grape G (n), but is not limited thereto. The end effector 23 may be any device that performs a predetermined operation on the object.
 また、前記実施形態において、多関節ロボットアーム装置10は、外部の通信端末からの制御信号と多関節ロボットアーム制御装置25からの制御信号によって、半自動的に対象葡萄G(n)を収穫しているが、一例でありこれに限定するものではない。例えば、多関節ロボットアーム装置10は、外部の操作端末Cからの制御信号によって多関節ロボットアーム11の各軸のモータユニット、エンドエフェクタ23及び単眼カメラ24を制御する構成でもよい。 Further, in the above embodiment, the articulated robot arm device 10 semi-automatically harvests the target grape G (n) by the control signal from the external communication terminal and the control signal from the articulated robot arm control device 25. However, this is just an example and is not limited to this. For example, the articulated robot arm device 10 may be configured to control the motor unit, the end effector 23, and the monocular camera 24 of each axis of the articulated robot arm 11 by a control signal from an external operation terminal C.
 また、前記実施形態において、遠隔操作車両1は、操作端末Cから制御信号を取得すると所定の経路を移動するように車両制御装置9によって制御されているが、一例でありこれに限定するものではない。例えば、遠隔操作車両1は、GNSS(Global Navigation Satellite System/全休測位衛星システム)からの位置情報と地図データとに基づいて自立して移動する構成でもよい。 Further, in the above-described embodiment, the remote-controlled vehicle 1 is controlled by the vehicle control device 9 so as to move on a predetermined route when a control signal is acquired from the operation terminal C, but this is an example and is not limited thereto. Absent. For example, the remote-controlled vehicle 1 may be configured to move independently based on position information and map data from GNSS (Global Navigation Satellite System).
 また、前記実施形態において、遠隔操作車両1は、一対の車輪3及び一対の車輪4を備える四輪車両であるが一例でありこれに限定するものではない。遠隔操作車両は、四輪車両以外の車両、例えば三輪車両、二輪車両などでもよい。 Further, in the above-described embodiment, the remote-controlled vehicle 1 is a four-wheeled vehicle including a pair of wheels 3 and a pair of wheels 4, but is not limited to this as an example. The remote-controlled vehicle may be a vehicle other than a four-wheeled vehicle, for example, a three-wheeled vehicle or a two-wheeled vehicle.
1 遠隔操作車両
2 車体
3、4 車輪
5 駆動用モータ
6 操舵用モータ
7 通信装置
8 バッテリー
9 制御装置
10 多関節ロボットアーム装置
11 多関節ロボットアーム
23 エンドエフェクタ
23a 把持装置
23b 切断装置
24 単眼カメラ
25 多関節ロボットアーム制御装置
26 画像処理部
27 座標情報処理部
28 駆動制御部
P1 第1撮像位置
Im1 第1画像
Is1 第1画像の補助画像
P2 第2撮像位置
Im2 第2画像
Is2 第2画像の補助画像
1 Remote control vehicle 2 Body 3, 4 Wheels 5 Drive motor 6 Steering motor 7 Communication device 8 Battery 9 Control device 10 Articulated robot arm device 11 Articulated robot arm 23 End effector 23a Gripping device 23b Cutting device 24 Monocular camera 25 Articulated robot arm control device 26 Image processing unit 27 Coordinate information processing unit 28 Drive control unit P1 First imaging position Im1 First image Is1 Auxiliary image of first image P2 Second imaging position Im2 Second image Is2 Auxiliary of second image image

Claims (10)

  1.  多関節ロボットアームに設けられる撮像部と、
     前記撮像部が撮像した画像から対象物の画像を検出する画像処理部と、
     前記対象物の位置座標を算出する座標情報処理部と、
     前記多関節ロボットアームのアクチュエータを駆動する駆動制御部と、を備え、
     前記撮像部が、前記多関節ロボットアームが前記対象物に対する作業を行う作業領域から所定距離に位置する第1撮像位置で前記作業領域を撮像し、
     前記画像処理部は、前記作業領域の画像から前記対象物の画像を検出し、
     前記座標情報処理部は、検出された前記対象物の画像に基づいて多関節ロボットアームの座標系における前記対象物の位置座標を算出する、多関節ロボットアーム制御装置。
    An imaging unit provided on the articulated robot arm and
    An image processing unit that detects an image of an object from an image captured by the imaging unit, and an image processing unit.
    A coordinate information processing unit that calculates the position coordinates of the object,
    A drive control unit for driving an actuator of the articulated robot arm is provided.
    The imaging unit images the work area at a first imaging position located at a predetermined distance from the work area where the articulated robot arm works on the object.
    The image processing unit detects an image of the object from the image of the work area and determines the image of the object.
    The coordinate information processing unit is an articulated robot arm control device that calculates the position coordinates of the object in the coordinate system of the articulated robot arm based on the detected image of the object.
  2.  請求項1に記載の多関節ロボットアーム制御装置において、
     前記撮像部は、前記第1撮像位置と異なる第2撮像位置から前記対象物を含む前記作業領域をさらに撮影する、多関節ロボットアーム制御装置。
    In the articulated robot arm control device according to claim 1,
    The imaging unit is an articulated robot arm control device that further photographs the work area including the object from a second imaging position different from the first imaging position.
  3.  請求項2に記載の多関節ロボットアーム制御装置において、
     前記第2撮像位置は、前記第1撮像位置よりも前記作業領域に近い位置である、多関節ロボットアーム制御装置。
    In the articulated robot arm control device according to claim 2.
    The articulated robot arm control device whose second imaging position is closer to the working area than the first imaging position.
  4.  請求項1から3のいずれか一つに記載の多関節ロボットアーム制御装置において、
     前記撮像部は、撮像範囲において90度以上の画角で前記作業領域を撮像する、多関節ロボットアーム制御装置。
    In the articulated robot arm control device according to any one of claims 1 to 3.
    The imaging unit is an articulated robot arm control device that images the working area at an angle of view of 90 degrees or more in the imaging range.
  5.  請求項1から4のいずれか一つに記載の多関節ロボットアーム制御装置において、
     前記撮像部は、前記多関節ロボットアームの駆動によって任意の方向に移動可能であり、且つ、前記作業領域の視差画像を取得可能な単眼カメラを備える、多関節ロボットアーム制御装置。
    In the articulated robot arm control device according to any one of claims 1 to 4.
    The articulated robot arm control device includes a monocular camera that can move in an arbitrary direction by driving the articulated robot arm and can acquire a parallax image of the work area.
  6.  請求項5に記載の多関節ロボットアーム制御装置において、
     前記単眼カメラの視差画像を撮像するための任意の方向の移動量は、前記第1撮像位置と前記第2撮像位置とで異なる、多関節ロボットアーム制御装置。
    In the articulated robot arm control device according to claim 5.
    An articulated robot arm control device in which the amount of movement in an arbitrary direction for capturing a parallax image of the monocular camera differs between the first imaging position and the second imaging position.
  7.  請求項2から6のいずれか一つに記載の多関節ロボットアーム制御装置において、
     前記撮像部は、前記第1撮像位置と前記第2撮像位置とで異なる撮像方向から撮像する、多関節ロボットアーム制御装置。
    In the articulated robot arm control device according to any one of claims 2 to 6.
    The imaging unit is an articulated robot arm control device that captures images from different imaging directions at the first imaging position and the second imaging position.
  8.  請求項1から7のいずれか一つに記載の多関節ロボットアーム制御装置において、
     前記駆動制御部は、算出された前記対象物の位置座標の変動に基づいて、前記多関節ロボットアームのアクチュエータの制御量を補正する、多関節ロボットアーム制御装置。
    In the articulated robot arm control device according to any one of claims 1 to 7.
    The drive control unit is an articulated robot arm control device that corrects the control amount of the actuator of the articulated robot arm based on the calculated fluctuation of the position coordinates of the object.
  9.  請求項1から8のいずれか一つに記載の多関節ロボットアーム制御装置と、
     多関節ロボットアームと、
    をさらに備え、
     前記多関節ロボットアームは、前記多関節ロボットアーム制御装置によって制御される、多関節ロボットアーム装置。
    The articulated robot arm control device according to any one of claims 1 to 8.
    With an articulated robot arm
    With more
    The articulated robot arm is an articulated robot arm device controlled by the articulated robot arm control device.
  10.  請求項9に記載の多関節ロボットアーム装置において、
     前記多関節ロボットアームは、
     複数の関節部と、
     前記複数の関節部に、各関節部の軸線まわりに回転可能に連結された複数のリンクと、
     前記複数のリンクのうち、最も先端に位置するリンクの軸線まわりに回転可能な先端回転部と、を有し、
     前記撮像部は、前記先端回転部に設けられている、多関節ロボットアーム装置。
    In the articulated robot arm device according to claim 9.
    The articulated robot arm
    With multiple joints
    A plurality of links rotatably connected to the plurality of joints around the axis of each joint, and
    It has a tip rotating portion that can rotate around the axis of the link located at the tip of the plurality of links.
    The imaging unit is an articulated robot arm device provided in the tip rotating unit.
PCT/JP2019/034393 2019-09-02 2019-09-02 Multi-joint robot-arm control device and multi-joint robot arm device WO2021044473A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/034393 WO2021044473A1 (en) 2019-09-02 2019-09-02 Multi-joint robot-arm control device and multi-joint robot arm device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/034393 WO2021044473A1 (en) 2019-09-02 2019-09-02 Multi-joint robot-arm control device and multi-joint robot arm device

Publications (1)

Publication Number Publication Date
WO2021044473A1 true WO2021044473A1 (en) 2021-03-11

Family

ID=74852324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/034393 WO2021044473A1 (en) 2019-09-02 2019-09-02 Multi-joint robot-arm control device and multi-joint robot arm device

Country Status (1)

Country Link
WO (1) WO2021044473A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113858170A (en) * 2021-09-08 2021-12-31 江苏叁拾叁信息技术有限公司 Intelligent manipulator for picking grapes
CN116483016A (en) * 2023-06-25 2023-07-25 北京瀚科智翔科技发展有限公司 Universal kit for unmanned modification of manned control equipment and implementation method
WO2023203726A1 (en) * 2022-04-21 2023-10-26 ヤマハ発動機株式会社 Image acquisition device
WO2024069779A1 (en) * 2022-09-28 2024-04-04 日本電気株式会社 Control system, control method, and recording medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1052145A (en) * 1996-08-08 1998-02-24 Iseki & Co Ltd Harvesting device of robot for agriculture
JP2003230959A (en) * 2002-02-06 2003-08-19 Toshiba Corp Remote operation welding robot system
JP2006003263A (en) * 2004-06-18 2006-01-05 Hitachi Ltd Visual information processor and application system
JP2009241247A (en) * 2008-03-10 2009-10-22 Kyokko Denki Kk Stereo-image type detection movement device
JP2011194498A (en) * 2010-03-18 2011-10-06 Denso Wave Inc Visual inspection system
JP2015085458A (en) * 2013-10-31 2015-05-07 セイコーエプソン株式会社 Robot control device, robot system and robot
KR20180038869A (en) * 2016-10-07 2018-04-17 엘지전자 주식회사 Robot for airport and method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1052145A (en) * 1996-08-08 1998-02-24 Iseki & Co Ltd Harvesting device of robot for agriculture
JP2003230959A (en) * 2002-02-06 2003-08-19 Toshiba Corp Remote operation welding robot system
JP2006003263A (en) * 2004-06-18 2006-01-05 Hitachi Ltd Visual information processor and application system
JP2009241247A (en) * 2008-03-10 2009-10-22 Kyokko Denki Kk Stereo-image type detection movement device
JP2011194498A (en) * 2010-03-18 2011-10-06 Denso Wave Inc Visual inspection system
JP2015085458A (en) * 2013-10-31 2015-05-07 セイコーエプソン株式会社 Robot control device, robot system and robot
KR20180038869A (en) * 2016-10-07 2018-04-17 엘지전자 주식회사 Robot for airport and method thereof

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113858170A (en) * 2021-09-08 2021-12-31 江苏叁拾叁信息技术有限公司 Intelligent manipulator for picking grapes
WO2023203726A1 (en) * 2022-04-21 2023-10-26 ヤマハ発動機株式会社 Image acquisition device
WO2024069779A1 (en) * 2022-09-28 2024-04-04 日本電気株式会社 Control system, control method, and recording medium
CN116483016A (en) * 2023-06-25 2023-07-25 北京瀚科智翔科技发展有限公司 Universal kit for unmanned modification of manned control equipment and implementation method
CN116483016B (en) * 2023-06-25 2023-08-29 北京瀚科智翔科技发展有限公司 Universal kit for unmanned modification of manned control equipment and implementation method

Similar Documents

Publication Publication Date Title
WO2021044473A1 (en) Multi-joint robot-arm control device and multi-joint robot arm device
Zhao et al. Dual-arm robot design and testing for harvesting tomato in greenhouse
Mueller-Sim et al. The Robotanist: A ground-based agricultural robot for high-throughput crop phenotyping
Davidson et al. Proof-of-concept of a robotic apple harvester
KR100784830B1 (en) Harvesting robot system for bench cultivation type strawberry
CN111673755B (en) Picking robot control system and method based on visual servo
JP5606241B2 (en) Visual cognitive system and method for humanoid robot
Zou et al. Fault-tolerant design of a limited universal fruit-picking end-effector based on vision-positioning error
Kohan et al. Robotic harvesting of rosa damascena using stereoscopic machine vision
Hu et al. Simplified 4-DOF manipulator for rapid robotic apple harvesting
KR102094004B1 (en) Method for controlling a table tennis robot and a system therefor
CN113812262B (en) Tea-oil camellia fruit picking robot based on machine vision
CN114029945A (en) Grabbing path control method of spherical-like fruit picking mechanical arm
CN114080905A (en) Picking method based on digital twins and cloud picking robot system
JP2024505669A (en) Robotic harvesting system with gantry system
Benet et al. Development of autonomous robotic platforms for sugar beet crop phenotyping using artificial vision
Li A visual recognition and path planning method for intelligent fruit-picking robots
Dario et al. The Agrobot project for greenhouse automation
WO2022013979A1 (en) Stereo camera
US20220168909A1 (en) Fusing a Static Large Field of View and High Fidelity Moveable Sensors for a Robot Platform
Yang Research on the application of rigid-flexible compound driven fruit picking robot design in realizing fruit picking
Luojia et al. Research progress of apple production intelligent chassis and weeding and harvesting equipment technology
Gharakhani et al. Evaluating object detection and stereoscopic localization of a robotic cotton harvester under real field conditions
KR100433272B1 (en) An apparatus of detecting 3-D position of fruits and method of detecting 3-D position of it
WO2023218568A1 (en) End effector

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19943933

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19943933

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP