WO2021117479A1 - Dispositif, procédé et programme de traitement d'informations - Google Patents

Dispositif, procédé et programme de traitement d'informations Download PDF

Info

Publication number
WO2021117479A1
WO2021117479A1 PCT/JP2020/043675 JP2020043675W WO2021117479A1 WO 2021117479 A1 WO2021117479 A1 WO 2021117479A1 JP 2020043675 W JP2020043675 W JP 2020043675W WO 2021117479 A1 WO2021117479 A1 WO 2021117479A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot arm
information
trajectory
camera
processing device
Prior art date
Application number
PCT/JP2020/043675
Other languages
English (en)
Japanese (ja)
Inventor
学嗣 浅谷
Original Assignee
株式会社エクサウィザーズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エクサウィザーズ filed Critical 株式会社エクサウィザーズ
Publication of WO2021117479A1 publication Critical patent/WO2021117479A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present invention relates to an information processing device, a method and a program for controlling a robot arm.
  • Non-Patent Document 1 discloses an algorithm called MPNet (Motion Planning Networks) for generating a trajectory of a robot arm by machine learning.
  • MPNet Motion Planning Networks
  • the trajectory of the robot arm is generated using an image of the working space of the robot arm taken by a camera.
  • Patent Document 1 can calculate the trajectory of the robot arm from the current position to the target position, the accuracy of the processing operation of the robot arm with respect to the object (grasping an article, painting, etc.) can be obtained. It was difficult to improve.
  • One aspect of the present invention realizes an information processing apparatus, method and program capable of improving both the accuracy of generating a trajectory to be traced by the robot arm and the accuracy of generating a processing motion with respect to an object of the robot arm.
  • the purpose is.
  • the information processing apparatus uses the first image data generated by the first camera, the state information regarding the state of the robot arm, and the first trained model.
  • the orbit generation unit that generates orbit information indicating the orbit to be traced by the robot arm
  • the second image data generated by the second camera, the state information, and the second trained model the robot It includes a processing operation generation unit that generates processing operation information indicating the processing operation to be performed by the arm.
  • both the accuracy of generating the trajectory to be traced by the robot arm and the accuracy of generating the processing motion for the object of the robot arm can be improved.
  • FIG. 1 is a block diagram showing a schematic configuration of a control system 1 according to an embodiment of the present invention.
  • the control system 1 is a system for controlling the robot arm 10.
  • the control system 1 includes a robot arm 10, a first camera 20, a second camera 30, and an information processing device 40.
  • the robot arm 10 is a device that performs processing operations related to an object. Processing operations relating to the object are, for example, gripping, painting, polishing, cutting, drilling, or assembling an article.
  • the first camera 20 and the second camera 30 are cameras that photograph the work space of the robot arm 10.
  • the first camera 20 can capture, for example, a work space of the robot arm 10 and the robot arm 10 so that the robot arm 10 can take an image suitable for generating information indicating the trajectory to be traced (hereinafter referred to as “orbit information”). Is fixed at a position included in the shooting range.
  • the first camera 20 outputs the first image data representing the captured image.
  • the second camera 30 can capture, for example, at least a part of the periphery of the robot arm 10 so that the second camera 30 can take an image suitable for generating information indicating the processing operation to be performed by the robot arm 10 (hereinafter referred to as “processing operation information”). (For example, the front) is fixed at a position included in the shooting range. The second camera 30 outputs the second image data representing the captured image.
  • the information processing device 40 includes an orbit generation unit 41, a processing operation generation unit 42, a first trained model 43, and a second trained model 44.
  • the trajectory generation unit 41 generates trajectory information using the first image data generated by the first camera 20, information on the state of the robot arm 10 (hereinafter referred to as “state information”), and the first trained model 43. To do.
  • the state information is arbitrary information regarding the state of the robot arm 10.
  • the state information includes, but is not limited to, information indicating the joint angles of the joint 12 and the end effector 11 and the coordinates of each part of the robot arm 10 including the end effector 11.
  • the trajectory information is information indicating the trajectory (change in state) traced by each part of the robot arm 10 until the robot arm 10 moves from the current position (start point of the trajectory) to the target position (end point of the trajectory).
  • the trajectory information can be time-series data of the state information of the robot arm 10 from the current position to the target position.
  • the current position of the robot arm 10 refers to the position of the robot arm 10 and the state of the robot arm 10 at each processing time point.
  • the target position refers to the state of the robot arm 10 when the robot arm 10 moves to the periphery of the object to be processed.
  • the target position is, for example, the state of the robot arm 10 when the robot arm 10 moves to a position where the processing operation can be started.
  • the processing motion generation unit 42 generates processing motion information using the second image data generated by the second camera 30, the state information of the robot arm 10, and the second trained model.
  • the first trained model 43 is learned by machine learning the correlation between the state information of the robot arm 10 at the current position and the target position, the information about the work space of the robot arm 10 (hereinafter referred to as “spatial information”), and the trajectory information. It is a finished model.
  • the spatial information includes, for example, the coordinates of an obstacle included in the work space of the robot arm 10 and information indicating the coordinates of an object to be processed by the robot arm 10.
  • Spatial information is specified, for example, from a captured image of the work space of the robot arm 10.
  • the first trained model 43 is, for example, MPNet, but is not limited thereto.
  • the first trained model 43 can be any machine learning model capable of generating orbital information based on state information and spatial information.
  • the first trained model 43 can be realized by CNN (Convolutional Neural Network), RNN (Recurrent Neural Network), RSTM (Long Short-Term Memory), DNN (Deep Neural Network), or a combination thereof.
  • the second trained model 44 is a model different from the first trained model 43, and is a trained model in which the correlation between the second image data and the state information and the processing operation information is machine-learned.
  • the second trained model 44 can be any machine learning model capable of generating processing operation information from the second image data and state information.
  • the second trained model 44 can be realized by CNN, RNN, LSTM, DNN, or a combination thereof.
  • the information processing device 40 configured as described above generates trajectory information of the robot arm 10 using the first trained model 43, and also uses the second trained model 44 to generate processing operation information of the robot arm 10. Generate.
  • a trained model suitable for each of the generation of the trajectory information and the generation of the processing motion information it is possible to improve the accuracy of both the generation of the trajectory information and the generation of the processing motion information.
  • it is possible to improve the accuracy of both the generation of the trajectory information and the generation of the processing operation information by using the captured images suitable for each of the generation of the trajectory information and the generation of the processing operation information. it can.
  • FIG. 2 is a diagram schematically showing the appearance of the control system 1 according to the embodiment of the present invention.
  • the same reference numerals will be added to the members having the same functions as the members described in the first embodiment, and the description will not be repeated.
  • the control system 1 is a system that controls a series of operations in which the robot arm 10 is moved to the vicinity of the object 50 to grip the object 50.
  • an end effector 11 is provided at the end of the robot arm 10.
  • the end effector 11 is, for example, a multi-finger hand or a gripper.
  • the end effector 11 has a mechanism for gripping an article.
  • the processing operation performed by the robot arm 10 is not limited to gripping the article, and may be configured so that another processing operation can be performed by exchanging the end effector 11.
  • the robot arm 10 includes one or more joints 12, and operates by driving each joint 12.
  • the joint 12 may be a joint of an arm or a joint of an end effector 11 (for example, a multi-fingered hand).
  • the robot arm 10 also includes one or more sensors 13 (see FIG. 1).
  • Each sensor 13 may include, for example, an angle sensor that detects the joint angle of each joint 12, a force sensor that detects a force sense at a specific position of the robot arm 10, and the like. The detection result of each sensor 13 is output to the information processing device 40.
  • the first camera 20 photographs the work space of the robot arm 10 and generates photographing data (first image data) representing the photographed image.
  • the first camera 20 is fixed at a position where the robot arm 10 and the work space of the robot arm 10 are included in the photographing range.
  • the first camera 20 includes a depth sensor and outputs a captured image represented by an RGB color space (hereinafter referred to as “RGB image”) and captured data representing a depth image having distance data for each pixel. ..
  • RGB image RGB color space
  • the captured image is not limited to a color image and may be a monochrome image.
  • a plurality of cameras may be provided as the first camera 20 for photographing the work space of the robot arm 10.
  • the second camera 30 is a camera fixed to the robot arm 10 and outputs shooting data (second image data) representing a shot image.
  • the second camera 30 is provided at a position (for example, the fifth axis of the robot arm 10) in which at least a part of the end effector 11 and the periphery of the end effector 11 are included in the photographing range.
  • the second camera 30 is fixed in a direction in which the axial direction of the fifth axis of the robot arm 10 and the photographing direction are parallel to each other.
  • the second camera 30 may include a depth sensor.
  • the shooting data output by the second camera 30 may include data representing a depth image.
  • a plurality of cameras may be provided as the second camera 30.
  • the second camera 30 may be fixed to the fourth axis or the sixth axis of the robot arm 10.
  • the object 50 is an object of a processing operation performed by the robot arm 10, for example, an article held by the robot arm 10.
  • the obstacles 61 and 62 are arbitrary objects other than the object 50 existing in the work space of the robot arm 10. Obstacles 61 and 62 can be obstacles in the movement of the robot arm 10 or the processing operation of the robot arm. Obstacles 61 and 62 are, for example, articles. In the example of FIG. 2, two obstacles 61 and 62 are shown, but the number of obstacles included in the work space of the robot arm 10 is not limited to 2, and may be more or less than this.
  • the information processing device 40 includes a trajectory generation unit 41, a processing motion generation unit 42, a first trained model 43, a second trained model 44, a robot control unit 45, and a trajectory storage unit 46.
  • the trajectory generation unit 41 uses the first image data, the state information, and the first trained model 43 to generate trajectory information indicating the trajectory of the robot arm 10 from the current position to the target position.
  • the trajectory generation unit 41 When there is an obstacle in the work space, the trajectory generation unit 41 generates trajectory information indicating a route in which the robot arm 10 moves from the current position to the target position while avoiding the obstacles 61 and 62. Further, the trajectory generation unit 41 repeatedly executes the trajectory information generation process for a period from the current position to the target position of the robot arm 10.
  • the generated orbit information is stored in the orbit storage unit 46.
  • the processing operation generation unit 42 generates processing operation information using the second image data, the state information, and the second trained model 44.
  • the processing operation information can be, for example, time-series data of the state information of the robot arm 10.
  • the processing operation information generated by the processing operation generating unit 42 indicates the processing operation to be performed by the robot arm 10 after the robot arm 10 has moved to the target position along the trajectory generated by the trajectory generating unit 41. Information.
  • the first trained model 43 is a trained model in which the correlation between the state information and spatial information of the robot arm 10 at the current position and the target position and the trajectory information is machine-learned.
  • the spatial information includes the coordinate information of the obstacle and the coordinate information of the object.
  • the input information of the first trained model 43 is not limited to the state information and the spatial information, and may include other information.
  • the input information may include the first image data.
  • the first trained model 43 is pre-generated by the information processing device 40 or another device.
  • the information processing device 40 or another device generates a first trained model 43 in which the correlation between the state information and the spatial information and the orbit information is machine-learned by using the pair of the state information and the spatial information and the orbit information. To do.
  • the second trained model 44 is generated in advance by the information processing device 40 or another device.
  • the correlation between the second image data and the state information and the processing operation information when the end effector 11 performs the processing operation of gripping the object 50 is machine-learned. It is a trained model.
  • the information processing device 40 or another device uses a pair of the second image data and the state information and the processing operation information to determine the correlation between the second image data and the state information and the processing operation information, for example, by machine learning.
  • the trained second trained model 44 is generated.
  • the robot control unit 45 controls the robot arm 10 by using the trajectory information generated by the trajectory generation unit 41 and the processing operation information generated by the processing operation generation unit 42. That is, the robot control unit 45 changes the state of the robot arm 10 using the trajectory information, and controls the robot arm 10 so that the state of the robot arm 10 approaches the state indicated by the trajectory information. Further, the robot control unit 45 changes the state of the robot arm 10 by using the processing operation information.
  • the orbit storage unit 46 stores the orbit information generated by the orbit generation unit 41.
  • the orbit storage unit 46 stores the state information of the robot arm 10 as the orbit information in chronological order.
  • FIG. 3 is a flowchart showing an example of the flow of control of the robot arm 10 by the control system 1. Note that some steps may be performed in parallel or in a different order.
  • step S101 the work space is photographed by the first camera 20.
  • the first camera 20 supplies the first image data representing the captured work space to the information processing device 40.
  • the trajectory generation unit 41 recognizes an object (object and obstacle) existing in the work space by using the RGB image and the depth image included in the first image data.
  • Various methods can be used as the object recognition method.
  • MMSS Multi-model Sharable and Specific Feature Learning for RGB-D Object Recognition
  • the 3D data is a three-dimensional object recognition method using CNN.
  • the 3D data is a three-dimensional object recognition method using CNN.
  • the estimated spatial information can be output as a point cloud with the type (label) of each object.
  • the object 50, the obstacle 61, and the obstacle 62 are recognized as the objects existing in the work space.
  • the type of obstacle does not have to be specified.
  • the trajectory generation unit 41 determines the target position for movement of the robot arm 10.
  • the target position is determined, for example, as follows.
  • the trajectory generation unit 41 identifies an object from the objects existing in the work space based on the type of the object recognized in step S102.
  • the type of the object may be preset by the user.
  • the object 50 is specified.
  • the trajectory generation unit 41 determines the state (position) of the robot arm 10 in which the positional relationship between the object 50 and the end effector 11 of the robot arm 10 satisfies a predetermined condition as a movement target position.
  • the predetermined condition may be, for example, a condition that the distance from the end effector 11 to the object 50 is equal to or less than a predetermined threshold value.
  • the distance from the end effector 11 to the object 50 is, for example, the distance from a predetermined portion of the end effector 11 to the center of gravity of the object 50, or the shortest distance from a predetermined portion of the end effector 11 to the surface of the object 50.
  • the target position of the robot arm 10 means, for example, a state of the robot arm 10 in which the shortest distance from a predetermined portion of the end effector 11 to the surface of the object 50 is equal to or less than a predetermined threshold value.
  • the trajectory generating unit 41 includes state information and spatial information at the current position and target position of the robot arm 10 (coordinate information of the object 50 and obstacles 61 and 62, state information at the target position of the robot arm 10).
  • the first trained model 43 is used to generate orbital information.
  • the trajectory generation unit 41 inputs the state information and the spatial information at the current position and the target position of the robot arm 10 into the first trained model 43, and the trajectory information output from the first trained model 43.
  • the trajectory generation unit 41 may acquire the detection result of the sensor 13 as the current state information of the robot arm 10, or may calculate the state information from the detection result, or the first.
  • the state information may be specified by performing image analysis on the image data.
  • the trajectory generation unit 41 stores the acquired trajectory information in the trajectory storage unit 46, and supplies the trajectory information to the robot control unit 45.
  • step S105 the robot control unit 45 controls the robot arm 10 according to the supplied trajectory information and moves the robot arm 10. That is, the robot control unit 45 controls the robot arm 10 so that the robot arm 10 moves along the trajectory indicated by the trajectory information. At this time, the detection result of the sensor 13 may be used for controlling the robot arm 10.
  • step S106 the robot control unit 45 determines whether the robot arm 10 has moved to the target position by comparing the current position of the robot arm 10 with the target position. When the current position and the target position match, or the difference between the current position and the target position is less than the threshold value, the robot control unit 45 determines that the robot arm 10 has moved to the target position. On the other hand, when the difference between the current position and the target position is equal to or greater than the threshold value, the robot control unit 45 determines that the robot arm 10 has not moved to the target position.
  • step S106 When the robot arm 10 moves to the target position (step S106; YES), the information processing device 40 proceeds to the process of step S107. On the other hand, when it has not moved to the target position (step S106; NO), the information processing apparatus 40 returns to the process of step S101 and acquires the first image data from the first camera 20. That is, the information processing device 40 repeats the processes of steps S101 to S106 at predetermined time intervals until the robot arm 10 reaches the target position.
  • steps S101 to S106 are repeatedly executed until the robot arm 10 moves to the target position. That is, the trajectory generation unit 41 repeatedly executes the trajectory information generation process until the robot arm 10 moves to the target position. By repeatedly executing these processes, the robot arm 10 gradually moves in a trajectory that does not collide with the obstacle 61 and the obstacle 62, and approaches the object 50.
  • FIG. 4 is a diagram illustrating the trajectory of the end effector 11 of the robot arm 10.
  • the end effector 11 gradually moves along the trajectory generated so as not to collide with the obstacle 61 and the obstacle 62 under the control of the robot control unit 45, and approaches the object 50. ..
  • the orbits q11 to q19 indicate the orbits of the end effector 11 generated by the orbit generation unit 41.
  • FIG. 4 only the trajectory of the end effector 11 is shown, and the trajectory of the entire robot arm 10 is omitted in order to prevent the drawing from becoming complicated.
  • step S107 shooting is performed by the second camera 30, and the second image data of the second camera 30 is supplied to the information processing device 40.
  • step S108 the processing motion generating unit 42 generates processing motion information using the state information at the current position of the robot arm 10, the second image data, and the second trained model 44.
  • the processing motion generation unit 42 inputs the state information and the second image data at the current position of the robot arm 10 into the second trained model 44, and the processing motion information output from the second trained model 44. To get.
  • the acquired processing operation information is supplied to the robot control unit 45.
  • step S109 the robot control unit 45 controls the robot arm 10, particularly the end effector 11, according to the processing operation information supplied, and causes the robot arm 10 to perform a gripping operation.
  • step S110 the information processing device 40 determines whether the robot arm 10 has completed the processing operation.
  • the processing operation is completed (step S110; YES)
  • the information processing apparatus 40 ends the processing of FIG.
  • step S110; NO the information processing apparatus 40 returns to the processing in step S107 and continues the processing operation control using the second image data. That is, the information processing apparatus 40 repeatedly executes the processes of steps S107 to S109 at predetermined time intervals until the processing operation of the robot arm 10 is completed. By repeatedly executing these processes, the robot arm 10 performs a gripping operation of the object 50.
  • the control system 1 executes the process shown in FIG. 3 to control the robot arm 10, so that the robot arm 10 grips the object 50. After the robot arm 10 completes the gripping motion of the object 50, the robot arm 10 may perform another motion.
  • the robot arm 10 may execute an operation of moving the object 50 to another place.
  • the robot control unit 45 of the information processing device 40 controls the robot arm 10 by using the orbit information stored in the orbit storage unit 46, and the object 50 is held by the robot arm 10 while holding the object 50.
  • the robot control unit 45 reads out the time-series state information stored in the trajectory storage unit 46 in the reverse order, and controls the robot arm 10 according to the read state information to change the state of the robot arm 10 from the target position to the initial position. It is possible to control the return to.
  • the information processing device 40 sets a new target position and uses the current position of the robot arm 10 and the state information at the new target position, the spatial information acquired from the first image data, and the first trained model 43. Then, the trajectory information to the new target position may be generated, and the robot arm 10 may be moved along the trajectory information.
  • the information processing apparatus 40 generates the trajectory information of the robot arm 10 by using the captured image of the first camera 20 and the first trained model 43, and also generates the captured image of the second camera 30 and the second.
  • the processing operation information of the robot arm 10 is generated by using the trained model 44. In this way, by using trained models and captured images suitable for each of the generation of orbit information and the generation of processing operation information, the accuracy of orbit information generation and the accuracy of processing operation information generation can be improved. It can be compatible.
  • the processing operation control of the robot arm 10 requires finer control than the movement control of the robot arm 10.
  • the robot arm 10 collides with the object 50, the object 50 is scratched, or the object 50 is damaged. This is because it may be damaged.
  • the position of the robot arm 10 in which the positional relationship between the object 50 and the end effector 11 satisfies a predetermined condition is set as the target position, and when the robot arm 10 moves to the target position, the robot arm 10 is controlled. Is switched from movement control to processing operation control.
  • the trajectory information of the robot arm 10 may be generated by a mathematical search.
  • the information processing apparatus 40 can generate the orbit information in a short time by generating the orbit information using the first trained model 43. Therefore, real-time trajectory information of the robot arm 10 can be generated. That is, even when the obstacle 61 or the object 50 moves while the robot arm 10 is moving, the information processing device 40 can update the trajectory information of the robot arm 10 in real time.
  • the trajectory generating unit 41 can accurately generate trajectory information that does not collide with the obstacles 61 and 62.
  • trajectory information and processing operation information when generating trajectory information and processing operation information from images taken by one camera, it is necessary to install a high-resolution camera equipped with a depth sensor at a position where the work space can be photographed. Such cameras are expensive. On the other hand, in this embodiment, it is possible to generate trajectory information and processing operation information by combining two inexpensive cameras without using an expensive camera. As a result, the control system 1 can be constructed at low cost.
  • the trajectory generating unit 41 determines the position of the robot arm 10 in which the positional relationship between the object 50 and the shooting direction of the second camera 30 satisfies a predetermined condition as a target position, and the robot up to this target position. Generates trajectory information indicating the trajectory of the arm 10.
  • the orbit generation unit 41 determines, for example, a position in which a part or all of the object 50 is included in the shooting range of the second camera 30 as a target position. More specifically, for example, the trajectory generation unit 41 determines the position of the robot arm 10 in which the entire object 50 is included in the angle of view of the second camera 30 as the target position.
  • the robot control unit 45 of the information processing device 40 does not perform the determination process of whether the robot arm 10 has moved to the target position (the process of step S106 of FIG. 3), but instead of comparing the current position with the target position, for example. You may do it as follows.
  • the robot control unit 45 recognizes an object by analyzing the image captured by the first camera 20, and identifies the positional relationship between the object 50 included in the image captured by the first camera 20 and the photographing direction of the second camera 30. To do.
  • the robot control unit 45 determines whether the specified positional relationship satisfies a predetermined condition. When the specified positional relationship satisfies a predetermined condition, the robot control unit 45 determines that the robot arm 10 has moved to the target position. On the other hand, when the positional relationship does not satisfy the predetermined condition, the robot control unit 45 determines that the robot arm 10 has not moved to the target position.
  • the method for determining the target position of the robot arm 10 is not limited to that shown in the above-described embodiment, and other methods may be used.
  • the trajectory generation unit 41 may determine a position where the ratio of the region of the object 50 in the image captured by the second camera 30 satisfies a predetermined condition as a target position.
  • the trajectory generation unit 41 determines a position at which the ratio of the object 50 occupies 50% or more in the image captured by the second camera as the target position.
  • the robot control unit 45 of the information processing device 40 does not perform the determination process of whether the robot arm 10 has moved to the target position (the process of step S106 of FIG. 3), but instead of comparing the current position with the target position, for example. You may do it as follows.
  • the robot control unit 45 analyzes the captured image of the first camera 20 and recognizes the object 50 and the second camera 30 included in the captured image of the first camera 20.
  • the robot control unit 45 uses information such as the positional relationship between the recognized object 50 and the second camera 30, the shooting direction of the second camera, the angle of view of the second camera, and the like to capture the target in the image captured by the second camera.
  • the ratio occupied by the object 50 is calculated. When the calculated ratio is 50% or more, the robot control unit 45 determines that the robot arm 10 has moved to the target position. On the other hand, when the calculated ratio is less than 50%, the robot control unit 45 determines that the robot arm 10 has not moved to the target position.
  • the target position can be determined according to the desired shooting state of the object 50 by the second camera 30 at the start of the processing operation control. As a result, processing operation control can be smoothly performed.
  • the trajectory generating unit 41 has at least one of the characteristics of the object 50, the type of processing operation performed by the robot arm 10 on the object 50, and the positions of the obstacle 61 and the obstacle 62. Use one to change the method of determining the target position.
  • the characteristic of the object 50 is a feature or property of the object 50, for example, the size or shape of the object 50.
  • the processing operation performed by the robot arm 10 is, for example, gripping, painting, polishing, cutting, drilling, or assembling an article.
  • the trajectory generating unit 41 may determine the target position so that the larger the size of the object 50, the larger the threshold value of the distance between the end effector 11 and the object 50.
  • the trajectory generating unit 41 may make the angle of the robot arm 10 with respect to the object 50 at the threshold value or the target position different depending on the type of processing operation performed by the robot arm 10. Further, as another example, when the obstacle 61 or the obstacle 62 is near the object 50 (the distance between the object 50 and the obstacle 61 or 62 is equal to or less than a predetermined threshold value, the orbit generating unit 41 may be used. ), The target position may be determined from the positional relationship between the object 50 and the second camera 30, while the target position may be determined from the positional relationship between the object 50 and the end effector 11 in other cases.
  • the information processing device 40 can determine a target position suitable for, for example, the size of the object 50. As a result, the information processing device 40 can smoothly start controlling the processing operation of the robot arm 10 with respect to the object 50.
  • FIG. 5 is a block diagram showing a schematic configuration of the control system 1A according to the present embodiment.
  • the same reference numerals will be added to the members having the same functions as the members described in the second embodiment, and the description will not be repeated.
  • the control system 1A shown in FIG. 5 is different from the control system 1 according to the second embodiment in that the camera 70 is provided instead of the first camera 20 and the second camera 30, and the information processing device 40 is replaced. The point is that the information processing device 40A is provided.
  • the camera 70 photographs the working space of the robot arm 10.
  • the camera 70 outputs shooting data representing an RGB image and a depth image.
  • the information processing device 40A includes an orbit generation unit 41A and a processing operation generation unit 42A in place of the orbit generation unit 41 and the processing operation generation unit 42.
  • the trajectory generation unit 41A generates trajectory information using the state information, the shooting data of the camera 70, and the first trained model 43.
  • the processing operation generation unit 42A generates processing operation information using the state information, the shooting data of the camera 70, and the second trained model 44.
  • the shooting data of the common camera 70 is used in the generation of the trajectory information of the robot arm 10 and the generation of the processing operation information.
  • a trained model suitable for each of the generation of trajectory information and the generation of processing operation information is used. Thereby, the accuracy of both the generation of the trajectory information and the generation of the processing operation information can be improved.
  • the second camera 30 is installed on the robot arm 10, but the position where the second camera 30 is installed is not limited to that shown in the above-described embodiment.
  • the second camera 30 may be fixed to something other than the robot arm 10.
  • the second camera may be fixed to the end portion of the end effector 11. In this case, the robot arm 10 (or the end effector 11) may not be included in the shooting range of the second camera 30.
  • the trajectory generating unit 41 repeatedly executes the trajectory information generation process for a period until the robot arm 10 moves from the current position to the target position (step S104 in FIG. 3).
  • the trajectory generation unit 41 may perform the trajectory information generation process only once at the initial position of the robot arm 10 without repeatedly executing the trajectory information generation process.
  • the robot control unit 45 controls the robot arm 10 according to the trajectory information generated at the initial position of the robot arm 10, and controls the robot arm 10 to be gradually moved to the target position.
  • the functions realized by the information processing device 40 according to each of the above-described embodiments may be shared and realized by a plurality of devices.
  • the first device including the trajectory generation unit 41 and the processing motion generation unit 42 and the second device including the robot control unit 45 may be configured as separate devices.
  • a part of the plurality of devices may be a so-called cloud server connected to another device in a communication network.
  • the control block of the information processing device 40 (particularly, the trajectory generation unit 41, the processing operation generation unit 42, and the robot control unit 45) is realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. It may be realized by software. In the latter case, the information processing device 40 is configured by using, for example, a computer (electronic computer).
  • FIG. 6 is a block diagram illustrating the physical configuration of a computer used as the information processing device 40.
  • the information processing device 40 can be configured by a computer including a bus 410, a processor 401, a main memory 402, an auxiliary memory 403, and an input / output interface 404.
  • the processor 401, the main memory 402, the auxiliary memory 403, and the input / output interface 404 are connected to each other via the bus 410.
  • processor 401 for example, a microprocessor, a digital signal processor, a microcontroller, or a combination thereof or the like is used.
  • main memory 402 for example, a semiconductor RAM (random access memory) or the like is used.
  • auxiliary memory 403 for example, a flash memory, an HDD (Hard Disk Drive), an SSD (Solid State Drive), or a combination thereof or the like is used.
  • the auxiliary memory 403 stores a program for causing the processor 401 to execute the operation of the information processing device 40 described above.
  • the processor 401 expands the program stored in the auxiliary memory 403 on the main memory 402, and executes each instruction included in the expanded program.
  • the input / output interface 404 for example, a USB interface, a short-range communication interface such as infrared rays or Bluetooth (registered trademark), or a combination thereof is used.
  • the information processing apparatus uses the first image data generated by the first camera, the state information regarding the state of the robot arm, and the first trained model, and the trajectory to be traced by the robot arm. Processing indicating the processing operation to be performed by the robot arm using the trajectory generating unit that generates the trajectory information indicating the above, the second image data generated by the second camera, the state information, and the second trained model. It includes a processing operation generation unit that generates operation information.
  • the orbit generation unit may repeatedly execute the orbit information generation process until the robot arm moves to a predetermined target position.
  • the processing operation information generated by the processing operation generating unit is performed by the robot arm after moving to a predetermined target position along the trajectory. It may be information indicating the processing operation to be performed.
  • the orbit generating unit satisfies a predetermined condition in the positional relationship between the object to be processed by the robot arm and the end effector of the robot arm.
  • the position of the robot arm may be determined as the target position, and trajectory information indicating the trajectory of the robot arm up to the target position may be generated.
  • the orbit generating unit sets the position of the robot arm at which the distance from the end effector of the robot arm to the object satisfies the condition. It may be determined as a position.
  • the second camera is provided on the robot arm.
  • the trajectory generating unit determines the position of the robot arm whose positional relationship between the object to be processed by the robot arm and the shooting direction of the second camera satisfies a predetermined condition as a target position, and reaches the target position. It may be possible to generate trajectory information indicating the trajectory of the robot arm.
  • the orbit generating unit determines a position in which a part or all of the object is included in the photographing range of the second camera as the target position. It may be that.
  • the orbit generating unit sets a position where the ratio of the region of the object in the image captured by the second camera satisfies a predetermined condition. It may be decided as.
  • the trajectory generating unit has the characteristics of the object, the type of processing operation performed by the robot arm on the object, and obstacles.
  • the above condition may be changed by using at least one of the positions of.
  • the second camera may be provided on the robot arm in the first to fifth aspects.
  • the first camera may include a depth sensor
  • the first image data may include image data representing the depth
  • the method according to aspect 12 of the present invention uses the first image data generated by the first camera, the state information regarding the state of the robot arm, and the first trained model to indicate the trajectory to be traced by the robot arm.
  • processing operation information indicating the processing operation to be performed by the robot arm is generated. This is a method in which the information processing apparatus executes the steps to be performed.
  • the program according to aspect 13 of the present invention is a program for operating a computer as the above-mentioned information processing device, and is a program for operating the computer as each of the above-mentioned parts.
  • the information processing device may be realized by a computer.
  • the information processing device is made into a computer by operating the computer as each part (software element) included in the information processing device.
  • the learning program of the information processing device to be realized and the computer-readable recording medium on which the information processing device is recorded are also included in the scope of the present invention.
  • Robot arm 40 Robot arm 40, 40A Information processing device 11 End effector 12 Joint 13 Sensor 20 First camera 30 Second camera 41, 41A Orbit generation unit 42, 42A Processing motion generation unit 43 First trained model 44 2 Learned model 45 Robot control unit 46 Orbit storage unit 50 Objects 61, 62 Obstacles 70 Camera 401 Processor 402 Main memory 403 Auxiliary memory 404 I / O interface 410 Bus

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

La présente invention améliore la précision à la fois de la génération d'une trajectoire à suivre par un bras de robot et de la génération d'un mouvement de traitement du bras de robot par rapport à un objet cible. L'invention concerne à ce titre un dispositif de traitement d'informations (40) comprenant : une unité de génération de trajectoire (41) qui, à l'aide de premières données d'image générées par une première caméra (20), d'informations d'état concernant un état d'un bras de robot et d'un premier modèle appris (43), génère des informations de trajectoire indiquant une trajectoire à suivre par un bras de robot (10); et une unité de génération de mouvement de traitement (42) qui, à l'aide de secondes données d'image générées par une seconde caméra (30), des informations d'état et d'un second modèle appris (44), génère des informations de mouvement de traitement indiquant un mouvement de traitement à effectuer par le bras de robot (10).
PCT/JP2020/043675 2019-12-12 2020-11-24 Dispositif, procédé et programme de traitement d'informations WO2021117479A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019224624A JP6792230B1 (ja) 2019-12-12 2019-12-12 情報処理装置、方法およびプログラム
JP2019-224624 2019-12-12

Publications (1)

Publication Number Publication Date
WO2021117479A1 true WO2021117479A1 (fr) 2021-06-17

Family

ID=73452896

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/043675 WO2021117479A1 (fr) 2019-12-12 2020-11-24 Dispositif, procédé et programme de traitement d'informations

Country Status (2)

Country Link
JP (1) JP6792230B1 (fr)
WO (1) WO2021117479A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210237270A1 (en) * 2020-02-05 2021-08-05 Denso Corporation Trajectory generation apparatus, multi-link system, and trajectory generation method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113119073A (zh) * 2021-04-16 2021-07-16 中国科学技术大学 面向3c装配场景基于计算机视觉及机器学习的机械臂系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05318363A (ja) * 1992-05-21 1993-12-03 Sanyo Electric Co Ltd ロボットの制御方式
JPH08212329A (ja) * 1995-02-06 1996-08-20 Fujitsu Ltd 適応的認識システム
JP2007245326A (ja) * 2006-02-17 2007-09-27 Toyota Motor Corp ロボットとロボットの制御方法
WO2018146770A1 (fr) * 2017-02-09 2018-08-16 三菱電機株式会社 Dispositif et procédé de commande de position
JP2019508273A (ja) * 2016-03-03 2019-03-28 グーグル エルエルシー ロボットの把持のための深層機械学習方法および装置
JP2019509905A (ja) * 2016-03-03 2019-04-11 グーグル エルエルシー ロボットの把持のための深層機械学習方法および装置
JP2019155554A (ja) * 2018-03-14 2019-09-19 オムロン株式会社 ロボットの制御装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05318363A (ja) * 1992-05-21 1993-12-03 Sanyo Electric Co Ltd ロボットの制御方式
JPH08212329A (ja) * 1995-02-06 1996-08-20 Fujitsu Ltd 適応的認識システム
JP2007245326A (ja) * 2006-02-17 2007-09-27 Toyota Motor Corp ロボットとロボットの制御方法
JP2019508273A (ja) * 2016-03-03 2019-03-28 グーグル エルエルシー ロボットの把持のための深層機械学習方法および装置
JP2019509905A (ja) * 2016-03-03 2019-04-11 グーグル エルエルシー ロボットの把持のための深層機械学習方法および装置
WO2018146770A1 (fr) * 2017-02-09 2018-08-16 三菱電機株式会社 Dispositif et procédé de commande de position
JP2019155554A (ja) * 2018-03-14 2019-09-19 オムロン株式会社 ロボットの制御装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210237270A1 (en) * 2020-02-05 2021-08-05 Denso Corporation Trajectory generation apparatus, multi-link system, and trajectory generation method
US11673271B2 (en) * 2020-02-05 2023-06-13 Denso Corporation Trajectory generation apparatus, multi-link system, and trajectory generation method

Also Published As

Publication number Publication date
JP2021091067A (ja) 2021-06-17
JP6792230B1 (ja) 2020-11-25

Similar Documents

Publication Publication Date Title
US20200215686A1 (en) Deep machine learning methods and apparatus for robotic grasping
CN104936748B (zh) 徒手机器人路径教导
JP5949242B2 (ja) ロボットシステム、ロボット、ロボット制御装置、ロボット制御方法、およびロボット制御プログラム
JP6978454B2 (ja) 物体検出装置、制御装置及び物体検出用コンピュータプログラム
KR20180114217A (ko) 로봇 파지용 심층 기계 학습 방법 및 장치
CN111085997A (zh) 基于点云获取和处理的抓取训练方法及系统
WO2021117479A1 (fr) Dispositif, procédé et programme de traitement d'informations
CN114097004A (zh) 基于视觉嵌入的自主任务性能
JP2012218119A (ja) 情報処理装置、情報処理装置の制御方法、およびプログラム
WO2020190166A1 (fr) Procédé et système de saisie d'objet à l'aide d'un dispositif robotisé
JP2008000884A (ja) ロボットの相互作用のためのビジュアル・プロト−オブジェクトの評価
JP7458741B2 (ja) ロボット制御装置及びその制御方法及びプログラム
JP6075888B2 (ja) 画像処理方法、ロボットの制御方法
Ottenhaus et al. Visuo-haptic grasping of unknown objects based on gaussian process implicit surfaces and deep learning
Teke et al. Real-time and robust collaborative robot motion control with Microsoft Kinect® v2
JP7359577B2 (ja) ロボット教示装置及びロボットシステム
CN112805127A (zh) 用于创建机器人控制程序的方法和设备
JP7376318B2 (ja) アノテーション装置
JPH05150835A (ja) ロボツトによる組み立て装置
Xu et al. A fast and straightforward hand-eye calibration method using stereo camera
RU2756437C1 (ru) Способ и система планирования движения робота-манипулятора путем коррекции опорных траекторий
TW202021754A (zh) 自動定位方法以及自動控制裝置
WO2023286138A1 (fr) Système de commande de robot, système de robot, procédé de commande de robot et programme de commande de robot
KR20230175122A (ko) 대상물의 조작, 특히 픽업을 위한 로봇 제어 방법
WO2024023934A1 (fr) Dispositif de retrait de pièce, procédé de retrait de pièce et dispositif de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20900465

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/09/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20900465

Country of ref document: EP

Kind code of ref document: A1