CN116133801A - Robot control system, robot control device, robot control method, and program - Google Patents

Robot control system, robot control device, robot control method, and program Download PDF

Info

Publication number
CN116133801A
CN116133801A CN202180059579.7A CN202180059579A CN116133801A CN 116133801 A CN116133801 A CN 116133801A CN 202180059579 A CN202180059579 A CN 202180059579A CN 116133801 A CN116133801 A CN 116133801A
Authority
CN
China
Prior art keywords
teaching
data
reproduction
unit
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180059579.7A
Other languages
Chinese (zh)
Inventor
平泉一城
目直子
原田宽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Corporation Youzhi
Original Assignee
Corporation Youzhi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=75712192&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=CN116133801(A) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Corporation Youzhi filed Critical Corporation Youzhi
Publication of CN116133801A publication Critical patent/CN116133801A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

A robot control system (1000) includes a three-dimensional measurement camera (150), a robot arm (160), and a robot control device (100). A three-dimensional measurement camera (150) is mounted on the robot arm (160) and measures the three-dimensional position of the target object. A robot control device (100) is provided with: a reference position acquisition unit (111) that acquires a reference position that is a three-dimensional position of the target object during teaching; a reproduction position acquisition unit (112) that acquires a reproduction position that is a three-dimensional position of the target object at the time of reproduction; a conversion data calculation unit (113) that calculates conversion data that converts the reference position acquired by the reference position acquisition unit (111) into the reproduction position acquired by the reproduction position acquisition unit (112); and a reproduction unit (114) that converts the converted data calculated by the converted data calculation unit (113) and reproduces the teaching data, thereby moving the robot arm (160).

Description

Robot control system, robot control device, robot control method, and program
Technical Field
The present invention relates to a robot control system, a robot control device, a robot control method, and a program.
Background
In an automatic assembly process of a product or the like, a pickup device that automatically picks up a component is known. For example, patent document 1 discloses a pickup robot that teaches a position of a component in advance, and picks up the component based on the taught position.
[ Prior Art literature ]
[ patent literature ]
Patent document 1 Japanese patent laid-open No. 2007-319997
Disclosure of Invention
[ problem to be solved by the invention ]
In the pickup robot disclosed in patent document 1, it is possible to reduce the man-hour of teaching to the pickup robot by sharing the gripping positions of a plurality of workpieces (pickup target objects). However, this pickup robot is premised on the following condition: there is no deviation between the position of the workpiece at the time of teaching and the position of the workpiece at the time of pickup processing (at the time of reproduction) based on teaching data produced by the teaching processing. Therefore, when the workpiece position is deviated during teaching and playback, proper pickup processing cannot be performed.
The present invention has been made in view of the above-described circumstances, and an object thereof is to provide a robot control system, a robot control device, a robot control method, and a program capable of removing the influence of a deviation even if the position of a target object is deviated during teaching and playback.
[ solution for solving the technical problem ]
In order to achieve the above object, a robot control system according to aspect 1 of the present invention is a robot control system including a three-dimensional measurement camera, a robot arm, and a robot control device;
the three-dimensional measurement camera is mounted on the robot arm and measures the three-dimensional position of the target object;
the robot control device is a robot control device that moves the robot arm by reproducing teaching data taught in advance, comprising:
a reference position obtaining unit that obtains a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
a playback position acquisition unit that acquires a playback position, which is a three-dimensional position of the target object at the time of playback, from the three-dimensional measurement camera,
a conversion data calculation unit that calculates conversion data for converting the reference position acquired by the reference position acquisition unit into the reproduction position acquired by the reproduction position acquisition unit, and
and a playback unit that, after converting the teaching data using the conversion data calculated by the conversion data calculation unit, plays back the teaching data, thereby moving the robot arm.
The robot control device may further include a teaching unit that receives teaching data, which is teaching data to be corrected from the initial teaching data, again after reproducing the teaching data to be corrected from the initial teaching data, and corrects the teaching data to be corrected.
Further, a robot control device according to the 2 nd aspect of the present invention is a robot control device for moving a robot arm to which a three-dimensional measurement camera for measuring a three-dimensional position of a target object is attached by reproducing teaching data taught in advance, the robot control device including:
a reference position obtaining unit that obtains a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
a playback position acquisition unit that acquires a playback position, which is a three-dimensional position of the target object at the time of playback, from the three-dimensional measurement camera,
a conversion data calculation unit that calculates conversion data for converting the reference position acquired by the reference position acquisition unit into the reproduction position acquired by the reproduction position acquisition unit, and
and a playback unit that, after converting the teaching data using the conversion data calculated by the conversion data calculation unit, plays back the teaching data, thereby moving the robot arm.
The teaching section may further include a teaching section that receives teaching again after reproducing teaching data to be corrected from the initial teaching data, that is, teaching data to be corrected, from among the teaching data, and corrects the teaching data to be corrected.
The apparatus may further include a point group data editing unit configured to edit point group data indicating the reference position;
the reference position obtaining unit obtains a reference position represented by the point group data edited by the point group data editing unit.
Further, a robot control method according to the 3 rd aspect of the present invention is a robot control method for moving a robot arm to which a three-dimensional measurement camera for measuring a three-dimensional position of a target object is attached by reproducing teaching data taught in advance, the robot control method including:
a reference position obtaining step of obtaining a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
a reproduction position acquisition step of acquiring a reproduction position, which is a three-dimensional position of the target object at the time of reproduction, from the three-dimensional measurement camera,
A conversion data calculation step of calculating conversion data for converting the reference position acquired in the reference position acquisition step into the reproduction position acquired in the reproduction position acquisition step, and
and a reproduction step of converting the teaching data by the conversion data calculated in the conversion data calculation step and then reproducing the teaching data, thereby moving the robot arm.
It is also possible to include a re-teaching step of receiving teaching again after reproducing teaching data to be corrected from the original teaching data, that is, correction-required teaching data, from among the teaching data, and correcting the correction-required teaching data.
Further, a program according to the 4 th aspect of the present invention causes a computer of a robot control apparatus that reproduces teaching data taught in advance to move a robot arm to which a three-dimensional measuring camera for measuring a three-dimensional position of a target object is attached to:
a reference position obtaining step of obtaining a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
A reproduction position acquisition step of acquiring a reproduction position, which is a three-dimensional position of the target object at the time of reproduction, from the three-dimensional measurement camera,
a conversion data calculation step of calculating conversion data for converting the reference position acquired in the reference position acquisition step into the reproduction position acquired in the reproduction position acquisition step, and
and a reproduction step of converting the teaching data by the conversion data calculated in the conversion data calculation step and then reproducing the teaching data, thereby moving the robot arm.
The computer may be further configured to execute a re-teaching step of receiving teaching data to be corrected from the initial teaching data, that is, the teaching data to be corrected, and correcting the teaching data to be corrected, after reproducing the teaching data to be corrected from the teaching data.
Effects of the invention
According to the present invention, even if the position of the target object deviates at the time of teaching and at the time of reproduction, the influence of the deviation can be removed.
Drawings
Fig. 1 is a block diagram showing an example of a functional configuration of a robot control system according to the embodiment.
Fig. 2 is a flow chart of the teaching process of the embodiment.
Fig. 3 is a diagram showing an example of a target object photographed by the three-dimensional measuring camera according to the embodiment.
Fig. 4 is a diagram illustrating an example of the operation of the robot arm in teaching.
Fig. 5 is a flowchart of reproduction processing according to an embodiment.
Fig. 6 is a diagram illustrating an example of the operation of the robot arm at the time of playback.
Fig. 7 is a diagram illustrating a case where an error in a movable region of the robot arm according to the embodiment is corrected.
Fig. 8 is a flow chart of the re-teaching process of the embodiment.
Fig. 9 is a diagram illustrating an example of the operation of the robot arm during the re-teaching.
Fig. 10 is a block diagram showing an example of a functional configuration of a robot control system including a robot control device and an image processing device.
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings. In addition, the same or corresponding portions in the drawings are denoted by the same reference numerals.
(embodiment)
As shown in fig. 1, the robot control system 1000 according to the embodiment includes a three-dimensional measurement camera 150, a robot arm 160, and a robot control device 100. The robot control system 1000 is one of the following: in advance, the robot arm 160 is moved by receiving teaching from a user (operator) and reproducing the taught content (teaching data) with respect to the operation of the robot arm 160.
The three-dimensional measurement camera 150 is a camera capable of measuring a distance to a photographed object, and is also called a three-dimensional sensor, a depth sensor, a distance measurement sensor, or the like. As the device configuration of such a camera, there are a device configuration of 2 cameras, a device configuration of 1 camera and 1 projector, a device configuration of 2 cameras and 1 projector, and the like, and the device configuration of the three-dimensional measurement camera 150 may be any configuration. As shown in fig. 1, the three-dimensional measurement camera 150 is attached to the robot arm 160, and photographs the target object 310 existing before the robot arm 160. The robot control device 100 can acquire the three-dimensional position of the target object 310 by acquiring data of a point group described later from the image data of the target object 310 captured by the three-dimensional measurement camera 150.
The robot arm 160 includes a robot hand 161 at a distal end. The teaching data is reproduced by the robot control device 100, and the robot arm 160 can perform operations such as grasping and lifting the target object 310 with the robot hand 161.
As shown in fig. 1, the robot control device 100 includes a control unit 110, a storage unit 120, an image input unit 131, a display unit 132, an operation input unit 133, and a robot control unit 134.
The control unit 110 is configured by a CPU (Central Processing Unit: central processing unit), an FPGA (Field Programmable Gate Array: field programmable gate array), and the like, and controls each part of the robot control device 100.
The storage unit 120 is configured by a RAM (Random Access Memory: random access Memory), a ROM (Read Only Memory), and the like. The storage unit 120 stores therein programs and the like for performing various processes by the control unit 110. The storage unit 120 stores image data captured by the three-dimensional measurement camera 150, data used by the control unit 110 for arithmetic processing, and the like.
The image input unit 131 is an interface between the control unit 110 and the three-dimensional measurement camera 150. Image data captured by the three-dimensional measurement camera 150 is input to the control unit 110 via the image input unit 131. Further, an imaging command to the three-dimensional measurement camera 150 is transmitted from the control unit 110 to the three-dimensional measurement camera 150 via the image input unit 131.
The display unit 132 is a device for displaying image data captured by the three-dimensional measurement camera 150, an operation UI (user interface) screen of the robot arm 160, and the like. For example, the display unit 132 includes a liquid crystal display and an organic EL (Electro-Luminescence) display. However, the robot control device 100 may include these displays as the display unit 132, or may include the display unit 132 as an interface for connecting an external display. When the robot control device 100 includes the display unit 132 as an interface, the robot control device displays image data or the like on an external display connected via the display unit 132.
The operation input unit 133 is a device that receives an operation input from a user to the robot control device 100, and is, for example, a keyboard, a mouse, a touch panel, or the like. The robot control device 100 receives an instruction or the like from a user via the operation input unit 133.
The robot control unit 134 is an interface between the control unit 110 and the robot arm 160. A command to move the robot arm 160 is transmitted from the control unit 110 to the robot arm 160 via the robot control unit 134. The control unit 110 can acquire the current three-dimensional position of the robot arm 160 via the robot control unit 134.
The control unit 110 functions as a reference position acquisition unit 111, a playback position acquisition unit 112, a conversion data calculation unit 113, a playback unit 114, and a teaching unit 115 by executing a program stored in the storage unit 120.
The reference position acquisition unit 111 acquires a reference position, which is a three-dimensional position of the object to be imaged at the time of teaching, from the image captured by the three-dimensional measurement camera 150.
The playback position obtaining unit 112 obtains a playback position, which is a three-dimensional position of the object to be imaged at the time of playback, from the image captured by the three-dimensional measurement camera 150.
The conversion data calculation unit 113 calculates conversion data for converting the reference position acquired by the reference position acquisition unit 111 into the reproduction position acquired by the reproduction position acquisition unit 112.
The playback unit 114 converts the teaching data using the conversion data calculated by the conversion data calculation unit 113, and transmits an operation command for the robot arm 160 from the robot control unit 134 to the robot arm 160 based on the converted teaching data, thereby moving the robot arm. In addition, the following case is referred to as "reproduction teaching data": based on the teaching data, an operation command for the robot arm 160 is transmitted from the robot control unit 134 to the robot arm 160.
The teaching unit 115 receives teaching related to the operation of the robot arm 160 from the user via the operation input unit 133, acquires teaching data, and stores the acquired teaching data in the storage unit 120. The teaching data includes information on coordinates of a movement destination (teaching point) of the robot arm 160 and information on a type of motion (grasping, releasing, etc.) of the robot hand 161 at the movement destination. The configuration of the robot control system 1000 is described above.
(teaching treatment)
Next, teaching processing of the robot control system 1000 will be described with reference to fig. 2. When the user instructs the robot control device 100 to start teaching processing via the operation input unit 133, the teaching processing is started.
First, the user operates the robot arm 160 via the operation input unit 133 to move the robot arm 160 to the imaging position (step S101). In step S101, the control unit 110 outputs the user operation instruction received in the operation input unit 133 to the robot control unit 134, and thereby the robot arm 160 moves to the shooting position. As shown in fig. 3, when the robot arm 160 is moved to the imaging position 170, the target object 310 processed by the robot arm 160 is set in advance in the imaging direction of the three-dimensional measurement camera 150 (after the user sets the target object 310 in front of the three-dimensional measurement camera 150 at the imaging position, the start of teaching processing is instructed to the robot control device 100). The photographing position may also be different from the teaching point described later. Here, as shown in fig. 3, the target object 310 is provided on the base 320.
Next, the control unit 110 issues an imaging command to the three-dimensional measurement camera 150, acquires image data of the target object 310 imaged by the three-dimensional measurement camera from the image input unit 131, and acquires the current three-dimensional position of the robot arm 160 via the robot control unit 134 (step S102). The three-dimensional position of the robot arm 160 acquired here is represented by coordinates (robot coordinates) in a coordinate system (robot coordinate system) based on the control mechanism of the robot arm 160, but may be represented by coordinates in a world coordinate system based on the earth, for example.
Next, the reference position acquisition unit 111 acquires data (point group data) of the three-dimensional position of the point group of the target object 310 (step S103). Step S103 is also referred to as a reference position acquisition step. The three-dimensional position of the point group acquired here is represented by coordinates (camera coordinates) in a coordinate system (camera coordinate system) observed from the three-dimensional measurement camera 150, but may be represented by coordinates in a world coordinate system based on the earth, for example, similarly to the three-dimensional position of the robot arm 160. The point group is a set of three-dimensional coordinate (XYZ coordinate) data obtained by discretizing a three-dimensional object in a point unit (for example, 0.1mm unit).
In the present embodiment, the three-dimensional coordinates of each feature point are obtained from the distance to each feature point obtained from the image data captured by the three-dimensional measurement camera 150 and the three-dimensional position of the three-dimensional measurement camera 150 at the time of capturing (since the three-dimensional measurement camera 150 is attached to the robot arm 160, it can be calculated from the three-dimensional position of the robot arm 160), and the set of the three-dimensional coordinate data of each feature point is used as the point group data. The feature point is a point indicating a characteristic portion (edge portion, boundary portion of color, or the like) included in the image data. The point group data acquired here is data indicating a reference position, which is a three-dimensional position of the target object 310 at the time of teaching. The point group may be, without limitation to the feature point, a set of three-dimensional coordinate data obtained by discretizing the entire surface of the object included in the image data. In the present embodiment, there is no particular problem even if the feature points are limited, and the amount of calculation can be reduced, so that the set of three-dimensional coordinate data of the feature points is treated as a point group.
In the example of fig. 3, a set of feature points representing an edge portion of the target object 310, an edge portion of the base 320, and the like is acquired as point group data. That is, the point group data acquired here includes not only the target object 310 but also the point group of an unnecessary portion represented by the background (the base 320 or the like). The dot unit is merely an example of "for example, 0.1mm unit". The point unit may be set to an appropriate value based on the movement accuracy of the robot arm 160, the size of the target object 310, and the like, and may be set to 0.5mm unit, 1mm unit, or the like, for example.
Next, the reference position acquisition unit 111 edits the point group data acquired in step S103 (step S104). Step S104 is also referred to as an editing step. In step S104, the reference position acquisition unit 111 functions as a point group data editing unit. Since the point group data acquired in step S103 includes points other than the target object 310, such as the background, unnecessary points are deleted in step S104. In step S104, a process of adjusting the resolution as necessary is performed so that the matching process with the main data is performed appropriately in a conversion data calculation step described later.
The editing in step S104 is specifically performed, for example, by: the dot group is displayed on the display unit 132 (or, if necessary, the images before the dot group are superimposed and displayed), and unnecessary dots such as the background included in the displayed dot group are deleted by the user via the operation input unit 133. In this case, the user may input a height that is a boundary in the Z direction (vertical direction), and the process of erasing the Z coordinate as a background may be performed for a point group whose Z coordinate does not satisfy the input height.
Next, the reference position acquisition unit 111 stores the point group data edited in step S104 as main data in the storage unit 120 (step S105). As described above, the coordinates of the point group data are expressed as coordinates (camera coordinates) in a coordinate system (camera coordinate system) observed from the three-dimensional measurement camera 150, but when stored as main data, the coordinates are stored so as to be convertible into coordinates (robot coordinates) in a robot coordinate system. For example, the point group data at the camera coordinates and the three-dimensional position (robot coordinates) of the robot arm 160 acquired in step S102 are stored in the storage unit 120 as a set. Alternatively, the three-dimensional position (robot coordinates) of the robot arm 160 acquired in step S102 may be stored in the storage unit 120 after the point group data at the camera coordinates is converted into the robot coordinates before being stored in the storage unit 120.
If the coordinates of the robot arm 160 and the coordinates of the point group data are both represented by coordinates in the same coordinate system (for example, the world coordinate system), the coordinates of the point group data (without being converted into coordinates in the robot coordinate system) may be directly stored in the storage unit 120 as main data.
Next, the teaching unit 115 receives teaching from the user regarding the operation of the robot arm 160, and stores the teaching data obtained in this manner in the storage unit 120 (step S106). In step S106, the user inputs the operations to be performed by the robot arm 160 in order through the operation input unit 133, and actually teaches the user while moving the robot arm 160. For example, as shown in fig. 4, the user teaches the following actions: the robot arm 160 is first moved to the teaching point 301, then moved to the teaching point 302, and the object 300 is grasped by the robot hand 161, and then the robot arm 160 is moved to the teaching point 303. The teaching unit 115 stores the coordinates of all teaching points taught by the user and the motions (grasping, releasing, etc.) of the robot hand 161 as teaching data in the storage unit 120.
After the teaching data is stored in step S106, the teaching section 115 ends the teaching process. The teaching process is described above.
(reproduction processing)
Next, description will be given with reference to fig. 5 for a playback process in which the robot control system 1000 plays back teaching data obtained by the teaching process. When the user instructs the robot control device 100 to start the reproduction process via the operation input unit 133, the reproduction process is started. In addition, as in the teaching process, the user sets the target object 310 in the shooting direction of the three-dimensional measurement camera 150 at the shooting position before instructing the start of the reproduction process.
First, the user operates the robot arm 160 via the operation input unit 133 to move the robot arm 160 to the imaging position (step S201). Step S201 is the same process as step S101 of the teaching process (fig. 2).
Next, the control unit 110 issues an imaging command to the three-dimensional measurement camera 150, acquires image data of the target object 310 imaged by the three-dimensional measurement camera 150 from the image input unit 131, and acquires the current three-dimensional position of the robot arm 160 via the robot control unit 134 (step S202). Step S202 is the same process as step S102 of the teaching process.
Next, the playback position acquiring unit 112 acquires point group data of the target object 310 (step S203). Step S203 is the same processing as step S103 of the teaching processing, but the point group data acquired in step S203 is data indicating the three-dimensional position of the target object 310 at the time of reproduction, that is, the reproduction position. Step S203 is also referred to as a reproduction position acquisition step. As in step S103 of the teaching process, the point group data acquired here includes not only the target object 310 but also the point group of an unnecessary portion represented by the background (the base 320 or the like).
Then, the conversion data calculation unit 113 matches the point group data acquired in step S203 with the main data stored in the storage unit 120 in step S105 of the teaching process, and calculates conversion data for converting the coordinates (reference position) at the time of teaching into coordinates (reproduction position) at the time of reproduction (step S204). Step S204 is also referred to as a conversion data calculation step. For example, the conversion data calculation unit 113 converts the point group data (coordinates at the time of playback) and the main data (coordinates at the time of teaching) acquired in step S203 into coordinates in the same coordinate system (for example, robot coordinate system), matches the coordinates with ICP (Iterative Closest Point: iterative closest point) algorithm or the like, and calculates a coordinate transformation matrix a for converting the coordinates at the time of teaching into the coordinates at the time of playback.
In step S203, there may be a plurality of target objects 310 on the base 320, or a plurality of target objects 310 may be present in the image data captured by the three-dimensional measurement camera 150. In this case, in step S204, the conversion data calculation unit 113 selects 1 object 310 considered to be most appropriate from the plurality of object 310, and matches the point group data of the selected 1 object 310 with the main data. Here, as a method of selecting 1 target object 310 considered most suitable, for example, a method of evaluating the matching degree of the point group of each target object 310 and selecting the target object 310 having the highest matching degree with the main data is exemplified. The method will be described below.
The number of the plurality of target objects 310 present in the image data acquired in step S202 is denoted by n. The number of points included in the main data is denoted by N. Then, the number of points within a predetermined threshold from the point group of the i-th (i is an integer of 1 to n) object 310 (corresponding to the point included in the main data) among the points included in the main data is represented by Mi. At this time, the matching degree of the i-th object 310 is represented by Mi/N. The conversion data calculation unit 113 selects the target object 310 having the largest matching degree (Mi/N). For example, in this way, the conversion data calculation unit 113 can select 1 target object 310 which is considered to be most appropriate from among the plurality of target objects 310.
Next, the reproducing unit 114 converts the coordinates of the teaching points included in the teaching data stored in the storage unit 120 into coordinates at the time of reproduction using the conversion data (coordinate conversion matrix a) calculated in step S204 (step S205). By the processing of step S205, for example, as shown in fig. 6, the coordinates of the main data (the coordinates in the coordinate system C at the time of teaching) are converted into coordinates at the time of reproduction (the coordinates obtained by projecting the coordinates in the coordinate system C at the time of teaching onto the coordinate system C'). Step S205 is also referred to as a conversion step.
In addition, according to the robot, the following functions may be provided: a coordinate system C 'used when specifying the position of the robot arm 160 is defined, and the coordinate system C at the time of teaching is switched to its defined coordinate system C'. The function is as follows: an instruction can be given to the robot to switch the coordinate system as a reference from the coordinate system C (as the coordinate system at the time of teaching) to the coordinate system C 'as the coordinate system as the reference of the current reproduction position, thereby setting the position of the specified robot arm 160 or the coordinate at the time of accepting teaching as the coordinate in the coordinate system C'. In the case of using this function, instead of converting the coordinates as described above, the coordinate system C' may be defined and the coordinate system may be switched.
That is, in step S204 and step S205, instead of performing the coordinate conversion as described above, the reproduction unit 114 may define the coordinate system C ' at the time of reproduction and instruct the robot to switch the coordinate system of the main data (the coordinate system C at the time of teaching) to the coordinate system C ' at the time of reproduction, thereby directly using the coordinate (in the coordinate system C) at the time of teaching shown in fig. 4 as the coordinate (the coordinate obtained by projecting to the coordinate system C ') at the time of reproduction as shown in fig. 6. In this way, by switching the coordinate system handled by the robot, the coordinates at the time of teaching are converted into the coordinates at the time of reproduction without changing the values of the coordinates themselves, and as a result, the coordinates at the time of teaching on the robot side are also converted into the coordinates at the time of reproduction (for example, by the coordinate conversion matrix a), and are thus included in the conversion of teaching data.
Then, the playback unit 114 plays back the teaching data converted in step S205, controls the robot arm 160 (step S206), and ends the playback process. In step S206, for example, as shown in fig. 6, the reproduction section 114 first moves the robot arm 160 to the teaching point 301', then moves it to the teaching point 302', and grips the object 300 with the robot hand 161, and then moves the robot arm 160 to the teaching point 303'. Step S206 is also referred to as a reproduction step.
After the teaching process, the reproduction process is performed, whereby even if the position or inclination of the object 300 deviates from the reproduction at the time of teaching (further, even if the position or inclination of the robot including the robot arm 160 deviates), the robot arm 160 can appropriately reproduce the content taught about the object 300 at the time of teaching at the time of reproduction. That is, even if the position of the object deviates from the reproduction at the teaching time, the influence of the deviation can be removed.
In addition, regardless of whether there is a positional deviation between the target object at the time of teaching and the time of playback, there is a case where an object positioning error of the three-dimensional measurement camera 150 is generated at the time of teaching and the time of playback, and the object positioning error is observed from the three-dimensional measurement camera 150 and is identical to the positional deviation of the target object. Therefore, the influence of the object positioning error can be removed by the above-described processing.
Further, there is a case where there is an error between the coordinates (instruction coordinates) at the time of teaching and the coordinates (actual coordinates) at the time of reproduction in the movable region of the robot arm 160, regardless of whether there is a positional deviation from the target object at the time of reproduction or whether there is an object positioning error. However, since the three-dimensional measurement camera 150 is attached to the robot arm 160, the influence of the error can be removed by the above-described processing. This will be described with reference to fig. 7.
Here, consider a case where the object is moved by a known movement amount T from the position (coordinate P) at the time of teaching. In order to follow the movement of the object, when the movement amount T is instructed to the robot arm 160, if the error of the movement amount of the robot arm 160 is 0, there is no problem in that the position of the object in the camera view should be unchanged between the camera view at the time of teaching and the camera view at the time of reproduction. However, for example, it is assumed that the movement amount of the robot arm 160 has an error of Δt.
Then, due to the error, an object within the camera view at the time of reproduction is recognized as being moved by- Δt. Therefore, in the measurement by the three-dimensional measurement camera 150 at the time of reproduction, the movement amount of the object is measured as t—Δt.
As a result, when the teaching point (coordinate P) at the time of teaching is moved by the movement amount T based on the conversion data in the above-described reproduction processing, the reproduction unit 114 instructs the robot arm 160 to move to the coordinate q=p+ (T- Δt) at the time of reproduction.
Then, the robot arm 160 has an error of Δt with respect to the instructed coordinate Q, and thus the actual movement destination of the robot arm 160 becomes q+Δt. Here, since q=p+ (T- Δt), q+Δt=p+t, and the actual movement destination (q+Δt) of the robot arm 160 coincides with the actual movement destination (p+t) of the object.
As described above, in the present embodiment, since the three-dimensional measurement camera 150 is attached to the robot arm 160, the movement error of the robot arm 160 is also measured by the three-dimensional measurement camera 150, and the error between the instruction coordinates and the actual coordinates in the movable region of the robot arm 160 can be corrected.
(re-teaching treatment)
Sometimes, it may happen that only a part of the teaching data is intended to be corrected. In this case, even if only the teaching points to be corrected are to be corrected, in the teaching process described above, when the target object 310 is not provided at the exactly same position as when the main data is created, a deviation occurs in coordinates due to a relationship with other teaching points. However, it is difficult to set the target object 310 at the exactly same position. Therefore, in the case of using the teaching process described above, even if the teaching data to be corrected is made to be only a part of the teaching data, it is necessary to re-teach all the teaching points.
As a process for solving this problem, a re-teaching process capable of re-teaching only teaching points requiring correction after reproducing teaching points requiring no correction will be described with reference to fig. 8. The re-teaching means that the pointer teaches the teaching point after the teaching has been completed, and the coordinates of the teaching point, the motion at the teaching point, and the like are corrected. When the user instructs the robot control device 100 to start the re-teaching process through the operation input unit 133, the re-teaching process is started. In addition, the user sets the target object 310 in the shooting direction of the three-dimensional measurement camera 150 at the shooting position before instructing the start of the re-teaching process, as in the teaching process and the reproduction process. Further, the robot used herein has the following functions: the coordinate system used when specifying the position of the robot arm 160 is switched.
Steps S301 to S304 are the same as the processing of steps S201 to S204 of the reproduction processing (fig. 5), and therefore, the description thereof is omitted.
After step S304, the teaching section 115 defines the coordinate system C' based on the conversion data calculated in step S304 (step S305). Here, "defining the coordinate system C 'means giving an instruction to the robot person to switch the coordinate system as a reference from the coordinate system C (as the coordinate system at the time of teaching) to the coordinate system C (as the coordinate system as the reference of the current reproduction position), thereby setting the position of the specified robot arm 160 or the coordinate at the time of accepting teaching as the coordinate in the coordinate system C'. Thus, the specified position (coordinates in the coordinate system C at the time of teaching=coordinates in the coordinate system C at the time of reproduction, these two coordinates differ only in the coordinate system, and the values of the coordinates themselves are equal) is handled as follows: on the robot side, the conversion data (for example, the coordinate transformation matrix a) is converted into coordinates in the real space at the time of reproduction. Furthermore, the taught positions (coordinates in real space) are processed as follows: on the robot side, the coordinates in the coordinate system C' (coordinates in the coordinate system C) are converted with conversion data (for example, an inverse matrix of the coordinate transformation matrix a).
Then, after step S305, the teaching unit 115 receives the re-teaching from the user in the coordinate system C' with respect to the operation of the robot arm 160, stores the teaching data thus obtained in the storage unit 120 (step S306), and ends the re-teaching process. Step S306 is also referred to as a re-teaching step. The term "accept the re-teaching in the coordinate system C" means that the coordinates in the actual space to be taught are converted into coordinates in the coordinate system C' on the robot side (=the coordinates in the coordinate system C at the time of the original teaching) based on the conversion data calculated in step S304, and the coordinates in the coordinate system C are acquired.
In addition, in step S305 and step S306 (for example, in a case where the robot used does not have a function of switching the coordinate system used when specifying the position of the robot arm 160), instead of defining the coordinate system C 'as described above, and receiving the re-teaching in the coordinate system C', the teaching unit 115 may convert the coordinates in the actual space to which the re-teaching is to be applied into the coordinate system C (which is the coordinate system in the original teaching) based on the conversion data (for example, the inverse matrix of the coordinate transformation matrix) calculated in step S304, and store the data obtained by the conversion into the storage unit 120 as the teaching data in the coordinate system C (the teaching data obtained by the re-teaching).
The teaching data stored in the storage section 120 by the teaching process (fig. 2) is represented by coordinates in the coordinate system C, and therefore the teaching data obtained by the re-teaching also needs to be represented by coordinates in the coordinate system C. By performing the re-teaching processing as described above, the robot can store teaching data obtained in the re-teaching in the storage unit 120 as teaching data expressed by coordinates in the coordinate system C, which is the coordinate system at the time of teaching, both in the case where the robot has a function of switching the coordinate system used when specifying the position of the robot arm 160 and in the case where the robot does not have the function.
In step S306, the teaching unit 115 sequentially moves the robot arm 160 from the initial position by reproducing the teaching data from the initial teaching data to the teaching data to be corrected (teaching data that the user considers to be corrected (teaching data to be corrected)). Then, if the reproduction of the teaching data advances to the teaching data to be corrected, the user is allowed to specify new coordinates and actions (to be re-taught), and the teaching data is corrected by receiving the teaching again.
For example, in the teaching process, as shown in fig. 4, teaching data as follows: the robot arm 160 is moved to the teaching point 301 and moved to the teaching point 302, and the object 300 is grasped by the robot hand 161, and then moved to the teaching point 303. And, consider the following: the user intends to correct the 2 nd teaching point 302 from the vicinity of the center of gravity of the object 300 to the vicinity of the upper end of the object 300.
In this case, in the re-teaching process, as shown in fig. 9, first, the robot arm 160 moves to the teaching point 301', and then, moves to the teaching point 302'. Since teaching point 302 'is teaching data to be corrected by the user (teaching data to be corrected), at this time, the user instructs robot control device 100 to re-teach, which moves robot arm 160 from teaching point 302' to teaching point 302". As a result, the teaching section 115 causes the storage section 120 to store teaching data that corrects the teaching point 302 'to the teaching point 302'.
Then, if there remains teaching data to be corrected by the user (teaching data to be corrected), the teaching unit 115 reproduces the teaching data until teaching data to be corrected next (for example, teaching point 303') and performs a re-teaching. Then, if the teaching data to be corrected by the user (teaching data to be corrected) is all re-taught, the re-teaching process is ended.
The processing in step S306 is merely an example, and the teaching unit 115 may not perform the operation of sequentially reproducing the teaching data from the beginning. For example, the teaching unit 115 may be configured to receive an instruction of teaching data to be corrected (teaching data to be corrected) from the user, and to move the robot arm 160 directly to the teaching point indicated by the teaching data to be corrected, and to receive a re-teaching from the user.
Through the re-teaching process described above, the robot control device 100 can effectively correct a part of the teaching data.
(modification)
In the above-described embodiment, the user moves the robot arm 160 to the imaging position in step S101, step S201, and step S301, but the operation by the user may not be required. For example, the control unit 110 may move the robot arm 160 to the imaging position in step S101, step S201, and step S301 by specifying the imaging position in advance without any operation by the user.
In the above embodiment, the point group data is edited in step S104, but the editing of the point group data is not necessarily required. The editing of the point group data in step S104 is performed so that the point group data (main data) at the time of teaching and the point group data at the time of reproduction are easily matched in step S204 of the reproduction process (and step S304 of the re-teaching process). Depending on the contents of the point group data obtained from the target object 310, the background, and the like, matching may be performed without any problem even if the point group data is not edited. In this case, the process of step S104 is not required.
In contrast, between the step S203 and the step S204 of the playback process, the process of editing the point group data may be performed in the same manner as the step S104 of the teaching process. Depending on the content of the point group data such as the point group data and the background obtained from the target object 310, it is preferable to edit the point group data in the playback processing, so that matching with the main data in step S204 is easy. In such a case, the following processing may be performed: between step S203 and step S204 of the reproduction process, the point group data is edited. The same applies to steps S303 to S304 of the re-teaching process.
In the above-described embodiment, the teaching is performed by actually moving the robot arm 160 in step S106 of the teaching process. The reason for this is considered to be that the following cases are more: in effect, the robotic arm 160 is moved to teach it is relatively understandable to the user and enables accurate teaching. However, this is merely an example of the teachings. For example, the teaching may be performed by checking the operation of the robot arm 160 with the display unit 132, and the teaching may be performed without actually moving the robot arm 160.
In the above embodiment, the three-dimensional position of the robot arm 160 is represented by coordinates in the robot coordinate system, and the coordinates of the point group data are represented by coordinates in the camera coordinate system, but these coordinate systems are arbitrarily selected. For example, the coordinates in the world coordinate system may be expressed by using the earth as a reference.
In the above-described embodiment, the robot control device 100 controls both the three-dimensional measurement camera 150 and the robot arm 160, but the robot control device 100 does not necessarily need to control both. For example, as shown in fig. 10, the robot control system 1001 may further include an image processing device 200 in addition to the robot control device 101, and these 2 devices may be configured to be capable of communicating with each other via the communication unit 140 and the communication unit 240. In this case, the robot control device 101 controls the robot arm 160, and the image processing device 200 controls the three-dimensional measurement camera 150.
In this case, the control unit 210, the storage unit 220, the image input unit 231, the display unit 232, and the operation input unit 233 shown in fig. 10 have functions and configurations similar to those of the control unit 110, the storage unit 120, the image input unit 131, the display unit 132, and the operation input unit 133 of the robot control device 100 (fig. 1) described above, respectively. The communication unit 140 and the communication unit 240 each include devices capable of communicating with each other. The communication may be either wireless or wired. For example, the communication unit 140 and the communication unit 240 are each provided with a wireless LAN (Local Area Network: local area network) device, and can perform data communication with each other.
Then, in step S102 of the teaching process and step S202 of the reproduction process, the control unit 110 transmits an imaging command in the three-dimensional measurement camera 150 and the current three-dimensional position of the robot arm 160 to the control unit 210 via the communication unit 140. The control unit 210 that receives these acquires image data of the target object 310 captured by the three-dimensional measurement camera 150 from the image input unit 231, and acquires the three-dimensional position of the robot arm 160 via the communication unit 240.
Then, in step S205 of the reproduction process, the reproduction section 114 acquires the conversion data calculated by the conversion data calculation section 113 of the image processing apparatus 200 via the communication section 140, and converts the coordinates of the teaching points included in the teaching data stored in the storage section 120 into coordinates in the coordinate system at the time of reproduction using the conversion data.
As described above, the robot control device 101 and the image processing device 200 may share and control the robot arm 160, and control and image processing of the three-dimensional measurement camera 150. The configuration described above and the configuration of fig. 10 are only examples of the sharing of the functions of the robot control system, and the functions of the robot control system may be configured by sharing different from fig. 10.
The functions of the control unit 110 of the robot control devices 100 and 101 and the control unit 210 of the image processing device 200 can be implemented by a computer such as a general PC (Personal Computer: personal computer). Specifically, in the above embodiment, the following will be described: the programs of the processes performed by the control unit 110 of the robot control devices 100 and 101 are stored in advance in the ROM of the storage unit 120, and the programs of the processes performed by the control unit 210 of the image processing device 200 are stored in advance in the ROM of the storage unit 220. However, the program may be stored and distributed in a computer-readable storage medium such as a floppy disk, a CD-ROM (Compact Disc Read Only Memory: compact disk read only memory), a DVD (Digital Versatile Disc: digital versatile disk), an MO (magnetic-Optical disk), a memory card, and a USB (Universal Serial Bus: universal serial bus) memory, and the program may be read into and installed in a computer, whereby a computer capable of realizing the above functions is configured.
While the preferred embodiments of the present invention have been described above, the present invention can be variously embodied and modified without departing from the broad spirit and scope of the present invention. The above-described embodiments are merely illustrative of the present invention, and do not limit the scope of the present invention. That is, the scope of the present invention is expressed not by the embodiments but by the claims. Further, various modifications to be performed within the scope of the claims and the meaning of the invention equivalent thereto are regarded as being within the scope of the invention.
The present application is based on the japanese patent application publication No. 2020-89195 of month 21 of the year 2020, and the entire specification, claims and drawings of the japanese patent application publication No. 2020-89195 are incorporated herein by reference.
[ Industrial availability ]
The present invention is applicable to a robot control system, a robot control device, a robot control method, and a program, which can remove the influence of a deviation between teaching and playback even if the position of a target object is deviated.
[ description of reference numerals ]
100. 101 … robot control device
110. 210 … control part
111 … reference position acquiring unit
112 … reproduction position acquisition unit
113 … conversion data calculating unit
114 … reproduction section
115 … teaching part
120. 220 … storage part
131. 231 … image input section
132. 232 … display part
133. 233 … operation input section
134 … robot control unit
140. 240 … communication part
150 … three-dimensional measuring camera
160 … robot arm
161 … robot hand
170 … shooting position
200 … image processing apparatus
300 … object
301. 301', 302', 302", 303' … teaching points
310 … object
320 … base
1000. 1001 … robot control system
C. C' … coordinate System
P, Q … coordinates
T … movement amount

Claims (9)

1. A robot control system includes a three-dimensional measurement camera, a robot arm, and a robot control device;
in the control system of the robot, the control system,
the three-dimensional measurement camera is mounted on the robot arm and measures the three-dimensional position of the target object;
the robot control device is a robot control device for moving the robot arm by reproducing teaching data taught in advance, and includes:
a reference position obtaining unit that obtains a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
a playback position acquisition unit that acquires a playback position, which is a three-dimensional position of the target object at the time of playback, from the three-dimensional measurement camera,
a conversion data calculation unit that calculates conversion data for converting the reference position acquired by the reference position acquisition unit into the reproduction position acquired by the reproduction position acquisition unit, and
and a playback unit that, after converting the teaching data using the conversion data calculated by the conversion data calculation unit, plays back the teaching data, thereby moving the robot arm.
2. The robotic control system according to claim 1, wherein,
the robot control device further includes a teaching unit that, after reproducing teaching data to be corrected from the initial teaching data, that is, teaching data to be corrected, among the teaching data, receives the teaching again, and corrects the teaching data to be corrected.
3. A robot control device that moves a robot arm to which a three-dimensional measurement camera for measuring a three-dimensional position of a target object is attached, by reproducing teaching data taught in advance;
the robot control device includes:
a reference position obtaining unit that obtains a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
a playback position acquisition unit that acquires a playback position, which is a three-dimensional position of the target object at the time of playback, from the three-dimensional measurement camera,
a conversion data calculation unit that calculates conversion data for converting the reference position acquired by the reference position acquisition unit into the reproduction position acquired by the reproduction position acquisition unit, and
and a playback unit that, after converting the teaching data using the conversion data calculated by the conversion data calculation unit, plays back the teaching data, thereby moving the robot arm.
4. The robot control device according to claim 3, wherein,
and a teaching part which receives teaching again after reproducing teaching data which is required to be corrected from the original teaching data, namely, the teaching data required to be corrected, from the teaching data, and corrects the teaching data required to be corrected.
5. The robot control device according to claim 3 or 4, wherein,
the system further comprises a point group data editing unit which edits point group data representing the reference position;
the reference position obtaining unit obtains a reference position represented by the point group data edited by the point group data editing unit.
6. A robot control method for moving a robot arm to which a three-dimensional measurement camera for measuring the three-dimensional position of a target object is attached by reproducing teaching data taught in advance;
the robot control method comprises the following steps:
a reference position obtaining step of obtaining a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
a reproduction position acquisition step of acquiring a reproduction position, which is a three-dimensional position of the target object at the time of reproduction, from the three-dimensional measurement camera,
A conversion data calculation step of calculating conversion data for converting the reference position acquired in the reference position acquisition step into the reproduction position acquired in the reproduction position acquisition step, and
and a reproduction step of converting the teaching data by the conversion data calculated in the conversion data calculation step and then reproducing the teaching data, thereby moving the robot arm.
7. The robot control method according to claim 6, wherein,
and a re-teaching step of receiving teaching again after reproducing teaching data to be corrected from the original teaching data, that is, teaching data to be corrected, from among the teaching data, and correcting the teaching data to be corrected.
8. A program for causing a computer of a robot control device that reproduces teaching data taught in advance to move a robot arm to which a three-dimensional measurement camera that measures a three-dimensional position of a target object is attached, to execute the steps of:
a reference position obtaining step of obtaining a reference position, which is a three-dimensional position of the target object at the time of teaching, from the three-dimensional measurement camera,
A reproduction position acquisition step of acquiring a reproduction position, which is a three-dimensional position of the target object at the time of reproduction, from the three-dimensional measurement camera,
a conversion data calculation step of calculating conversion data for converting the reference position acquired in the reference position acquisition step into the reproduction position acquired in the reproduction position acquisition step, and
and a reproduction step of converting the teaching data by the conversion data calculated in the conversion data calculation step and then reproducing the teaching data, thereby moving the robot arm.
9. The program according to claim 8, for causing the computer to further execute a re-teaching step of receiving teaching again after reproducing teaching data to be corrected from initial teaching data, that is, correction-required teaching data, among the teaching data, and correcting the correction-required teaching data.
CN202180059579.7A 2019-10-30 2021-02-16 Robot control system, robot control device, robot control method, and program Pending CN116133801A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2019197058 2019-10-30
JP2020-089195 2020-05-21
JP2020089195A JP7199101B2 (en) 2019-10-30 2020-05-21 ROBOT CONTROL SYSTEM, ROBOT CONTROL DEVICE, ROBOT CONTROL METHOD AND PROGRAM
PCT/JP2021/005620 WO2021235030A1 (en) 2019-10-30 2021-02-16 Robot control system, robot control device, robot control method and program

Publications (1)

Publication Number Publication Date
CN116133801A true CN116133801A (en) 2023-05-16

Family

ID=75712192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180059579.7A Pending CN116133801A (en) 2019-10-30 2021-02-16 Robot control system, robot control device, robot control method, and program

Country Status (3)

Country Link
JP (1) JP7199101B2 (en)
CN (1) CN116133801A (en)
WO (1) WO2021235030A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023102647A1 (en) * 2021-12-06 2023-06-15 University Of Manitoba Method for automated 3d part localization and adjustment of robot end-effectors

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4675502A (en) * 1985-12-23 1987-06-23 General Electric Company Real time tracking control for taught path robots
JP4794011B2 (en) * 2008-04-03 2011-10-12 関東自動車工業株式会社 Image processing apparatus and robot control system
JP5239901B2 (en) * 2009-01-27 2013-07-17 株式会社安川電機 Robot system and robot control method
JP2018001393A (en) * 2016-07-08 2018-01-11 キヤノン株式会社 Robot device, robot control method, program and recording medium
KR102096897B1 (en) * 2018-12-31 2020-04-03 (주) 엠엔비젼 The auto teaching system for controlling a robot using a 3D file and teaching method thereof

Also Published As

Publication number Publication date
JP7199101B2 (en) 2023-01-05
WO2021235030A1 (en) 2021-11-25
JP2021070149A (en) 2021-05-06

Similar Documents

Publication Publication Date Title
CN108297096B (en) Calibration device, calibration method, and computer-readable medium
CN105269578B (en) Pointing device and robot system
JP5233601B2 (en) Robot system, robot control apparatus, and robot control method
JP6892286B2 (en) Image processing equipment, image processing methods, and computer programs
US10675759B2 (en) Interference region setting apparatus for mobile robot
KR20140008262A (en) Robot system, robot, robot control device, robot control method, and robot control program
CN104802186A (en) Robot programming apparatus for creating robot program for capturing image of workpiece
CN109648568B (en) Robot control method, system and storage medium
CN109556510B (en) Position detection device and computer-readable storage medium
KR102111655B1 (en) Automatic calibration method and apparatus for robot vision system
CN114080590A (en) Robotic bin picking system and method using advanced scanning techniques
CN116133801A (en) Robot control system, robot control device, robot control method, and program
JP5573275B2 (en) Feature point extraction device, motion teaching device and motion processing device using the same
JP6410411B2 (en) Pattern matching apparatus and pattern matching method
CN110853102A (en) Novel robot vision calibration and guide method, device and computer equipment
JP2014144516A (en) Robot control device, robot system, robot, control method and program
WO2018096669A1 (en) Laser processing device, laser processing method, and laser processing program
JP2022055100A (en) Control method, control device and robot system
TWI813480B (en) Cluster data synthesis apparatus, method, system, and computer-readable medium
JPH08272451A (en) Calibration method in robot with visual sensor
JP2012236266A (en) Robot control system, robot system, and program
JP2014061578A (en) Robot, robot system, robot control device, robot control method, and program
CN112643718B (en) Image processing apparatus, control method therefor, and storage medium storing control program therefor
US20240123611A1 (en) Robot simulation device
WO2023171167A1 (en) Work recognition device, work recognition method, and work recognition program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination