CN111890371B - Method for verifying and updating calibration information for robot control and control system - Google Patents

Method for verifying and updating calibration information for robot control and control system Download PDF

Info

Publication number
CN111890371B
CN111890371B CN202010831931.1A CN202010831931A CN111890371B CN 111890371 B CN111890371 B CN 111890371B CN 202010831931 A CN202010831931 A CN 202010831931A CN 111890371 B CN111890371 B CN 111890371B
Authority
CN
China
Prior art keywords
verification
camera
validation
image
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010831931.1A
Other languages
Chinese (zh)
Other versions
CN111890371A (en
Inventor
罗素·伊斯兰
叶旭涛
鲁仙·出杏光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/369,630 external-priority patent/US10399227B1/en
Priority claimed from US16/864,071 external-priority patent/US10906184B2/en
Application filed by Individual filed Critical Individual
Priority claimed from CN202010646706.0A external-priority patent/CN112677146A/en
Publication of CN111890371A publication Critical patent/CN111890371A/en
Application granted granted Critical
Publication of CN111890371B publication Critical patent/CN111890371B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

The present disclosure relates to a method and a control system for verifying and updating calibration information for robot control. Specifically, a computing system and method for calibration verification is presented. The computing system is configured to perform a first calibration operation that controls the robotic arm to move the authentication symbol to a reference position. The robot control system also receives a reference image of the validation symbol from the camera and determines reference image coordinates of the validation symbol. The robot control system also controls the robotic arm to move the validation symbol to the reference position again during the idle period, receive additional images of the validation symbol, and determine the validation image coordinates. The robot control system determines a deviation parameter value based on the reference image coordinates and the verification image coordinates, determines whether the deviation parameter value exceeds a prescribed threshold value, and if the threshold value is exceeded, performs a second calibration operation.

Description

Method for verifying and updating calibration information for robot control and control system
The present application is a divisional application of patent application 202010646706.0 entitled "method and control system for verifying and updating calibration information for robot control" filed 7/2020.
Cross reference to related applications
This application is a continuation-in-part application from U.S. patent application No. 16/732,832 entitled "Method and Control System for reforming and Updating Camera Calibration for Robot Control" filed on day 1, 2 2020, this U.S. patent application No. 16/732,832 is a continuation-in-part application from U.S. patent application No. 16/525,004 entitled "Method and Control System for reforming and Updating Camera Calibration for Robot Control" filed on day 29, 7, 2019, and this U.S. patent application No. 16/525,004 is a continuation-in-part application from U.S. patent application No. 16/369,630 entitled "Method Control System for reforming and Updating Camera Calibration for Robot Control" filed on day 29, 3, 2019, the entire contents of these U.S. patent applications being incorporated herein by reference. This application also claims the benefit of U.S. provisional application No. 62/916,798 entitled "Method and Control System for verification Calibration for Robot Control", filed on 18/10.2019, the entire contents of which are also incorporated herein by reference.
Technical Field
The invention relates to a method and a control system for verifying and updating calibration information for robot control.
Background
As automation becomes more prevalent, robots are used in more environments, such as in warehousing and manufacturing environments. For example, robots may be used to load and remove goods from pallets in a warehouse, or may be used to pick items from a factory's conveyor belt. The motion of the robot may be fixed or may be based on input, such as images taken by a camera in a warehouse or factory. In the latter case, calibration operations may be performed in order to determine the nature of the camera and to determine the relationship between the camera and the environment in which the robot is located. The calibration operation may generate calibration information for controlling the robot. In some implementations, the calibration operation may involve manual operation of a person who may manually control the motion of the robot or may manually control a camera to take an image of the robot.
Disclosure of Invention
One aspect of embodiments herein relates to performing calibration verification for robot control, such as verifying camera calibration or other system calibration. Calibration verification may be performed by a robot control system including a communication interface and control circuitry. The communication interface may be configured to communicate with a robot having a base and a robotic arm with a verification symbol disposed thereon, and with a camera having a camera field of view. The control circuitry of the robot control system may be configured to perform calibration verification by: a) performing a first calibration operation (e.g. a first camera calibration) to determine calibration information (e.g. camera calibration information), b) outputting a first motion command to a communication interface, wherein the communication interface is configured to communicate the first motion command to the robot to cause the robotic arm to move a validation symbol to a position within the field of view of the camera during or after the first calibration operation, the position being a reference position of one or more reference positions used for validation of the first calibration operation, c) receiving an image of the validation symbol from the camera over the communication interface, the camera being configured to take an image of the validation symbol at the reference position, the image being a reference image for validation, d) determining reference image coordinates of the validation symbol, the reference image coordinates being coordinates where the validation symbol appears in the reference image, and e) outputting a second motion command based on the calibration information to the communication interface, wherein the communication interface is configured to communicate a second motion command to the robot to move the robotic arm to perform the robot operation.
In an embodiment, the control circuit is configured to perform calibration verification by further: f) detecting an idle period during operation of the robot, g) outputting a third motion command to the communication interface, wherein the communication interface is configured to communicate the third motion command to the robot to cause the robot arm to move the verification symbol at least to said reference position during the idle period, h) receiving an additional image of the verification symbol over the communication interface from a camera configured to take an additional image of the verification symbol at least at said reference position during the idle period, said additional image being a verification image for verification, i) determining verification image coordinates for verification, the verification image coordinates being coordinates at which the verification symbol appears in the verification image, j) determining a deviation parameter value based on an amount of deviation between the reference image coordinates and the verification image coordinates, both the reference image coordinates and the verification image coordinates being associated with the reference position, wherein the deviation parameter value is indicative of since the first calibration operation, a change in a camera with which the communication interface is configured to communicate, or a change in a relationship between the camera and the robot with which the communication interface is configured to communicate since the first calibration operation, k) determining whether the deviation parameter value exceeds a prescribed threshold, and l) in response to a determination that the deviation parameter value exceeds the prescribed threshold, performing a second calibration operation (e.g., a second camera calibration operation) to determine updated calibration information (e.g., updated camera calibration information).
Drawings
The foregoing and other features, objects and advantages of the invention will be apparent from the following description of embodiments of the invention as illustrated in the accompanying drawings. The accompanying drawings, which are incorporated herein and form a part of the specification, further illustrate the principles of the present invention and, together with the description, further enable a person skilled in the relevant art to make and use the invention. The figures are not drawn to scale.
Fig. 1A and 1B depict block diagrams of systems in which verification of calibration information is performed, according to embodiments herein.
Fig. 1C depicts a block diagram of a robot control system configured to perform verification of calibration, according to embodiments herein.
Fig. 1D depicts a block diagram of a camera on which camera calibration is performed, according to embodiments herein.
Fig. 2 depicts a system illustrating control of a robot based on calibration information obtained from a calibration operation, according to embodiments herein.
Fig. 3 depicts a system for performing calibration operations, according to embodiments herein.
Fig. 4A and 4B provide a flow diagram illustrating a method of performing verification of calibration information, according to embodiments herein.
Fig. 5A and 5B illustrate a system in which a validation symbol is provided on a robot for performing validation of calibration information, according to embodiments herein.
Fig. 5C depicts an exemplary validation symbol, according to embodiments herein.
6A-6D illustrate examples of taking reference positions of respective images of a validation symbol according to embodiments herein.
FIG. 7A depicts an example of determining reference image coordinates, according to embodiments herein.
FIG. 7B depicts an example of determining verification image coordinates, according to embodiments herein.
FIG. 8 illustrates an exemplary timeline for verification of calibration information, according to embodiments herein.
Fig. 9 provides a flow chart illustrating an exemplary method of performing verification of calibration information, according to embodiments herein.
Fig. 10A-10C illustrate a system in which a set of validation symbols is provided on a robot for performing validation of calibration information, according to embodiments herein.
Fig. 11A-11B illustrate a system in which a set of validation symbols is provided on a robot for performing validation of calibration information, according to embodiments herein.
Fig. 11C illustrates a set of validation symbols having different respective sizes, in accordance with embodiments herein.
Fig. 12A and 12B provide a flow chart illustrating a method of performing verification of calibration information, according to embodiments herein.
Fig. 13A and 13B illustrate a reference image and a verification image, respectively, associated with a first pose of a robotic arm.
Fig. 14A and 14B illustrate a reference image and a verification image, respectively, associated with a second pose of the robotic arm.
Detailed Description
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
Embodiments described herein relate to verifying and/or updating calibration information for controlling a robot, such as a robot used in a warehouse, manufacturing plant, or some other environment. More specifically, calibration information may be performed to facilitate control of the robotic operating system, and may be determined by performing a calibration operation, which may be referred to as system calibration. The system calibration may include calibration of a camera (which may be referred to as camera calibration or camera calibration operations), calibration of a robot (which may be referred to as robot calibration), calibration of another element of a robot operating system, or any combination thereof. System calibration may be performed by, for example, a robot control system (also referred to as a robot controller) to generate calibration information (e.g., camera calibration information) that facilitates the robot control system's ability to control a robot based on images captured (e.g., taken) by a camera. For example, a robot may be used to pick up packages in a warehouse, where placement of a robotic arm or other component of the robot may be based on images of the packages taken by a camera. In this case, if the calibration information includes camera calibration information, the camera calibration information may be used with the image of the package to determine, for example, the position and orientation of the package relative to the robotic arm of the robot. If the system calibration includes camera calibration, the camera calibration may involve determining respective estimates of intrinsic parameters (which may also be referred to as intrinsic parameters) of the camera, and determining estimates of the relationship between the camera and its external environment. The intrinsic parameters of the camera may have one or more parameter values, such as a matrix, vector, or scalar value. Further, examples of the intrinsic parameters include a projection matrix and a distortion (distortion) parameter. In an example, camera calibration may involve determining a position of a camera relative to a fixed position in an external environment, which may be expressed as a transformation function representing a relationship between the camera and the fixed position in the external environment. In some cases, camera calibration may be performed with a calibration pattern that may have pattern elements disposed at defined locations on the calibration pattern. The camera may take an image of a pattern element of the calibration pattern (also referred to as calibration image), and based on comparing the image of the pattern element with the defined position of the pattern element, camera calibration may be performed. Camera CALIBRATION is discussed in more detail in U.S. patent application No.16/295,940 (docket No. MJ0021US1) "METHOD AND DEVICE FOR PERFORMING AUTOMATIC CAMERA CALIBRATION FOR ROBOT CONTROL", filed on 7.3.2019, the entire contents of which are incorporated herein by reference.
As described above, one aspect of the present disclosure relates to verifying that camera calibration or other calibration operations performed at an earlier point in time are still accurate at a later point in time. Camera calibration performed at an earlier point in time may generate camera calibration information reflecting the nature of the camera at that point in time, such as intrinsic parameters of the camera at that point in time, or a relationship between the camera and its external environment. In some cases, earlier camera calibrations may lose accuracy over time, as the properties of the camera may change over time. In a first example, the intrinsic parameters of the camera may change over time. Such changes may be caused by, for example, temperature changes that change the shape of the camera's housing and/or lens. In a second example, the relationship between the camera and its external environment may change over time. For example, the camera may move position or orientation relative to, for example, the base of the robot or some location in a warehouse. Such changes may be caused by, for example, temperature changes that expand or contract any of the components used to mount the camera, a person or other object impacting the camera, vibrations in the external environment of the camera (e.g., a warehouse), forces from the camera's own weight (i.e., gravity), or other factors. These variations may cause the camera calibration information or other calibration information to be outdated, such that at a later point in time, positioning a robotic arm or other component of the robot using the camera calibration information or other calibration information may cause an error. In other words, if the properties associated with the camera have changed over time, but the camera calibration information has not been updated to reflect such changes, the robot may operate according to outdated or incorrect camera calibration information, causing undesirable errors in the operation of the robot. To address the possibility that one or more properties of the camera may change, the robotic control system may automatically perform a verification that detects when the camera calibration information from the camera calibration is no longer sufficiently accurate (or, more generally, when the calibration information from the calibration operation is no longer sufficiently accurate). Detecting such a condition may provide an indication of a change in a property of the camera or some other element of the robot operating system. If the verification detects that the calibration information is no longer sufficiently accurate, the robot control system may perform a calibration operation to determine updated calibration information that may reflect one or more up-to-date properties of the camera or another element of the robot operating system. The updated calibration information may be used to control placement of the robotic arm or some other aspect of robot operation. Thus, automatic verification of calibration information and/or updating of calibration information is performed to ensure that the robot operates based on correct information relating to one or more properties associated with the camera or any other element of the robot operating system.
One aspect of embodiments herein relates to verifying calibration information of a camera by comparing a reference image taken by the camera and a verification image taken by the camera. In some cases, the reference image may be an image of the object taken when the object is at a particular location at an earlier point in time, and the verification image may be an image of the object taken at the same location at a later point in time. The verification may determine whether there is an excessive deviation between the reference image and the verification image, such as whether the deviation exceeds a certain threshold. In some implementations, the object may be a verification symbol. More specifically, a robotic arm or other component of the robot may have a verification symbol for verifying the calibration information. Both the reference image and the validation image may capture or otherwise include a validation symbol, and the robot control system may compare the two images by comparing the presence of the validation symbol in the reference image to the presence of the validation symbol in the validation image. For example, after the robot control system performs a calibration operation that generates calibration information at a particular point in time, the robot control system may control the robotic arm (e.g., via motion commands) to move the validation symbol to a set of predetermined positions within the field of view of the camera (also referred to as the camera field of view of the camera), where these positions may be used as a set of reference positions for validation. The camera may capture respective reference images of the validation symbol at the set of reference positions. In some cases, the reference image may be taken immediately after the calibration operation is performed. The motion of the robotic arm, or more specifically, the motion commands for moving the robotic arm, may be based on calibration information from the calibration operation just performed, or may be independent of the calibration information. In some cases, the reference image may be taken before the robot starts the robot operation. After taking the reference image, the robot may be considered ready to start a robot operation for performing a job, and the robot control system may e.g. control the positioning of the robot arm based on the images subsequently taken by the camera.
As described above, the reference image and the verification image taken subsequently may be compared. In an embodiment, the verification image may be taken during one or more idle periods detected by the robot control system. More specifically, as robotic operations begin, the robot may begin performing robotic tasks (e.g., by interacting with packages or other objects). The robot control system may detect one or more idle periods of the robot while the robot is performing a robotic operation. In some cases, the idle period may be a period of time during which the robot is not performing robot work during robot operation. In some cases, the robot control system may schedule robot operations based on detecting or otherwise anticipating objects with which the robot needs to interact, and may detect idle periods based on detecting or otherwise anticipating the absence of objects with which the robot needs to interact.
During the idle period(s), the robot control system may control the robotic arms or other components of the robot to move (e.g., via motion commands) to the reference positions and capture (e.g., via camera commands) respective verification images at the respective reference positions. More specifically, if the authentication symbol is provided on the robot, the robot control system may control the robot arm to move the authentication symbol to a reference position to capture an authentication image. The robotic control system may then determine the extent to which each verification image deviates from the corresponding reference image at each reference location. In some cases, the deviation between the verification image and the corresponding reference image may be expressed as a deviation parameter. If the value of the deviation parameter (also referred to as the deviation parameter value) exceeds a defined threshold value of the deviation parameter (also referred to as the prescribed deviation threshold value), the robot control system may perform additional calibration operations (e.g., perform additional camera calibrations) to determine updated calibration information for the camera (e.g., updated camera calibration information). When the value of the deviation parameter exceeds a prescribed deviation threshold, this condition may indicate that the use of previously generated calibration information may result in an undesirable amount of error in the robot operation. Thus, in some cases, robot operation may be paused or stopped while additional calibration operations are performed (pausing may be considered as another idle period). After the additional calibration operation is completed, a new set of reference images may be taken and robot operation may continue with updated calibration information. During a subsequent idle period, a new set of verification images may be taken and the robot control system may perform verification of additional calibration operations by comparing the new set of reference images to the new set of verification images.
As described above, if the value of the deviation parameter exceeds the prescribed deviation threshold, the controller control system may perform additional calibration operations. If the value of the deviation parameter does not exceed the deviation threshold, then robot operation may continue after an idle period without additional calibration operations by the robot control system. In this case, during a subsequent idle period, the camera may take a new set of verification images at the respective reference positions. When a new set of verification images is taken, the robot control system may again perform verification of the calibration operation by determining the extent to which the new set of verification images deviate from the corresponding reference images for the respective reference positions.
As described above, the robotic arm may have a validation symbol, such as a ring pattern, disposed thereon, which may be captured or otherwise included in the reference image and the validation image. In an embodiment, the robot control system may determine a deviation between the reference image and the respective verification image based on the respective locations where the verification symbols appear in the reference image and based on the respective locations where the verification symbols appear in the verification image. For example, the robot control system may determine reference image coordinates for each reference location. The reference image coordinates of the specific position may be coordinates where the authentication symbol appears in a reference image taken when the authentication symbol is placed at the reference position. More specifically, the reference image coordinates may be associated with a specific reference position and may refer to image coordinates where a verification symbol appears in a reference image captured by a camera when the verification symbol is placed at the reference position. In the above example, the image coordinates may refer to coordinates in the image, such as pixel coordinates. When the robot control system subsequently places the validation symbol at a particular reference location at a subsequent point in time and obtains a corresponding validation image, the robot control system may determine the validation image coordinates. The verification image coordinates may also be associated with a reference location and may refer to image coordinates (e.g., pixel coordinates) where a verification symbol appears in a verification image captured by the camera when the verification symbol is placed at the reference location. The robot control system may compare the reference image coordinates associated with a particular reference location to the verification image coordinates associated with the same reference location. Such a comparison may be made for each reference position at which the verification image and the reference image were taken.
In an example, in the reference image, the reference image coordinates where the verification symbol appears may be coordinates of the center of the verification symbol in the reference image (also referred to as center coordinates of the verification symbol in the reference image). Similarly, the verification image coordinates where the verification symbol appears in the verification image may be coordinates of the center of the verification symbol in the verification image (also referred to as center coordinates of the verification symbol in the verification image). For each reference position at which the robotic arm and/or the validation symbol is located while the corresponding validation image is captured, the robot control system may determine a deviation between reference image coordinates associated with the reference position and validation image coordinates associated with the same reference position. If the robotic arm and/or validation symbol is placed at multiple reference positions, the robot control system may determine respective amounts of deviation between the respective reference image coordinates and the respective validation image coordinates for the multiple reference positions. The robot control system may further determine the value of the deviation parameter based on a respective amount of deviation between the reference image coordinates of each reference location and the respective verification image coordinates.
In an example, the validation symbol may include a plurality of shapes that are concentric with one another such that respective centers of the plurality of shapes in the validation symbol are at the same or substantially the same location. For example, the validation symbol may be an annular pattern comprising two or more concentric circles. In some cases, if the reference image coordinates of the validation symbol are center coordinates of the validation symbol in the reference image, the robot control system may determine the center coordinates of the validation symbol based on respective center coordinates of a plurality of shapes in the reference image, where the center coordinates of a particular shape are the coordinates of the center of that shape. If the authentication symbol is a circular pattern, the center coordinates of the circular pattern in the reference image may be determined as an average of the center coordinates of a first circle forming the circular pattern and the center coordinates of a second circle forming the circular pattern in the reference image. Similarly, the center coordinates of the validation symbol in the validation image may be determined based on the respective center coordinates of the plurality of shapes in the validation image that form the validation symbol. In some cases, forming the verification symbol with multiple shapes may improve the accuracy of the verification. For example, determining the center coordinates of the verification symbol using the respective center coordinates of the plurality of shapes in the image may improve robustness of verification against image noise. More specifically, if the image of the validation symbol contains image noise, the image noise may reduce the accuracy with which the robot control system detects the center coordinates of the particular shape of the validation symbol. However, if the center coordinates of the shape are averaged with the center coordinates of another shape to determine the center coordinates of the verification symbol, the averaged center coordinates may reduce the influence of image noise. As a result, the accuracy of determining the center coordinates of the verification symbol can be improved.
In an example, the validation symbol may include a plurality of regions having respective different colors, wherein respective areas of the plurality of regions may have exact prescribed ratios. For example, the authentication symbol may include a first region having a first color (e.g., black) and a second region having a second color (e.g., white), where the ratio of the area of the first region to the area of the second region is specified or otherwise known. The exact ratio may facilitate identification of the authentication symbol in the image, especially if the image is captured or otherwise includes other features, such as dots of a calibration pattern. For example, a robotic arm that moves the calibration symbol may also have a calibration pattern disposed on the robotic arm. The robotic control system may use the ratio to distinguish between the verification symbol and the dots of the calibration pattern. More specifically, since the ratio of the areas of the plurality of regions of the verification symbol is defined as an exact ratio, the robot control system can recognize the verification symbol in the image based on the prescribed ratio. During identification of the validation symbol appearing in the image, the robot control system may distinguish the validation symbol from the calibration pattern or other features based on a prescribed ratio. In some cases, the authentication symbol may be identified in the image as a portion of the image that includes a plurality of regions having respective different colors and having a prescribed ratio between respective areas of the plurality of regions. If the robotic control system or other system or device determines that a particular portion of the image does not include multiple regions having respective different colors, or that the respective areas of the multiple regions have a ratio that is different than a prescribed ratio, the robotic control system may determine that the portion of the image is not a verification symbol.
In an example, the robot control system may be validated based on the temperature around the robot. For example, the robot control system may adjust the prescribed deviation threshold based on temperature (i.e., define a new value for the deviation threshold). For example, temperature may affect various parts in the camera and/or robot, as some materials may be sensitive, and/or may expand/contract based on temperature. Changes in temperature may cause changes in the intrinsic parameters of the camera and/or changes in the relationship between the camera and its external environment. In an embodiment, the deviation threshold may be set to have a first value when the temperature is outside the prescribed range, and may be set to have a second value lower than the first value when the temperature is within the prescribed range. For example, when the temperature is within a specified normal operating temperature range (e.g., within 10 degrees of room temperature), then the deviation threshold may be a first value. When the temperature is outside the normal operating temperature range, then the deviation threshold may have a second value that is lower than the first value. The second value may be lower than the first value so that when the temperature is outside of the normal operating range, it is easier to trigger additional calibration operations, as temperatures outside of the normal operating temperature range are more likely to cause changes in the camera or the camera's relationship to the external environment, and thus more likely to cause errors in operating the robot using previously generated camera calibration information or any other calibration information.
In an embodiment, verification of calibration information may rely on only a single reference location. Alternatively, verification of the calibration information may rely on multiple reference locations. The reference position may be any position in the field of view of the camera or may be a specific prescribed position. For example, the reference position may be defined as a position on the surface of at least one imaginary sphere that is recessed relative to the camera. At each reference position in this case, the robotic arm may be controlled to position the validation symbol such that the validation symbol is located tangent to the surface of the at least one imaginary sphere while facing the camera. Such positioning may better allow the authentication symbol to be photographed or otherwise captured by the front of the camera (the authentication symbol directly facing the camera) such that the image of the authentication symbol resembles a top view of the authentication symbol rather than a perspective view. For example, if the validation symbol is a circular pattern, then having the circular pattern located tangentially to the surface of the imaginary sphere may cause the final image of the circular pattern to be circular, rather than elliptical. The final image may exhibit little or no perspective distortion (relative to the case where the annular pattern appears elliptical in the image). The absence of perspective distortion may facilitate accurate determination of the center coordinates of the annular pattern. In some cases, the reference position may be divided between a plurality of imaginary spheres that are all concave relative to the camera. The plurality of imaginary spheres may share a common center and may be different in size such that each imaginary sphere has a spherical surface at a different distance from the camera. In some cases, the camera may be the common center of all imaginary spheres.
Fig. 1A illustrates a block diagram of a robotic manipulation system 100 (also referred to as system 100) for performing automatic camera calibration and automatic verification of camera calibration. The robot operating system 100 includes a robot 150, a robot control system 110 (also referred to as a robot controller), and a camera 170. Although some of the examples below discuss performing automatic camera calibration and verifying camera calibration information determined from automatic camera calibration, the examples may be more generally applicable to any type of automatic calibration operation, as well as verifying any type of calibration information determined from an automatic calibration operation. In embodiments, the system 100 may be located in a warehouse, manufacturing facility, or other location. The robot control system 110 may be configured to perform camera calibration, discussed in more detail below, to determine camera calibration information that is later used to control the robot 150 for robotic operations, such as picking up packages in a warehouse. The robot control system 110 may also be configured to perform camera calibration verification, discussed in more detail below, to verify that the camera calibration information is still sufficiently accurate. In some cases, the robot control system 110 is configured to perform camera calibration and control the robot 150 to perform robot operations based on the camera calibration information. In some cases, the robot control system 110 may constitute a single device (e.g., a single console or a single computer) in communication with the robot 150 and the camera 170. In some cases, the robotic control system 110 may include multiple devices.
In some cases, the robot control system 110 may be dedicated to performing camera calibration and/or verification of camera calibration, and the latest camera calibration information may be communicated to other control systems (also referred to as other controllers, not shown) that then control the robot 150 for robot operation based on the latest camera calibration information. The robot 150 may be positioned based on the images taken by the camera 170 and the camera calibration information. More specifically, in an embodiment, the robot control system 110 may be configured to generate motion commands based on the images and based on the camera calibration information, and communicate the motion commands to the robot 150 to control the motion of its robotic arm. In some cases, the robot control system 110 is configured to perform verification of camera calibration during idle periods in robot operation. In some cases, the robot control system 110 is configured to verify while performing robot operations with the robot 150.
In an embodiment, the robot control system 110 may be controlledConfigured to communicate with the robot 150 and the camera 170 through wired or wireless communication. For example, the robot control system 110 may be configured to communicate via an RS-232 interface, a Universal Serial Bus (USB) interface, an Ethernet interface, a USB interface,
Figure BDA0002638306660000131
Figure BDA0002638306660000132
an interface, an IEEE 802.11 interface, or any combination thereof, in communication with the robot 150 and/or the camera 170. In an embodiment, the robot control system 110 may be configured to communicate with the robot 150 and/or the camera 170 via a local computer bus, such as a Peripheral Component Interconnect (PCI) bus.
In an embodiment, the robot control system 110 may be separate from the robot 150 and may communicate with the robot via the wireless or wired connection described above. For example, the robot control system 110 may be a standalone computer configured to communicate with the robot 150 and the camera 170 through wired or wireless connections. In an embodiment, the robot control system 110 may be an integrated component of the robot 150 and may communicate with other components of the robot 150 via the local computer bus described above. In some cases, the robot control system 110 may be a dedicated control system (also referred to as a dedicated controller) that controls only the robot 150. In other cases, the robot control system 110 may be configured to control multiple robots, including the robot 150. In an embodiment, the robot control system 110, the robot 150, and the camera 170 are located at the same site (e.g., a warehouse). In an embodiment, the robot control system 110 may be remote from the robot 150 and the camera 170 and may be configured to communicate with the robot 150 and the camera 170 over a network connection (e.g., a Local Area Network (LAN) connection).
In an embodiment, the robot control system 110 may be configured to retrieve or otherwise receive an image of the calibration pattern 160 and/or the verification symbol 165 placed on the robot 150 (e.g., on a robotic arm of the robot) from the camera 170. In some cases, the robotic control system 110 may be configured to control the camera 170 to take such images. For example, the robot control system 110 may be configured to generate camera commands that cause the camera 170 to take images of the field of view of the camera 170 (also referred to as the camera field of view) and communicate the camera commands to the camera 170 over a wired or wireless connection. The same command may also cause the camera 170 to transfer the image to the robotic control system 110, or more generally to a storage device accessible to the robotic control system 110. Alternatively, the robot control system 110 may generate another camera command that causes the camera 170 to transmit the image captured by the camera 170 to the robot control system 110 upon receipt of the camera command. In an embodiment, the camera 170 may automatically capture images in its camera field of view periodically or in response to a prescribed trigger condition, without requiring a camera command from the robot control system 110. In such embodiments, the camera 170 may also be configured to automatically transfer images to the robotic control system 110, or more generally, to a storage device accessible to the robotic control system 110, without camera commands from the robotic control system 110.
In an embodiment, the robot control system 110 may be configured to control the motion of the robot 150 through motion commands generated by the robot control system 110 and communicated to the robot 150 via a wired or wireless connection. The robot 150 may be configured to have a calibration pattern 160 and/or a validation symbol 165 on the robot 150. For example, fig. 1B depicts a robot operating system 100A in which a validation symbol 165 is disposed on the robot 150, without the calibration pattern 160 of fig. 1A. In one example, the validation symbol 165 may be part of the robot 150 and may be permanently disposed on the robot 150. For example, the validation symbol 165 may be permanently painted on the robot 150 or may be part of a sticker or board permanently attached to the robot 150. In another example, the validation symbol 165 may be a separate component that is attachable to and detachable from the robot 150. The validation symbol 165 may be permanently disposed on the robot 150 or may be a separate component that is attachable to and detachable from the robot 150.
In an embodiment, the only image in the system 100 used to control the robot 150 may be the image captured by the camera 170. In another embodiment, the system 100 may include multiple cameras and may control the robot 150 using images from the multiple cameras.
FIG. 1B further illustrates an embodiment in which the robotic control system 110 is in communication with a user interface device 180. The user interface device 180 may be configured to interface with an operator of the robot 150, such as a staff member in a warehouse where the robot 150 is located. The user interface device 180 may include, for example, a tablet computer or desktop computer that provides a user interface that displays information related to the operation of the robot 150. As described above, the robot control system 110 may be configured to detect when a deviation parameter value exceeds a prescribed deviation threshold. In an embodiment, the user interface device 180 may provide an alarm or other alert to notify the operator that the deviation parameter value exceeds the prescribed deviation threshold.
Fig. 1C depicts a block diagram of the robot control system 110. As illustrated in the block diagram, the robot control system 110 includes control circuitry 111, a communication interface 113, and a non-transitory computer-readable medium 115 (e.g., memory). In embodiments, the control circuitry 111 may include one or more processors, Programmable Logic Circuits (PLCs) or Programmable Logic Arrays (PLAs), Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), or any other control circuitry.
In embodiments, the communication interface 113 may include one or more components configured to communicate with the camera 170 of fig. 1A or 1B and the robot 150 of fig. 1A or 1B. For example, the communication interface 113 may include communication circuitry configured to communicate via wired or wireless protocols. For example, the communication circuit may include an RS-232 port controller, a USB controller, an Ethernet controller, a USB interface, a USB,
Figure BDA0002638306660000151
A controller, a PCI bus controller, any other communication circuit, or a combination thereof. In an embodiment, the control circuit 111 may be configured to generate a motion command (e.g., a motor motion command) and output the motion command to the communication interface 113. In this embodiment, the communication interface 113 may be configured to communicate motion commands to the robot 150 to controlTo control the movement of the robotic arms or other components of the robot 150. In an embodiment, the control circuit 111 may be configured to generate a camera command and output the camera command (e.g., a take image command) to the communication interface 113. In this embodiment, the communication interface 113 may be configured to communicate camera commands to the camera 170 to control the camera 170 to take or otherwise capture images of objects within the camera's field of view. In an embodiment, the communication interface 113 may be configured to receive images or other data from the camera 170, and the control circuit 111 may be configured to receive images from the communication interface 113.
In an embodiment, the non-transitory computer readable medium 115 may include computer memory. The computer memory may include, for example, Dynamic Random Access Memory (DRAM), solid state integrated memory, and/or a Hard Disk Drive (HDD). In some cases, camera calibration may be implemented by computer executable instructions (e.g., computer code) stored on non-transitory computer readable medium 115. In such cases, the control circuitry 111 may include one or more processors configured to execute computer-executable instructions to perform verification of camera calibration (e.g., the steps illustrated in fig. 4A, 4B, and 9).
Fig. 1D depicts a block diagram of a camera 170, the camera 170 including one or more lenses 171, an image sensor 173, and a communication interface 175. The communication interface 175 may be configured to communicate with the robot control system 110 of fig. 1A, 1B, or 1C, and may be similar to the communication interface 113 of fig. 1C of the robot control system 110. In an embodiment, the one or more lenses 171 may focus light from outside the camera 170 onto the image sensor 173. In an embodiment, image sensor 173 may include a pixel array configured to represent an image by individual pixel intensity values. The image sensor 173 may include a Charge Coupled Device (CCD) sensor, a Complementary Metal Oxide Semiconductor (CMOS) sensor, a Quantum Image Sensor (QIS), or any other image sensor.
As described above, camera calibration may be performed to facilitate control of the robot based on an image taken by the camera. For example, fig. 2 depicts a robot operating system 200 (also referred to as system 200) in which images are used to control a robot 250 to perform robotic operations, such as operations to pick up an object 292 in a warehouse. More specifically, the system 200 may be an embodiment of the system 100 of fig. 1A, including a camera 270, a robot 250, and a robot control system 110. Camera 270 may be an embodiment of camera 170 of fig. 1A, 1B, or 1D, and robot 250 may be an embodiment of robot 150 of fig. 1A or 1B. The camera 270 may be configured to capture images of objects 292 (e.g., shipping packages) placed on a conveyor belt 293 in the warehouse and the robot control system 110 may be configured to control the robot 250 to pick up the objects 292. When one or more objects are on the conveyor belt 293, the robot control system 110 may be configured to schedule the movement of the robot 250 for picking up the objects. In some cases, the robot control system 110 may be configured to detect idle periods of robot operation by detecting when there are no objects on the conveyor belt 293, or when there are no objects on the conveyor belt 293 within reach of the robot 250.
In the embodiment of fig. 2, the robot 250 may have a base 252 and a robotic arm that is movable relative to the base 252. More specifically, the robotic arm may include a plurality of links 254A-254E, and a manipulator 255 attached to the link 254E. The plurality of links 254A-254E may rotate relative to each other and/or may be linear links (kinematic links) that are linearly movable relative to each other. Since fig. 2 relates to a robot 250 for picking up objects, the robot arm 255 may comprise grippers 255A and 255B for gripping the object 292. In an embodiment, the robotic control system 110 may be configured to communicate motion commands to rotate one or more of the links 254A-254E. The motion command may be a low level command, such as a motor motion command, or a high level command. If the motion commands from the robot control system 110 are high-level commands, the robot 150 may be configured to convert the high-level commands to low-level commands.
In an embodiment, the camera calibration information determined from the camera calibration describes a relationship between the camera 270 and the robot 250, or more specifically, a relationship between the camera 270 and a world point 294 that is stationary with respect to the base 252 of the robot 250. World point 294 may represent the world or other environment in which robot 250 is located and may be any imaginary point that is stationary relative to base 252. In other words, the camera calibration information may include information describing the relationship between the camera 270 and the world point 294. In an embodiment, the relationship may refer to the position of the camera 270 relative to the world point 294, and the orientation of the camera 270 relative to a reference orientation of the robot 250. The relationship between the camera 270 and the world point 294 described above may be referred to as a camera-to-world relationship and may be used to represent the relationship between the camera 270 and the robot 250. In some cases, the camera-to-world relationship may be used to determine a relationship between the camera 270 and the object 292 (also referred to as a camera-to-object relationship), and a relationship between the object 292 and the world point 294 (also referred to as an object-to-world relationship). The camera to object relationship and the object to world relationship can be used to control the robot 250 to pick up the object 292.
In an embodiment, the camera calibration information may describe intrinsic parameters of the camera 270, where the intrinsic parameters may be any parameters whose values are independent of the position and orientation of the camera 270. The intrinsic parameters may characterize properties of the camera 270, such as the focal length of the camera 270, the size of the image sensor of the camera 270, or the effect of lens distortion introduced by the camera 270.
An example of a detailed structure representing an example of a robot 350 is depicted in fig. 3, and fig. 3 depicts a robot operating system 300 including a robot control system 110 in communication with a camera 370 and the robot 350. Camera 370 may be an embodiment of camera 170/270 of fig. 1A, 1B, 1D, or 2, respectively, and robot 350 may be an embodiment of robot 150/250 of fig. 1A, 1B, or 2, respectively. The camera 370 is capable of capturing images within the camera field of view 330. The robot 350 may include a base 352, and a robotic arm that is movable relative to the base 352. The robotic arm includes one or more links, such as link 354A-link 354E, and a robot 355. In an embodiment, the links 354A-354E are rotatably attached to one another. For example, link 354A is rotatably attached to robot base 352 by joint 356A. The remaining links 354B-354E are rotatably attached to each other by joints 356B-356E. In an embodiment, the base 352 may be used to mount the robot 350 to, for example, a mounting rack or mounting surface (e.g., the floor of a warehouse). In an embodiment, the robot 350 may include a plurality of motors configured to move the robotic arms via the rotational links 354A-354E. For example, one of the motors may be configured to rotate the first link 354A relative to the joint 356A and the base 302, as indicated by the dashed arrow in FIG. 3. Similarly, other motors of the plurality of motors may be configured to rotate the links 354B-354E. The plurality of motors may be controlled by the robotic control system 110. Fig. 3 also depicts a robot 355 fixedly disposed on the fifth link 354E. The robot arm 355 may have a calibration pattern 320 thereon, so that the robot control system 110 may capture an image of the calibration pattern 320 through the camera 370 and perform camera calibration based on the captured image of the calibration pattern 320. For example, when the camera 370 is used to capture an image of the calibration pattern 320 (also referred to as a calibration image), the robot control system 110 may move the robotic arm such that the calibration pattern 320 is within the camera field of view 330 and visible to the camera 370. After camera calibration is performed, the robot 355 may be removed and replaced with another robot, such as a robot having a validation symbol disposed thereon, as described in more detail below.
As described above, according to embodiments, calibration verification may involve comparing reference image coordinates where a verification symbol appears in a reference image with verification image coordinates where the verification symbol appears in a verification image. The comparison may determine deviations between the verification image coordinates and the reference image coordinates, which may be used to determine whether to perform additional calibration operations. The verification image may be taken during an idle period of the robot operation. Fig. 4A and 4B depict a flow diagram illustrating a method 400 of verifying camera calibration, in accordance with an embodiment. Although some of the embodiments below discuss verifying camera calibration information determined from camera calibration, the method 400 may be used to more generally verify calibration information determined from any calibration operation performed for a robotic operating system. In an embodiment, the method 400 may be performed by the control circuitry 111 of the robot control system 110. As described above, the robot control system 110 may include the communication interface 113 of fig. 1C, the communication interface 113 being configured to communicate with the robot 150 of fig. 1A or 1B, and with the camera 170 of fig. 1A, 1B, or 1D. In an embodiment, the robot may have a base (e.g., base 252 of fig. 2 or base 352 of fig. 3), and a robotic arm (e.g., robotic arm of fig. 2 or 3) having a verification symbol disposed thereon, wherein the robotic arm is movable relative to the base.
An exemplary environment in which the method 400 is performed is depicted in fig. 5A and 5B, and fig. 5A and 5B depict a robot operating system 500/500a that both include a robot control system 110 in communication with a camera 570 and a robot 550. Camera 570 may be an embodiment of camera 170/270/370 of fig. 1, 2, or 3, respectively, and robot 550 may be an embodiment of robot 150/250/350 of fig. 1A, 1B, 2, or 3, respectively. The robot 550 may include a base 552 and a robotic arm (labeled 553 in fig. 5A and 5B) movable relative to the base 552. The robotic arm includes one or more links, such as links 554A-554E. The links 554A-554E may also be examples of arm portions of the robotic arm 553 that are movably attached to one another. In an embodiment, the links 554A-554E are rotatably attached to one another. For example, link 554A is rotatably attached to robot base 552. The remaining links 554B-554E are rotatably attached to one another by a plurality of joints. In an embodiment, the base 552 may be used to mount the robot 552 to, for example, a mounting rack or surface (e.g., the floor of a warehouse). Robot 550 may operate in a similar manner as robot 350 of fig. 3. For example, the robot 550 may include a plurality of motors configured to move the robotic arms by rotating the links 554A-554E relative to one another. The robotic arm may also include a manipulator attached to linkage 554E. For example, fig. 5A depicts a first robot 555, a second robot 557, and a third robot 559, all of which may be attached to and detached from the fifth link 554E. The robot 555/557/559 may include, for example, a gripper or suction device configured to pick up objects (e.g., 582A, 582B, 582C) from the conveyor 573. When the manipulator 555/557/559 is attached to the fifth link 554E, the attachment may be hybrid. The attachment and detachment may be performed manually or automatically. In one example, the fifth linkage 554E may be attached to the first manipulator 555, as shown in fig. 5A and 5B, the robot control system 110 may control the robot 550 to disengage the fifth linkage 554E from the first manipulator 555 and then attach the fifth linkage 554E to the second manipulator 557. In another embodiment, the fifth link 554E may be permanently attached to a robot (e.g., robot 559).
In an embodiment, a validation symbol 530 may be provided on the robot 550. In some cases, validation symbol 530 may be permanently disposed on robot 550. In some cases, the validation symbol 530 may be disposed on a robotic arm of the robot 550, such as one of the links 554A-554E, or on a robotic arm. For example, fig. 5A depicts the validation symbol 530 disposed on the first robot 555 and the third robot 559, while fig. 5B depicts the validation symbol 530 disposed on the linkage 554E. The validation symbol 530 may be painted directly on the robot 550 or may be attached to the robot 550, such as by a sticker or tablet. In the example depicted in fig. 5A, the second robot 557 or the third robot 559 may be used to perform camera calibration because they both have respective calibration patterns 520/527 disposed thereon, while the first robot 555 or the third robot 559 may be used to perform verification of camera calibration because they both have respective verification symbols 530 disposed thereon.
Returning to fig. 4A, in an embodiment, the method 400 begins at step 401, where the control circuitry 111 performs a first camera calibration to determine camera calibration information associated with a camera (e.g., the camera 170/270/370/570 of fig. 1, 2, 3, or 5) at step 401. More specifically, the camera calibration information may include camera calibration values for the camera. In this embodiment, the control circuit 111 may perform a first camera calibration based on an image of the calibration pattern (also referred to as a calibration image).
For example, to perform a first camera calibration, the camera 550 of fig. 5A may be attached to the second robot 557 having the calibration pattern 520 or the third robot 559 having the calibration pattern 527. Fig. 3 depicts a similar environment in which a first camera calibration may be performed. During this step, the first manipulator 555 may be detached from the fifth linkage 554E, with the calibration pattern 320 being used to perform camera calibration. A first camera calibration may be performed before the robot operation is started. For example, robotic operation may begin with a robotic task such as the first robot 555 interacting with a first object 582A on the conveyor. During the first camera calibration, the robot 550 may arm a second robot hand 557. The robot control system 110 may control the robotic arms of the robot 550 through motion commands to move the calibration pattern 520 to various positions within the camera field of view 510 of the camera 570 and to capture corresponding images of the calibration pattern 520 at such positions. The robot control system 110 may perform a first camera calibration based on the captured image of the calibration pattern 520 to determine camera calibration information for the camera 570. In an example, the camera calibration information may include information describing a relationship between the camera 570 and the robot 550. In an example, the camera calibration information may describe intrinsic parameters of the camera 570. Camera CALIBRATION is discussed in more detail in U.S. application No.16/295,940 (docket No. MJ0021US1) "METHOD AND DEVICE FOR PERFORMING AUTOMATIC CAMERA CALIBRATION FOR ROBOT CONTROL", filed on 7.3.2019, the entire contents of which are incorporated herein by reference.
Returning to fig. 4A, the method 400 may further include a step 403 of, at step 403, controlling the robotic arm to move the verification symbol (e.g., 530 of fig. 5) to a position within the camera field of view (e.g., 510) of the camera (e.g., 570) during or after the first camera calibration by outputting a first motion command to the communication interface 113 of the robot control system 110. The communication interface 113 may be configured to communicate motion commands to the robot to cause the robotic arm to move the verification symbol (e.g., 530) to a position within the camera field of view (e.g., 510) during or after the first camera calibration. The motion command may also cause the robotic arm to direct the verification symbol toward the camera (e.g., 570), or more generally to make it visible to the camera. The position may be used as a reference position of the one or more reference positions for verification of the first camera calibration. For example, as the validation process acquires images of the validation symbol over time, the control circuitry 111 may control the robotic arm to consistently position the validation symbol (e.g., 530) at the same one or more locations, such that the one or more locations may be used as one or more reference locations. Further, as described below with respect to step 405-. The post image may be used as a verification image, and an image against which the post image is compared may be used as a reference image.
At step 405, the control circuit 111 may receive (e.g., retrieve) an image of the authentication symbol (e.g., 530) from the camera (e.g., 170/270/370/570) through the communication interface 113, where the image is a reference image for authentication. The image may be taken by the camera while the verification symbol is or was at the reference position. In an embodiment, the communication interface 113 may first receive a reference image from the camera, and the control circuit 111 may then receive the reference image from the communication interface 113. In an embodiment, step 405 is performed without the control circuit 111 generating a camera command to the camera. In an embodiment, step 405 may involve the control circuitry 111 generating a camera command and communicating the camera command to the camera through the communication interface 113. The camera command may control the camera to take an image of the authentication symbol at the reference position.
Fig. 5A-6B illustrate various aspects of steps 403 and 405. In the embodiment of fig. 5A, after a first camera calibration is performed, for example, using the second robot 557, the second robot 557 may be replaced with a third robot 559 on which the verification symbol 530 is disposed. In this example, the robot control system 110 controls a robotic arm of the robot 550 (e.g., via one or more motion commands) to move the validation symbol 530 to one or more reference positions within the camera field of view 510 of the camera 570. The one or more reference positions may include any position within the camera field of view 510, or may be a set of one or more particular positions, such as positions disposed on the surface of an imaginary sphere, as described in more detail below. In another example, in the embodiment of fig. 5B, during or after the first camera calibration, the robot control system 110 may control the robotic arm to move the validation symbol 530 to one or more reference positions within the camera field of view 510. In this example, the one or more reference positions may include any position where the verification symbol 530 (along with the calibration pattern 520) was captured during the first camera calibration, or may be a set of one or more particular positions to which the verification symbol 530 was moved after the first camera calibration was performed. The robot control system 110 may control the motion of the robotic arm of the robot 550 in this step under the direction of the camera 570 based on camera calibration information obtained from the first camera calibration, or may do so without such direction. In embodiments, the reference location may be a defined location that is saved in a local or remote storage device and may be retrieved. They may be stored in the form of coordinates (e.g., cartesian coordinates), or as motor commands for rotating links 554A-554E, or in some other manner.
In an embodiment, the one or more reference positions to which the robotic arm moves the verification symbol (e.g., 530) may include a plurality of reference positions, wherein each of the plurality of reference positions is a position disposed on a surface of an imaginary sphere that is recessed relative to the camera. In such embodiments, the control circuitry 111 may be further configured to control the robotic arm to move the validation symbol tangent to the surface of the imaginary sphere at each of the plurality of reference positions. For example, as illustrated in FIGS. 6A, 6B, 6C, and 6D, the robot control system 110 may control the robotic arm of the robot 550 to move the validation symbol 530 to the reference positions 610A 610I and control the camera 570 to capture respective reference images at the respective reference positions 610A 610I. The reference positions 610A-610I in FIGS. 6A and 6B may be divided among a plurality of imaginary spheres within the camera field of view 510. The reference positions 610A and 610B may be disposed on a first spherical surface 621 of the first imaginary sphere 620, where the first spherical surface 621 is within the camera field of view 510. The reference positions 610C, 610D, and 610E may be disposed on a second spherical surface 631 of the second imaginary sphere 630, where the second spherical surface 631 is within the camera field of view 510. The reference positions 610F, 610G, 610H, and 610I may be disposed on a third spherical surface 641 of the third imaginary sphere 640, wherein the third spherical surface 641 is within the camera field of view 510. As illustrated in fig. 6A and 6B, the first, second, and third spherical surfaces 621, 631, and 641 are recessed with respect to the camera 570, respectively. Although the examples in fig. 6A and 6B show 3 spheres based on 3 spheres, the number of different spheres on which the reference position can be set may be greater or less than 3. In an embodiment, the camera 570 may be the center of each imaginary sphere 620, 630, 640.
In an embodiment, as illustrated in fig. 6A-6D, when validation symbol 530 is moved to a reference position, robot control system 110 may control robotic arm 553 of robot 550 (e.g., via a motion command) to position validation symbol 530 tangent to a spherical surface on which the reference position is disposed. For example, fig. 6A and 6B illustrate validation symbol 530 tangent to second spherical surface 631 at reference position 610D, while fig. 6C and 6D illustrate validation symbol 530 tangent to second spherical surface 631 at reference position 610C. More particularly, validation symbol 530 may be disposed on a plane (e.g., on a sticker), the plane of validation symbol 530 may be tangent to second spherical surface 631 at reference position 610D in fig. 6A and 6B, and may be tangent to second spherical surface 631 at reference position 610C in fig. 6C and 6D. In an embodiment, robotic arm 553 may be in a first position in fig. 6A and 6B, and may be in a second position in fig. 6C and 6D. The pose of the robot 553 may refer to, for example, the shape, or more generally the geometry, formed by the link or other arm portion of the robot 553. For example, the pose of robotic arm 553 may refer to a particular arrangement (permatation) of angles or distances at which a link of robotic arm 553 rotates or translates (e.g., extends or retracts) relative to a previous link of robotic arm 553. As an example, the first pose depicted in fig. 6A may correspond to a first alignment of angles formed between consecutive links of the series of links of robotic arm 553, while the second pose depicted in fig. 6C may correspond to a second alignment of angles between consecutive links of the series of links of robotic arm 553. In such an example, the reference position 610D of the validation symbol 630 may be associated with a first pose of the robotic arm 553, as shown in fig. 6A, while the reference position 610C of the validation symbol 630 may be associated with a second pose of the robotic arm 553, as shown in fig. 6C.
In an embodiment, the control circuitry 111 is configured to control the robotic arm to move the validation symbol (e.g., 530) to directly face the camera when the validation symbol is moved to the reference position. For example, as illustrated in fig. 6A, the robot control system 110 may control the robotic arm 553 of the robot 550 to move the validation symbol 530 to directly face the camera 570 when the validation symbol 530 is moved to the reference position 610D. In this example, the robotic control system 110 may control the robot hand 555 to rotate such that the authentication symbol 530 directly faces the camera 570. In some cases, the verification symbol may face the camera 570 directly by being tangent to a spherical surface at the camera field of view 510. When the authentication symbol 530 directly faces the camera 570, the camera 570 can photograph the authentication symbol 530 in front such that in the final image of the authentication symbol 530, there is no perspective effect or the perspective effect is reduced.
In an embodiment, the validation symbol (e.g., 530) includes a first region having a first color and a second region having a second color, where a ratio of an area of the first region to an area of the second region is specified and saved on a non-transitory computer readable medium (e.g., a storage device) of the robotic control system 110. In such embodiments, the control circuit 111 may be configured to identify the validation symbol in the reference image or the validation image based on a specified ratio. For example, as illustrated in fig. 5C, the validation symbol 530 may include a first region 531, the first region 531 being annular and having a first color (e.g., a black region), and a second region 533 (e.g., a white region) having a second color surrounded by the first region 531. The ratio of the area of the black first region 531 to the area of the white second region 533 in the verification symbol 530 may be an exact prescribed value. By analyzing the colors within the captured image, the robot control system 110 may identify the portion of the image corresponding to the authentication symbol 530 by determining whether a portion of the image has an annular region surrounding the circular region, and whether a ratio of the area of the annular region to the area of the circular region coincides with a prescribed ratio. This may allow the robotic control system 110 to distinguish the verification symbol 530 from other features captured in the image. For example, as illustrated in fig. 5A, the robot 550 may be set to utilize a third manipulator 559 having a combination of a calibration pattern 527 and a verification symbol 530. In this example, the reference image may display both the verification symbol 530 and the calibration pattern 527. In this example, the calibration pattern 527 may not have any annular pattern, or may include an annular pattern having a ratio different from the prescribed ratio described above. The control circuit 111 may determine whether the part of the reference image is the verification symbol 530 or the calibration pattern 527 by determining whether the part of the reference image includes a first image region having a first color and a second image region having a second color, and whether a ratio of an area of the first image region to an area of the second image region is equal to a prescribed ratio.
In some cases, the robot control system 110 may determine whether a specific portion of the captured image includes a first region having a first color and a second region having a second color, and whether a ratio of an area of the first region to an area of the second region is within a prescribed range. In one example, if the prescribed ratio is 1.5, then if the ratio in a particular region is within a range of 1.4-1.6, then the robot control system 110 may determine that the particular region corresponds to the verification symbol 530. The two colors of the first and second areas are not limited to black and white, and may be any two different colors that the robot control system 110 can distinguish.
In one aspect, a validation symbol (e.g., 530) can include a first shape and a second shape that are concentric with one another, wherein respective centers of the first shape and the second shape are substantially co-located. For example, the shape of the authentication symbol may be a circular ring shape including a first circle and a second circle that are concentric with each other. More specifically, as illustrated in fig. 5C, the validation symbol 530 may include a first shape 535 (e.g., an outer circle) and a second shape 537 (e.g., an inner circle). First shape 535 and second shape 537 may be concentric with each other such that the center of first shape 535 and the center of second shape 537 are substantially co-located. For example, if the center of the first shape 535 is at coordinates
Figure BDA0002638306660000251
While the center of the second shape 537 is in coordinates
Figure BDA0002638306660000252
Figure BDA0002638306660000253
Then the coordinates
Figure BDA0002638306660000254
And coordinates
Figure BDA0002638306660000255
Are substantially the same.
Returning to FIG. 4A, the method 400 may also include step 407, where the control circuit 111 determines reference image coordinates of the validation symbol at step 407, the reference image coordinates being coordinates where the validation symbol (e.g., 530) appears in the reference image. For example, as illustrated in FIG. 6A, an image of the validation symbol 530 may be captured at the reference location 610D and may be used as a reference image. The validation symbol 530 appears within the reference image at specific coordinates, which may be referred to as reference image coordinates.
In an embodiment, as described above, the verification symbol (e.g., 530) may include a first shape and a second shape that are concentric with one another, wherein respective centers of the first shape and the second shape are substantially co-located. In such embodiments, at step 407, the control circuitry 111 may be configured to control the operation of the apparatus by: the reference image coordinates of such a verification symbol are determined by determining first coordinates of the center of the first shape in the reference image, determining second coordinates of the center of the second shape in the reference image, and determining the reference image coordinates as an average of the first coordinates and the second coordinates in the reference image.
For example, fig. 7A shows a reference image 710 taken at a reference position N (where N is an integer) among the reference positions. The reference image 710 includes a validation portion 730, the validation portion 730 being an image portion in the reference image 710 that displays the validation symbol 530 of fig. 5A, 5B, or 5C. The robot control system 110 of fig. 1A or 1B may be configured to identify a first shape 735 (e.g., an outer circle) from the validation portion 730 that is the same as or substantially the same as the first shape 535 of the validation symbol 530 of fig. 5C. The robot control system 110 may be configured to further identify a second shape 737 (e.g., an inner circle) from the validation portion 730 that is the same or substantially the same as the second shape 537 of the validation symbol 530 in fig. 5C. Subsequently, for the reference position N, the robot control system 110 may determineFirst coordinates centering on a first shape 735 displayed in the reference image 710
Figure BDA0002638306660000261
(i.e., the center coordinates of the first shape 735), and the second coordinates of the center of the second shape 737 displayed in the reference image 710
Figure BDA0002638306660000262
Figure BDA0002638306660000263
(i.e., the center coordinates of second shape 737). To determine reference image coordinates (u) of the entire reference image 710ref_N,uref_N) Where the reference image 710 corresponds to the validation symbol 530 at the reference location N, the robot control system 110 may calculate the first coordinates in the reference image 710 as follows
Figure BDA0002638306660000264
And second coordinates
Figure BDA0002638306660000265
Average value of (a).
Figure BDA0002638306660000266
In an embodiment, the reference image coordinates of the verification symbol may be center coordinates thereof, and determining the center coordinates of the verification symbol 530 based on the respective center coordinates of the first and second shapes 735, 735 may improve robustness of the verification process to image noise. For example, image noise may introduce error in the determination of the center coordinates of the first shape 735, but not the determination of the center coordinates of the second shape 737. In some cases, the second shape 737 may have, or may actually share, the same center coordinates as the first shape 735, however, image noise may cause the center coordinates of the second shape 737 to appear in the image to be different than the center coordinates of the first shape 735. In this case, simply using the center coordinates of the second shape 737 as the center coordinates of the validation symbol 530 may result in an undesirable amount of error. By using the average of the center coordinates of the first shape 735 and the center coordinates of the second shape 737 as the center coordinates of the verification symbol 530, the amount of error can be reduced.
In an embodiment, the one or more reference positions may be a plurality of reference positions respectively corresponding to a plurality of reference image coordinates. In this embodiment, the reference image coordinate may be one of a plurality of reference image coordinates. For example, as illustrated in FIGS. 6A and 6B, there may be multiple reference locations, such as reference locations 610A-610I to which the validation symbol 530 is moved or otherwise placed. For each of the reference positions 610A-610I of the validation symbol 530, the robot control system 110 may retrieve or otherwise receive a respective reference image of the validation symbol 530 at that position taken by the camera 570 and may determine respective reference image coordinates that indicate where the validation symbol 530 appears in the respective reference image.
Returning to fig. 4A, the method 400 may further include a step 409 of the control circuit 111 controlling the motion of the robotic arm based on the camera calibration information for robotic operation at step 409. In an embodiment, this step may involve the control circuit 111 generating a second motion command based on the camera calibration information and then outputting the second motion command to the communication interface 113. The communication interface 113 may in turn communicate a second motion command to the robot to control the motion of the robot arm. For example, as illustrated in fig. 5A, after the first camera calibration, the robot control system 110 controls the robot 550 to perform a robot operation involving a robot task, such as picking up objects 582A, 582B, and 582C. The motion of the robot 550 may be based on camera calibration information obtained from the first camera calibration, as well as based on images of the objects 582A, 582B, 582C taken by the camera 570.
In step 411, the control circuit 111 detects an idle period during operation of the robot. In one aspect, the idle period of the robot may be a period of time during which the robot is not performing robot tasks during operation of the robot. In some cases, if robot operation is based on picking up objects from the conveyor belt 573, the idle period may be based on the absence of objects on the conveyor belt 573. More specifically, the conveyor belt 573 is accessible by the robotic arm 553, and the control circuit 111 is configured to detect the idle period by detecting that there are no objects on the conveyor belt 573, or that the distance between the robot 550 and the nearest object on the conveyor belt 573 exceeds a prescribed distance threshold. In some cases, the control circuit 111 may receive a signal indicating an upcoming idle period, where the signal may be received from other devices or components monitoring robot operation. For example, as illustrated in fig. 5A, the robot control system 110 may detect an idle period between a robot job involving picking up the second object 582B and a robot job involving picking up the third object 582C during robot operation because there is a greater distance between the second object 582B and the third object 582C. During this idle period, after the robot 550 picks up the second object 582B, the robot 550 may have an idle period during which it is not performing robot work because the robot 550 is not yet able to reach the object 582C. In one example, the robot control system 110 may detect an idle period when the robot 550 cannot reach any object on the conveyor belt 573, and/or when the robot control system 110 determines that the distance between the robot 550 and the nearest object (e.g., the third object 582C) upstream of the conveyor belt 573 exceeds a certain threshold.
Returning to fig. 4A and 4B, the method 400 may further include a step 451, where the control circuit 111 controls the robot arm 553 to move the validation symbol 530 to at least the reference position (for capturing the reference image) used in step 403 during the idle period, at step 451. In an embodiment, step 451 may involve the control circuit 111 generating a third motion command and outputting the third motion command to the communication interface 113. The communication interface 113 may be configured to then communicate the third motion command to the robot to cause the robot arm 553 to move based on the motion command. In some cases, the third motion command may relate to a saved set of motor commands corresponding to the reference position. In some cases, the third motion command may be generated based on the camera calibration information from step 401. In other cases, the third motion command in step 451 is not dependent on the camera calibration information from step 401.
At step 453, the control circuit 111 retrieves or otherwise receives an additional image of the verification symbol (e.g., 530) from the camera (e.g., 570) during the idle period, wherein the additional image is a verification image for verification and is an image of the verification symbol located at least at the reference location during the idle period. That is, the verification image of the reference location is taken while the verification symbol (e.g., 530) is or was at the reference location. In an embodiment, step 453 involves control circuit 111 generating a camera command that controls a camera (e.g., 570) to take the verification image. Control circuitry 111 may output camera commands to communication interface 113, and communication interface 113 may communicate the camera commands to a camera (e.g., 570). In an embodiment, step 451 may involve controlling the robotic arm to move the authentication symbol to a plurality of reference positions and receive a plurality of corresponding authentication images captured by the camera. For example, as illustrated in fig. 6A and 6B, during an idle period, the robot control system 110 may control the robot arm 553 of the robot 550 to move the authentication symbol 530 to one of the reference positions 610A 610I and capture an image of the authentication symbol 530 at that position as an authentication image. If the idle period has not ended, or more specifically, if sufficient time remains in the idle period, the robot control system 110 may control the robot arm 553 of the robot 550 to move the validation symbol 530 to another one of the reference positions 610A-610I and take an image of the validation symbol 530 at the another position as another validation image. If the idle period is over, the robot control system 110 may stop capturing the verification image. Thus, during each idle period, the robot control system 110 may control the robotic arm of the robot 550 to move the validation symbol 530 to one or more of the reference positions 610A 610I and to capture a validation image at each of the one or more of the reference positions 610A 610I.
Returning to fig. 4B, the method 400 may further include a step 455, where in step 455, the control circuit 111 determines verification image coordinates for verification, the verification image coordinates being coordinates where a verification symbol appears in the verification image. If the verification symbol (e.g., 530) is moved to a plurality of reference positions (e.g., 610A-610I), the camera (e.g., 570) may capture a plurality of verification images respectively corresponding to the plurality of reference positions, and the control circuit 111 may determine a plurality of verification image coordinates respectively corresponding to the plurality of verification images and respectively corresponding to the plurality of reference positions. The multiple validation images may all be captured by the camera (e.g., 570) during a single idle period (e.g., if the single idle period is long enough to allow the robotic arm to move the validation symbol (e.g., 530) to all of the reference locations 610A-610I), or during several different idle periods (e.g., if each idle period is not long enough to allow the robotic arm to move the validation symbol 530 to all of the reference locations 610A-610I).
In an embodiment, the verification image coordinates may be determined in a similar manner as the reference image coordinates. For example, the verification image coordinates may be center coordinates of the verification symbol (e.g., 530), which may be determined as an average of the center coordinates of the first shape of the verification symbol (e.g., 530) and the center coordinates of the second shape of the verification symbol in the verification image (e.g., 760). For example, fig. 7B shows an authentication image 760 taken at a reference position N among the reference positions. The verification image 760 displays a verification portion 780, the verification portion 780 being the portion of the verification image 760 that displays the verification symbol 530. The robot control system 110 may identify a first shape 785 from the verification portion 780 that is the same or substantially the same as the first shape 585 of the verification symbol 530 of fig. 5C. The robot control system 110 may further identify a second shape 787 from the validation portion 780 that is the same or substantially the same as the second shape 587 of the validation symbol 530. Further, the robot control system 110 may be configured to determine center coordinates of the first shape 785 displayed in the verification portion 780 of the verification image 760
Figure BDA0002638306660000301
And the center coordinates of the second shape 787 displayed in the verification portion 780 of the verification image 760
Figure BDA0002638306660000302
The robot control system 110 may further coordinate (u) the verification image 760 as followsverify_N,vverify_N) Determined as the average of the center coordinates of first shape 785 and the center coordinates of second shape 787 in verification image 760:
Figure BDA0002638306660000303
returning to fig. 4B, method 400 may further include step 457, at step 457, control circuit 111 determines a deviation parameter value based on an amount of deviation between the reference image coordinates of step 403 and the verification image coordinates of step 455, where both the reference image coordinates and the verification image coordinates are associated with reference location N. In one example, the deviation between the reference image coordinates and the verification image coordinates may be a distance between the reference image coordinates and the verification image coordinates. For example, assume that the reference image coordinates at the reference position N are expressed as (u)ref_N,vref_N) And the verification image coordinates at the reference position N are expressed as (u)verify_N,vverify_N) Then the deviation (e.g., distance) at the reference position N can be expressed as
Figure BDA0002638306660000311
As described above, in aspects in which the one or more reference locations are a plurality of reference locations, the control circuitry 111 may be configured to determine a plurality of verification image coordinates corresponding to the plurality of reference locations, respectively, wherein the verification image coordinate is one of the plurality of verification image coordinates. In this regard, the deviation parameter values are based on respective deviation amounts between a plurality of reference image coordinates of a plurality of reference locations and a plurality of verification image coordinates, wherein each of the respective deviation amounts is: (a) respective amounts of deviation between reference image coordinates corresponding to respective ones of the plurality of reference locations and (b) verification image coordinates corresponding to the same reference location. The plurality of verification image coordinates may be respective coordinates where a verification symbol appears in the plurality of verification images, the verification image being one of the plurality of verification images. The control circuit 111 may be configured to control the camera to take all of the plurality of verification images during one idle period, and/or may be configured to control the camera to take the plurality of verification images during different idle periods.
For example, when there are a plurality of reference positions, as shown in fig. 6A and 6B, the robot control system 110 may determine a plurality of respective reference image coordinates corresponding to the plurality of reference positions, and a plurality of respective verification image coordinates corresponding to the plurality of reference positions, and determine respective amounts of deviation between the plurality of reference image coordinates and the plurality of verification image coordinates. The deviation parameter values may be based on respective amounts of deviation between the plurality of reference image coordinates and the plurality of verification image coordinates. For example, the deviation parameter may be an average of the respective deviation amounts, as shown below.
Figure BDA0002638306660000312
In the above expression, N refers to the nth reference position, and M refers to the total number of reference positions.
Returning to fig. 4B, the method 400 may further include a step 459, where the control circuit 111 determines whether the deviation parameter value exceeds a prescribed threshold (which may also be referred to as a prescribed deviation threshold) in step 459. Further, in step 461, in response to a determination that the deviation parameter value exceeds the prescribed threshold, the control circuit 111 may perform a second camera calibration to determine updated camera calibration information for the camera. For example, a deviation parameter value that exceeds a prescribed threshold may indicate that the camera calibration information for the camera is out of date, and/or may cause an undesirable amount of error in robot operation. Thus, if the deviation parameter value exceeds a prescribed threshold, a second camera calibration of the camera may be performed to update camera calibration information of the camera (e.g., 570). The second camera calibration may use the same techniques as the first camera calibration, but based on the most recently taken image of the camera. In an example, if step 459 indicates that the deviation parameter value exceeds a prescribed threshold, then robot operation may be stopped or paused, followed by commencing a second camera calibration, which may be started by taking an image for the second camera calibration. After the second camera calibration is complete and the camera calibration information for the camera is updated, the robot control system 110 may resume robot operation using the updated camera calibration information.
In an embodiment, the control circuitry 111 may be configured to control the robot to continue robot operation after an idle period without additional camera calibration (e.g., by outputting a fourth motion command to the robot via the communication interface) in response to a determination that the deviation parameter value does not exceed the prescribed threshold. This situation may indicate that the camera calibration information from step 401 is still sufficiently accurate and that robot operation may continue without an undesirable amount of error.
In an embodiment, the control circuitry 111 may be configured to determine a temperature of an environment in which the robot is located and adjust at least one of a prescribed deviation threshold (also referred to as a redefined deviation threshold) or camera calibration information of the camera based on the measured temperature. For example, the control circuitry 111 may determine the temperature of the environment by measuring the temperature, or receiving temperature data from another device or component. In such embodiments, the control circuitry 111 may be configured to control the operation of the device by: the prescribed threshold value is set to have a first value when the measured temperature is outside the prescribed range, and the threshold value is set to have a second value lower than the first value when the measured temperature is within the prescribed range, to adjust the prescribed threshold value based on the measured temperature. For example, too high a temperature or too low a temperature may cause a change in the camera. More specifically, temperature variations may affect intrinsic parameters of the camera. For example, components in a camera may expand when the temperature increases and contract when the temperature decreases, which may affect intrinsic parameters of the camera. It is then advantageous to adjust the defined deviation threshold on the basis of the temperature or the amount of temperature change. For example, when the temperature is within a normal operating temperature range (e.g., a prescribed range based on room temperature), then the prescribed deviation threshold may be lower because the temperature does not adversely affect the camera. On the other hand, when the temperature is outside the normal operating temperature range, the deviation threshold may be higher because low or high temperatures adversely affect the camera. In an alternative example, the deviation threshold may be specified as a lower value when the temperature is outside of the normal operating temperature range, so that additional camera calibrations are triggered more frequently. In this example, the deviation threshold may be specified to a higher value when the temperature is within the normal operating temperature range, so that additional camera calibrations are triggered less frequently.
FIG. 8 depicts an exemplary timeline 800 for performing camera calibration and verification of camera calibration. Although the following examples discuss verifying camera calibration, they may be more generally applicable to verifying any type of calibration operation performed for a robotic operating system. Before robot operation begins, the robot control system 110 of fig. 1A or 1B performs a first camera calibration during a calibration period 811 to determine camera calibration information for a camera (e.g., camera 570 of fig. 5A or 5B). After the first camera calibration is completed, the robot control system 110 takes reference images of the validation symbols (e.g., validation symbol 530) at respective reference positions during a reference acquisition period 813 and determines reference image coordinates where the validation symbols appear in the respective reference images (e.g., reference image 710 of fig. 7A). When the reference image coordinates are determined, robot operation may begin after the end of the reference acquisition period 813.
After robot operation is initiated, during a job period 815, robot control system 110 controls a robot (e.g., robot 550 of fig. 5A or 5B) to perform one or more robot jobs, such that in an embodiment, a verification image (e.g., verification image 760 of fig. 7B) may not be collected. After the job period 815, the robot control system 110 detects an idle period 817 during which the robot does not perform robot jobs. Thus, during the idle period 817, the robot control system 110 captures one or more verification images of the verification symbols at a first set of one or more of the reference positions (e.g., 610A-610B), respectively. After the idle period 817 has ended, the robot control system 110 resumes controlling the robot to perform one or more robot jobs during the job period 819 so that no verification images are collected. After the job period 817, the robot control system 110 detects an idle period 821 during which the robot does not perform robot jobs. During the idle period 821, the robot control system 110 captures one or more verification images of the verification symbols at a second set of one or more of the reference locations (e.g., 610C-610E), respectively. After the idle period 821, in the robot job period 823, the robot control system 110 resumes controlling the robot to perform one or more robot jobs so that no verification images are collected. After the work period 823, the robot control system 110 detects an idle period 825 during which the robot does not perform robot work. During the idle period 825, the robot control system 110 captures one or more verification images of the verification symbols at a third set of one or more of the reference positions (e.g., 610F-610I), respectively.
The verification images (e.g., 760) taken during the idle periods 817, 821, and 825 may be taken at different respective ones of the reference locations. For example, the first, second, and third sets of one or more locations may be different from each other such that the locations may not overlap. Further, during the idle period 825, the robot control system 110 may determine that verification image capture is complete, which may indicate that a sufficient number of verification images were captured for verification of camera calibration. In one embodiment, if the verification images are captured at all of the reference positions (e.g., 610A-610I), the robot control system 110 may determine that the verification image capture is complete. In one embodiment, if the number of verification images reaches a prescribed target count, the robot control system 110 may determine that the verification image capture is complete.
When it is determined that the authentication image capturing is completed, the robot control system 110 determines the authentication image coordinates where the authentication symbol appears in each authentication image. Subsequently, the robot control system 110 determines the deviation parameter values based on the respective deviation amounts of the verification image coordinates and the reference image coordinates. If the deviation parameter exceeds a prescribed threshold, the robot control system 110 performs an additional camera calibration. However, in this example, the deviation parameter does not exceed the prescribed threshold, so after the idle period 825, the robot control system 110 continues robot work during a work period 827 without additional camera calibration.
FIG. 9 depicts an exemplary flow chart 900 representing a verification process associated with the timeline of FIG. 8. At step 901, the robot control system 110 of fig. 1A, 1B, or 1C performs a first camera calibration of a camera (e.g., camera 570 of fig. 5A or 5B) to determine camera calibration information for the camera. In step 903, the robot control system 110 controls a robot (e.g., robot 550 of fig. 5A or 5B) to move a validation symbol (e.g., validation symbol 530 of fig. 5A or 5B) to a reference position and capture, by a camera, respective instances of reference images (e.g., reference image 710 of fig. 7A) of the validation symbol at the respective reference positions. In step 905, the robot control system 110 starts robot operation of the robot based on the camera calibration information obtained from the first camera calibration.
In step 907, the robot control system 110 detects an idle period during robot operation. In step 909, the robot control system 110 controls the robot (e.g., robot 550 of fig. 5A or 5B) to move the validation symbol (e.g., validation symbol 530 of fig. 5A or 5B) to one or more of the reference positions and to take one or more validation images (e.g., validation image 760 of fig. 7B) at the one or more of the reference positions, respectively, by the camera during the idle period. In some cases, the robot control system 110 may control the robot to move the validation symbol to as many reference positions as the duration of the idle period allows. In step 911, the robot control system 110 determines whether the total number of the photographed verification images reaches a prescribed target count. If the total number of verification images taken does not reach the target count, then the robot control system 110 attempts to detect another subsequent idle period during robot operation by returning to step 907 to take more verification images.
If the total number of captured verification images reaches the target count, the robot control system 110 performs verification of camera calibration based on the reference image (e.g., 710) and the verification image (e.g., 760) at step 913. Verification of camera calibration yields a deviation parameter. At step 915, the robot control system 110 determines whether the deviation parameter exceeds a prescribed threshold. If the deviation parameter does not exceed the threshold, then at step 919 the robot control system 110 may reset the total number of captured verification images to 0 and continue robot operation after the idle period while attempting to detect additional idle periods to capture a new set of verification images by returning to step 907.
If the deviation parameter exceeds the threshold, the robot control system 110 may stop robot operation and perform a second camera calibration at step 917. After the second camera calibration of step 917, the robot control system 110 may reset the total number of captured verification images to 0 in step 921. After step 921, the flowchart may return to step 903, where in step 903, the robot control system 110 controls the robot (e.g., 550) to move the validation symbol (e.g., 530) to the reference positions and takes a new set of reference images (e.g., 710) of the validation symbol at the respective reference positions by the camera (e.g., 570) so that the new set of reference images may be used for validation at a later time.
As described above, an aspect of the present disclosure is directed to receiving a reference image (e.g., reference image 710 of fig. 7A) and a validation image (e.g., validation image 760 of fig. 7B), both of which capture or otherwise represent a common validation symbol (e.g., validation symbol 530 of fig. 6A-6D) disposed at a physical location on a robotic arm, such as robotic arm 553 of robot 550 of fig. 6A-6D. In an embodiment, the reference image (e.g., 710) may be generated at a first point in time (such as an earlier point in time), while the verification image (e.g., 760) may be a new image generated at a second point in time (such as a later point in time), for example. In some cases, the reference image may be generated based on one or more commands (also referred to as one or more instructions) generated by a computing system, such as the robotic control system 110 of fig. 1C. In some cases, the one or more commands may include a motion command and/or a camera command. The motion command may be used to cause the robotic arm to move the validation symbol to a reference position. For example, the motion command may cause the robotic arm to assume a particular pose, which results in the verification symbol being moved to a reference position associated with the particular pose of the robotic arm. The camera commands may be used to cause a camera (such as camera 570 of fig. 6A-6D) to generate a reference image when the robotic arm is in a particular pose and/or when the verification symbol is in a reference position associated with the particular pose. The reference image may thus correspond to the reference position. Similarly, the verification image may be generated based on one or more commands (e.g., motion commands and/or camera commands), which may be the same or similar to the motion commands and/or camera commands discussed above with respect to the reference image. The one or more commands for verifying the image may be generated by the same computing system that caused the reference image to be generated, such as robotic control system 110, or by another computing system. In this example, one or more commands for generating the validation image may cause the validation symbol to be moved to the reference position again. For example, the one or more commands may cause the robotic arm to again assume a particular pose associated with the reference position.
In the above example, the computing system receiving the verification image may compare the verification image to the reference image to determine whether the calibration information is still sufficiently accurate. Calibration information may be determined by a calibration operation performed for the robot operating system. The calibration operation (which may also be referred to as system calibration) may include camera calibration, robot calibration, or any other calibration for controlling one or more components of the robot operating system. In embodiments, the calibration information may include camera calibration information, robot calibration information, or any other calibration information. The comparison discussed above may involve determining a deviation parameter value, for example, based on the difference between where the validation symbol appears in the reference image and where the validation symbol appears in the validation image. If the deviation parameter value is too large, such as if it exceeds a prescribed deviation threshold (e.g., a predetermined deviation threshold), the computing system may determine that the calibration information is no longer sufficiently accurate. In such cases, the calibration information may be said to reflect or include mis-calibration or mis-alignment. When the camera calibration information includes a miscalibration or a miscalibration, it may no longer accurately describe the intrinsic characteristics of the camera (e.g., the projection characteristics or the lens distortion characteristics) and/or the relationship between the camera and its external environment (e.g., the spatial relationship between the camera and the robot base). When the robot calibration information includes a mis-calibration or mis-alignment, it may no longer be reliable to accurately move the robotic arm or other component of the robot to a desired position and/or orientation.
In an embodiment, the method of comparing a reference image and a verification image described above may involve a plurality of verification symbols. For example, FIGS. 10A-10C depict a set of multiple validation symbols 530A-530C. More specifically, the set of validation symbols 530A-530C may be part of the robotic manipulation system 500A, which robotic manipulation system 500A may be an embodiment of the robotic manipulation system 500. The robot operating system 500A may include a robot 550A (which may be an embodiment of the robot 550), a camera 570 having a camera field of view 510, and a computing system such as the robot controller 110. Like robot 550, robot 550A has a robot arm 553 having a plurality of arm portions movably attached to each other. For example, the plurality of arms may include links 554A-554E and include a robotic end effector (such as a robot hand 555) attached to link 554E. In some cases, as shown in fig. 10A-10C, multiple arm portions may be connected or arranged in series from a base 552 of a robot 550A to a robot end effector (e.g., a robot hand 555). In such a case, the series of arm portions may form a kinematic chain, wherein movement of a particular arm portion in the series may cause movement of some or all of the arm portions downstream of the particular arm portion. The arm portion downstream of the specific arm portion may refer to an arm portion following the specific arm portion in the series of arm portions. For example, linkages 554B-554E and robotic arm 555 may be downstream of linkage 554A. In other words, the linkage 554A may be upstream of the linkages 554B-554E and upstream of the robotic arm 555. In this example, each arm portion of the series of arm portions or a subset thereof may be rotatable, extendable, retractable, or movable relative to the respective arm portion immediately preceding the arm portion in the series of arm portions. For example, the link 554C may rotate relative to the link 554B, where the link 554B may be the arm portion immediately preceding the link 554C in the series of arm portions shown in fig. 10A. In the example of FIGS. 10A-10C, linkages 554A-554E and robotic arm 555 may be movably attached to each other via joints 556A-556D.
As described above, a robotic arm (e.g., 553) may be moved to different poses, where a pose may refer to a shape, or more generally a geometric shape, formed by an arm portion (e.g., a link) of the robotic arm. For example, FIGS. 10A-10C illustrate three different respective poses of robotic arm 553. In an embodiment, robot 550A in fig. 10A-10C may include one or more actuators (e.g., motors) configured to rotate, translate (e.g., extend or retract), or otherwise move links 554A-554E and robot arm 555 relative to one another. In such embodiments, each gesture in fig. 10A-10C may be associated with a particular permutation of movement of one or more actuator outputs. For example, the arrangement may describe an angular value by which a respective arm portion of robotic arm 553 has been rotated relative to a respective arm portion immediately preceding the respective arm portion, and/or a distance by which the respective arm portion has been translated relative to a respective arm portion immediately preceding the respective arm portion. For example, the poses in fig. 10A-10C may be associated with different permutations of five angular values describing the direction and amount of rotation of link 554B relative to link 554A, the direction and amount of rotation of link 554C relative to link 554B, the direction and amount of rotation of link 554D relative to link 554C, the direction and amount of rotation of link 554E relative to link 554D, and the direction and amount of rotation of manipulator 555 relative to link 554E, respectively.
As described above, FIGS. 10A-10C depict a set of multiple validation symbols 530A-530C disposed on one or more arm portions of the robot arm 553. More particularly, validation symbol 530A can be disposed on link 554B, while validation symbol 530B can be disposed on link 554C, and validation symbol 530C can be disposed on manipulator 555. In some cases, a robot (e.g., robot 550) may have any number of validation symbols, and they may be arranged on the robot in any manner. For example, validation symbols 530A-530C may be randomly placed at various locations on robot 550A. In some cases, the number and relative placement of the validation symbols (e.g., 530A-530C) may be constrained by one or more specifications. For example, their relative placement may be subject to regulatory constraints that require adjacent validation symbols (e.g., 530B and 530C) to be separated by a prescribed minimum distance (also referred to as symbol spacing), such as a prescribed minimum distance of 5 cm. Such constraints may reduce the likelihood that a computing system (e.g., robotic controller 110) will confuse a particular validation symbol (e.g., 530B) with an adjacent validation symbol (e.g., 530C).
11A-11C depict another example involving a set of multiple authentication symbols 1130A-1130C. More particularly, the verification symbols 1130A-1130C may be part of a robot operating system 1100, the robot operating system 1100 including a robot 1150, a camera 1170 having a camera field of view 1110, and a computing system such as the robot controller 110. The robotic operating system 1100 and camera 1170 may be, for example, embodiments of the robotic operating system 500 and camera 570, respectively. The robot 1150 may have a robotic arm 1153, the robotic arm 1153 including a plurality of arm portions, such as links 1154A-1154E and a manipulator 1155 (or other robotic end effector). The multiple arm portions may be movably attached to one another via, for example, joints 1156A-1156D. Fig. 11A illustrates the robotic arm 1153 in a first pose, while fig. 11B illustrates the robotic arm 1153 in a second pose. As in fig. 10A-10C, multiple arm portions may be connected or arranged in series from a base 1152 to a manipulator 1155 of the robot 1150. The series of arm portions may form a kinematic chain, wherein movement of one arm portion of the series may propagate to a downstream arm portion in the chain. As shown in fig. 11A, verification symbol 1130A may be disposed on link 1154C, while verification symbol 1130B may be disposed on link 1154D, and verification symbol 1130C may be disposed on manipulator 1155.
In an embodiment, one or more of the set of verification symbols (e.g., 530A-530C or 1130A-1130C) may have a circular shape. For example, as discussed above with respect to FIG. 5C, the validation symbols 530A-530C of FIGS. 10A-10C or 1130A-1130C of FIGS. 11A-11C may be a ring-shaped pattern, or more specifically a circular ring. In the example of fig. 5C, the annular pattern may have concentric circular regions (e.g., 531 and 533) or concentric circles (e.g., 535 and 537). The concentric circular regions or concentric circles may include, for example, an inner circular region and an outer circular region, or an inner circle and an outer circle. In the example shown in FIG. 11C, validation symbol 1130A in one example is shaped to have a radius r1,1130AHas an inner circle and a radius r2,1130AThe outer circle of (2). In this example, authentication symbol 1130B may be a symbol having a radius r1,1130BHas an inner circle and a radius r2,1130BAnother ring of the outer circle of (a). Verification symbol 1130C may also be shaped to have a radius r1,1130CHas an inner circle and a radius r2,1130CThe circle of the outer circle of (1).
In an embodiment, the set of validation symbols (e.g., 530A-530C or 1130A-1130C) may be shaped as respective circles having different respective sizes. For example, as shown in FIG. 11C, validation symbols 1130A-1130C may have different radii for their respective outer circle regions or circles. That is, they may all have r different from each other2,1130A、r2,1130BAnd r2,1130C. In an embodiment, the rings used to validate symbols 1130A-1130C have different ratios between the respective radii of their inner circle regions or inner circles and the respective radii of their outer circle regions or outer circles. I.e. the ratio r2,1130A/r1,1130AThe ratio r2,1130B/r1,1130BAnd the ratio r2,1130C/r1,1130CMay all be different from each other. As discussed in more detail below, the computing system (e.g., robotic control system 110) may be configured to identify verification symbols 1130A/1130B/1130C based on the size of the respective circles forming verification symbols 1130A/1130B/1130C, and/or based on a respective ratio between the radius of the inner circle of the circles and the radius of the outer circle of the circles.
In an embodiment, some or all of the validation symbols (e.g., 530A-530C of FIGS. 10A-10C or 1130A-1130C of FIGS. 11A-11B) may be permanently attached or otherwise disposed on the robotic arm (e.g., 533 or 1130). Such embodiments may involve calibration patterns (e.g., 520 of fig. 5A and 5B) for camera calibration and having a size large enough to accommodate a sufficiently complex pattern and/or large enough to accommodate a sufficient number of pattern elements to produce sufficiently accurate camera calibration results for camera calibration. However, such large size of the calibration pattern (e.g., 520) may make it too large to be a permanent or persistent part of the robotic arm, as the large size of the calibration pattern (e.g., 520) may make it interfere with the normal operation of the robot. In such an example, the calibration pattern 520 may be detached from the robot arm (e.g., 553), for example, before normal robot operation is resumed. In this embodiment, some or all of the verification symbols (e.g., 530A-530C or 1130A-1130C) may be less complex and/or smaller than the calibration pattern (e.g., 520). The small size of the validation symbols may allow them to remain on the robot arm (e.g., 553 or 1153) during normal robot operation with little or no interference with normal robot operation. Thus, in such examples, in some cases, some or all of the validation symbols (e.g., 530A-530C or 1130A-1130C) may be permanently or otherwise permanently disposed on the robotic arm (e.g., 533 or 1133). Such an arrangement provides the advantage of allowing a robot controller or other computing system to be able to perform calibration verifications more frequently and/or more quickly and assess whether an updated camera calibration needs to be determined.
Fig. 12A and 12B illustrate a method 1200 for verifying calibration information, such as camera calibration information, using multiple verification symbols. In an embodiment, the method 1200 may be performed by a computing system (such as the robot control system 110 of fig. 10A-10C or 11A-11B), or more particularly by control circuitry of the computer system (such as the control circuitry 111 of the robot control system 110 in fig. 1C). As shown in fig. 1C, the computing system may include a communication interface 113, the communication interface 113 configured to communicate with a camera having a camera field of view, such as camera 570 of fig. 10A or camera 1170 of fig. 11A having a camera field of view 510 or 1110, respectively. Communication interface 113 may also be configured to communicate with a robot, such as robot 550A of fig. 10A-10C or robot 1150 of fig. 11A-11C. As described above, the robot 550A/1150 may include a robotic arm 553/1153 having a plurality of arm portions movably attached to one another, and may include a set of validation symbols disposed on respective ones of the plurality of arm portions. In the example of FIGS. 10A-10C, the set of validation symbols can include validation symbols 530A-530C disposed on a linkage 554B, a linkage 554C, and a robot 555, respectively. In the example of fig. 11A-11C, the set of validation symbols may include validation symbols 1130A-1130C, which may be disposed on links 1154C, 1154D and manipulator 1155, respectively.
In an embodiment, the method 1200 may include step 1201, where the control circuitry 111 of the robot control system or other computing system outputs motion commands for controlling movement of a robotic arm (e.g., robotic arm 533/1153) for robotic operation. The motion command may be based on calibration information, for example. Calibration information (such as camera calibration information) may be determined from a first calibration operation (such as a first camera calibration). In some cases, step 1201 may be the same as or similar to step 409 of method 400, where the control circuitry 111 controls the movement of the robotic arm to perform robotic operations. For example, robotic operation may involve picking boxes or other objects in a warehouse. In this example, the control circuitry 111 may be configured to determine a spatial relationship between the camera and the cassette, and/or a spatial relationship between the robot (e.g., robot 550A 1150) and the cassette based on an image of the cassette generated by the camera (e.g., camera 570 of fig. 10A-10C or camera 1170 of fig. 11A-11B) and based on the calibration information. In an embodiment, the method 1200 may include the step of the control circuit 111 performing a first calibration operation to determine calibration information. Such a step of performing a first calibration operation may be similar to step 401 of method 400 of fig. 4A and may be performed prior to step 1201. For example, the first calibration operation may be a camera calibration involving determining an estimate of camera calibration parameters based on a calibration image generated by the camera.
In an embodiment, method 1200 may include step 1203, where control circuitry 111 or other component of the computing system determines a set of reference image coordinates. The set of reference image coordinates may be, for example, respective coordinates where the set of authentication symbols (e.g., 530A-530C/1130A-1130C) appear in the reference image, where the reference image may be an image representing the set of authentication symbols (e.g., 530A-530C/1130A-1130C). In an embodiment, the set of reference image coordinates may be used to verify calibration information.
For example, FIG. 13A depicts a group representing FIG. 11AThe reference images 1120 of the symbols 1130A-1130C are verified. In the example of fig. 13A, the set of reference image coordinates may include first reference image coordinates, second reference image coordinates, and third reference image coordinates. In such an example, the first reference image coordinates may identify where verification symbol 1130A appears in reference image 1120, while the second reference image coordinates may identify where verification symbol 1130B appears in reference image 1120, and the third reference image coordinates may identify where verification symbol 1130C appears in reference image 1120. In a more specific example, each of the first reference image coordinate, the second reference image coordinate, and the third reference image coordinate may be a pixel coordinate [ u v ]]T. More specifically, FIG. 13A depicts the three reference image coordinates of validation symbols 1130A, 1130B, and 1130C as [ u ] respectivelyref_1vref_1]T 1130A、[uref_1vref_1]T 1130B、[uref_1vref_1]T 1130C. In this example, a label ref _ N (e.g., ref _1) may refer to coordinates associated with a reference image generated when the mechanical arm 1153 is in an nth pose, such as the first pose (when N is 1). As discussed in more detail below, the reference image 1120 of fig. 13A may correspond to or more generally be associated with a first pose, which may be an example pose of the robotic arm 1153 illustrated in fig. 11A. Different gestures may place the verification symbols 1130A-1130C at different sets of corresponding positions in the camera field of view 1110. For example, the gesture in FIG. 11A may place verification symbols 1130A-1130C in 3D positions [ x ] respectivelyref_1yref_ 1zref_1]T 1130A、[xref_1yref_1zref_1]T 1130B、[xref_1yref_1zref_1]T 1130CTo (3). In such cases, these 3D positions [ x ]ref_1yref_1zref_1]T 1130A、[xref_1yref_1zref_1]T 1130B、[xref_1yref_1zref_1]T 1130CPixel coordinates u in reference image 1120 may be projected or otherwise mappedref_1vref_1]T 1130A、[uref_1vref_1]T 1130B、[uref_1vref_1]T 1130C. In an embodiment, the pixel coordinate [ u ]ref_1vref_1]T 1130A、[uref_1vref_1]T 1130B、[uref_1vref_1]T 1130CEach of which may identify the center of its respective verification symbol 1130A/1130B/1130C as appearing in reference image 1120.
In an embodiment, step 1203 may involve the control circuitry 111 receiving a reference image (e.g., 1120) from the communication interface 113 and/or the non-transitory computer readable medium 115 of the robot control system 110 or other computing system. For example, the reference image 1120 of fig. 13A may be generated by the camera 1170 when the mechanical arm 1153 is in the pose of fig. 11A. Computing system 110 may receive reference image 1120 from camera 1170 via communication interface 113 and may store reference image 1120 in non-transitory computer-readable medium 115. In step 1203, in an example, the control circuitry 111 may retrieve or otherwise receive the reference image 1120 from the non-transitory computer readable medium 115. The non-transitory computer readable medium 115 may also store a verification image (discussed below), and the control circuitry 111 may receive the verification image from the non-transitory computer readable medium 115.
In an embodiment, the control circuitry 111 may be configured to identify at least one verification symbol (e.g., 1130A) of the set of verification symbols (e.g., 1130A-1130C) in the reference image (e.g., 1120) based on a prescribed model describing the geometry of the robotic arm (e.g., 1153). For example, the prescribed model may describe which links, robotic end effectors, or other arm portions form a robotic arm, their respective sizes (e.g., lengths) and how they are connected, and/or which arm portions have at least one validation symbol (e.g., 1130A) disposed thereon. In such embodiments, the control circuitry may be configured to determine a region within the reference image (e.g., 1120) in which the at least one verification symbol is expected to occur based on the model, and search for the at least one verification symbol (e.g., 1130A) within the region of the reference image. In an embodiment, the model may store or more generally describe the location of the validation symbol (e.g., 1130A) on the robotic arm (e.g., 1153). The position of the validation symbol (also referred to as the symbol position) may be the approximate position of the validation symbol on the robotic arm (e.g., 1153).
For example, if a reference image (e.g., 1120) is stored with one or more parameter values for a motion command that generates a pose associated with the reference image (or more specifically a pose of a robotic arm that appears in the reference image), the one or more parameter values and the model may be used to estimate the pose of the robotic arm (e.g., 1153) when moving the robotic arm in accordance with the motion command. As discussed in more detail below, in an embodiment, the one or more parameter values may pertain to one or more actuator parameters for controlling one or more actuators (e.g., one or more motors) for moving a robotic arm (e.g., 1153). The estimated pose may be used to estimate the position of the verification symbol (e.g., 1130A) on the robotic arm (e.g., 1153), and the estimated position may be used to estimate the position at which the verification symbol may appear in the reference image (e.g., 1120).
In the above example, the control circuit 111 may be configured to focus on the region(s) in the reference image (e.g., 1120) where one or more validation symbols are expected to appear. Such techniques may allow control circuitry 111 to avoid searching for verification symbols (e.g., 1130A-1130C) throughout a reference image (e.g., 1120), and thus identify verification symbols (e.g., 1130A-1130C) in the reference image (e.g., 1120) more quickly. The model in the above example may also be used to search for a validation symbol in a validation image (discussed below).
As described above, in an embodiment, the set of authentication symbols (e.g., 1130A-1130C) may be shaped as respective circles. In such embodiments, the control circuit 111 may be configured to discern or otherwise identify in the reference image (e.g., 1120) by identifying the ring forming the validation symbolThe symbol (e.g., 1130A) is verified. If the set of validation symbols are shaped as respective circles having different respective sizes, such as shown above with respect to FIG. 11C, the control circuitry 111 may be configured to base the size (e.g., radius r) of the respective circles forming the validation symbols on2,1130A) To identify the authentication symbol (e.g., 1130A). In some cases, if the shape of the verification symbol (e.g., 1130A) is a circular ring having at least a first circular area or a first circle and a second circular area or a second circle, the control circuitry 111 may be configured to identify the verification symbol based on a ratio between a radius of the first circular area or first circle and a radius of the second circular area or second circle. For example, the control circuit 111 may be configured to identify a ring based on identifying the ring and confirming that the ring has a ratio r2,1130A/r1,1130ATo identify authentication symbol 1130A.
As further set forth above, the reference image of step 1203 (e.g., 1120 of fig. 13A) may be generated by a camera (e.g., 1170 of fig. 11A) when a robotic arm (e.g., 1153) is in a first pose, such as the pose depicted in fig. 11A. The reference image may be generated by the camera over a first time period or more generally at a first point in time. The first time period may refer to a period of time (e.g., a period of milliseconds, seconds, or minutes) used to generate or associate with the reference image. For example, the first time period may include a time of a camera operation for taking the reference image, and in some cases, may also include a time for the robot to move to position the validation symbol in a camera field of view. In some cases, the first time period during which the reference image is generated may be hours, days, or weeks prior to, for example, step 1201 and/or step 1203. In an embodiment, the computing system or more specifically the control circuitry 111 performing steps 1201 and 1203 may not be involved in generating the reference image (e.g., 1120). In an embodiment, the computing system or more specifically the control circuitry 111 performing steps 1201 and 1203 may involve generating a reference image (e.g., 1120). For example, in steps preceding steps 1201 and/or 1203, the computing system may output a motion command to move the robotic arm (e.g., 1153) to a first pose during a first time period. In such a case, the previous step may be part of method 1200. The first gesture may cause the set of verification symbols (e.g., 1130A-1130C) to be moved to a set of corresponding positions associated with the first gesture, such as position [ x ] discussed aboveref_1yref_1zref_1]T 1130A、[xref_1yref_1zref_1]T 1130B、[xref_1yref_1zref_1]T 1130C. In some cases, such steps may be similar or identical to step 403 in fig. 4A. In the above example, the computing system, or more specifically the control circuitry 111, may also output a camera command in this step that causes the camera (e.g. 1170) to generate a reference image (e.g. 1120) when the robotic arm (e.g. 1153) is in the first pose. In an embodiment, the computing system may receive a reference image (e.g., 1120) from the camera and store the reference image in the non-transitory computer readable medium 115 of fig. 1C or another non-transitory computer readable medium. In some cases, the computing system may also store information that allows the robotic arm (e.g., 1153) to return to the first pose and/or to return the verification symbols (e.g., 1130A-1130C) to the first set of respective positions. For example, the computing system may store a first set of respective positions or, more specifically, their 3D coordinates, and/or may store parameter values for motion commands that move the robotic arm (e.g., 1153) to the first pose. In embodiments, these coordinates and/or motion commands may be stored in the non-transitory computer readable medium 115 in a manner that associates the stored information with the reference image (e.g., 1120) described above.
In an embodiment, the motion command for moving the robotic arm to the first pose discussed above may include one or more parameter values describing the motion of the robotic arm (e.g., 1153). As described above, in some cases, the one or more parameter values may pertain to one or more actuator parameters that control one or more actuators that generate the motion of the robotic arm (e.g., 1153). In such a case, the one or more parameter values may be referred to as one or more actuator parameter values (also referred to as robot joint values). For example, the one or more actuator parameter values may describe, for example, respective amounts of rotation of the arm portions of the robotic arm relative to each other, respective positions and/or orientations of the arm portions relative to each other, and/or joint (e.g., 1156A-1156D) positions of the link arm portions. For example, the one or more actuator parameter values of the motion command may describe respective angular values at which a motor in robot 1150 rotates the respective arm portions (e.g., links 1154A-1154E and manipulator 1155) relative to their immediately preceding arm portions.
In embodiments, the motion command discussed above may have any parameter value(s) (e.g., random parameter value (s)), and the first gesture caused by the motion command may be any gesture. In an embodiment, the motion commands discussed above may be motion commands that cause some or all of the verification symbols (e.g., 1130A-1130C) to have a desirable appearance in the reference image (e.g., 1120). For example, as described above, some or all of the set of authentication symbols (e.g., 1130A-1130C) may have a circular shape, such as a ring formed by concentric circular regions or concentric circles. In such an example, at least one verification symbol (e.g., 1130A) may appear elliptical in the resulting reference image rather than completely circular when the at least one verification symbol (e.g., 1130A) is in certain orientations relative to the camera (e.g., 1170). The elliptical appearance may lead to inaccurate calibration verification. For example, if the reference image coordinates of verification symbol 1130A are the location where the center of the symbol appears in the reference image, it may be more difficult to accurately determine the reference image coordinates when verification symbol 1130A is elliptical in the reference image. Further, if identifying authentication symbol 1130A in the reference image (e.g., distinguishing authentication symbol 1130A from other features in the reference image) relies on identifying the annular pattern in the reference image and on verifying that the annular pattern has a particular ratio between the plurality of concentric circles associated with the symbol (e.g., ratio r associated with authentication symbol 1130A)2,1130A/r1,1130A) This identification may be more difficult to perform accurately when the annular pattern is elliptical in the reference image. Therefore, if controlThe control circuit 111 is involved in outputting a motion command associated with generating the reference image, the control circuit 111 may attempt to generate a motion command that causes the set of validation symbols (e.g., 1130A-1130C) to be positioned in a manner such that they are completely or substantially circular in the reference image.
For example, the control circuitry 111 may generate a motion command that causes the robotic arm (e.g., 1153) to move to a pose where the set of validation symbols (e.g., 1130A-1130C) directly faces the camera (e.g., 1170). For example, the gesture may cause at least one verification symbol of the set of verification symbols (e.g., 1130A-1130C) to be tangent to a surface of an imaginary sphere that is concave relative to the camera (e.g., 1170). In fig. 11A, the illustrated pose of the robotic arm 1153 may cause the verification symbols 1130A and 1130B to have respective orientations at which the verification symbols 1130A and 1130B are tangent to the surface of an imaginary sphere 1121 that is recessed relative to the camera (e.g., 1170), and may cause the verification symbols 1130C to have an orientation that is tangent to the surface of an imaginary sphere 1123 that is also recessed relative to the camera (e.g., 1170). In a gesture such as the first gesture discussed above, the set of verification symbols (e.g., 1130A-1130C) may appear as respective circular shapes in the reference image (e.g., 1120). More specifically, the set of validation symbols (e.g., 1130A-1130C) in such an example may be positioned in such a way that they exhibit no eccentricity (eccentricities) in the reference image, or a corresponding amount of eccentricity that is less than a prescribed eccentricity threshold. In the above example, the centers of the imaginary circles 1121 and 1123 may be at the camera (e.g., 1170). In an embodiment, the control circuit 111 may be configured to generate random motion commands and search these random motion commands to find one motion command that may produce the above-discussed direction for the set of validation symbols. The found motion command may be output by the control circuit 111 to generate a reference image (e.g., 1120).
Returning to fig. 12A-12B, in an embodiment, the method 1200 includes a step 1205 in which the control circuitry 111 outputs a motion command for controlling the robotic arm to move to a first pose (such as the first pose shown in fig. 10A or 11A), wherein the first pose is a pose in which a reference image (e.g., 1120) is generated during a first time period, as described above. In some cases, the motion commands may be output to the robot (e.g., 550A/1150) via the communication interface 113. In an embodiment, the motion command in step 1205 may be referred to as an additional motion command because it is additional to the motion command of step 1201. The additional motion command may be output during a second time period, or more generally at a second point in time, after the first time period. The second time period may refer to a period of time (e.g., a period of milliseconds, seconds, or minutes) used to generate or otherwise associated with generating the verification image. For example, the second time period may include a time for the robot to move to have the verification symbol in the camera field of view and/or a time for the camera to operate to capture the verification image. In some cases, the second time period (or, more generally, the second point in time) may be hours, days, or weeks after the first time period (or, more generally, the first point in time). As discussed in more detail below, the additional motion commands may be used to generate a verification image during the second time period. In an embodiment, the add-on motion command may cause the robotic arm (e.g., 1153) to move to a pose (e.g., the first pose of fig. 11A) in which the set of verification symbols (e.g., 1130A-1130C) directly face the camera (e.g., 1170). For example, the set of verification symbols may have respective orientations that are tangent to a surface of one or more virtual spheres that are recessed relative to the camera. In such a gesture, the set of verification symbols (e.g., 1130A-1130C) may appear as respective circular shapes in the verification image (e.g., 1160 of FIG. 13B).
In an embodiment, if the method 1200 includes the step of outputting a motion command for generating a reference image during a first time period as described above, the motion command for generating the reference image during the first time period may be a first additional motion command, and the motion command in step 1205 for generating the verification image during a second time period may be a second additional motion command. In some cases, the first additional motion command may be an earlier motion command and the second additional motion command may be a later motion command. In some cases, the first additional motion command may have one or more actuator parameter values (or robot joint values) for controlling the robotic arm to move to the first pose, and the second additional motion command may also have the one or more actuator parameter values. More particularly, the first and second additional motion commands may have the same actuator parameter values. For example, if the robot (e.g., 1150) includes multiple motors that rotate various arm portions (e.g., links 1154A-1154E and manipulator 1155) relative to one another, the one or more actuator parameter values may include multiple respective angle values that control how much rotation the multiple motors output. In this example, both the first additional motion command and the second additional motion command may include the same plurality of respective angle values that control how much rotation the plurality of motors output.
In an embodiment, the first additional motion command and/or the second additional motion command may be output during a corresponding idle period (such as the idle period discussed above with respect to step 411 of fig. 4A). In some cases, if the calibration information is determined by performing the calibration operation, the first additional movement command may be output immediately after performing the calibration operation or during an earliest idle period after performing the calibration operation. In some cases, the second additional motion command may be in response to a prescribed trigger condition, such as a prescribed period of time having elapsed since the calibration operation was performed, a collision event involving the robot, or any other event (e.g., a natural disaster, such as an earthquake) that may result in a possible displacement or misalignment between the camera and the robot or portion thereof, or some other trigger condition. If generating the verification image involves the control circuit 111 outputting a camera command, in some examples, the camera command may also be output in response to a specified trigger condition.
In an embodiment, the first pose of step 1205 of the robotic arm (e.g., 1153) may be associated with a particular set of respective reference positions of the set of validation symbols (e.g., 1130A-1130C) disposed on the robotic arm, as described above. For example, the set of respective reference positions may be 3D positions [ x ]ref_1yref_1zref_1]T 1130A、[xref_1yref_1zref_1]T 1130B、[xref_ 1yref_1zref_1]T 1130C. When the reference image (e.g., 1120) is generated, the set of verification symbols (e.g., 1130A-1130C) may be located at the set of respective reference positions associated with the first gesture. In step 1205, if the add-on motion command returns the robotic arm to the first pose, the set of verification symbols (e.g., 1130A-1130C) may be returned to the set of reference positions (e.g., [ x ]ref_1yref_1zref_1]T 1130A、[xref_1yref_1zref_1]T 1130B、[xref_1yref_1zref_1]T 1130C). In such an example, the set of verification symbols (e.g., 1130A-1130C) may be disposed at the set of respective reference locations when the reference image (e.g., 1120) is generated, and may again be disposed at the set of respective reference locations when the verification image (e.g., 1160) is generated.
Returning to fig. 12A-12B, method 1200 in an embodiment may include step 1207, where control circuit 111 receives a verification image, such as verification image 1160 in fig. 13B. As described above, the reference image (e.g., 1160) may be an image used to represent the set of authentication symbols (e.g., 1130A-1130C). In this example, the verification image (e.g., 1160) may be an additional image to also represent the set of verification symbols (e.g., 1130A-1130C) and may be generated when the robotic arm (e.g., 1153) is moved to a first pose (such as the first pose shown in fig. 11A) due to the additional motion command of step 1205.
In an embodiment, method 1200 may include step 1209, where control circuitry 111 determines a set of verification image coordinates. In this embodiment, the set of verification image coordinates may be the respective coordinates where the set of verification symbols (e.g., 1130A-1130C) appear in the verification image (e.g., 1160 of FIG. 13B). In an embodiment, the set of verification image coordinates mayFor verifying calibration information, such as camera calibration information. Similar to the discussion above with respect to the reference image coordinates in step 1203, the set of verification image coordinates in the example may include first verification image coordinates, second verification image coordinates, and third verification image coordinates. In a more specific example, each of the first verification image coordinates, the second verification image coordinates, and the third verification image coordinates may be pixel coordinates that identify, for example, the center of its respective verification symbol. For example, FIG. 13B depicts three verification image coordinates, or more specifically pixel coordinates [ u ] where the respective centers of verification symbols 1130A-1130C appear in verification image 1160verify_1vverify_1]T 1130A、[uverify_1vverify_1]T 1130B、[uverify_ 1vverify_1]T 1130C. Similar to the discussion for reference image coordinates, the tag verify _ N (e.g., verify _1) may refer to coordinates associated with a reference image generated when the robot arm is in an nth pose, such as the first pose (N ═ 1).
In an embodiment, the control circuit 111 may perform steps 1203-1209 multiple times for multiple reference images and multiple verification images (e.g., five reference images and five verification images). The plurality of reference images and the plurality of verification images may correspond to a plurality of respective gestures. For example, fig. 10A-10C depict a series of three poses of robotic arm 553. In this example, the control circuitry 111 of the robot controller 110 or any other computing system may receive a first reference image and a first verification image each associated with a first pose shown in fig. 10A, receive a second reference image and a second verification image each associated with a second pose shown in fig. 10B, and receive a third reference image and a third verification image each associated with a third pose shown in fig. 10C. In another example, the reference image 1120 of fig. 13A may be a first reference image and the verification image 1160 of fig. 13B may be a first verification image, both of which may be associated with the first pose shown in fig. 11A for the robotic arm 1153. In this example, the control circuit 111 may also receive the second reference image 1122 of fig. 14A and the second reference image 1122 of fig. 14BTwo verification images 1162, both of which may be associated with the second gesture shown in FIG. 11B. In this example, the reference image coordinate [ u ] of FIG. 13Aref_1vref_1]T 1130A、[uref_1vref_1]T 1130B、[uref_1vref_1]T 1130CMay be the first set of reference image coordinates and the verification image coordinates u of fig. 13Bverify_1vverify_1]T 1130A、[uverify_1vverify_1]T 1130B、[uverify_1vverify_1]T 1130CMay be the first set of verification image coordinates. The control circuit 111 in this example may also be configured to determine a second set of reference image coordinates [ u ] of the second reference image 1122 of FIG. 14Aref_2vref_2]T 1130A、[uref_2vref_2]T 1130B、[uref_2vref_2]T 1130CAnd determining a second set of verification image coordinates [ u ] u for the second verification image 1162 of FIG. 14Bverify_2vverify_2]T 1130A、[uverify_ 2vverify_2]T 1130B、[uverify_2vverify_2]T 1130C. In the above example, the control circuitry 111 may be configured in some cases to output different respective motion commands for moving the robotic arm (e.g., 1153) to a plurality of respective poses with which the plurality of reference images and/or the plurality of verification images are associated.
Returning to fig. 12A-12B, method 1200 in an embodiment may include step 1211, where control circuit 111 bases the set of reference image coordinates (e.g., [ u ] of fig. 13A)ref_1vref_1]T 1130A、[uref_1vref_1]T 1130B、[uref_1vref_1]T 1130C) And the set of verification image coordinates (e.g., [ u ] of FIG. 13Bverify_1vverify_1]T 1130A、[uverify_1vverify_1]T 1130B、[uverify_1vverify_1]T 1130C) A respective set of deviation parameter values is determined from the respective amounts of deviation therebetween. For example, the set of respective deviation parameter values in the context of fig. 13A and 13B may include a first deviation parameter value, a second deviation parameter value, and a third deviation parameter value. The first deviation parameter value may be based on a first reference image coordinate [ u ] of the verification symbol 1130A in the reference image 1120ref_1vref_1]T 1130AAnd first verification image coordinates [ u ] of verification symbol 1130A in verification image 1160verify_1vverify_1]T 1130AThe amount of deviation therebetween. For example, the first deviation parameter value may be equal to or more generally based on the distance between the first reference image coordinates and the first verification image coordinates, as discussed above with respect to step 457 of fig. 4B. Similarly, the second deviation parameter value may be based on (e.g., equal to) the second reference image coordinate [ u ] of the verification symbol 1130B in the reference image 1120ref_1vref_1]T 1130BAnd second verification image coordinates [ u ] of verification symbol 1130B in verification image 1160verify_ 1vverify_1]T 1130BThe amount of deviation therebetween. Further, the third deviation parameter value may be based on a third reference image coordinate [ u ] of the verification symbol 1130C in the reference image 1120ref_1vref_1]T 1130CAnd third verification image coordinates [ u ] of verification symbol 1130C in verification image 1160verify_1vverify_1]T 1130CThe amount of deviation therebetween. In the above example, the set of respective deviation parameter values is associated with the set of validation symbols. That is, a first offset parameter value is associated with verification symbol 1130A, a second offset parameter value is associated with verification symbol 1130B, and a third offset parameter value is associated with verification symbol 1130C.
In an embodiment, some or all of the set of respective deviation parameter values may be based on a single pair of reference and verification images, both of which may be associated with a common pose (common position) of the robotic arm. For example, the first deviation parameter value discussed above may be associated with the verification symbol 1130A and may be based on a single pair of the reference image 1120 and the verification image 1160, both of which are associated with a common gesture (such as the gesture in fig. 11A).
In an embodiment, some or all of the set of respective deviation parameter values may be based on a plurality of pairs of respective reference images and respective verification images, wherein each pair is associated with a respective pose of a plurality of poses of the robotic arm. For example, the first deviation parameter value discussed above associated with the verification symbol 1130A may be based on the first pair of reference images 1120 and verification images 1160 (of fig. 13A and 13B) and the second pair of reference images 1122 and verification images 1162 (of fig. 14A and 14B). The first pair may be associated with the first pose of FIG. 11A, while the second pair may be associated with the second pose of FIG. 11B. More specifically, in this example, the first deviation parameter value may be based on [ u ]ref_1vref_1]T 1130AAnd [ u ]verify_1vverify_1]T 1130A(they are associated with the first gesture) and is based on [ u [ u ] ]ref_2vref_2]T 1130AAnd [ u ]verify_2vverify_2]T 1130AThe amount of deviation between (which are associated with) the second gesture. In one example, the first deviation parameter value may be equal to or based on an average of the two deviation amounts. More generally, in this example, the respective set of deviation parameter values may be based on respective amounts of deviation between the first set of reference image coordinates and the first set of verification image coordinates (which are associated with the first pose), and also based on respective amounts of deviation between the second set of reference image coordinates and the second set of verification image coordinates (which are associated with the second pose).
In some cases, the reference image coordinates in the first reference image and the reference image coordinates in the second reference image may be part of a set of reference image coordinates that indicate where a common authentication symbol (e.g., 1130A) appears in a set of reference images, such as the two reference images 1120, 1122 discussed above. In this example, the set of reference images and the set of reference image coordinates may each correspond to a respective set of poses of the robotic arm (e.g., 1153), such as the two poses in fig. 11A and 11B. Similarly, the verification image coordinates in the first verification image and the verification image coordinates in the second verification image may be part of a set of verification image coordinates indicating where in a set of verification images (such as verification images 1160, 1162 discussed above) the verification symbols appear. The set of verification images and the set of verification image coordinates may also correspond to the set of gestures. In such a case, the deviation parameter value associated with the verification symbol may be based on a respective amount of deviation between the set of reference image coordinates and the set of verification image coordinates.
Returning to fig. 12A-12B, in an embodiment method 1200 may include step 1213, wherein control circuit 111 determines whether at least one of the set of respective deviation parameter values exceeds a prescribed deviation threshold. In some cases, step 1213 may be similar to step 459 of fig. 4B. In an embodiment, step 1213 may involve determining whether each deviation parameter value of the set of respective deviation parameter values has exceeded a respective deviation threshold. For example, control circuitry 111 may determine whether the offset parameter value associated with verification symbol 1130A has exceeded a prescribed offset threshold, the offset parameter value associated with verification symbol 1130B has exceeded a prescribed offset threshold, and/or the offset parameter value associated with verification symbol 1130C has exceeded a prescribed offset threshold. In some cases, the respective deviation thresholds may have the same value, forming a common deviation threshold for the verification symbols, or may have different values.
In an embodiment, the method may include step 1215, wherein the control circuit 111 may, in response to determining that at least one deviation parameter value of the set of respective deviation parameter values exceeds a prescribed deviation threshold, at least one of: (a) outputting a notification that at least one of the set of respective deviation parameter values exceeds a prescribed deviation threshold, or (b) performing a calibration operation to determine updated calibration information (e.g., updated camera calibration information). For example, step 1215 can involve outputting the notification to a user interface device, such as an electronic display in communication with the robot controller 110. The electronic display may display, for example, an indication that the at least one deviation parameter value or the at least one deviation parameter value exceeds a prescribed deviation threshold. In an embodiment, performing the calibration operation in step 1215 may be similar to or the same as step 461 in FIG. 4B. In an embodiment, if the calibration information of step 1201 is determined by performing a first calibration operation, the calibration operation of step 1215 may be a second calibration operation after the first calibration operation.
In an embodiment, at least one deviation parameter value exceeding a prescribed deviation threshold may indicate a change in a camera (e.g., 1170 of fig. 11A-11B) and/or a change in an environment of a camera or robot operating system (e.g., 1100). In some cases, the change in the camera may be an internal change, such as the lens or image sensor of the camera changing shape or size due to temperature changes or physical damage. In some cases, the change in the camera may include a change in the location where the camera (e.g., 170) is mounted, such as due to vibration of the structure (e.g., ceiling) on which the camera is mounted. In some cases, the change in the environment of the camera or robot operating system may include a change in the position or orientation of the base (e.g., 1152) of the robot (e.g., 1150), such as due to vibration of the structure (e.g., floor) on which the robot is mounted. In some cases, the change in the environment of the camera or the robotic manipulation system may be a change in the relationship between arm portions of the robotic arm (e.g., between links 1154A-1154E of robotic arm 1153) or a change in the arm portions themselves. For example, one of the arm portions (e.g., link 1154D or manipulator 1155) may bend or otherwise deform or break due to an unplanned event by computing system 110. The unplanned event may be a collision event or some other undesirable event that may result in possible changes to the robot (e.g., 1150) or other elements of the robot operating system. The calibration verification discussed above may provide a fast and efficient technique for detecting changes in the camera (e.g., 1170) and/or changes in the environment of the camera or robotic operating system 1100. For example, changes in the camera (e.g., 1170) and/or robot (e.g., 1150) may be detected by comparing the reference image coordinate(s) and corresponding verification image coordinate(s) to determine differences therebetween. In many cases, such comparisons may be made without placing significant demands on the computing resources of computing system 110. For example, the comparison may be accomplished with computations that only occupy a limited amount of processor execution time and/or a limited amount of memory. Thus, the comparison may facilitate accurate monitoring of the accuracy of the calibration information in a computationally efficient manner.
In an embodiment, at least one deviation parameter value exceeding a prescribed deviation threshold may indicate the presence of a calibration error, e.g., a camera calibration error where the camera calibration information from the first camera calibration is no longer sufficiently accurate. In an embodiment, the method 1200 may involve the control circuit 111 determining a type of calibration error (also referred to as a type of mis-alignment) that causes the at least one deviation parameter value to exceed a prescribed deviation threshold. The type of calibration error may indicate, for example, whether a loss of accuracy in the calibration information (e.g., camera calibration information) is caused by or otherwise represents a change in the robot or whether the loss of accuracy is caused by or otherwise represents a change in the camera. For example, as described above, changes to the camera (e.g., 1170) may include internal changes to the camera and/or changes to the location where the camera is mounted. The changes to the robot may include changes in the position or orientation of a base (e.g., 1152) of the robot (e.g., 1150), changes in the relationship between arm portions of a robotic arm (e.g., 1153), and/or changes in the arm portions themselves, as described above.
In an embodiment, the determination of the type of calibration error may be based on a comparison between the set of respective deviation parameter values, and more particularly on whether the set of respective deviation parameter values exceeds a prescribed deviation threshold in a substantially uniform manner. For example, if the set of deviant parameter values associated with different validation symbols (e.g., 1130A-1130C) all exceed a specified deviant threshold and do so in a substantially uniform manner, the control circuitry 111 may determine that the loss of accuracy is caused by the camera (e.g., 1170) or more specifically by a change in the camera. This is because the appearance of each verification symbol (e.g., 1130A-1130C) in the reference image or verification image depends on the internal properties and/or on the position of the camera (e.g., 1170). More specifically, the reference image coordinates and the verification image coordinates of the verification symbols (e.g., 1130A-1130C) in the reference image or verification image may both depend on the internal nature or location of the camera. Thus, changes in the internal properties or position of the camera (e.g., 1170) may often affect the corresponding deviant parameter values for all of the verification symbols (e.g., 1130A-1130C), and more particularly, may often increase all of the deviant parameter values in a substantially uniform manner. In contrast, if there is damage, faulty operation, or other change in a portion (e.g., arm portion) of a robot (e.g., 1150), it is unlikely that the other portions of the robot will all change in exactly the same manner (e.g., damage or full fault). Thus, if a change occurs in a portion of the robot that will increase at least one deviation parameter value above a prescribed deviation threshold, the other deviation parameter values in the set of deviation parameter values may still remain below the prescribed deviation threshold, or the set of deviation parameter values may all exceed the prescribed deviation threshold but do so in a non-uniform manner. Thus, in an embodiment, if at least one of the set of respective deviation parameter values associated with different verification symbols (e.g., 1130A-1130C) exceeds a prescribed deviation threshold, but the set of respective deviation parameter values do not all exceed the prescribed deviation threshold in a substantially uniform manner, the control circuitry 111 may determine that the loss of accuracy is caused by a change in the robot (e.g., 1150), or more particularly the robot.
In an embodiment, the control circuit 111 may use a prescribed uniformity threshold to evaluate whether the set of deviation parameter values all exceed the prescribed deviation threshold in a substantially uniform manner. For example, the control circuit 111 may determine whether at least one of the deviation parameter values exceeds a prescribed deviation threshold, and also determine whether the differences among the deviation parameter values (or among their respective amounts that exceed the prescribed deviation threshold) are within a prescribed uniformity threshold. The uniformity threshold may be specified in a dynamic manner (e.g., based on current operating conditions of the robotic operating system), or may be predetermined. As an example of using a prescribed uniformity threshold, if control circuit 111 determines that the respective deviation parameter values of verification symbols 1130A-1130C all exceed the prescribed deviation threshold, but that the deviation parameter value associated with verification symbol 1130C differs from the deviation parameter value associated with verification symbol 1130A by more than the prescribed uniformity threshold, and/or that the deviation parameter value associated with verification symbol 1130B differs from the prescribed uniformity threshold, control circuit 111 may determine that the loss of accuracy is caused by a change in robot 1150, such as at least a change in manipulator 1155 or other arm portion on which verification symbol 1130C is placed. In the above example, the control circuit 111 directly compares the deviation parameter values. In other examples, the control circuit 111 may compare respective amounts by which the deviation parameter values exceed a specified deviation threshold and whether the respective amounts differ by more than a specified uniformity threshold. In another example, if the control circuit 111 determines that a first deviation parameter value of the set of deviation parameter values exceeds a prescribed deviation threshold value, but one or more deviation parameter values of the set of deviation parameter values do not exceed the prescribed deviation threshold value, the control circuit 111 may also determine that a calibration error (also referred to as mis-alignment) is due to a change in the robot (e.g., 1150), such as a change in an arm portion where a validation symbol associated with the first deviation parameter is disposed. In another example, if the control circuit 111 determines that the set of deviation parameter values all exceed a specified deviation threshold and that they differ from each other by no more than a specified uniformity threshold, the control circuit 111 may determine that the calibration error is caused by a change in the camera (e.g., 1170).
In some cases, the control circuitry 111 of the robot controller 110 or other computing system may be in communication with a conveyor belt (such as conveyor belt 1173 of fig. 11A and 11B). In such a case, the control circuitry 111 may be configured to stop the conveyor belt 1173 in response to determining that at least one deviation parameter value exceeds a prescribed deviation threshold value. Stopping the conveyor belt 1173 may prevent the robotic arm 1153 from undesirably interacting with objects on the conveyor belt 1173 based on inaccurate calibration information.
In an embodiment, the control circuitry 111 may be configured to determine: if the calibration information is sufficiently accurate for a particular arm portion (e.g., link 1154C with verification symbol 1130A disposed thereon, link 1154D with verification symbol 1130B disposed thereon, or manipulator 1155 with verification symbol 1130C disposed thereon), the calibration information is also sufficiently accurate for one or more arm portions upstream of the particular arm portion. As described above, the plurality of arm portions may be arranged as a series of arm portions from the base of the robot to the end effector of the robot arm. The preceding arm portion may be upstream of the further arm portion if it is before the following arm portion of the series of arm portions. For example, link 1154D in fig. 11A-11B may be upstream of link 1154E and robot 1155. In one example, if the control circuit 111 determines that the calibration information (e.g., camera calibration information) is accurate enough for, for example, the manipulator 1155, it may determine that the calibration is accurate enough for the upstream arm portions (such as links 1154E, 1154D, 1154C, 1154B, and 1154A). In this example, the control circuit 111 may determine that the calibration information is sufficiently accurate for the arm portion if a deviation parameter value associated with the validation symbol disposed on the arm portion is below a prescribed deviation threshold. In an embodiment, if the control circuit 111 determines that there is a calibration error for a particular arm portion such that the calibration information is not sufficiently accurate for that arm portion, the control circuit 111 may determine that there is a calibration error for some or all of the downstream arm portions. For example, if the control circuit 111 determines that the calibration information is not accurate enough for a particular robot section (such as link 1154D), the control circuit 111 may determine that the calibration information is not accurate enough for downstream arm sections (such as link 1154E and manipulator 1155).
In an embodiment, one or more steps of the method 1200, such as steps 1203 to 1215, may be performed by the robot controller 110 or other computing system in response to a user command. For example, a user (e.g., a system operator) may manually trigger the calibration verification operation involving steps 1203 to 1215. In an embodiment, steps 1203 to 1215 may be performed during an idle period. The idle period may be, for example, a period during which no robotic operation is performed, such as picking objects from a conveyor belt or tray. In an embodiment, one or more steps of the method 1200, such as steps 1203-1215, may be performed by the robot controller 110 or other computing system in response to a specified trigger condition. As described above, the trigger condition may include, for example, an unplanned event, such as a collision involving a robot (e.g., 1150), an earthquake, or other natural disaster, which may cause changes to the robot (e.g., 1150) and/or camera (e.g., 1170). In some cases, the trigger condition may include a specific time period elapsing after an earlier calibration operation (such as the calibration operation used to determine the calibration information of step 1201). In such an example, the earlier calibration operation may be a first calibration operation, and the calibration operation of step 1215 may be a second calibration operation.
Additional discussion of various embodiments
Embodiment a1 relates to a robot control system comprising a communication interface configured to communicate with a robot having a base and a robotic arm with a validation symbol disposed thereon, and to communicate with a camera having a camera field of view. The robot control system also includes control circuitry configured to perform a first camera calibration (or, more generally, a calibration operation) to determine camera calibration information associated with the camera (or, more generally, calibration information associated with the robot control system). The control circuit is further configured to: a) controlling the robotic arm to move the validation symbol to a position within the field of view of the camera during or after the first camera calibration by outputting a first motion command to the robot via the communication interface, the position being a reference position of the one or more reference positions used for validation of the first camera calibration, b) receiving an image of the validation symbol from the camera via the communication interface, wherein the camera is configured to take an image of the validation symbol at the reference position, the image being a reference image for validation, c) determining reference image coordinates for validation, the reference image coordinates being coordinates where the validation symbol appears in the reference image; d) controlling the motion of the mechanical arm based on the camera calibration information to perform robot operation by outputting a second motion command based on the camera calibration information to the robot via the communication interface; e) detecting idle periods during operation of the robot; f) controlling the mechanical arm to move the verification symbol to at least the reference position in an idle period by outputting a third motion command to the robot through the communication interface; g) during an idle period, receiving an additional image of the verification symbol from the camera via the communication interface, wherein the camera is configured to capture the additional image of the verification symbol at least at the reference location, the additional image being a verification image for verification; h) determining verification image coordinates for verification, the verification image coordinates being coordinates where a verification symbol appears in the verification image; i) determining a deviation parameter value based on an amount of deviation between the reference image coordinates and the verification image coordinates, both of which are associated with the reference position, wherein the deviation parameter value is indicative of a change in the camera since the first camera calibration or a change in the relationship between the camera and the robot since the first camera calibration, j) determining whether the deviation parameter value exceeds a prescribed threshold, and k) in response to a determination that the deviation parameter value exceeds the prescribed threshold, performing a second camera calibration to determine updated camera calibration information (or, more generally, performing a second calibration operation to determine updated calibration information).
Embodiment a2 includes the robot control system of embodiment a1, wherein the control circuitry is configured to control the robot to continue robot operation after the idle period without additional camera calibration by outputting a fourth motion command to the robot via the communication interface in response to a determination that the deviation parameter value does not exceed the prescribed threshold.
Embodiment A3 includes the robot control system of embodiment a1 or a2, wherein the one or more reference positions are a plurality of reference positions that respectively correspond to a plurality of reference image coordinates, the reference image coordinates being one of a plurality of reference image coordinates. In this embodiment, the control circuit is further configured to determine a plurality of verification image coordinates corresponding to the plurality of reference positions, respectively, wherein the verification image coordinate is one of the plurality of verification image coordinates, wherein the deviation parameter value is based on respective deviation amounts between the plurality of reference image coordinates of the plurality of reference positions and the plurality of verification image coordinates, wherein each of the respective deviation amounts is: (a) an amount of deviation between reference image coordinates corresponding to respective ones of the plurality of reference locations, and (b) verification image coordinates corresponding to the same reference location.
Embodiment a4 includes the robot control system of embodiment A3, wherein the plurality of verification image coordinates are respective coordinates at which a verification symbol appears in the plurality of verification images, the verification image being one of the plurality of verification images, wherein the control circuitry is configured to control the camera to capture all of the plurality of verification images during the idle period.
Embodiment a5 includes the robot control system of embodiment A3, wherein the plurality of verification image coordinates are respective coordinates at which a verification symbol appears in the plurality of verification images, the verification image being one of a plurality of verification images, wherein the control circuitry is configured to control the camera to take the plurality of verification images in different idle periods, the idle periods being one of the different idle periods.
Embodiment a6 includes the robot control system of any one of embodiments a1-a5, wherein the authentication symbol includes a first region having a first color and a second region having a second color, wherein a ratio of an area of the first region to an area of the second region is defined as a prescribed ratio and stored on a storage device of the robot control system.
Embodiment a7 includes the robot control system of embodiment a6, wherein the control circuitry is configured to identify the validation symbol in the reference image or the validation image based on a specified ratio.
Embodiment A8 includes the robot control system of embodiment a7, wherein the robotic arm is provided with a calibration pattern, wherein the reference image includes a validation symbol and the calibration pattern, wherein the control circuitry is configured to determine whether a portion of the reference image is the validation symbol or the calibration pattern by determining whether the portion includes a first image region having a first color and a second image region having a second color, and whether a ratio of an area of the first image region to an area of the second image region is equal to a prescribed ratio.
Embodiment a9 includes the robot control system of any one of embodiments a1-A8, wherein the authentication symbol includes a first shape and a second shape concentric with one another, wherein respective centers of the first shape and the second shape are substantially at a same location.
Embodiment a10 includes the robot control system of embodiment a9, wherein the control circuitry is configured to determine the reference image coordinates by: a) determining first coordinates of a center of a first shape in a reference image; b) determining second coordinates of a center of a second shape in the reference image; and c) determining the reference image coordinates as an average of the first coordinates and the second coordinates in the reference image. In this embodiment, the control circuit is configured to determine the verification image coordinates by: d) determining first coordinates of a center of a first shape in a verification image; e) determining second coordinates of a center of a second shape in the verification image; and f) determining the verification image coordinates as an average of the first coordinates and the second coordinates in the verification image.
Embodiment a11 includes the robot control system of any one of embodiments a1-a10, wherein the control circuitry is configured to identify the verification symbol in the reference image or the verification image by identifying a circular ring, the verification symbol shaped as a circular ring.
Embodiment a12 includes the robot control system of any one of embodiments a1-a11, wherein the control circuitry is configured to determine a temperature of an environment in which the robot is located; and adjusting at least one of the prescribed threshold or the camera calibration information based on the measured temperature.
Embodiment a13 includes the robot control system of embodiment a12, wherein the control circuitry is configured to adjust the prescribed threshold based on temperature by: setting the prescribed threshold value to have a first value when the temperature is outside the prescribed range; and setting the threshold value to have a second value lower than the first value when the temperature is within the prescribed range.
Embodiment a14 includes the robot control system of any one of embodiments a1-a13, wherein the one or more reference positions to which the control circuitry is configured to cause the verification symbol to be moved by the robotic arm includes a plurality of reference positions disposed on a surface of a sphere that is recessed relative to the camera.
Embodiment a15 includes the robot control system of embodiment a14, wherein the control circuitry is further configured to control the robotic arm to move the validation symbol tangent to the surface of the sphere at each of the plurality of reference positions.
Embodiment a16 includes the robot control system of any one of embodiments a1-a15, wherein the control circuitry is configured to control the robotic arm to move the verification symbol to face directly toward the camera when the verification symbol is moved to the reference position.
Embodiment a17 includes the robot control system of any one of embodiments a1-a16, wherein the control circuitry is configured to detect an idle period of robot operation by detecting a time period during which the robot is not performing robot work during the robot operation.
Embodiment a18 includes the robot control system of embodiment a17, wherein the control circuitry is configured to control the robotic arm to interact with objects on the conveyor belt that are accessible to the robotic arm, wherein the control circuitry is configured to detect the idle period by detecting an absence of an object on the conveyor belt, or detecting a distance between the robot and a nearest object on the conveyor belt exceeding a prescribed distance threshold.
Embodiment B1 relates to a computing system comprising a communication interface and control circuitry, the communication interface configured to communicate with: (i) a camera having a camera field of view; (ii) a robot having a robotic arm with a plurality of arm portions movably attached to one another and with a set of validation symbols disposed on respective ones of the plurality of arm portions. When the robotic arm is within a camera field of view, the control circuitry is configured to perform a method comprising: outputting a motion command for controlling motion of the robotic arm for robotic operation, wherein the motion command is based on calibration information; determining a set of reference image coordinates, the set of reference image coordinates being respective coordinates where the set of validation symbols appear in a reference image, wherein the reference image is an image representing the set of validation symbols and is generated by the camera when the robotic arm is in a first pose during a first time period; outputting, during a second time period after the first time period, an additional motion command for controlling the robotic arm to move to the first pose; receiving a verification image, the verification image being an additional image representing the set of verification symbols and being generated by the camera when the robotic arm has been moved to the first pose as a result of an additional motion command; determining a set of validation image coordinates, the set of validation image coordinates being respective coordinates at which the set of validation symbols appear in the validation image; determining a set of respective deviation parameter values based on respective amounts of deviation between the set of reference image coordinates and the set of verification image coordinates, wherein the set of respective deviation parameter values is associated with the set of verification symbols; determining whether at least one deviation parameter value of the set of respective deviation parameter values exceeds a prescribed deviation threshold; and in response to determining that at least one deviation parameter value of the set of respective deviation parameter values exceeds a prescribed deviation threshold, at least one of: outputting a notification that at least one of the set of respective deviation parameter values exceeds a prescribed deviation threshold, or performing a calibration operation to determine updated calibration information. The control circuitry may perform the method by, for example, executing instructions on a non-transitory computer readable medium.
Embodiment B2 includes the computing system of embodiment B1, wherein the first gesture is associated with a first additional motion command output during a first time period, the first additional motion command having one or more actuation parameter values for controlling movement of the robotic arm to the first gesture, wherein the reference image is generated by the camera when the robotic arm is in the first gesture due to the first additional motion command. Furthermore, in this embodiment, the additional motion command output during the second time period is a second additional motion command and further comprises the one or more actuation parameter values.
Embodiment B3 includes the computing system of embodiment B2, wherein each validation symbol in the set of validation symbols has a circular shape, and wherein the one or more actuation parameter values of the first additional motion command and the second additional motion command cause the set of validation symbols to be positioned in a manner such that the set of validation symbols exhibits no eccentricity, or a corresponding amount of eccentricity less than a prescribed eccentricity threshold, in the reference image and in the validation image.
Embodiment B4 includes the computing system of any one of embodiments B1-B3, wherein the control circuitry is configured to determine one or more actuation parameter values for an additional motion command that causes each verification symbol of the set of verification symbols to be moved to face directly toward the camera.
Embodiment B5 includes the computing system of embodiment B4, wherein the one or more actuation parameter values tangent the set of verification symbols to one or more virtual spheres that are concave relative to the camera.
Embodiment B6 includes the computing system of any one of embodiments B1-B5, wherein when at least one of the set of verification symbols is in the shape of a circle, the control circuitry is configured to identify the at least one of the reference image and the verification image by identifying the circle.
Embodiment B7 includes the computing system of embodiment B6, wherein, when the set of validation symbols are in the shape of respective circles of respective different sizes, the control circuitry is configured to identify the at least one validation symbol based on the size of the respective circle forming the at least one validation symbol.
Embodiment B8 includes the computing system of any one of embodiments B1-B7, wherein the control circuitry is configured to identify at least one validation symbol of the set of validation symbols in the reference image based on a prescribed model describing the geometry of the robotic arm.
Embodiment B9 includes the computing system of embodiment B8, wherein the control circuitry is configured to determine, based on the model, an area within the reference image where the at least one verification symbol is expected to occur, and to search for the at least one verification symbol within the area of the reference image.
Embodiment B10 includes the computing system of any one of embodiments B1-B9, wherein the control circuitry is configured to determine a type of calibration error that causes the at least one deviation parameter value to exceed a prescribed deviation threshold based on a comparison between the set of respective deviation parameter values, wherein the type of calibration error indicates whether a loss of precision of the calibration information represents a change in the robot or whether the loss of precision represents a change in the camera.
Embodiment B11 includes the computing system of embodiment B10, wherein the control circuitry is configured to determine whether the set of respective deviation parameter values all exceed a prescribed deviation threshold and whether the set of respective deviation parameter values differ from each other by more than a prescribed uniformity threshold. The control circuitry in this embodiment is further configured to determine that the type of the calibration error is a calibration error representative of a change in the camera in response to determining that the set of respective deviation parameter values all exceed a prescribed deviation threshold and differ from each other by no more than a prescribed uniformity threshold.
Embodiment B12 includes the computing system of embodiment B11, wherein the control circuitry is further configured to determine that the type of calibration error is a calibration error that represents a change in the robot in response to determining that one or more of the set of respective deviation parameter values does not exceed a prescribed deviation threshold, or that the set of respective deviation set parameter values differ from each other by more than a prescribed uniformity threshold.
Embodiment B13 includes the computing system of any one of embodiments B1-B12. In this embodiment, the calibration information is associated with a first calibration operation, and the calibration operation used to generate the updated calibration information is a second calibration operation subsequent to the first calibration operation, wherein the control circuitry is configured to output the additional motion command and to output the camera command for receiving the verification image in response to a specified trigger condition. The specified trigger condition comprises at least one of: a prescribed period of time elapsed since the first calibration operation, or an event that the computing system was not planning and caused a robot or camera change.
Embodiment B14 includes the computing system of any one of embodiments B1-B13 wherein, when the plurality of robotic arm portions are arranged as a series of arm portions from a base of the robot to the robot end effector, the control circuitry is configured to: determining whether a deviation parameter value of a first verification symbol of the set of verification symbols exceeds a prescribed deviation threshold; identifying a first arm portion from the plurality of arm portions having a first authentication symbol disposed thereon; and responsive to determining that the deviation parameter value of the first verification symbol does not exceed the prescribed deviation threshold, determining that the calibration information is accurate for the first arm portion and at least one additional arm portion of the series of arm portions preceding the first arm portion.
Embodiment B15 includes the computing system of any one of embodiments B1-B14, wherein when the computing system is in communication with a conveyor belt for robotic operation, the control circuitry is configured to stop the conveyor belt in response to determining that at least one deviation parameter value exceeds a prescribed deviation threshold.
Embodiment B16 includes the computing system of any one of embodiments B1-B15, wherein the reference image is a first reference image associated with a first pose of the robotic arm, wherein the verification image is a first verification image associated with the first pose, wherein the set of reference image coordinates is a first set of reference image coordinates associated with the first pose, and wherein the set of verification image coordinates is a first set of verification image coordinates associated with the first pose. In this embodiment, the control circuit is configured to: determining a second set of reference image coordinates, the second set of reference image coordinates being respective coordinates where the set of validation symbols appear in a second reference image, wherein the second reference image is generated by the camera when the robotic arm is in a second pose; outputting a further motion command for controlling a robotic arm to move the robotic arm to a second pose after generating a second reference image; receiving a second verification image, the second verification image also representing the set of verification symbols and being generated by the camera when the robotic arm has been moved to the second pose as a result of the further motion command; and determining a second set of verification image coordinates, the second set of verification image coordinates being respective coordinates where the set of verification symbols appears in the second verification image. In this embodiment, the set of deviation parameter values is further based on an amount of deviation between the second set of reference image coordinates and the second set of verification image coordinates.
While various embodiments have been described above, it should be understood that they have been presented by way of illustration and example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. The breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims appended hereto and their equivalents. It is also to be understood that various features of the various embodiments discussed herein, as well as various references cited herein, may be used in combination with features of any other embodiment. All patents and publications discussed herein are incorporated by reference in their entirety.

Claims (20)

1. A computing system, comprising:
a communication interface configured to communicate with: (i) a camera having a camera field of view; and (ii) a robot having a robotic arm with a plurality of arm portions movably attached to one another and with a set of validation symbols disposed on respective ones of the plurality of arm portions; and
control circuitry configured to, when the robotic arm is within the camera field of view:
outputting a motion command for controlling motion of the robotic arm for robotic operation, wherein the motion command is based on calibration information;
determining a set of reference image coordinates, the set of reference image coordinates being respective coordinates where the set of validation symbols appear in a reference image, wherein the reference image is an image representing the set of validation symbols and is generated by the camera when the robotic arm is in a first pose during a first time period;
outputting, during a second time period after the first time period, an additional motion command for controlling the robotic arm to move to the first pose;
receiving a verification image, the verification image being an additional image representing the set of verification symbols and being generated by the camera when the robotic arm has been moved to a first pose as a result of the additional motion command;
determining a set of validation image coordinates, the set of validation image coordinates being respective coordinates at which the set of validation symbols appear in the validation image;
determining a set of respective deviation parameter values based on respective amounts of deviation between the set of reference image coordinates and the set of verification image coordinates, wherein the set of respective deviation parameter values is associated with the set of verification symbols,
determining whether at least one deviation parameter value of the set of respective deviation parameter values exceeds a specified deviation threshold, an
Determining a type of calibration error that causes at least one deviation parameter value of the set of respective deviation parameter values to exceed a prescribed deviation threshold value based on a comparison between the set of respective deviation parameter values and in response to determining that the at least one deviation parameter value exceeds the prescribed deviation threshold value, wherein the type of calibration error indicates whether a loss of precision of calibration information represents a change in the robot or whether the loss of precision represents a change in the camera.
2. The computing system of claim 1, wherein the control circuitry is further configured to: in response to determining that at least one deviation parameter value of the set of respective deviation parameter values exceeds a prescribed deviation threshold, at least one of: outputting a notification that at least one of the set of respective deviation parameter values exceeds a prescribed deviation threshold, or performing a calibration operation to determine updated calibration information.
3. The computing system of claim 1, wherein the first gesture is associated with a first additional motion command output during a first time period, the first additional motion command having one or more actuation parameter values for controlling movement of the robotic arm to the first gesture, wherein the reference image is generated by a camera when the robotic arm is in the first gesture due to the first additional motion command, and wherein the reference image is generated by the camera when the robotic arm is in the first gesture due to the first additional motion command
Wherein the additional motion command output during the second time period is a second additional motion command and also includes the one or more actuation parameter values.
4. The computing system of claim 3, wherein each validation symbol of the set of validation symbols has a circular shape, and wherein the one or more actuation parameter values of the first and second additional motion commands cause the set of validation symbols to be positioned in a manner such that the set of validation symbols exhibits no eccentricity, or a corresponding amount of eccentricity less than a prescribed eccentricity threshold, in the reference image and in the validation image.
5. The computing system of claim 1, wherein the control circuitry is configured to determine one or more actuation parameter values for the additional motion command that causes each validation symbol of the set of validation symbols to be moved to directly face the camera.
6. The computing system of claim 5, wherein the one or more actuation parameter values tangent the set of verification symbols to one or more virtual spheres that are concave relative to the camera.
7. The computing system of claim 1, wherein, when at least one of the set of validation symbols is in the shape of a circle, the control circuitry is configured to identify the at least one of the reference image and the validation symbol in the validation image by identifying the circle.
8. The computing system of claim 7, wherein, when the set of validation symbols are shaped as respective circles of respective different sizes, the control circuitry is configured to identify the at least one validation symbol based on the size of the respective circle forming the at least one validation symbol.
9. The computing system of claim 1, wherein the control circuitry is configured to identify at least one validation symbol of the set of validation symbols in the reference image based on a prescribed model describing a geometry of the robotic arm.
10. The computing system of claim 9, wherein the control circuitry is configured to
Determining, based on the model, a region within the reference image where the at least one verification symbol is expected to occur, and
searching for the at least one verification symbol within the region of the reference image.
11. The computing system of claim 1, wherein the control circuitry is configured to determine whether the respective sets of deviation parameter values all exceed a prescribed deviation threshold and whether the respective sets of deviation parameters differ from each other by more than a prescribed uniformity threshold, and
determining that the type of calibration error is a calibration error representative of a change in the camera in response to determining that the set of respective deviation parameter values all exceed a prescribed deviation threshold and differ from each other by more than a prescribed uniformity threshold.
12. The computing system of claim 11, wherein the control circuitry is further configured to determine the type of the calibration error is a calibration error representative of a change in the robot in response to determining that one or more of the set of respective deviation parameter values do not exceed a prescribed deviation threshold, or that the set of respective deviation parameter values differ from each other by no more than a prescribed uniformity threshold.
13. The computing system of claim 2, wherein the calibration information is associated with a first calibration operation and the calibration operation to generate updated calibration information is a second calibration operation subsequent to the first calibration operation, wherein the control circuitry is configured to output the additional motion command and to output a camera command to receive the verification image in response to a specified trigger condition, wherein the specified trigger condition comprises at least one of: a prescribed period of time elapsed since a first calibration operation, or an event not planned by the computing system that caused a change to the robot or the camera.
14. The computing system of claim 1, wherein, when the computing system is in communication with a conveyor belt for the robotic operation, the control circuitry is configured to stop the conveyor belt in response to determining that the at least one deviation parameter value exceeds a prescribed deviation threshold.
15. The computing system of claim 1, wherein the reference image is a first reference image associated with a first pose of the robotic arm, wherein the verification image is a first verification image associated with the first pose, wherein the set of reference image coordinates is a first set of reference image coordinates associated with the first pose, and the set of verification image coordinates is a first set of verification image coordinates associated with the first pose, wherein the control circuitry is configured to:
determining a second set of reference image coordinates, the second set of reference image coordinates being respective coordinates where the set of validation symbols appear in a second reference image, wherein the second reference image is generated by the camera when the robotic arm is in a second pose;
after generating the second reference image, outputting a further motion command for controlling the robotic arm to move the robotic arm to a second pose;
receiving a second verification image, the second verification image also representing the set of verification symbols and being generated by the camera when the robotic arm has been moved to a second pose as a result of the further motion command;
determining a second set of validation image coordinates, the second set of validation image coordinates being respective coordinates where the set of validation symbols occurred in a second validation image,
wherein the set of respective deviation parameter values is further based on respective amounts of deviation between the second set of reference image coordinates and the second set of verification image coordinates.
16. A computing system, comprising:
a communication interface configured to communicate with: (i) a camera having a camera field of view; and (ii) a robot having a robotic arm with a plurality of arm portions movably attached to one another and arranged in a series of arm portions from a base to a robotic end effector, and with a set of validation symbols arranged on respective ones of the plurality of arm portions; and
control circuitry configured to, when the robotic arm is within the camera field of view:
outputting a motion command for controlling motion of the robotic arm for robotic operation, wherein the motion command is based on calibration information;
determining a set of reference image coordinates, the set of reference image coordinates being respective coordinates where the set of validation symbols appear in a reference image, wherein the reference image is an image representing the set of validation symbols and is generated by the camera when the robotic arm is in a first pose during a first time period;
outputting, during a second time period after the first time period, an additional motion command for controlling the robotic arm to move to the first pose;
receiving a verification image, the verification image being an additional image representing the set of verification symbols and being generated by the camera when the robotic arm has been moved to a first pose as a result of the additional motion command;
determining a set of validation image coordinates, the set of validation image coordinates being respective coordinates at which the set of validation symbols appear in the validation image;
determining a set of respective deviation parameter values based on respective amounts of deviation between the set of reference image coordinates and the set of verification image coordinates, wherein the set of respective deviation parameter values is associated with the set of verification symbols;
determining whether a deviation parameter value of a first verification symbol of the set of verification symbols exceeds a prescribed deviation threshold;
identifying a first arm portion from the plurality of arm portions having a first authentication symbol disposed thereon; and
in response to determining that the deviation parameter value of the first verification symbol does not exceed the prescribed deviation threshold, determining that the calibration information is accurate for the first arm portion and at least one upstream arm portion, the at least one upstream arm portion being an additional arm portion of the series of arm portions preceding the first arm portion.
17. A non-transitory computer readable medium having stored thereon instructions that, when executed by control circuitry of a computing system, cause the control circuitry to:
outputting a motion command based on the calibration information, wherein the computing system is configured to communicate with: (i) a camera having a camera field of view; and (ii) a robot having a robotic arm comprising a plurality of arm portions movably attached to each other and comprising a set of validation symbols arranged on respective ones of the plurality of arm portions, and wherein the motion commands are for controlling motion of the robotic arm to perform robotic operations;
determining a set of reference image coordinates that are respective coordinates where the set of validation symbols appear in a reference image, wherein the reference image is an image representing the set of validation symbols and is generated by the camera when the robotic arm is in a first pose during a first time period;
outputting, during a second time period after the first time period, an additional motion command for controlling the robotic arm to move to the first pose;
receiving a verification image, the verification image being an additional image representing the set of verification symbols and being generated by the camera when the robotic arm has been moved to a first pose as a result of the additional motion command;
determining a set of validation image coordinates, the set of validation image coordinates being respective coordinates at which the set of validation symbols appear in the validation image;
determining a set of respective deviation parameter values based on respective amounts of deviation between the set of reference image coordinates and the set of verification image coordinates, wherein the set of respective deviation parameter values is associated with the set of verification symbols;
determining whether at least one deviation parameter value of the set of respective deviation parameter values exceeds a prescribed deviation threshold; and
based on the comparison between the set of respective deviation parameter values and in response to determining that at least one deviation parameter value of the set of respective deviation parameter values exceeds a prescribed deviation threshold value, determining a type of calibration error that causes the at least one deviation parameter value to exceed the prescribed deviation threshold value, wherein the type of calibration error indicates whether a loss of precision of the calibration information represents a change in the robot or whether the loss of precision represents a change in the camera.
18. A method performed by a computing system, comprising:
outputting, by the computing system, a motion command based on the calibration information, wherein the computing system is configured to communicate with: (i) a camera having a camera field of view; and (ii) a robot having a robotic arm with a plurality of arm portions movably attached to each other and with a set of validation symbols disposed on respective ones of the plurality of arm portions, and wherein the motion commands are for controlling motion of the robotic arm to perform robotic operations;
determining, by the computing system, a set of reference image coordinates that are respective coordinates where the set of validation symbols appear in a reference image, wherein the reference image is an image representing the set of validation symbols and is generated by the camera when the robotic arm is in a first pose during a first time period;
outputting, by the computing system, an additional motion command to control the robotic arm to move to the first pose during a second time period after the first time period;
receiving, by the computing system, a verification image, wherein the verification image is an additional image representing the set of verification symbols and is generated by the camera when the robotic arm has been moved to a first pose as a result of the additional motion command;
determining, by the computing system, a set of validation image coordinates that are respective coordinates where the set of validation symbols appear in the validation image;
determining, by the computing system, a set of respective deviation parameter values based on respective amounts of deviation between the set of reference image coordinates and the set of verification image coordinates, wherein the set of respective deviation parameter values is associated with the set of verification symbols;
determining, by the computing system, that at least one deviation parameter value in the set of respective deviation parameter values exceeds a prescribed deviation threshold; and
determining a type of calibration error that causes at least one deviation parameter value of the set of respective deviation parameter values to exceed a prescribed deviation threshold value based on a comparison between the set of respective deviation parameter values and in response to determining that the at least one deviation parameter value exceeds the prescribed deviation threshold value, wherein the type of calibration error indicates whether a loss of precision of calibration information represents a change in the robot or whether the loss of precision represents a change in the camera.
19. A non-transitory computer readable medium having stored thereon instructions that, when executed by control circuitry of a computing system, cause the control circuitry to:
outputting a motion command based on the calibration information, wherein the computing system is configured to communicate with: (i) a camera having a camera field of view; and (ii) a robot having a robotic arm with a plurality of arm portions movably attached to one another and arranged in a series of arm portions from a base to a robot end effector, and with a set of validation symbols arranged on respective ones of the plurality of arm portions, and wherein the motion commands are for controlling motion of the robotic arm to perform robotic operations;
determining a set of reference image coordinates that are respective coordinates where the set of validation symbols appear in a reference image, wherein the reference image is an image representing the set of validation symbols and is generated by the camera when the robotic arm is in a first pose during a first time period;
outputting, during a second time period after the first time period, an additional motion command for controlling the robotic arm to move to the first pose;
receiving a verification image, the verification image being an additional image representing the set of verification symbols and being generated by the camera when the robotic arm has been moved to a first pose as a result of the additional motion command;
determining a set of validation image coordinates, the set of validation image coordinates being respective coordinates at which the set of validation symbols appear in the validation image;
determining a set of respective deviation parameter values based on respective amounts of deviation between the set of reference image coordinates and the set of verification image coordinates, wherein the set of respective deviation parameter values is associated with the set of verification symbols;
determining whether a deviation parameter value of a first verification symbol of the set of verification symbols exceeds a prescribed deviation threshold;
identifying a first arm portion from the plurality of arm portions having a first authentication symbol disposed thereon; and
in response to determining that the deviation parameter value of the first verification symbol does not exceed the prescribed deviation threshold, determining that the calibration information is accurate for the first arm portion and at least one upstream arm portion, the at least one upstream arm portion being an additional arm portion of the series of arm portions preceding the first arm portion.
20. A method performed by a computing system, comprising:
outputting, by the computing system, a motion command based on the calibration information, wherein the computing system is configured to communicate with: (i) a camera having a camera field of view; and (ii) a robot having a robotic arm with a plurality of arm portions movably attached to one another and arranged in a series of arm portions from a base to a robot end effector, and with a set of validation symbols arranged on respective ones of the plurality of arm portions, and wherein the motion commands are for controlling motion of the robotic arm to perform robotic operations;
determining, by the computing system, a set of reference image coordinates that are respective coordinates where the set of validation symbols appear in a reference image, wherein the reference image is an image representing the set of validation symbols and is generated by the camera when the robotic arm is in a first pose during a first time period;
outputting, during a second time period after the first time period, an additional motion command for controlling the robotic arm to move to the first pose;
receiving a verification image, wherein the verification image is an additional image representing the set of verification symbols and is generated by the camera when the robotic arm has been moved to a first pose as a result of the additional motion command;
determining a set of validation image coordinates, the set of validation image coordinates being respective coordinates at which the set of validation symbols appear in the validation image;
determining a set of respective deviation parameter values based on respective amounts of deviation between the set of reference image coordinates and the set of verification image coordinates, wherein the set of respective deviation parameter values is associated with the set of verification symbols;
determining whether a deviation parameter value of a first verification symbol of the set of verification symbols exceeds a prescribed deviation threshold;
identifying a first arm portion from the plurality of arm portions having a first authentication symbol disposed thereon; and
in response to determining that the deviation parameter value of the first verification symbol does not exceed the prescribed deviation threshold, determining that the calibration information is accurate for the first arm portion and at least one upstream arm portion, the at least one upstream arm portion being an additional arm portion of the series of arm portions preceding the first arm portion.
CN202010831931.1A 2019-03-29 2020-07-07 Method for verifying and updating calibration information for robot control and control system Active CN111890371B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US16/369,630 US10399227B1 (en) 2019-03-29 2019-03-29 Method and control system for verifying and updating camera calibration for robot control
US201962916798P 2019-10-18 2019-10-18
US62/916,798 2019-10-18
US16/864,071 US10906184B2 (en) 2019-03-29 2020-04-30 Method and control system for verifying and updating camera calibration for robot control
US16/864,071 2020-04-30
CN202010646706.0A CN112677146A (en) 2019-10-18 2020-07-07 Method for verifying and updating calibration information for robot control and control system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010646706.0A Division CN112677146A (en) 2019-03-29 2020-07-07 Method for verifying and updating calibration information for robot control and control system

Publications (2)

Publication Number Publication Date
CN111890371A CN111890371A (en) 2020-11-06
CN111890371B true CN111890371B (en) 2021-05-04

Family

ID=73264228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010831931.1A Active CN111890371B (en) 2019-03-29 2020-07-07 Method for verifying and updating calibration information for robot control and control system

Country Status (1)

Country Link
CN (1) CN111890371B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113043282B (en) * 2019-12-12 2022-03-29 牧今科技 Method and system for object detection or robot interactive planning
CN116038707B (en) * 2023-01-30 2023-08-04 深圳技术大学 Intelligent fault automatic diagnosis system based on data driving

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63163907A (en) * 1986-12-26 1988-07-07 Toyota Motor Corp Method for matching coordinate in intelligent robot
JPH0460817A (en) * 1990-06-29 1992-02-26 Fanuc Ltd Detection of camera positional deviation
CN103538061A (en) * 2012-07-11 2014-01-29 精工爱普生株式会社 Robot system, robot, robot control device, robot control method, and robot control program
JP2018001333A (en) * 2016-06-30 2018-01-11 セイコーエプソン株式会社 Calibration board, robot and detection method
CN108621125A (en) * 2017-03-22 2018-10-09 株式会社东芝 Object manipulation device and its calibration method
CN110103219A (en) * 2019-03-07 2019-08-09 牧今科技 Automatic camera calibration is executed to carry out the method and system of robot control
CN110193832A (en) * 2019-03-29 2019-09-03 牧今科技 Verifying and the method and control system for updating robot control camera calibrated
CN110253629A (en) * 2019-04-12 2019-09-20 牧今科技 For updating the method and control system that are used for the camera calibrated of robot control

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63163907A (en) * 1986-12-26 1988-07-07 Toyota Motor Corp Method for matching coordinate in intelligent robot
JPH0460817A (en) * 1990-06-29 1992-02-26 Fanuc Ltd Detection of camera positional deviation
CN103538061A (en) * 2012-07-11 2014-01-29 精工爱普生株式会社 Robot system, robot, robot control device, robot control method, and robot control program
JP2018001333A (en) * 2016-06-30 2018-01-11 セイコーエプソン株式会社 Calibration board, robot and detection method
CN108621125A (en) * 2017-03-22 2018-10-09 株式会社东芝 Object manipulation device and its calibration method
CN110103219A (en) * 2019-03-07 2019-08-09 牧今科技 Automatic camera calibration is executed to carry out the method and system of robot control
CN110193832A (en) * 2019-03-29 2019-09-03 牧今科技 Verifying and the method and control system for updating robot control camera calibrated
CN110253629A (en) * 2019-04-12 2019-09-20 牧今科技 For updating the method and control system that are used for the camera calibrated of robot control

Also Published As

Publication number Publication date
CN111890371A (en) 2020-11-06

Similar Documents

Publication Publication Date Title
CN111230865B (en) Method and control system for verifying and updating camera calibration for robot control
US11883964B2 (en) Method and control system for verifying and updating camera calibration for robot control
CN112677146A (en) Method for verifying and updating calibration information for robot control and control system
JP6822719B2 (en) Robot system with automatic package scanning and registration mechanism, and how it works
US11571816B2 (en) Method and control system for updating camera calibration for robot control
CN112091970B (en) Robotic system with enhanced scanning mechanism
CN111890371B (en) Method for verifying and updating calibration information for robot control and control system
TWI615691B (en) Anti-collision system and anti-collision method
US20240001557A1 (en) Robot and robot hand-eye calibrating method
TW202201946A (en) Camera system and robot system for simplifying operation of unmanned aerial vehicle carrying camera device
EP4382258A1 (en) Robot control device, robot control system, and robot control method
US20240210542A1 (en) Methods and apparatus for lidar alignment and calibration
Wijaya et al. A Visual-Based Pick and Place on 6 DoF Robot Manipulator
TW201927498A (en) Monitoring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant