Disclosure of Invention
The invention provides a robot hand-eye calibration method, a robot hand-eye calibration device, a computer readable storage medium, a robot and a robot operation method, and aims to solve the problem that a robot cannot meet high-precision processing requirements due to hand-eye matrix errors.
According to one aspect of the embodiments of the present invention, a method for calibrating a hand and an eye of a robot is disclosed, which comprises:
acquiring a hand-eye matrix; the hand-eye matrix represents a transformation relationship of a camera coordinate system with respect to a tool coordinate system of the robot, the tool coordinate system being used to define a center position of the tip tool and a pose of the tip tool;
acquiring a calibration plate image acquired by the camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points to a tool coordinate system of the robot by using the hand-eye matrix to obtain a first coordinate set;
controlling the end tool to contact a plurality of second calibration points on a calibration plate, and acquiring coordinates of the second calibration points under the tool coordinate system to acquire a second coordinate set, wherein the second calibration points and the first calibration points are a plurality of same position points on the calibration plate;
registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix;
and correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
In an exemplary embodiment, the obtaining the coordinates of the second calibration points in the tool coordinate system obtains a second coordinate set, including:
recording coordinates of the second calibration points under a base coordinate system of the robot;
and according to the pose of the camera when the calibration plate image is acquired, converting the coordinates of the second calibration points under the base coordinate system into the coordinates under the tool coordinate system to obtain a second coordinate set.
In an exemplary embodiment, the transforming the coordinates of the second calibration points in the base coordinate system to the tool coordinate system according to the pose of the camera when the calibration plate image is captured, obtaining the second coordinate set, includes:
multiplying the coordinates of the second calibration points in the base coordinate system by pose data of the calibration plate image acquired by the camera;
and taking the result obtained by the multiplication operation as the second coordinate set.
In an exemplary embodiment, the modifying the hand-eye matrix with the compensation matrix to obtain a modified hand-eye matrix includes:
multiplying the compensation matrix and the hand-eye matrix;
and taking the result obtained by the multiplication operation as the corrected hand-eye matrix.
In an exemplary embodiment, said selecting a plurality of first calibration points on said calibration board image comprises:
selecting at least three first calibration points on the calibration plate image, wherein the at least three first calibration points are not on the same straight line;
controlling a plurality of second index points on the end tool contact calibration plate, including:
controlling the end tool to contact at least three second calibration points on a calibration plate, the at least three second calibration points not being on the same line;
the at least three second calibration points correspond to the at least three first calibration points one to one.
In an exemplary embodiment, said selecting a number of first calibration points on said calibration plate image comprises:
selecting four first calibration points on the calibration plate image;
said controlling said end tool to contact a plurality of second index points on an index plate comprising:
and controlling the end tool to contact four second calibration points on the calibration plate.
In an exemplary embodiment, before acquiring the calibration board image captured by the camera, the method further includes:
controlling the robot to drive the camera to move to the position above the calibration plate;
and controlling the camera to acquire the image of the calibration plate.
In an exemplary embodiment, the acquiring the hand-eye matrix includes:
controlling the robot to drive the camera to move to the position above the calibration plate;
controlling the robot to have at least five different postures, acquiring calibration plate images acquired by the camera under the at least five different postures respectively, and recording pose data of the at least five different postures under a tool coordinate system;
and calculating to obtain the hand-eye matrix by adopting a Zhang Zhengyou calibration method.
According to an aspect of an embodiment of the present invention, there is disclosed a robot including: the robot comprises a robot body, a camera, a tail end tool and a controller, wherein the robot body is provided with a plurality of motion axes; the camera is arranged at the tail end of the robot body; the end tool is arranged at the end of the robot body; the controller is connected with the robot body, the camera and the end tool and is used for controlling the robot body, the camera and the end tool to execute the steps of the hand-eye calibration method.
According to an aspect of an embodiment of the present invention, a robot working method is disclosed, in which a tip tool and a camera are provided at a tip of the robot. The operation method comprises the following steps:
the robot hand-eye calibration is carried out by adopting the hand-eye calibration method;
controlling the camera to acquire a working object image of the robot;
performing image processing on the operation object image to obtain an operation position;
and controlling the end tool to move to the working position for working.
According to an aspect of the embodiments of the present invention, a robot hand-eye calibration device is disclosed, which includes:
the hand-eye matrix acquisition module is used for acquiring a hand-eye matrix;
the image coordinate acquisition module is used for acquiring a calibration plate image acquired by the camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points into a tool coordinate system of the robot by adopting the hand-eye matrix to obtain a first coordinate set;
the actual coordinate acquisition module is used for controlling the tail end tool to contact a plurality of second calibration points on the calibration plate, acquiring coordinates of the second calibration points under the tool coordinate system and acquiring a second coordinate set, wherein the second calibration points and the first calibration points are a plurality of same position points on the calibration plate;
the registration module is used for registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix;
and the correction module is used for correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
According to an aspect of the embodiments of the present invention, a robot hand-eye calibration apparatus is disclosed, the robot hand-eye calibration apparatus including:
one or more processors;
a memory for storing one or more programs that, when executed by the one or more processors, cause the robotic eye calibration apparatus to implement the aforementioned robotic eye calibration method.
According to an aspect of the embodiments of the present invention, a computer-readable storage medium is disclosed, which stores computer-readable instructions, which, when executed by a processor of a computer, cause the computer to execute the aforementioned robot hand-eye calibration method.
The technical scheme provided by the embodiment of the invention at least comprises the following beneficial effects:
according to the technical scheme provided by the invention, after the hand-eye matrix is obtained, the hand-eye matrix is corrected, a plurality of calibration points on the calibration plate image are obtained, the obtained hand-eye matrix is adopted to convert the calibration points into a tool coordinate system of a robot, and a coordinate set of the image calibration points in the tool coordinate system is obtained; controlling the end tool to contact a plurality of calibration points on the calibration plate, and acquiring a coordinate set of the actual calibration points in a tool coordinate system; then, registering the coordinate set of the image calibration point under the tool coordinate system and the coordinate set of the actual calibration point under the tool coordinate system by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix; and finally, correcting the hand-eye matrix by adopting the compensation matrix, wherein the corrected hand-eye matrix has higher precision, and the robot can meet the requirement of high-precision processing.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Detailed Description
While this invention is susceptible of embodiment in different forms, there is shown in the drawings and will herein be described in detail, specific embodiments thereof with the understanding that the present description is to be considered as an exemplification of the principles of the invention and is not intended to limit the invention to that as illustrated.
Furthermore, the terms "comprising," "having," and any variations thereof, as referred to in the description of the invention, are intended to cover non-exclusive inclusions. Such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules recited, but may alternatively include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more features.
In the description of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
It should be noted that in the embodiments of the present invention, words such as "exemplary" or "exemplary" are used for example, illustration, or description. Any embodiment or design described as "exemplary" or "exemplary" in embodiments of the invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the words "exemplary" or "exemplary" is intended to present relevant concepts in a concrete fashion.
Exemplary embodiments will be described in detail below. The implementations described in the following exemplary examples do not represent all implementations consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the summary of the invention.
FIG. 1 illustrates a robot architecture diagram in accordance with an exemplary embodiment. As shown in fig. 1, the robot 300 includes a robot body 310, a camera 320, and an end tool 330, the camera 320 and the end tool 330 being disposed at an end of the robot body 310; the robot body 310 drives the camera 320 and the end tool 330 to move, so that the camera 320 is used to capture an image of a work object at a short distance, a work position is obtained by analyzing the image of the work object, and the end tool 330 is further driven to move to the work position for work.
It can be understood that the robot 300 further includes a controller, the controller is connected to the robot body 310, the camera 320 and the end tool 330 to control the movement of the robot body 310 and control the camera 320 to capture an image of a work object, and the controller further analyzes the image of the work object to obtain a work position, so as to control the robot body 310 to move and drive the end tool 330 to move to the work position for performing a work. It is understood that the controller may be built in the robot body 310, or may be disposed outside the robot body 310.
The end tool 330 may be a welding gun and the object to be welded is the object to be welded, in which case the robot 300 is a welding robot. The end tool 330 may be a glue gun, and the object to be glued is the object to be glued, and the robot 300 is a gluing robot. The end tool 330 may also be a cutter, and the work object is the object to be cut, in which case the robot 300 is a cutting robot. Of course, the end tool 330 may be other tools that can be disposed at the end of the robot 300 and driven by the robot 300 to perform work, and is not limited to the welding gun, the glue gun, the cutter, and the like.
The embodiment of the invention provides a robot hand-eye calibration method. The hand-eye calibration method comprises the following steps:
acquiring a hand-eye matrix;
acquiring a calibration plate image acquired by a camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points into a tool coordinate system of a robot by adopting a hand-eye matrix to obtain a first coordinate set;
controlling the tail end tool to contact a plurality of second calibration points on the calibration plate, acquiring coordinates of the second calibration points under a tool coordinate system, and acquiring a second coordinate set, wherein the second calibration points and the first calibration points are a plurality of same position points on the calibration plate;
registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix;
and correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
According to the technical scheme provided by the embodiment of the invention, after the hand-eye matrix is obtained, the hand-eye matrix is corrected, a plurality of calibration points on the calibration plate image are obtained, the obtained hand-eye matrix is adopted to convert the calibration points into a tool coordinate system of a robot, and a coordinate set of the image calibration points in the tool coordinate system is obtained; controlling the end tool to contact a plurality of calibration points on the calibration plate, and acquiring a coordinate set of the actual calibration points in a tool coordinate system; then, registering the coordinate set of the image calibration point under the tool coordinate system and the coordinate set of the actual calibration point under the tool coordinate system by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix; and finally, the hand-eye matrix is corrected by adopting the compensation matrix, the corrected hand-eye matrix has higher precision, and the robot can meet the requirement of high-precision processing.
The embodiments of the present invention will be further explained in detail with reference to the drawings in the examples of the present specification.
As shown in fig. 2, an embodiment of the present invention provides a robot hand-eye calibration method, where the robot hand-eye calibration method includes a hand-eye matrix obtaining step and a hand-eye matrix correcting step, where the hand-eye matrix obtaining step includes step S101, and the hand-eye matrix correcting step includes steps S102 to S105.
S101, acquiring a hand-eye matrix.
It is understood that the hand-eye matrix represents the translation of the camera coordinate system with respect to the tool coordinate system of the robot, the hand-eye matrix = robot arm tip to camera translation + robot arm tip to camera rotation. In detail, the hand-eye matrix may be a 4 × 4 square matrix.
In one exemplary embodiment, as shown in fig. 3, step S101 includes:
and S1011, controlling the robot to drive the camera to move to the upper part of the calibration plate. To take a calibration plate image by the camera.
In detail, the calibration plate is a checkerboard calibration plate.
And S1012, controlling the robot to take at least five different postures, acquiring calibration plate images acquired by the camera under at least five different postures respectively, and recording pose data of at least five different postures under a tool coordinate system.
And S1013, calculating to obtain a hand-eye matrix by adopting a Zhang Zhengyou calibration method.
It can be understood that the robot is controlled to obtain calibration plate images acquired by the camera under at least five different postures so as to be suitable for calculation by adopting a Zhang Zhengyou calibration method, so as to obtain a hand-eye matrix. In other embodiments, other calibration methods may be used to calculate to obtain the hand-eye matrix, and in this case, the robot is not limited to be controlled to obtain the calibration plate images acquired by the camera in at least five different poses.
In the case of obtaining calibration plate images acquired under at least five different poses and pose data of at least five different poses in a tool coordinate system, a Zhang Zhengyou calibration method is adopted, and how to calculate to obtain a hand-eye matrix is the prior art and is not described herein again.
S102, acquiring a calibration plate image acquired by a camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points into a tool coordinate system of the robot by using a hand-eye matrix to obtain a first coordinate set.
As can be appreciated, the tool coordinate system is used to define the center position of the end tool and the pose of the end tool.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting at least three first calibration points on the calibration plate image, wherein the at least three first calibration points are not on the same straight line. At least three first calibration points are selected, the obtained first coordinate set comprises a plurality of coordinate points, and the accuracy is higher when the registration is carried out by adopting an iterative nearest neighbor algorithm in the subsequent step S104. And the at least three first calibration points are not in the same line, the method can be applied to the registration of the rotation direction and the translation direction when the iterative nearest neighbor algorithm is adopted for registration.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting four first calibration points on the calibration board image. Four first calibration points are selected, the first coordinate set comprises more coordinate points, when the iterative nearest neighbor algorithm is adopted for registration in the subsequent step S104, the accuracy can be further improved, and meanwhile, the efficiency cannot be influenced due to large calculation amount caused by excessive coordinate points.
In an exemplary embodiment, as shown in fig. 4, step S102 includes:
and S1021, controlling the robot to drive the camera to move to the position above the calibration plate.
S1022, controlling the camera to collect the image of the calibration plate and recording the pose of the current robot in the tool coordinate system
And S1023, acquiring the calibration board image acquired by the camera, and selecting a plurality of first calibration points on the calibration board image.
And S1024, converting the first calibration point to a tool coordinate system of the robot by adopting a hand-eye matrix to obtain a first coordinate set.
For example, as shown in fig. 5, four first calibration points P0, P1, P2, and P3 are selected, and coordinates of the four first calibration points P0, P1, P2, and P3 in the camera coordinate system are represented as
In an exemplary embodiment, in step S1024, the first calibration points are converted into the tool coordinate system of the robot using the hand-eye matrix, and the four first calibration points P0, P1, P2, P3 are expressed as the four first calibration points P0, P1, P2, P3 in the tool coordinate system of the robot
Then the
Can be expressed as:
wherein,
representing the coordinates of the first index point in the tool coordinate system,
a matrix of the eyes of the hand is represented,
representing the coordinates of the first calibration point in the camera coordinate system.
S103, controlling the end tool to contact a plurality of second calibration points on the calibration plate, and acquiring coordinates of the second calibration points in a tool coordinate system to acquire a second coordinate set. The second calibration points and the first calibration points are the same position points on the calibration plate.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting at least three first calibration points on the calibration plate image, wherein the at least three first calibration points are not on the same straight line. Correspondingly, controlling the end tool to contact a plurality of second index points on the index plate, including: the control end tool contacts at least three second index points on the index plate, the at least three second index points not being on the same line. The at least three second calibration points correspond to the at least three first calibration points one by one. At least three first calibration points and at least three second calibration points are selected, the obtained first coordinate set and the second coordinate set comprise a plurality of coordinate points, and the accuracy is higher when the iterative nearest neighbor algorithm is adopted for registration in the subsequent step S104. And the at least three first and second calibration points are not in the same line, so that the method can be applied to the registration of the rotation direction and the translation direction when the iterative nearest neighbor algorithm is adopted for registration.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting four first calibration points on the calibration board image. Correspondingly, controlling the end tool to contact a plurality of second index points on the index plate, including: the control end tool contacts four second index points on the index plate. Four first calibration points and four second calibration points are selected, the first coordinate set and the second coordinate set comprise more coordinate points, when the iterative nearest neighbor algorithm is adopted for registration in the subsequent step S104, the accuracy can be further improved, and meanwhile, the efficiency cannot be influenced due to large calculated amount caused by excessive coordinate points.
In an exemplary embodiment, as shown in fig. 6, step S103 includes:
and S1031, controlling the end tool to contact a plurality of second calibration points on the calibration plate.
And S1032, recording the coordinates of the plurality of second calibration points in the base coordinate system of the robot.
And S1033, converting the coordinates of the second calibration points under the base coordinate system into the coordinates under the tool coordinate system according to the pose of the camera when the camera collects the calibration plate image, and obtaining a second coordinate set.
In an exemplary embodiment, as shown in fig. 7, step S1033 includes:
s10331, multiplying the coordinates of the second calibration points in the base coordinate system by the pose data of the calibration board image collected by the camera in the step S102.
S10332, the result obtained by the multiplication operation is taken as the second coordinate set.
For example, as shown in fig. 8, the robot is sequentially moved to make the end of the
end tool 330 contact four second calibration points C0, C1, C2, and C3 on the calibration plate, the four second calibration points C0, C1, C2, and C3 correspond to the four first calibration points P0, P1, P2, and P3 one-to-one, and coordinates of the four second calibration points C0, C1, C2, and C3 in the basic coordinate system of the robot are represented as
In step S1033, in an exemplary embodiment, coordinates of a plurality of second calibration points in the base coordinate system are converted to be in the tool coordinate system according to the pose of the camera at the time of acquiring the calibration plate image in step S102, and four second calibration points C0, C1, C2, C3 are expressed as four second calibration points in the tool coordinate system of the robot
Then the
Can be expressed as:
wherein,
representing the coordinates of the second index point in the tool coordinate system,
representing the coordinates of the second calibration point in the base coordinate system,
the inverse of the pose matrix when the camera acquires the calibration plate image in step S102 is represented.
And S104, registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix.
In detail, the second coordinate set is used as source data, the first coordinate set is used as data needing registration, and registration is carried out to obtain a compensation matrix. It is understood that the iterative nearest neighbor algorithm, also called ICP Registration (Point Cloud Registration), is used to input two Point clouds (source data and data to be registered), and output a transformation (compensation matrix) to make the coincidence degree of the source data and the data to be registered as high as possible. The transformations may or may not be rigid, including rotation and translation. As to how to perform registration by using the iterative nearest neighbor algorithm, it is not described herein again for the prior art.
And S105, correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
In an exemplary embodiment, as shown in fig. 9, step S105 includes:
s1051, multiplying the compensation matrix and the hand-eye matrix.
And S1052, taking the result obtained by the multiplication operation as a corrected hand-eye matrix.
For example, the compensation matrix obtained in step S104 is represented as T1, and the hand-eye matrix obtained in step S101 is represented as T1
The compensated hand-eye matrix is represented as
The corrected hand-eye matrix can be obtained by the following relational expression calculation
It should be noted that, in the foregoing embodiment, the sequential execution of steps S101 to S105 is merely exemplary, and in other embodiments, the execution order of partial steps may be adjusted. For example, in one embodiment, step S103 is performed first, then steps S101 and S102 are performed in sequence, and then steps S104 and S105 are performed in sequence.
In addition, in the foregoing embodiment, after step S101 is executed, in step S102, the robot is controlled again to drive the camera to move to the position above the calibration board, and then the camera is controlled to capture the image of the calibration board, and the pose of the robot in the tool coordinate system is recorded as
And then acquiring a calibration plate image acquired by the camera, and selecting a plurality of first calibration points on the calibration plate image. It should be understood that, in an embodiment, several first calibration points may be extracted from one of the calibration board images directly obtained in step S1011, and the pose data recorded in step S1012 corresponding to the one of the calibration board images is utilized, and in step S102, steps S1021 to S1023 are not executed.
Referring to fig. 1, in order to implement the method for calibrating a hand and an eye of a robot according to an embodiment of the present invention, the embodiment of the present invention provides a robot 300, where the robot 300 includes a robot body 310, a camera 320, an end tool 330, and a controller (not shown). The robot body 310 has a plurality of motion axes, and the camera 320 and the tip tool 330 are disposed at the tip of the robot body 310. The controller is connected to the robot body 310, the camera 320 and the end tool 330, and is configured to control the robot body 310, the camera 320 and the end tool 330, so that the robot 300 can perform all or part of the steps of the robot hand-eye calibration method shown in any one of fig. 2 to 4, 6 to 7 and 9.
For example, the robot body 310 has six axes of motion, i.e., the robot 300 is a six-axis robot.
By way of example, the end tool 330 is a welding gun.
FIG. 10 illustrates a flow chart of a method of robotic work of an exemplary embodiment. As shown in fig. 10, the robot working method includes the steps of:
s401, calibrating the robot hand and eye by adopting the hand and eye calibration method.
S402, the camera 320 is controlled to capture a work object image of the robot.
S403, performs image processing on the job target image to obtain a job position.
S404, the end tool 330 is controlled to move to the working position for working.
Referring next to fig. 11, fig. 11 is a block diagram illustrating a robot hand-eye calibration apparatus 200 according to an exemplary embodiment, where the robot hand-eye calibration apparatus 200 may be applied to a robot to perform all or part of the steps of the robot hand-eye calibration method shown in any one of fig. 2 to 4, 6 to 7, and 9. As shown in fig. 11, the robot hand-eye calibration device 200 includes, but is not limited to: a hand-eye matrix acquisition module 210, an image coordinate acquisition module 220, an actual coordinate acquisition module 230, a registration module 240, and a correction module 250.
The hand-eye matrix obtaining module 210 is configured to obtain a hand-eye matrix.
The image coordinate obtaining module 220 is configured to obtain a calibration plate image collected by the camera, select a plurality of first calibration points on the calibration plate image, and convert the first calibration points into a tool coordinate system of the robot by using a hand-eye matrix, so as to obtain a first coordinate set.
The actual coordinate obtaining module 230 is configured to control the end tool to contact a plurality of second calibration points on the calibration board, obtain coordinates of the plurality of second calibration points in the tool coordinate system, and obtain a second coordinate set. The second calibration points and the first calibration points are the same position points on the calibration plate.
The registration module 240 is configured to perform registration on the first coordinate set and the second coordinate set by using an iterative nearest neighbor algorithm to obtain a compensation matrix.
The correcting module 250 is configured to correct the hand-eye matrix by using the compensation matrix to obtain a corrected hand-eye matrix.
The implementation process of the functions and actions of each module in the robot eye calibration apparatus 200 is specifically described in the implementation process of the corresponding step in the robot eye calibration method, and is not described herein again.
In order to implement the robot hand-eye calibration method provided by the embodiment of the invention, an embodiment of the invention provides a robot hand-eye calibration device, which can be integrated into a robot. The robot hand-eye calibration device comprises a processor and a memory, wherein the memory is used for storing one or more programs, and when the one or more programs are executed by the processor, the robot hand-eye calibration device is enabled to realize the robot hand-eye calibration method provided by the embodiment.
The specific manner in which the processor of the robot hand-eye calibration apparatus performs the operation in this embodiment has been described in detail in the embodiments related to the robot hand-eye calibration method, and will not be described in detail herein.
In detail, the processor may comprise one or more processing units, e.g. two CPUs. And as an example, the robot hand-eye calibration apparatus may comprise a plurality of processors, for example two processors. Each of these processors may be a single-Core Processor (CPU) or a multi-Core Processor (CPU). It will be understood that a processor may refer to one or more devices, circuits, and/or processing cores configured to process data (e.g., computer program instructions).
The memory may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, a charged Erasable Programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
It will be appreciated that the memory may be self-contained, connected to the processor by a bus. The memory may also be integrated with the processor.
In an exemplary embodiment, the memory is used for storing computer-executable instructions corresponding to the software program of the present invention. The processor may implement various functions of the robotic eye calibration device by running or executing software program data stored in the memory.
An embodiment of the present invention also provides a storage medium that is a computer-readable storage medium, such as may be transitory and non-transitory computer-readable storage media, including instructions. The storage medium stores computer readable instructions which, when executed by a processor of a computer, cause the computer to perform the above-described robotic eye calibration method.
An embodiment of the present invention further provides a computer program product, where the computer program product is directly loadable into a memory and contains software codes, and the computer program product can be loaded and executed by a computer to implement the robot eye calibration method provided in the foregoing embodiments.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in this invention may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a modular division is merely a logical division, and other divisions may be realized in practice. For example, various elements or components may be combined or may be integrated into another device, or some features may be omitted, or not implemented.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.