Disclosure of Invention
An object of the embodiments of the present disclosure is to provide an object grasping method, an object grasping apparatus, an electronic device, and a computer-readable storage medium, which can achieve both highly dexterous user control with accurate positioning and partial automatic grasping while avoiding an object from falling.
According to a first aspect of the present disclosure, there is provided an object grasping method including:
acquiring an expected contact point preset in a controlled manipulator, and acquiring motion data of a human hand in real time;
calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data;
mapping the included angle to a target angle when the controlled manipulator grabs an object, and controlling the controlled manipulator to reach the target angle;
and controlling and adjusting the controlled manipulator until the expected contact point is detected to be fully contacted with the object.
In an exemplary embodiment of the present disclosure, the human hand is worn with a data glove, and sensors are provided at the tip of the five fingers and the palm of the human hand, respectively;
the real-time motion data of the human hand is collected, and the real-time motion data comprises the following steps:
and periodically reading each sensor to acquire the motion data, wherein the motion data comprises three-axis acceleration, three-axis angular velocity and quaternion of the motion of the human hand.
In an exemplary embodiment of the present disclosure, the calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data includes:
normalizing the collected quaternions of the five fingers and the hand back to obtain corresponding attitude quaternions;
and calculating the included angle between each finger and the back of the hand by using the posture quaternion.
In an exemplary embodiment of the present disclosure, a proportional-derivative controller is disposed on the controlled manipulator; the mapping the included angle to a target angle when the controlled manipulator grabs the object includes:
and acquiring an included angle between the five fingers of the hand and the back of the hand, and mapping by the proportional differential controller to obtain the corresponding target angle.
In an exemplary embodiment of the present disclosure, the desired contact point is set according to a shape, a size, and a number of the objects.
In an exemplary embodiment of the present disclosure, a tactile sensor is disposed at each finger of the controlled manipulator;
the adjusting the controlled manipulator until the object contacts all the desired contact points of the controlled manipulator comprises:
adjusting the joint moment of the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator.
In an exemplary embodiment of the present disclosure, the adjusting the joint torque of the controlled manipulator includes:
and adjusting the joint moment of the controlled manipulator to move each finger of the controlled manipulator to the direction of increasing the contact area with the object.
According to a second aspect of the present disclosure, there is provided an object grasping apparatus comprising:
the acquisition module is used for acquiring an expected contact point preset in the controlled manipulator and acquiring motion data of the human hand in real time;
the calculation module is used for calculating the included angle between the five fingers of the human hand and the back of the hand based on the motion data;
the mapping module is used for mapping the included angle to a target angle when the controlled manipulator grabs an object and controlling the controlled manipulator to reach the target angle;
and the grabbing module is used for adjusting the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
in the object grabbing method provided by the embodiment of the disclosure, firstly, an expected contact point preset in a controlled manipulator is obtained, and motion data of a human hand is collected in real time; and then, calculating an included angle between the five fingers of the hand and the back of the hand based on the collected motion data, mapping the calculated included angle to a target angle when the controlled manipulator grabs the object, controlling the controlled manipulator to reach the target angle, and adjusting the controlled manipulator until all the set expected contact points are detected to contact the object to be grabbed. On the one hand, in the object grabbing method provided by the exemplary embodiment, the included angle between the five fingers of the human hand and the back of the hand is calculated through the collected data of the motion of the human hand, the target angle required by the controlled manipulator for grabbing the object is obtained based on the included angle, and then the controlled manipulator is controlled to grab the object according to the target angle. On the other hand, after the manipulator is controlled to contact the object according to the target angle, the manipulator is adjusted until the preset expected contact points in the manipulator all contact the object to be grabbed, and the contact area between the manipulator and the object can be increased and the grabbing stability is improved by setting the expected contact points. Meanwhile, the manipulator can be controlled to control and adjust the own gripping capability according to different task requirements by adjusting the positions and the number of expected contact points.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an object grasping method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The object grabbing method provided by the embodiment of the disclosure may be executed by the terminal devices 101, 102, and 103, and correspondingly, the object grabbing device may also be disposed in the terminal devices 101, 102, and 103. The object grabbing method provided by the embodiment of the present disclosure may also be executed by the terminal devices 101, 102, and 103 and the server 105 together, and accordingly, the object grabbing apparatus may be disposed in the terminal devices 101, 102, and 103 and the server 105. In addition, the object capture method provided by the present disclosure may also be executed by the server 105, and accordingly, the object capture device may be disposed in the server 105, which is not particularly limited in this exemplary embodiment.
For example, in the present exemplary embodiment, the above-described object grasping method may be performed by the terminal apparatuses 101, 102, 103. Firstly, setting an expected contact point in a controlled manipulator according to a grabbing task, acquiring the expected contact point preset in the controlled manipulator by terminal equipment 101, 102 and 103, and acquiring motion data of a human hand in real time; then, the terminal device calculates the included angle between the five fingers of the hand and the back of the hand based on the acquired motion data, sends the included angle to a proportional differential controller arranged on the controlled manipulator, maps the included angle to a target angle when the controlled manipulator grabs an object through the proportional differential controller, and controls the controlled manipulator to reach the target angle; finally, the controlled robot is adjusted until all desired contact points contact the object to be grasped.
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 200 of the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 2, the computer system 200 includes a Central Processing Unit (CPU)201 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)202 or a program loaded from a storage section 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for system operation are also stored. The CPU 201, ROM 202, and RAM 203 are connected to each other via a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
The following components are connected to the I/O interface 205: an input portion 206 including a keyboard, a mouse, and the like; an output section 207 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 208 including a hard disk and the like; and a communication section 209 including a network interface card such as a LAN card, a modem, or the like. The communication section 209 performs communication processing via a network such as the internet. A drive 210 is also connected to the I/O interface 205 as needed. A removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 210 as necessary, so that a computer program read out therefrom is mounted into the storage section 208 as necessary.
Due to the requirements of life and production, the manipulator needs to operate in a high-risk environment many times, is suitable for transportation, and has good application prospects in the aspects of deep sea exploration, battlefield mine clearance, nuclear material carrying, aerospace equipment maintenance and the like.
In order to control the manipulator to work in a highly complex and unpredictable environment, the inventor tentatively provides a manipulator master-slave control method, and the main idea is to make decisions on various complex environments encountered by the manipulator directly by using human brain intelligence, so as to remotely control the manipulator to complete a target operation task. The implementation process is as follows: and acquiring motion data of the finger joint motion of the human hand for controlling the manipulator, and mapping the motion angle of the human hand finger to the controlled manipulator based on the acquired motion data so as to control the manipulator to complete the target task. In the process, the outer framework mechanism can be used for measuring the motion angle of the human finger joint, the motion of the human finger joint can be obtained by measuring the surface electromyographic signals when the human hand moves, and more accurately, the motion angle of the human finger joint can be obtained by using the data glove.
However, practice proves that the method can achieve the target operation task by using the intelligent remote manipulator of human brain, and simultaneously has the following problems: firstly, because the structure difference exists between the hand and the manipulator, no matter the motion angle of the fingers obtained by the exoskeleton mechanism, the surface electromyogram signal or the data glove, the obtained hand motion needs to be mapped on the manipulator, so that the precision, the real-time property and the stability of the manipulator motion can be ensured. However, in the master-slave control method of the manipulator, taking master-slave control of obtaining the motion angle of the human finger joint through the data glove as an example, when the data glove controls the manipulator to grip, the sensing angle of each finger is not enough to control the manipulator to adapt to the shapes of different objects, the manipulator is also inhibited from controlling and adjusting the gripping ability thereof according to different task requirements, and the local automation of the controlled manipulator in a changing environment cannot be realized. Therefore, the single master-slave relationship of the method reduces the gripping success rate and the gripping time of the manipulator, and the risk of falling of the object exists.
In order to solve the problems in the above method, in the present exemplary embodiment, the inventor further proposes a new technical solution to achieve the objective operation task by using the intelligent remote manipulator of the human brain. The technical solution of the embodiment of the present disclosure is elaborated below:
the present exemplary embodiment first provides an object grasping method. Referring to fig. 3, the object grabbing method specifically includes the following steps:
step S310: acquiring an expected contact point preset in a controlled manipulator, and acquiring motion data of a human hand in real time;
step S320: calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data;
step S330: mapping the included angle to a target angle when the controlled manipulator grabs an object, and controlling the controlled manipulator to reach the target angle;
step S340: and controlling and adjusting the controlled manipulator until the expected contact point is detected to be fully contacted with the object.
In the object grabbing method provided by the exemplary embodiment of the present disclosure, on one hand, in the object grabbing method provided by the exemplary embodiment, the included angle between the five fingers of the human hand and the back of the hand is calculated through the collected data of the motion of the human hand, a target angle required when the controlled manipulator grabs the object is obtained based on the included angle, and then the controlled manipulator is controlled to grab the object according to the target angle. On the other hand, after the manipulator is controlled to contact the object according to the target angle, the manipulator is adjusted until the preset expected contact points in the manipulator all contact the object to be grabbed, and the contact area between the manipulator and the object can be increased and the grabbing stability is improved by setting the expected contact points. Meanwhile, the manipulator can be controlled to control and adjust the own gripping capability according to different task requirements by adjusting the positions and the number of expected contact points.
Next, in another embodiment, the above steps are explained in more detail.
In step S310, a desired contact point preset in the controlled manipulator is acquired, and the motion data of the human hand is collected in real time.
The object grabbing method provided by the embodiment can make a decision on the working environment faced by the manipulator by using the intelligence of human brain, and the manipulator is remotely controlled to complete a target operation task. In the embodiment of the present invention, the controlled manipulator is a medium for performing a target operation in a task environment under manual control, and a human hand controls the manipulator remotely, so that the controlled manipulator completes the target operation in a complex high-risk environment, thereby ensuring the safety of the operation. The controlled manipulator may be in different forms according to different target operation tasks, for example, the controlled manipulator may be a five-finger manipulator, a three-finger manipulator, or other forms according to the definition of the controlled manipulator, and this is not particularly limited in this exemplary embodiment.
In the present exemplary embodiment, the desired contact point is a contact point that is set in advance in the controlled robot arm in order to increase the contact area between the controlled robot arm and the object to be grasped and improve grasping stability in the grasping task. The desired contact point may be set and adjusted depending on the grasping task. For example, the number and the position of the desired contact points may be determined according to one or more of the shape, the size and the number of the objects to be grasped, and may also be determined together with the shape of the controlled manipulator, which is not particularly limited in this exemplary embodiment.
In the object grabbing method provided by the exemplary embodiment, because the human hand and the controlled manipulator have structural differences, the motion of the human hand needs to be mapped onto the controlled manipulator, so that the precision, the real-time performance and the stability of the motion of the dexterous hand are ensured. Preferably, the motion data of the human hand can be acquired through the data glove, and compared with other methods, the method can acquire the motion angle of the human finger joint more accurately. In addition, the outer skeleton mechanism may also be used to measure the motion angle of the human finger joint, and the motion data of the human finger joint may also be obtained by measuring the surface electromyogram signal of the human hand during motion.
Taking the example of acquiring the motion data of the human hand through the data glove, the motion data comprises three-axis acceleration, three-axis angular velocity and quaternion of the motion of the human hand. The implementation process of collecting the motion data of the human hand in real time can be as follows: wearing inertial data gloves, and arranging sensors at the tips of the five fingers and the palm of a human hand respectively; and periodically reading each sensor to acquire the motion data.
It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
In step S320, an included angle between the five fingers of the human hand and the back of the hand is calculated based on the motion data.
In the present exemplary embodiment, after the motion data is acquired in step S310, an included angle between the five fingers of the human hand and the back of the hand can be calculated by using the acquired motion data, so that the motion of the human hand can be mapped to the controlled manipulator according to the included angle.
The method for calculating the angle of the included angle may be, for example, as follows: normalizing the quaternions of the five fingers and the back of the hand acquired by the data glove in the step S310 to obtain a corresponding posture quaternion; and calculating the included angle between each finger and the back of the hand by using the posture quaternion.
In the following, the calculation process is described in detail by taking the example of calculating the angle between the index finger and the back of the hand at time T:
firstly, respectively normalizing the quaternion q _ finger of the index finger end and the quaternion q _ play of the palm end to correspondingly obtain nq _ finger and nq _ play.
Taking q _ finger ═ q0, q1, q2, q3 as an example, the above normalization process is as follows:
q0(T)_n=q0(T)/norm(T)
q1(T)_n=q1(T)/norm(T)
q2(T)_n=q2(T)/norm(T)
q3(T)_n=q3(T)/norm(T)
wherein norm (T) is a modulus of a T-time quaternion, q0(T) _ n, q1(T) _ n, q2(T) _ n, and q3(T) _ n are attitude quaternions at T-time obtained after normalization, respectively, and the normalized index finger quaternion is merged and expressed as nq _ finger ═ q0_ n, q1_ n, q2_ n, q3_ n;
finally, the angular difference between the quaternions can be calculated using the dot product of the quaternions, i.e., the angle between the index finger and the palm can be obtained by calculating the angle between the quaternions nq _ play and nq _ finger:
angleA=cos-1(θ′)*2
in this exemplary embodiment, the angles of the included angles between the other four fingers and the palm can be calculated according to the above calculation process for calculating the index finger and the palm, which is not described herein again. It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
In step S330, the included angle is mapped to a target angle when the controlled manipulator grabs the object, and the controlled manipulator is controlled to reach the target angle.
In the present exemplary embodiment, the object is a target object to be grasped by the controlled manipulator controlled by the human hand. In the object grabbing method, because the human hand and the controlled manipulator have structural difference, the calculated included angle needs to be mapped to be the target angle and used for controlling the controlled manipulator, and the purpose that the controlled manipulator is controlled intelligently by the human brain can be achieved.
In the present exemplary embodiment, the process of obtaining the target angle may be as follows, for example: a proportional-differential controller is arranged on the controlled manipulator; and acquiring the calculated included angle between the five fingers of the human hand and the back of the hand, and mapping by a proportional differential controller to obtain a corresponding target angle.
The proportional-derivative controller is configured to map the calculated included angle between each finger and the back of the hand to the target angle. If there is no contact between one finger and the object, the finger is still controlled by a proportional-derivative controller arranged in the controlled manipulator to convert the calculated included angle into a target angle and control the controlled manipulator to reach the target angle, so that each finger contacts the object to be grabbed. It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
In step S340, control adjusts the controlled manipulator until all of the desired contact points are detected to contact the object.
In the object grabbing method provided by the present exemplary embodiment, after the controlled manipulator contacts the object in step S330, the controlled manipulator can stably grab the object by sharing the idea of the control theory, so as to fill up the gap in the prior art and establish a bridge for effectively executing human intention and expected tasks.
The implementation method of the above process may be, for example, as follows: arranging a touch sensor at each finger of the controlled manipulator; and adjusting the joint moment of the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator.
In the above method, the specific implementation of adjusting the joint moment may be: after the controlled manipulator contacts the object in the step S330, the sharing controller processes the information of the tactile sensor placed on the finger of the controlled manipulator to finely adjust the finger, and the sharing controller applies a predefined force to adjust the joint moment in magnitude and direction. That is, each finger may be caused to slide along the surface of the object to continue to seek contact with the object in the desired point of contact that has not yet been achieved. Meanwhile, the proportional-derivative controller continuously calculates the joint torque required by the data glove instruction joint angle. Through the sharing control, when the controlled manipulator contacts an object to be grabbed in the grabbing task, the touch sensor arranged on the controlled manipulator is combined with the direction trend of finger movement, so that the bending angle of the finger can be automatically increased, the contact area between the dexterous hand and the object is maximized, and the object is stably grabbed. However, the tactile sensor may be replaced by other force feedback devices that can achieve the same function, and this is not limited in this exemplary embodiment.
It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
Next, taking a specific example of the present exemplary embodiment as an example, the object grabbing method is further described, and as shown in fig. 4, the specific example includes the following steps:
step S410: and collecting data.
In the step, an inertial data glove is worn on a human hand playing a control role, and an inertial sensor is respectively arranged at the tip of the five fingers and the palm; and periodically reading the information of each inertial sensor at the T moment, wherein the information of the inertial sensor at the T moment comprises triaxial acceleration information, triaxial angular velocity data and a quaternion at the T moment.
Step S420: and (5) resolving the attitude.
In the step, the included angle between each finger of the human hand and the back of the hand at the time T is calculated by applying quaternion operation. The calculation process is already described in detail in step S320, and therefore will not be described herein again.
Step S430: and (4) sharing control.
In this step, first, a desired contact point is defined in the controlled manipulator, and the controlled manipulator is adjusted to contact all the preset desired contact points, so that the contact area between the controlled manipulator and the object can be increased. Wherein the type and number of desired contact points can be customized from gripping task to gripping task. The manipulator used in this embodiment has 5 fingers, and tactile sensors are provided on the palm side of each finger, the side and top surfaces of contact between the two fingers. Each finger has three phalanges, and the joints between each phalange can independently control the torque. In the controlled manipulator, 1 desired contact point is defined for each phalanx of each finger except the thumb, and 2 desired contact points are defined for the thumb.
The controlled manipulator is provided with a proportional-derivative controller and a shared controller, and the proportional-derivative controller and the shared controller realize shared control. The sharing control comprises the following two parts:
(1) when the controlled manipulator does not contact any object, the proportional differential controller of the controlled manipulator maps the angle between each finger of the human hand and the back of the hand, which is calculated in step S420, as the target angle of the controlled manipulator, and adjusts the joint torque so that the finger joint of the controlled manipulator reaches the target angle.
(2) Under the shared control condition, once the touch sensor at the finger end of the controlled manipulator is contacted with the object, the shared controller applies joint moment to the direction of the expected contact point, so that more expected contact points of the controlled manipulator are contacted with the object, the contact area of the controlled manipulator and the object is increased until all the expected contact points arranged in the controlled manipulator are contacted with the object, and the effect of stabilizing the object is achieved.
In this embodiment, on the one hand, the included angle between the five fingers of the hand and the back of the hand is calculated through the collected data of the hand movement, a target angle required when the controlled manipulator grabs the object is obtained based on the included angle, and then the controlled manipulator is controlled to grab the object according to the target angle. On the other hand, after the manipulator is controlled to contact the object according to the target angle, the manipulator is adjusted until the preset expected contact points in the manipulator all contact the object to be grabbed, and the contact area between the manipulator and the object can be increased and the grabbing stability is improved by setting the expected contact points. Meanwhile, the manipulator can be controlled to control and adjust the own gripping capability according to different task requirements by adjusting the positions and the number of expected contact points.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, in the present exemplary embodiment, an object grabbing apparatus is also provided, and as shown in fig. 5, the object grabbing apparatus 500 may include an acquisition module 510, a calculation module 520, a mapping module 530, and a grabbing module 540. Wherein:
the acquisition module 510 may be configured to acquire a desired contact point preset in the controlled manipulator and acquire motion data of the human hand in real time;
the calculation module 520 may be configured to calculate an included angle between the five fingers of the human hand and the back of the hand based on the motion data;
the mapping module 530 may be configured to map the included angle to a target angle when the controlled manipulator grabs an object, and control the controlled manipulator to reach the target angle;
the gripping module 540 may be used to adjust the controlled robot until the object contacts all of the desired contact points of the controlled robot.
The specific details of each module or unit in the object capture device have been described in detail in the corresponding object capture method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 3 to 4, and the like.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.