CN114578720A - Control method and control system - Google Patents

Control method and control system Download PDF

Info

Publication number
CN114578720A
CN114578720A CN202011386127.3A CN202011386127A CN114578720A CN 114578720 A CN114578720 A CN 114578720A CN 202011386127 A CN202011386127 A CN 202011386127A CN 114578720 A CN114578720 A CN 114578720A
Authority
CN
China
Prior art keywords
parameter
control
action
robot
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011386127.3A
Other languages
Chinese (zh)
Other versions
CN114578720B (en
Inventor
王毅
王松柏
何烽光
王广炎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Sineva Intelligent Machine Co Ltd
Original Assignee
Hefei Sineva Intelligent Machine Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Sineva Intelligent Machine Co Ltd filed Critical Hefei Sineva Intelligent Machine Co Ltd
Priority to CN202011386127.3A priority Critical patent/CN114578720B/en
Publication of CN114578720A publication Critical patent/CN114578720A/en
Application granted granted Critical
Publication of CN114578720B publication Critical patent/CN114578720B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a control method and a control system, relates to the technical field of carrying equipment, and can reduce the difficulty of robot equipment in executing actions. The control method comprises the following steps: the robot device receives a first control instruction from the controller, the first control instruction comprises at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is used for instructing the robot device to execute a first action on a first object, the first parameter value of the at least one control parameter is used for instructing the robot device to execute a pose of the first action, and the at least one control parameter corresponding to different actions is the same. The robot device executes a first action on the first object according to the first control instruction. The embodiment of the application is applied to the carrying process of objects.

Description

Control method and control system
Technical Field
The present application relates to the field of transportation equipment technologies, and in particular, to a control method and a control system.
Background
Robotic devices refer to automated machinery that carries objects from one location to another. For example, the robot apparatus may perform various transport motions on the object in accordance with a control instruction of the controller.
In the prior art, the control command sent by the controller to the robot device generally takes the form of address plus data content (data length + data value). The address can be used for representing the executed action, and the data content can be used for representing the pose of the executed action. For example, the control instructions may include a plurality of characters (e.g., bits). The arrangement sequence of the characters is different, and the corresponding execution action and the action pose are also different. However, in the case where the robot needs to perform multiple actions, the control command of the robot sent by the controller needs to use more characters (for example, at least 43 bits may be used), and the characters must be ordered in the correct order. In this way, the robot apparatus can execute the corresponding operation according to the control command.
However, each character and the corresponding value in the plurality of characters in the control command are different, and the action instructed by the control command is also different. The robot device needs to correctly analyze each character and the parameter value of each character to correctly execute the action indicated by the control instruction, so that the difficulty of the robot device in executing the action is increased.
Disclosure of Invention
The application provides a control method and a control system, which can reduce the difficulty of robot equipment in executing actions.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a control method applied to a robot apparatus, the method including: the robot device receives a first control instruction from the controller, the first control instruction comprises at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is used for instructing the robot device to execute a first action on a first object, the first parameter value of the at least one control parameter is used for instructing the robot device to execute a pose of the first action, and the at least one control parameter corresponding to different actions is the same. The robot device executes a first action on the first object according to the first control instruction.
Based on the technical scheme of this application, the control instruction that the controller sent to robotic device can include at least one control parameter and the parameter value of at least one control parameter. The control instructions may be for instructing the robotic device to perform a first action and the parameter value of the at least one control parameter may be for instructing the robotic device to perform a pose of the first action. At least one control parameter of the control instruction corresponding to different actions is the same. This means that the robot device only has to interpret at least one parameter value in the control instruction to determine the action that needs to be performed. Compared with the prior art, the robot equipment needs to analyze the arrangement sequence of each character and the value of each character in the control command, and the robot equipment does not need to analyze the characters corresponding to the control parameters in the technical scheme of the application, so that the difficulty of executing the action by the robot equipment can be reduced.
In one possible implementation, the at least one control parameter comprises a first type of parameter, a parameter value of which is indicative of information of the first object, and a second type of parameter, a parameter value of which is indicative of an actuator of the robotic device performing the first action.
Based on the possible implementation manner, the robot device can accurately determine the position of the first object and the actuator to perform the first action according to the parameter value of the first type parameter and the parameter value of the second type parameter in the control command.
In a possible implementation manner, the first control instruction further includes a first numerical value, and the first numerical value is used for reflecting the action amplitude of the first action executed by the robot device.
Based on the possible implementation mode, the robot device can accurately determine the size, the strength, the torque and the like of the executed first action, and accurately control the action of the actuator.
In one possible implementation, the robotic device sends a first response message to the controller, the first response message indicating that the first action execution is complete.
Based on the possible implementation manner, after the robot device executes the first action, a response message of the completion of the execution of the first action may be sent to the controller, so that the controller accurately determines that the robot device executes the first action.
In one possible implementation manner, the robot device receives a second control instruction from the controller, where the second control instruction includes at least one control parameter and at least a second parameter value of the control parameter, and the second control instruction is used to instruct the robot device to perform a second action on a second object, and the second parameter value and the first parameter value.
Based on the possible implementation mode, the robot equipment can determine the action corresponding to the control instruction only by analyzing the parameter values of the control parameters in different control instructions, and the complexity of analyzing the instruction by the robot equipment is reduced.
In a second aspect, a control device is provided, which may be a robot apparatus or may be a chip or a system on a chip of a robot apparatus, and which may include a communication unit and a processing unit.
The communication unit is used for receiving a first control instruction from the controller, the first control instruction comprises at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is used for instructing the robot device to execute a first action on a first object, the first parameter value of the at least one control parameter is used for instructing the robot device to execute a pose of the first action, and the at least one control parameter corresponding to different actions is the same.
And the processing unit is used for executing a first action on the first object according to the first control instruction.
In a third aspect, a control apparatus is provided, the control apparatus comprising a processor, a memory, and a communication interface; the communication interface is used for communication between the control device and other equipment; the memory is used for storing one or more programs, the one or more programs include computer-executable instructions, and when the control device runs, the processor executes the computer-executable instructions stored in the memory, so that the control device executes any one of the possible implementation manners of the first aspect and the first aspect.
In a fourth aspect, there is provided a computer readable storage medium having stored therein instructions which, when executed, implement the method of the first aspect.
In a fifth aspect, there is provided a computer program product comprising at least one instruction which, when run on a computer, causes the computer to perform the method of the first aspect.
In a sixth aspect, a chip is provided, the chip comprising at least one processor and a communication interface, the communication interface being coupled to the at least one processor, the at least one processor being configured to execute computer programs or instructions to implement the method of the first aspect.
In a seventh aspect, a control method is provided, which is applied to a controller, and includes: the controller generates a first control instruction, the first control instruction comprises at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is used for instructing the robot device to execute a first action on the first object, the first parameter value of the at least one control parameter is used for instructing the robot device to execute a pose of the first action, and the at least one control parameter corresponding to different actions is the same. The controller sends a first control instruction to the robotic device.
Based on the technical scheme of the embodiment of the application, after the controller generates the control instruction for instructing the robot device to execute the first action, the controller may send the control instruction to the robot device, so that the robot device may execute the action corresponding to the control instruction on the object according to the robot device. A parameter value of at least one control parameter in the control instruction may be used to indicate a pose of the robotic device. At least one control parameter of the control instruction corresponding to different actions is the same. This means that the robot device only has to interpret at least one parameter value in the control instruction to determine the action that needs to be performed. Compared with the prior art, the robot equipment needs to analyze the arrangement sequence of each character and the value of each character in the control command, and the robot equipment does not need to analyze the characters corresponding to the control parameters in the technical scheme of the application, so that the difficulty of executing actions by the robot equipment can be reduced.
In one possible implementation, the at least one control parameter comprises a first type of parameter, a parameter value of which is indicative of information of the first object, and a second type of parameter, a parameter value of which is indicative of an actuator of the robotic device performing the first action.
Based on the possible implementation manner, the robot device can accurately determine the position of the first object and the actuator to be subjected to the first action according to the parameter value of the first type parameter and the parameter value of the second type parameter in the control command.
In one possible implementation, before the controller sends the first control instruction to the robot device, the controller determines that the robot device is located at an initial position and the robot device is in an idle state.
Based on the possible implementation manner, after determining that the robot device is located at the initial position and is in the idle state, the controller sends the control instruction to the robot device again, so that the robot device is in the optimal state before executing the first action.
In a possible implementation manner, the controller sends a second control instruction to the robot device, where the second control instruction includes at least one control parameter and a second parameter value of the at least one control parameter, the second control instruction is used to instruct the robot device to perform a second action on a second object, the second parameter value of the at least one control parameter is used to instruct the robot device to perform a pose of the second action, and the first parameter value is different from the second parameter value.
Based on the possible implementation manner, the robot device can determine the action corresponding to the control instruction only by analyzing the parameter values of the control parameters in different control instructions based on the possible implementation manner, and the complexity of analyzing the instruction by the robot device is reduced.
In an eighth aspect, a control device is provided, which may be a controller or may be a chip or a system on a chip of a controller, and which may include a communication unit and a processing unit.
The processing unit is used for generating a first control instruction, the first control instruction comprises at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is used for instructing the robot device to execute a first action on a first object, the first parameter value of the at least one control parameter is used for instructing the robot device to execute a pose of the first action, and the at least one control parameter corresponding to different actions is the same. The robot device executes a first action on the first object according to the first control instruction.
And the communication unit is used for sending the first control instruction to the robot equipment.
In a ninth aspect, there is provided a control apparatus comprising a processor, a memory, and a communication interface; the communication interface is used for communication between the control device and other equipment; the memory is used for storing one or more programs, the one or more programs include computer-executable instructions, and when the control device runs, the processor executes the computer-executable instructions stored in the memory, so that the control device executes any one of the possible implementation manners of the eighth aspect and the eighth aspect.
A tenth aspect provides a computer-readable storage medium having stored thereon instructions which, when executed, implement the method of the eighth aspect.
In an eleventh aspect, there is provided a computer program product comprising at least one instruction which, when run on a computer, causes the computer to perform the method of the eighth aspect.
In a twelfth aspect, a chip is provided, where the chip includes at least one processor and a communication interface, the communication interface is coupled to the at least one processor, and the at least one processor is configured to execute a computer program or instructions to implement the method of the eighth aspect.
In a thirteenth aspect, a control system is provided. The control system includes a robotic device and a controller. The robot apparatus is configured to execute any one of the possible control methods according to the first aspect and the first aspect, and the controller is configured to execute any one of the possible control methods according to the eighth aspect and the eighth aspect.
The detection apparatus, the computer-readable storage medium, the computer program product, the chip, or the control system provided above are all configured to execute the corresponding methods provided above, and therefore, the beneficial effects achieved by the detection apparatus, the computer-readable storage medium, the computer program product, the chip, or the control system may refer to the beneficial effects of the corresponding schemes in the corresponding methods provided above, and are not described herein again.
Drawings
Fig. 1 is a schematic diagram of a control system according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a control device 200 according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a control method according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another control method provided in the embodiment of the present application;
fig. 5 is a timing diagram of information interaction between a controller and a robot device according to an embodiment of the present disclosure;
FIG. 6 is a timing diagram illustrating another example of information interaction between a controller and a robotic device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of another control device 70 according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of another control device 80 according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a control system according to an embodiment of the present application.
Detailed Description
With the development of robotics, due to the operability and sustainability of robots, more and more industries have introduced robots to replace labor. For example, in the transportation field, when an object is transported by a robot, the robot is safer and more stable in transporting the object than a robot that is manually operated.
In one example, fig. 1 illustrates a control system provided in an embodiment of the present application, where the control system includes a controller and a robot apparatus. The controller is in communication with the robotic device. For example, the communication connection may be through a wired (e.g., Universal Serial Bus (USB), type-C, communication bus) or wireless (e.g., bluetooth, wireless-fidelity (wifi), ethernet) manner. Of course, the controller and the robot device may also be connected through other forms of communication, for example, a fifth generation (5G) communication network or other evolved communication networks, and the like, without limitation.
Wherein the controller may be configured to generate and send to the robotic device control instructions for instructing the robotic device to perform a certain action. For example, the controller may be a PC, or may be a PLC or a touch screen external to the robot apparatus. The controller may also be referred to as a robot controller, a robot operating handle, or the like, without limitation. The controller may generate the control instruction in response to an operation instruction of a worker.
The robot device can be used for receiving a control command from the controller and executing an action corresponding to the control command and the pose of the action. For example, the robotic device may be a transfer robot that may be used to transfer flat sheet materials within the cassette.
The action performed by the robot device may refer to an action performed on an object. The material may be a plate material (e.g., a substrate), a cassette, etc., without limitation. The action may include translation, lifting, laying, pushing, etc.
The pose of the action may refer to a position and a posture of the robot device when executing the action corresponding to the control instruction.
For example, a robotic device may include one or more actuators. The one or more actuators may perform an action corresponding to the control command. For example, the one or more actuators may be pallets. When the control instruction is used for instructing to move the flat plate material from the position A to the position B, the one or more actuators can execute the following steps in response to the control instruction: move to a → lift the plate material → move to B → place the plate material. The action executed by the robot equipment can be translation, and the pose executed by the action is lifting and placing.
In one example, a robotic device may be provided with a plurality of sensors and a plurality of motors. The plurality of sensors may detect whether the object is in a balanced state when the robot device carries the object using the two pallets. If the object is not in the balanced state, the motors can be controlled to work to adjust the position of the supporting plate so as to enable the object to be in the balanced state.
Usually, the controller and the robot device perform information interaction in the form of address plus data content (data length + data value). Different addresses identify different actions. The data content of the different actions represents the pose at which the action is specifically performed. However, when the pose of a certain action is large, the data content corresponding to the pose is also long. For example, it may be 43 bits. In addition, a plurality of bits in the data content also need to be arranged in a correct order. In this way, the robotic device can correctly interpret and execute the action. Therefore, the interactive information between the controller and the robot equipment is complex, and the difficulty of executing actions by the robot is increased.
In some scenarios, for example, when the configuration of the robot device is low, if the control command sent by the controller is too complex, the robot device may need to spend a long time in analyzing the control command, and in a more serious case, an error may occur when the robot device analyzes the control command, thereby causing an accident.
In addition, if the information of interaction between the controller and the robot device is to be expanded, for example, the actions performed by the robot device are increased. It is necessary to add addresses and to add data contents corresponding to the addresses. The difficulty of information capacity expansion is further increased.
In view of this, an embodiment of the present application provides a control method, where the method includes: the controller generates a first control instruction, the first control instruction is used for instructing the robot to execute a first action on the first object, the first control instruction can comprise at least one control parameter and a parameter value of the at least one control parameter, and the at least one control parameter of different control instructions corresponds to different parameter values. After receiving the first control instruction, the robot may execute a first action according to the first control instruction.
Based on the technical scheme provided by the application, after the controller generates the control instruction for instructing the robot device to execute the first action, the controller can send the control instruction to the robot device, so that the robot device can execute the action corresponding to the control instruction on the object according to the robot device. A parameter value of at least one control parameter in the control instruction may be used to indicate a pose of the robotic device. At least one control parameter of the control instruction corresponding to different actions is the same. This means that the robot device only needs to parse at least one parameter value in the control command to determine the action to be performed, without parsing the sequence of each character in the control command. Therefore, the technical scheme provided by the embodiment of the application can reduce the difficulty of the robot equipment in executing the action.
Furthermore, if the capacity of the robot executing action is expanded, only the parameter value of the control parameter needs to be increased, and the control parameter does not need to be increased. Data expansion is facilitated. Meanwhile, for at least one control parameter, the staff can select and use part of the control parameters, and the method is flexible and convenient.
A data transmission method provided in an embodiment of the present application will be described in detail below with reference to the accompanying drawings.
Fig. 2 is a schematic composition diagram of a control apparatus 200 according to an embodiment of the present disclosure, where the control apparatus 200 may be a controller or a chip in the controller or a system on a chip. Alternatively, the robot device or a chip or system on a chip in the robot device may be used. As shown in fig. 2, the control device 200 includes a processor 201, a communication interface 202, and a communication line 203.
Further, the control device 200 may further include a memory 204. The processor 201, the memory 204 and the communication interface 202 may be connected via a communication line 203.
The processor 201 is a CPU, a general purpose processor Network (NP), a Digital Signal Processor (DSP), a microprocessor, a microcontroller, a Programmable Logic Device (PLD), or any combination thereof. The processor 201 may also be other devices with processing functions, such as, without limitation, a circuit, a device, or a software module.
A communication interface 202 for communicating with other devices or other apparatuses. The communication interface 202 may be a module, a circuit, a communication interface, or any device capable of enabling communication.
A communication line 203 for transmitting information between the respective components included in the detection apparatus 200.
A memory 204 for storing instructions. Wherein the instructions may be a computer program.
The memory 204 may be a read-only memory (ROM) or other types of static storage devices that can store static information and/or instructions, a Random Access Memory (RAM) or other types of dynamic storage devices that can store information and/or instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disc storage medium or other magnetic storage devices, and the like, without limitation.
It is noted that the memory 204 may exist separately from the processor 201 or may be integrated with the processor 201. The memory 204 may be used for storing instructions or program code or some data etc. The memory 204 may be located inside the control device 200 or outside the control device 200, which is not limited. The processor 201 is configured to execute the instructions stored in the memory 204 to implement the detection method provided by the following embodiments of the present application.
In one example, processor 201 may include one or more CPUs, such as CPU0 and CPU1 in fig. 2.
As an alternative implementation, the detection apparatus 200 includes a plurality of processors, for example, a processor 207 may be included in addition to the processor 201 in fig. 2.
As an alternative implementation, the control apparatus 200 further comprises an output device 205 and an input device 206. Illustratively, the input device 206 is a keyboard or the like, and the output device 205 is a display or the like.
It is noted that the constituent structure shown in fig. 2 does not constitute a limitation of the detecting means, and the detecting means may include more or less components than those shown in fig. 2, or may combine some components, or a different arrangement of components, in addition to the components shown in fig. 2.
In the embodiment of the present application, the chip system may be composed of a chip, and may also include a chip and other discrete devices.
In addition, acts, terms, and the like referred to between the embodiments of the present application may be mutually referenced and are not limited. In the embodiment of the present application, the name of the message exchanged between the devices or the name of the parameter in the message, etc. are only an example, and other names may also be used in the specific implementation, which is not limited.
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first lightness and the second lightness are only for distinguishing different lightness, and the order of the first lightness and the second lightness is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The following describes a detection method provided in an embodiment of the present application with reference to the control device shown in fig. 1. The detection device described in the following embodiments may include the components shown in fig. 2, and details are not repeated. In this application, the actions, terms, and the like referred to in the embodiments are all mutually referred to, and are not limited. In the embodiment of the present application, the name of the message exchanged between the devices or the name of the parameter in the message, etc. are only an example, and other names may also be used in the specific implementation, which is not limited. The actions related to the embodiments of the present application are only an example, and other names may also be used in the specific implementation, for example: the term "comprising" in the embodiments of the present application may also be replaced by "carrying" or "carrying" etc.
As shown in fig. 3, the control method provided in the embodiment of the present application may include:
step 301, the controller generates a first control command.
Wherein the controller may be the controller in fig. 1.
Wherein the first control instruction may be for instructing the robotic device to perform a first action on the first object. For example, the first control instruction may be used to instruct the robotic device to carry the flat material from a to B.
For example, the first control instruction may include at least one control parameter (which may also be referred to as a parameter address) and a first parameter value of the at least one control parameter. Wherein, the at least one control parameter may include an action name, an object. The parameter value of the at least one control parameter may be used to indicate a pose at which the robotic device performed the first action on the first object.
In one example, the at least one control parameter may include a first type of parameter (which may also be referred to as a generic parameter) and a second type of parameter (which may also be referred to as a specific parameter). Wherein the parameter value of the first type parameter may be used to indicate information of the first object. The information may include type, location, size, actuator to move the object, etc. The parameter values of the second type of parameter may be used for information indicative of an actuator of the robotic device performing the first action. For example, the actuator may be a sensor. The information of the sensor may include the number, location, type, whether the sensor is on, etc.
For example, the first type parameter and the second type parameter may be character strings or numbers, or may be a combination of characters and numbers. Without limitation. For example, the first type of parameter may be 1-xxx and the second type of parameter may be 2-xx.
In connection with the above example, when the first control instruction is used to instruct the robot device to move the flat material from a to B, the first type parameter includes information of the flat material, such as an initial position of the flat material (a), a number of cassettes where the flat material is located, a size and a number of the flat material, and which orientation of the flat material on the robot device, a destination position (B), a distance to be moved, and the like. The second type of parameters may include whether the robot needs to use the drive when handling the flat material, which position of the drive needs to be used, the type, number of drives used, etc.
The first type parameter and the second type parameter may be parameters using bit as a unit. Therefore, seamless and safe communication between the controller and the robot equipment can be ensured.
Further, the first control instruction may further include a first value, and the first value may be used to indicate a motion amplitude of the robot device performing the first motion.
The motion amplitude may refer to the magnitude of the motion, for example, a motor torque value used by the driver, a moving speed, a moving distance, a translation height, a rotation angle, a force for lifting/grabbing an object, a time for keeping the motion, and the like.
The first value may be a word (word) unit value. In this way, data transmission between the robot device and the controller can be facilitated.
In one possible implementation, the controller may generate the first control instruction in response to an input operation by a worker.
For example, the worker may input an instruction for instructing the robotic device to perform a first action on a first object through a touch screen or physical keys of the controller. For example, the entered instruction may include information of the first object and a number to perform the action. In response to the instruction, the controller may determine a first parameter value of at least one control parameter in the first control instruction according to a preset correspondence. The preset correspondence may include a number of the first motion, information of the first object, and information of the driver. The number of the first action may be consistent with the parameter values of the instruction parameters in table 1 below, or have a one-to-one correspondence.
The preset corresponding relationship may be preset in the controller. Each of the number information of the first action, the information of the first object, and the information of the driver in the preset correspondence may be represented by one or more bits. The value of one or more bits for different actions is different. One or more bits of information of different objects are different. One or more bits of information are different for different drives. Specifically, it can be shown in table 1 below, and is not described herein again.
Step 302, the controller sends a first control instruction to the robotic device. Accordingly, the robot device receives a first control instruction from the controller.
Wherein the robot device may be the robot device in fig. 1.
In one possible implementation, the controller may send the first control instruction to the robot device according to a preset address range. Alternatively, it may also be described that the first control instruction sent by the controller to the robot apparatus may belong to a preset address range. Alternatively, it may also be described that the address carried by the first control instruction sent by the controller to the robot apparatus belongs to a preset address range. The addresses corresponding to different control instructions are different. Accordingly, after receiving the first control instruction from the controller, the robot device may verify whether the first control instruction is within the preset address range. If the first control instruction is located in the preset address range, the robot equipment can analyze the first control instruction; if the first control command is located within the predetermined address range, the robotic device may discard the first control command. Therefore, the robot equipment can be prevented from being illegally controlled by other controllers.
Specifically, the preset address range of the controller and the preset address range of the robot device may refer to the following description in table 2, which is not repeated herein.
Step 303, the robot device executes a first action on the first object according to the first control instruction.
The first object may be a cassette or a substrate in the cassette. The first action may include grasping, handling, translating, and the like. Without limitation.
In a possible implementation manner, after receiving the first control instruction, the robot device may determine the first action according to a correspondence between the control instruction and the action. The correspondence between the control commands and the actions can be referred to the following table 1, which is not repeated herein. After determining the parameter value of each sub-parameter in the first control instruction, the robot device first determines information of the first object, for example, the position, number, orientation, etc. of the first object. The robotic device then determines a driver to perform the first action. As such, the robotic device may perform a first action on the first object using the control drive. Specifically, the process of the robot executing the action may refer to the prior art, and is not described in detail.
Further, before the robot device performs the first action on the first object according to the first control instruction, the robot device may further interact with the controller to verify whether the control parameter in the first control instruction received by the robot device is correct.
Specifically, after receiving a first control instruction, the robot device analyzes the first control instruction to obtain at least one control parameter and a corresponding parameter value carried in the first control instruction. The robot equipment sends a first verification message to the controller, wherein the first verification message is used for verifying whether at least one control parameter and a corresponding parameter value received by the robot are consistent with at least one control parameter and a corresponding parameter value in a first control instruction sent by the controller. If the first control instruction is consistent with the first control instruction, the controller can send a first indication message to the robot device, so that the robot device can execute a first action on the first object according to the first control instruction; if not, the controller may send a second indication message to the robotic device to cause the robotic device to stop performing the first action on the first object. The controller can also directly output the alarm information.
In the process of information interaction (for example, remote information interaction) between the robot device and the controller, in order to prevent the robot device from being attacked by other devices or prevent control instructions from being tampered with, the robot device and the controller may perform information interaction in an encrypted manner. For example, encryption may be performed in the form of a public and private key. The public and private keys can refer to the prior art and are not described in detail.
Based on the solution of fig. 3, the control instruction sent by the controller to the robot device may include at least one control parameter and a parameter value of the at least one control parameter. The control instructions may be for instructing the robotic device to perform a first action and the parameter value of the at least one control parameter may be for instructing the robotic device to perform a pose of the first action. At least one control parameter of the control instruction corresponding to different actions is the same. This means that the robot device only has to interpret at least one parameter value in the control instruction to determine the action that needs to be performed. Compared with the prior art, the robot equipment needs to analyze the arrangement sequence of each character and the value of each character in the control command, the robot equipment does not need to analyze the characters corresponding to the control parameters in the technical scheme of the application, and therefore the difficulty of executing actions by the robot equipment can be reduced.
In a possible implementation manner, the first control instruction may include a plurality of sub-parameters for each type parameter of the first type parameter and the second type parameter, and each sub-parameter may configure a plurality of bits.
For example, the first type of parameter may include an instruction parameter and sub-parameters 1 through 9. The instruction parameters and each sub-parameter may configure one or more bits. For example, the instruction parameter may be configured with 6 bits, the sub-parameter 1 may be configured with 1bit, the sub-parameter 2 may be configured with 5 bits, the sub-parameter 3 may be configured with 6 bits, the sub-parameter 4 may be configured with 5 bits, the sub-parameter 5 may be configured with 2 bits, the sub-parameter 6 may be configured with 1bit, the sub-parameter 7 may be configured with 4 bits, the sub-parameter 8 may be configured with 6 bits, and the sub-parameter 9 may be configured with 3 bits. The second type of parameters may include sub-parameters 10 to 13. 1bit can be configured for each of the subparameters 10-13.
Wherein a parameter value of the instruction parameter may be used to represent an action. The parameter values of the instruction parameters corresponding to different actions are different. For example, the robotic device may perform 63 actions, with the instruction parameters configuring 6 bits. Thus, each action may have a unique bit. For example, the parameter value of the command parameter corresponding to action 1 may be 000001, the parameter value of the command parameter corresponding to action 2 may be 000010, and the number 63 may be 111111.
In one example, the preset correspondence may be configured in the form of a table between the controller and the robot device. Of course, the controller and the robot device may be configured in other forms, for example, in an array form, without limitation. For example, the preset correspondence relationship may be as shown in table 1.
TABLE 1
Figure RE-RE-GDA0002924316980000141
In table 1, the parameter values of the sub-parameters 1 to 13 in different operations may be the same, may be partially the same, or may be different. For example, both actions 2 and 3 may be actions of the robot device to grasp the substrate. The parameter values of sub-parameters 1 to 13 corresponding to actions 2 and 3 are the same. However, the command parameters for act 2 and act 3 are different. This means that the robot device performs actions 2 and 3 differently. For example, action 2 may represent that the robotic device is directly grasping the substrate. Action 3 may represent that the robot device is ready to perform an action of grabbing the substrate. For example, the robotic device, upon receiving a control instruction instructing action 3 to be performed on the substrate, may move to the substrate and determine that the substrate is ready to be grasped. After a preset time or after receiving the next command from the controller, the robot device may perform a basic grasping action.
In table 1, the parameter values of the sub-parameters 6 to 9 in the actions 2 and 3 are "x", which means that the parameter values may be 0, or the controller does not need to assign values to the sub-parameters 6 to 9 when sending instructions to the robot apparatus to execute the actions 2 and 3, or the parameter values of the sub-parameters 6 to 9 may be randomly generated. The robot apparatus does not need to analyze the sub-parameters 6 to 9 and the parameter values of the sub-parameters 6 to 9 even when receiving the control command for instructing the operation 2 and the operation 3.
It should be noted that the value range of the parameter value of each sub-parameter in table 1 may mean that the parameter of the sub-parameter may be one value in the value range. For example, if the value range of the parameter value of the sub-parameter 2 corresponding to the action 1 is 1 to 15, the value of the parameter value of the sub-parameter 2 corresponding to the action 2 may be any one of the values (binary) from 1 to 15, for example, 10 (01100). Alternatively, the sub-parameter corresponding to the action 2 can be described as any binary value from 1 to 15.
The instruction parameters, sub-parameters 1 to 5, and sub-parameters 10 to 13 of the first type parameter and the second type parameter in table 1 are described below with reference to specific examples.
1. The command parameters (accounting for 6 bits) can be used to represent the actions issued by the controller to the robot. Wherein, 6bit address (2 ^6 ^ 64 actions). As such, the robotic device may be instructed to perform 60 actions. Of course, if the number of actions that the robot needs to perform is more than 64, more addresses, such as 7 bits, 8 bits, etc., can be allocated, without limitation.
2. Sub-parameter 1 (1 bit) may be used to indicate whether the first object is a single layer cartridge or a multi-layer cartridge. That is, there are two cases where 1-bit addresses are assigned (e.g., 0 for a single layer cassette and 1 for a multi-layer cassette).
3. A sub-parameter 2 (accounting for 5 bits) may be used to indicate the number of the cartridge. Different cassettes correspond to different numbers. For example, 00001 denotes the cartridge number 1, 00010 denotes the cartridge number 2, and 00011 denotes the cartridge number 3. As such, a variety of different numbered cassettes may be indicated. If 5 bits can not satisfy the number of the cassette, more bits can be allocated without limitation.
4. A sub-parameter of 3 (6 bits) may be used to indicate the number of layers of the cartridge. For example, 001100 denotes that the number of layers of the cassette is 10; 110010 shows that the number of layers of the cassette is 50. If 5 bits can not satisfy the number of layers of the cassette, more bits can be allocated without limitation.
5. The sub-parameter 4 (5 bits) can be used to indicate the size (length and width) of the substrate placed on the cassette. The cassettes corresponding to the substrates with different sizes have different numbers. For example, the length and width of the substrate is 700mm × 800mm, the number of the corresponding cassette is 1, and the sub-parameter 4 is 00001; the length and width of the substrate was 800mm 900mm, the number of the corresponding cassette was 2, and sub-parameter 4 was 00010.
6. The sub-parameter 5 (occupying 2 bits) may be used to represent an actuator used by the robotic device to perform a certain action on the first object. For example, the robot apparatus has two actuators (actuator 1, actuator 2, respectively). If the robot device needs the actuator 1 and does not need the actuator 2 when performing a certain action on the first object, the parameter value of the sub-parameter 5 may be 01; if the robot device does not need the actuator 1 and needs the actuator 2 when performing the action on the first object, the parameter value of the sub-parameter 5 may be 10; if the robot needs the actuator 1 and the actuator 2 to perform the motion on the first object, the parameter value of the sub-parameter 5 may be 11.
7. A sub-parameter 6 (of 1bit) may be used to indicate whether or not an action is to be performed immediately. For example, if an action requires immediate execution of the action, the parameter value of the sub-parameter 6 is 1; if the action does not need to be performed immediately, the parameter value of the sub-parameter 6 is 0.
8. The sub-parameter 7 (accounting for 4 bits) may be used to indicate that the first object is equivalent to the direction in which the robot device is located. For example, if the parameter value of the sub-parameter 8 is 1000, it indicates that the first object is located in a first direction (e.g., front) of the robot; if the parameter value of the sub-parameter 8 is 0100, it indicates that the first object is located in the second direction (e.g. the rear direction) of the robot; if the parameter value of the sub-parameter 8 is 0010, it indicates that the first object is located in a third direction (e.g., right direction) of the robot device; if the parameter value of the sub-parameter 8 is 0001, it indicates that the first object is located in the fourth direction (e.g., left direction) of the robot device.
9. A sub-parameter of 8 (6 bits) may be used to represent the number of first objects. For example, 000001 indicates that the number of first objects is 1; 000100 denotes a number of 4 first objects.
10. A sub-parameter 9 (of 3 bits) may be used to indicate the distance the first object needs to be moved. For example, 100 means that the first object needs to move 4 meters (m); 010 means that the first object needs to be moved by 2 m. Or it may be used to indicate the destination position of the first object movement. For example, the target position may include a direction of movement and a distance moved in that direction.
11. A sub-parameter 10 (1 bit) may be used to indicate whether the drive (e.g. sensor, motor) is on or off. If the parameter value of the sub-parameter 10 is 1, it indicates to turn on the driver; if the sub-parameter 10 is 0, it means that the driver does not need to be turned on.
12. A sub-parameter 11 (of 1bit) may be used to indicate whether the robot needs to adjust the position. If the parameter value of the sub-parameter 11 is 1, indicating that the position needs to be adjusted; if the sub-parameter 11 is 0, it indicates that the position adjustment is not required.
13. The sub-parameter 12 (accounting for 1bit) may be used to indicate the type of drive used by the robotic device. If the parameter value of the sub-parameter 12 is 1, it indicates that the type of the driver is the first type (e.g. gravity sensor); if the parameter value of the sub-parameter 12 is 0, it indicates that the type of the driver is the second type (e.g. temperature sensor).
14. A sub-parameter 13 (accounting for 1bit) may be used to indicate which position of the drive the robot uses. If the parameter value of the sub-parameter 13 is 1, it indicates that the driver in the first position is used; if the parameter value of the sub-parameter 13 is 0, it indicates that the driver in the second position is used.
The plurality of sub-parameters are only exemplary, and the meaning of each sub-parameter, the value range of the parameter value, and the configured bit may be set as required, without limitation.
Based on the control commands in table 1, for the robot device, if an action needs to be added, only the number of the command parameter needs to be added, and the sub-parameter corresponding to the command parameter and the parameter value of the sub-parameter need to be determined. Thus, control commands corresponding to other actions are not affected.
In another possible implementation manner, in order to ensure that the action indicated by the control command sent by the controller to the robot device is consistent with the action to be executed by the robot according to the control command. The address range of the output signal of the controller corresponds one-to-one to the address of the signal received by the robot device. Wherein the address of the controller can uniquely identify the controller. The address of the robotic device may uniquely identify the robotic device.
In one example, when the controller sends a control instruction to the robot device, the output signal of the controller (the signal sent by the controller to the robot device) may be configured with an address range of X1000 to X10FF (hexadecimal), which is 256 points; the input signal of the robot device (the signal received by the robot from the controller) can be configured in an address range of 2000-2255 (decimal), and the number of the input signals is 256.
In still another example, when the robot device sends a signal of the execution state to the controller, the output signal of the robot device (the signal sent by the robot to the controller) may be configured with an address range of 1000 to 1255 (decimal) for a total of 256 points; the input signals to the controller (from the robotic device received by the controller) may be configured to have an address range of Y1000 to Y10FF (hexadecimal) for a total of 256 points.
For example, the correspondence relationship between the signal address of the controller and the signal address of the robot apparatus may be as shown in table 2.
TABLE 2
Figure RE-RE-GDA0002924316980000171
Figure RE-RE-GDA0002924316980000181
In table 2, the valid state of the signal output by the controller may be used to indicate whether the robot device transmits feedback information to the controller after analysis. The feedback information includes an analysis result of the control instruction, for example, if the valid state is on, it indicates that the robot device needs to feed back the analysis result; if the valid state is off, the robot device does not need to feed back the analysis result.
In table 2, the valid state of the signal output by the robot apparatus may be used to indicate whether the controller needs to send the determination indication again after receiving the feedback information of the robot apparatus. For example, if the valid state is on, it indicates that the controller needs to send a determination instruction, so that the robot may execute an action corresponding to the control instruction; if the valid state is off, the controller does not need to send a determination instruction, and the robot can directly execute the action corresponding to the control instruction.
In a possible implementation manner, as shown in fig. 4, the control method provided in the embodiment of the present application may further include:
step 401, the controller determines that the robot device is in an initial position and in an idle state.
The initial position may refer to a safe position where the robot device is located when the robot device is in an idle state. For example, the initial position may be a charging position of the robot device, or may be a preset safety position, without limitation.
In one example, the robotic device may be configured with a positioning device that may be used to detect location information where the robotic device is currently located. If the position information of the robot device at present is located at the initial position (that is, the coordinates of the current position are consistent with the coordinates of the initial position), it is determined that the robot device is located at the initial position: otherwise, it indicates that the robot is not in the initial position.
The idle state of the robot apparatus may refer to a state in which the robot apparatus does not receive a control command from the controller and does not need to perform any action.
In one possible implementation manner, the controller and the robot device may determine whether the robot device is in an idle state through an identification bit in the interactive signal.
For example, as shown in fig. 5, after the robot apparatus and the controller are both powered on, a signal may be input and a signal may be received. In fig. 5, an increase in the signal of "power on" indicates that the robot apparatus is in the power-on state. At this time, the robot device may transmit a first signal, which may include the first identification bit, to the controller. If the robot equipment is at the initial position, the first identification position is 1; if the robot device is not at the initial position, the first identification bit is 0. After receiving the first signal from the robot device, the controller may determine whether the robot device is at an initial position according to a first identification bit in the first signal. If the controller determines that the first identification bit in the first signal is 0, the controller may send first indication information to the robot device, where the first indication information is used to indicate that the robot device returns to the initial position.
After the robot device is located at the initial position, the first identification position may be controlled to be 1, and a second signal may be sent to the controller, where the first identification position in the second signal is 1.
The following describes an information interaction process of the robot device returning to the initial position with reference to fig. 5:
s1, the robot device is connected to a power supply, so that the robot device (power supply start) is ensured to have power supplies for the robot device and the external device, and no error exists;
s2, controller → robot: the controller determines whether the robot apparatus is in an action state (refer to the above description). And if not, sending a motion command (the robot returns to the initial position) to the controller.
Note that the initial position belongs to the safe position, and the operation of returning the robot apparatus to the initial position does not belong to the operation in table 1. The action of the robot device returning to the initial position is prioritized over any one of the actions in table 2. That is, if the robot performs another operation and receives a control command to return to the initial position, the robot apparatus may end the current operation and return to the initial position.
S3, after the robot device receives the command to return to the initial position, the robot device returns to the initial position, and the signal [ in the robot device return to the initial position ] and the information [ whether the robot device is in the operating state ] are high, indicating that the robot device is returning to the initial position and performing the operation to return to the initial position.
And S4, stopping the movement of the robot equipment after returning to the initial position, wherein the signal of the robot equipment to the initial position is high, and the signal of the robot equipment to the controller to the initial position and the signal of the robot equipment to the motion state are low.
S5, the controller determines that the robot is after the initial position and in an idle state, and may send a [ robot device action ready ] signal to the robot device. The [ robot device action ready ] signal is used to indicate that the robot device is in a ready state to receive control instructions. At this time, the robot apparatus may increase the motion state signal, and the [ robot apparatus returns to the initial position ] signal is low, the robot apparatus motion preparation signal is low, and the robot apparatus is in a ready state to receive a motion command.
It should be noted that in fig. 5, a signal rising may indicate that the flag of the signal is the first character (e.g., 1), and a signal falling may indicate that the flag of the signal is the second character (e.g., 0).
Step 401 may be located before step 301, or may be between step 301 and step 302.
In another possible implementation manner, the control method provided in this embodiment of the present application may further include:
step 402, the robotic device sends a first response message to the controller after completing the first action. Accordingly, the controller receives a first response message from the robotic device.
Wherein the first response message may be for instructing the robotic device to perform the first action on the first object.
The robot device may complete the first action, that is, the robot device moves the first object to the destination position according to the first control instruction. For example, the destination location may be determined according to a sub-parameter (such as sub-parameter 9 in table 1) in the first control instruction for representing the destination location.
The following describes a process of the robot performing the first action on the first object with reference to fig. 6:
s6: the robot device receives and analyzes a control command from the control. And the robot equipment feeds back the analysis result to the external equipment. The analysis result may refer to the above description.
And S7, the controller receives the analysis result from the robot equipment, and judges whether the parameter values of the instruction parameters and the sub-parameters in the feedback result of the robot equipment are consistent with the parameter values of the instruction parameters and the sub-parameters sent to the robot equipment by the controller. If so, the controller may send a [ sub-parameter consistent ] determination message to the robotic device. The robot apparatus may perform S8 after receiving the determination message of [ sub-parameter agreement ]. If not, the robotic device may output an error. For example, an alarm sound may be emitted, or an alarm message may be sent to the controller. Therefore, the robot equipment can accurately receive the instruction parameters and the parameter values of the sub-parameters sent to the robot equipment by the controller, and the safety of the robot equipment in executing actions is guaranteed.
Referring to fig. 6, when the robot receives the first control command, the "action start" stage is entered. If the robot is in the ready-to-normal state, the "action start" signal is raised, and correspondingly, the "ready-to-normal state" signal is lowered, indicating that the robot has started to perform an action. The robot starts reading the sub-parameters (signal goes high) and performs sub-parameter consistency verification.
S8, after receiving the [ sub-parameter consistent ] determination message from the controller, the robot device may determine whether the current state is an idle state or a busy state. If the robot device is in the idle state, performing S9; if the robot device is in a busy state, it indicates that the robot device is performing another operation or the robot device itself has an error, and error abnormality processing is necessary.
The error exception handling may be that after the robot device inputs the alarm information, a worker may adjust the robot.
And S9, the robot equipment executes corresponding actions according to different control commands.
In connection with fig. 6, when the robot device starts, the robot device starts to perform a corresponding action according to the parameter values of the plurality of parameters.
Further, when the robot device executes the action, whether the robot device enters the cassette or not can be checked. The interference may be that there is no object in the cassette, or that the state of the object in the cassette is abnormal (for example, the position of the flat object is not consistent with the position in the control command), so that the robot device cannot perform an action on the object. At this time, the robot apparatus may output alarm information. After the intervention process, the robotic device may continue to perform the action until the action is completed.
And S10, after the robot device finishes the current action, feeding back an action finishing signal to the controller, and waiting for the next control instruction of the controller.
As shown in fig. 6, the robot apparatus resumes the "ready-to-normal state" and is in an idle state, waiting for the next control instruction.
It should be noted that the controller may send one control command at a time to the robotic device. Alternatively, the controller may send a plurality of control commands to the robot at a time. The robot device may sequentially execute the action corresponding to each control instruction according to the time when the control instruction is received. Alternatively, the robot apparatus may execute the operation corresponding to each control command according to the operation priority corresponding to the control command.
In another possible implementation manner, the method provided in the embodiment of the present application may further include:
and step 403, the controller sends a second control instruction to the robot equipment. Accordingly, the robot apparatus receives a second control instruction from the controller.
The second control instruction is used for instructing the robot equipment to execute a second action on a second object. Wherein the second control instruction may include at least one control parameter and a second parameter value of the at least one control parameter. The second parameter value is different from the first parameter value.
The second control instruction and the second object may refer to the description of the first object. The second action may be the actions in table 1.
Step 403 may refer to the description of step 302, which is not repeated herein.
And step 404, the robot device executes a second action on the second object according to the second control instruction.
Step 404 may refer to the description of step 303, which is not repeated herein.
Based on the possible implementation mode, the robot equipment can determine the action to be executed according to the parameter value of at least one control parameter in different control instructions, and the method is accurate and fast.
All the schemes in the above embodiments of the present application can be combined without contradiction.
In the embodiment of the present application, the detection apparatus may be divided into the functional modules or the functional units according to the method example, for example, each functional module or functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in a form of hardware, or may be implemented in a form of a software functional module or a functional unit. The division of the modules or units in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 7 and 8 are schematic structural diagrams of a possible control device provided by an embodiment of the present application. The control devices can realize the functions of the controller or the robot equipment in the method embodiment, and therefore, the beneficial effects of the method embodiment can also be realized. In the embodiment of the present application, the control device may be a controller or a robot apparatus as shown in fig. 1, or may be a module (e.g., a chip) applied to the controller or the robot apparatus.
As shown in fig. 7, the control device 70 includes a transceiver module 701 and a processing module 702. The control device 70 may be used to implement the functions of the controller or the robot apparatus in the above-described method embodiments shown in fig. 3 and 4.
When the control device 70 is used to implement the functions of the robot apparatus in the embodiment of the method illustrated in fig. 3:
the transceiver module 701 is configured to receive a first control instruction from the controller, where the first control instruction includes at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is used to instruct the robot apparatus to perform a first action on a first object, the first parameter value of the at least one control parameter is used to instruct the robot apparatus to perform a pose of the first action, and at least one control parameter corresponding to different actions is the same.
The processing module 702 is configured to execute a first action on the first object according to the first control instruction.
In one possible implementation, the at least one control parameter comprises a first type of parameter, a parameter value of which is indicative of the first object, and a second type of parameter, a parameter value of which is indicative of an actuator of the robotic device performing the first action.
In a possible implementation manner, the first control instruction further includes a first numerical value, and the first numerical value is used for reflecting the action amplitude of the first action executed by the robot device.
In a possible implementation manner, the processing module 702 is further configured to send a first response message to the controller, where the first response message is used to indicate that the execution of the first action is completed.
In a possible implementation manner, the processing module 702 is further configured to receive a second control instruction from the controller, where the second control instruction includes at least one control parameter and at least a second parameter value of the control parameter, and the second control instruction is configured to instruct the robot apparatus to perform a second action on a second object, and the second parameter value and the first parameter value.
When the control device 70 is used to implement the functions of the controller in the method embodiment described in fig. 3 or fig. 4: the processing module 702 is configured to generate a first control instruction, where the first control instruction includes at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is configured to instruct the robot apparatus to perform a first action on a first object, the first parameter value of the at least one control parameter is configured to instruct the robot apparatus to perform a pose of the first action, and at least one control parameter corresponding to different actions is the same. The controller sends a first control instruction to the robotic device. The transceiver module 701 is configured to send a first control instruction to the robot apparatus.
In one possible implementation, the at least one control parameter comprises a first type of parameter, a parameter value of which is indicative of the first object, and a second type of parameter, a parameter value of which is indicative of an actuator of the robotic device performing the first action.
In a possible implementation manner, the processing module 702 is further configured to determine that the robot device is located at the initial position, and the robot device is in an idle state.
In a possible implementation manner, the transceiver module 701 is further configured to send a second control instruction to the robot apparatus, where the second control instruction includes at least one control parameter and a second parameter value of the at least one control parameter, the second control instruction is configured to instruct the robot apparatus to perform a second action on a second object, the second parameter value of the at least one control parameter is used to instruct the robot apparatus to perform a pose of the second action, and the second parameter value is different from the second parameter value.
When the control device 70 is used to implement the functionality of the robot apparatus in the method embodiments described in fig. 3 or fig. 4: the transceiver module 701 is configured to receive a control instruction from a controller. The transceiving module 701 is further configured to send a response message to the controller.
When the control device 70 is used to implement the functions of the controller in the method embodiment described in fig. 3 or fig. 4: a transceiver module 701 for a response message from the robot apparatus. And the processing module 702 is configured to generate a control instruction. The transceiver module 701 is further configured to send a control instruction to the robot apparatus.
For a more detailed description of the transceiver module 701 and the processing module 702, reference may be made to the description of the above method embodiments, and no further description is provided here.
As shown in fig. 8, the control device 80 includes a processor 810 and an interface circuit 820. Processor 810 and interface circuit 820 are coupled to each other. It will be appreciated that the interface circuit 820 may be a transceiver or an input-output interface. Optionally, the control device 80 may further include a memory 830 for storing instructions executed by the processor 810 or for storing input data required by the processor 810 to execute the instructions or for storing data generated by the processor 810 after executing the instructions.
When the control device 80 is configured to implement the method in the above method embodiment, the processor 810 is configured to perform the functions of the processing module 702, and the interface circuit 820 is configured to perform the functions of the transceiver module 701.
When the control device is a chip applied to the robot device, the chip of the robot device realizes the functions of the robot device in the above method embodiments. The chip of the robot device receives information from other modules (such as a radio frequency module or an antenna) in the robot device, and the information is sent to the robot device by the controller; alternatively, the chip of the robot device sends information to other modules (such as a radio frequency module or an antenna) in the robot device, and the information is sent to the robot device by the controller.
When the control device is a chip applied to a controller, the chip of the controller realizes the functions of the controller in the method embodiment. The chip of the controller receives information from other modules (such as a radio frequency module or an antenna) in the controller, and the information is sent to the controller by the robot device; alternatively, the chip of the controller sends information to other modules in the controller (e.g., a radio frequency module or an antenna), which the robot sends to the controller.
It is understood that the processor in the embodiments of the present application may be a Central Processing Unit (CPU), other general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. The general purpose processor may be a microprocessor, but may be any conventional processor.
The method steps in the embodiments of the present application may be implemented by hardware, or may be implemented by software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in Random Access Memory (RAM), flash Memory, Read-Only Memory (ROM), programmable ROM, Erasable PROM (EPROM), Electrically EPROM (EEPROM), registers, a hard disk, a removable hard disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in an access network device or a terminal device. Of course, the processor and the storage medium may reside as discrete components in an access network device or a terminal device.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer programs or instructions. When the computer program or instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are performed in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer program or instructions may be stored in or transmitted over a computer-readable storage medium. The computer readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server that integrates one or more available media. The usable medium may be a magnetic medium, such as a floppy disk, a hard disk, a magnetic tape; or an optical medium, such as a DVD; it may also be a semiconductor medium, such as a Solid State Disk (SSD).
The modules in fig. 7 and 8 may also be referred to as units, for example, a processing module may be referred to as a processing unit.
As shown in fig. 9, fig. 9 is a diagram illustrating an example of a control system provided in an embodiment of the present application, and includes a robot apparatus 11 and a controller 12.
The robot apparatus 11 is used to perform the actions performed by the robot apparatus in the above-described embodiments. For example, for performing step 303 in fig. 3, and steps 402, 404 in fig. 4.
The controller 12 is configured to perform the actions performed by the above embodiments, for example, the controller 12 is configured to perform steps 301 and 302 in fig. 3, and steps 401 and 403 in fig. 4.
In implementation, the steps of the method provided by this embodiment may be implemented by hardware integrated logic circuits in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
Embodiments of the present application also provide a computer-readable storage medium, which includes instructions that, when executed on a computer, cause the computer to perform any of the above methods.
Embodiments of the present application also provide a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the methods described above.
An embodiment of the present application further provides a communication system, including: the controller and the controller.
Embodiments of the present application further provide a chip, where the chip includes a processor and an interface circuit, where the interface circuit is coupled to the processor, the processor is configured to execute a computer program or instructions to implement the method, and the interface circuit is configured to communicate with other modules outside the chip.
In the embodiments of the present application, unless otherwise specified or conflicting with respect to logic, the terms and/or descriptions in different embodiments have consistency and may be mutually cited, and technical features in different embodiments may be combined to form a new embodiment according to their inherent logic relationship.
In this application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. In the description of the text of the present application, the character "/" generally indicates that the former and latter associated objects are in an "or" relationship; in the formula of the present application, the character "/" indicates that the preceding and following associated objects are in a "division" relationship.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application. The sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of the processes should be determined by their functions and inherent logic.

Claims (10)

1. A control method, applied to a robot apparatus, the method comprising:
the robot device receives a first control instruction from a controller, wherein the first control instruction comprises at least one control parameter and a first parameter value of the at least one control parameter, the first control instruction is used for instructing the robot device to execute a first action on a first object, the first parameter value of the at least one control parameter is used for instructing the robot device to execute the pose of the first action, and the at least one control parameter corresponding to different actions is the same;
and the robot equipment executes the first action on the first object according to the first control instruction.
2. The method of claim 1, wherein the at least one control parameter comprises a first type of parameter, a second type of parameter, a parameter value of the first type of parameter being indicative of information of the first object, a parameter value of the second type of parameter being indicative of a driver of the robotic device performing the first action.
3. The method of claim 1 or 2, wherein the first control instruction further comprises a first value reflecting an amplitude of the action of the robotic device to perform the first action.
4. The method of claim 1, further comprising:
the robot device sends a first response message to the controller, the first response message being used to indicate that the first action execution is complete.
5. The method of claim 4, further comprising:
the robot device receives a second control instruction from the controller, wherein the second control instruction comprises at least one control parameter and a second parameter value of the at least one first control parameter, the second control instruction is used for instructing the robot device to execute a second action on a second object, the second parameter value is different from the first parameter value, and the second parameter value of the at least one control parameter is used for indicating information of the second object;
and the robot equipment executes the second action on the second object according to the second control instruction.
6. A control method is applied to a controller, and the method comprises the following steps:
the controller generates a first control instruction, wherein the first control instruction is used for instructing the robot device to execute the first action on the first object, the first control instruction comprises at least one control parameter and a first parameter value of the at least one control parameter, the first parameter value of the at least one control parameter is used for instructing the robot device to execute the pose of the first action, and the at least one control parameter corresponding to different actions is the same;
the controller sends a first control instruction to the robotic device.
7. The method according to claim 6, characterized in that the at least one control parameter comprises a first type of parameter, a second type of parameter, a parameter value of the first type of parameter being used for indicating information of the first object, a parameter value of the second type of parameter being used for indicating a driver of the robotic device to perform the first action.
8. The method of claim 6 or 7, wherein prior to the controller sending the first control instruction to the robotic device, the method further comprises: the controller determines that the robotic device is in an initial position and the robotic device is in an idle state.
9. The method of claim 8, further comprising:
the controller sends a second control instruction to the robot device, the second control instruction comprises at least one control parameter and a second parameter value of the at least one first control parameter, the second control instruction is used for instructing the robot device to execute a second action on a second object, the second parameter value of the at least one control parameter is used for instructing the robot device to execute the pose of the second action, and the second parameter value is different from the first parameter value.
10. A control system, comprising a robotic device communicatively coupled to a controller, the robotic device configured to perform the method of any of claims 1-5, and the controller configured to perform the method of any of claims 6-9.
CN202011386127.3A 2020-12-01 2020-12-01 Control method and control system Active CN114578720B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011386127.3A CN114578720B (en) 2020-12-01 2020-12-01 Control method and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011386127.3A CN114578720B (en) 2020-12-01 2020-12-01 Control method and control system

Publications (2)

Publication Number Publication Date
CN114578720A true CN114578720A (en) 2022-06-03
CN114578720B CN114578720B (en) 2023-11-07

Family

ID=81767232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011386127.3A Active CN114578720B (en) 2020-12-01 2020-12-01 Control method and control system

Country Status (1)

Country Link
CN (1) CN114578720B (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107077651A (en) * 2016-12-29 2017-08-18 深圳前海达闼云端智能科技有限公司 Robot cooperation method, device, robot and computer program product
US20170320210A1 (en) * 2016-05-06 2017-11-09 Kindred Systems Inc. Systems, devices, articles, and methods for using trained robots
CN107517238A (en) * 2016-06-17 2017-12-26 阿里巴巴集团控股有限公司 A kind of smart machine control method, device and equipment for Internet of Things
WO2017220199A1 (en) * 2016-06-21 2017-12-28 Kuka Roboter Gmbh Configuring and/or controlling a robot arrangement
CN107765838A (en) * 2016-08-18 2018-03-06 北京北信源软件股份有限公司 Man-machine interaction householder method and device
CN108115678A (en) * 2016-11-28 2018-06-05 深圳光启合众科技有限公司 Robot and its method of controlling operation and device
US10093019B1 (en) * 2014-12-29 2018-10-09 Boston Dynamics, Inc. Determination of robot behavior
US20180348730A1 (en) * 2017-06-01 2018-12-06 X Development Llc Automatic Generation of Toolpaths
CN109035740A (en) * 2018-09-27 2018-12-18 上海节卡机器人科技有限公司 Control method, device and the tele-control system of robot
CN109732597A (en) * 2018-12-29 2019-05-10 深圳市越疆科技有限公司 A kind of remote debugging method based on robot, device and controller
WO2019192244A1 (en) * 2018-04-03 2019-10-10 京东方科技集团股份有限公司 Parameter configuration method and device, and display device
EP3566823A1 (en) * 2018-05-11 2019-11-13 Siemens Aktiengesellschaft Method, apparatus and system for robotic programming
EP3576065A1 (en) * 2018-06-01 2019-12-04 Johnson Controls Fire Protection LP Systems and methods of alarm controls and directed audio evacuation
CN111195909A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Steering engine control method and device for robot, terminal and computer storage medium
CN111267111A (en) * 2020-03-31 2020-06-12 烟台艾睿光电科技有限公司 Robot control method, device and system
CN111267087A (en) * 2018-12-04 2020-06-12 北京猎户星空科技有限公司 Method, device, equipment and storage medium for generating and executing action molecules
CN111328306A (en) * 2017-10-10 2020-06-23 奥瑞斯健康公司 Surgical robot arm admittance control
WO2020155849A1 (en) * 2019-01-31 2020-08-06 华为技术有限公司 Method and apparatus for sending and receiving instructions
WO2020221311A1 (en) * 2019-04-30 2020-11-05 齐鲁工业大学 Wearable device-based mobile robot control system and control method
CN112584990A (en) * 2018-11-30 2021-03-30 欧姆龙株式会社 Control device, control method, and control program

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10093019B1 (en) * 2014-12-29 2018-10-09 Boston Dynamics, Inc. Determination of robot behavior
US20170320210A1 (en) * 2016-05-06 2017-11-09 Kindred Systems Inc. Systems, devices, articles, and methods for using trained robots
CN107517238A (en) * 2016-06-17 2017-12-26 阿里巴巴集团控股有限公司 A kind of smart machine control method, device and equipment for Internet of Things
WO2017220199A1 (en) * 2016-06-21 2017-12-28 Kuka Roboter Gmbh Configuring and/or controlling a robot arrangement
CN107765838A (en) * 2016-08-18 2018-03-06 北京北信源软件股份有限公司 Man-machine interaction householder method and device
CN108115678A (en) * 2016-11-28 2018-06-05 深圳光启合众科技有限公司 Robot and its method of controlling operation and device
CN107077651A (en) * 2016-12-29 2017-08-18 深圳前海达闼云端智能科技有限公司 Robot cooperation method, device, robot and computer program product
US20180348730A1 (en) * 2017-06-01 2018-12-06 X Development Llc Automatic Generation of Toolpaths
CN111328306A (en) * 2017-10-10 2020-06-23 奥瑞斯健康公司 Surgical robot arm admittance control
WO2019192244A1 (en) * 2018-04-03 2019-10-10 京东方科技集团股份有限公司 Parameter configuration method and device, and display device
EP3566823A1 (en) * 2018-05-11 2019-11-13 Siemens Aktiengesellschaft Method, apparatus and system for robotic programming
EP3576065A1 (en) * 2018-06-01 2019-12-04 Johnson Controls Fire Protection LP Systems and methods of alarm controls and directed audio evacuation
CN109035740A (en) * 2018-09-27 2018-12-18 上海节卡机器人科技有限公司 Control method, device and the tele-control system of robot
CN112584990A (en) * 2018-11-30 2021-03-30 欧姆龙株式会社 Control device, control method, and control program
CN111267087A (en) * 2018-12-04 2020-06-12 北京猎户星空科技有限公司 Method, device, equipment and storage medium for generating and executing action molecules
CN109732597A (en) * 2018-12-29 2019-05-10 深圳市越疆科技有限公司 A kind of remote debugging method based on robot, device and controller
WO2020155849A1 (en) * 2019-01-31 2020-08-06 华为技术有限公司 Method and apparatus for sending and receiving instructions
WO2020221311A1 (en) * 2019-04-30 2020-11-05 齐鲁工业大学 Wearable device-based mobile robot control system and control method
CN111195909A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Steering engine control method and device for robot, terminal and computer storage medium
CN111267111A (en) * 2020-03-31 2020-06-12 烟台艾睿光电科技有限公司 Robot control method, device and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
徐箭雨;陈卫东;: "基于物体识别的消防侦察机器人智能行为控制系统设计", 消防科学与技术, no. 06 *
陈彦宇;田东庄;许翠华;周春;黄煌;: "基于PLC直角坐标式机器人控制系统的设计", 自动化仪表, no. 04 *

Also Published As

Publication number Publication date
CN114578720B (en) 2023-11-07

Similar Documents

Publication Publication Date Title
JP4302160B2 (en) Robot programming device for palletizing work by robot
CN102648442B (en) Improved pick and place
US11325263B2 (en) System and method for real-time robotic control
CN102298373B (en) Monitoring method and monitoring system of programmable logic controller (PLC)
CN108527403B (en) Data setting system and method for robot, computer readable recording medium
JP2013197872A (en) Control device and control method
US11858134B2 (en) Reconfigurable robotic manufacturing cells
CN103926928A (en) Robot controller with modules dynamically dispatched
CN114578720B (en) Control method and control system
JPH09326808A (en) Electronic wiring system by cyclic automatic communication
US4694232A (en) Robot controlling system
KR102544350B1 (en) System and method for managing cargo storing and rehandling processes using artificial intellegience
JPS59223807A (en) Coupling system of numerical controller
US20220402121A1 (en) Control and monitoring of a machine arrangement
JP2013163247A (en) Robot system, robot, robot controller, and robot control method
CN113159611A (en) Elevator dispatching method, device and equipment based on prediction model and storage medium
Khasasi et al. Development of an automated storage and retrieval system in dynamic industrial environment
KR101264023B1 (en) Apparatus for controlling of conveyer module
JP5102090B2 (en) Control device and control method of control device
JP2000015595A (en) Object collision detecting method and its device
JPS61155127A (en) Stowage design assisting device
JP6829292B2 (en) Control devices, control systems, control methods and control programs
CN116692322A (en) Automatic bin changing method and device for robot, electronic equipment and medium
JP2011183498A (en) Controller of robot
CN115511018A (en) Communication method and cargo access system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant