CN112356030B - Robot control method, device, computer equipment and storage medium - Google Patents

Robot control method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN112356030B
CN112356030B CN202011220420.2A CN202011220420A CN112356030B CN 112356030 B CN112356030 B CN 112356030B CN 202011220420 A CN202011220420 A CN 202011220420A CN 112356030 B CN112356030 B CN 112356030B
Authority
CN
China
Prior art keywords
order
robot
scene
cooperative robot
production
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011220420.2A
Other languages
Chinese (zh)
Other versions
CN112356030A (en
Inventor
托比斯·亚历山大·哈特威格·阿恩特
张宁宁
马婧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kalu Production Technology Research Institute Of Suzhou Industrial Park
Original Assignee
Kalu Production Technology Research Institute Of Suzhou Industrial Park
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kalu Production Technology Research Institute Of Suzhou Industrial Park filed Critical Kalu Production Technology Research Institute Of Suzhou Industrial Park
Priority to CN202011220420.2A priority Critical patent/CN112356030B/en
Publication of CN112356030A publication Critical patent/CN112356030A/en
Application granted granted Critical
Publication of CN112356030B publication Critical patent/CN112356030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/006Controls for manipulators by means of a wireless system for controlling one or several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The application relates to a control method, a control device, computer equipment and a storage medium of a robot, wherein an order selection instruction and a scene selection instruction of an order to be produced are obtained; displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction; and responding to the triggering operation generated on the displayed page, acquiring the action instruction of the cooperative robot, and controlling the cooperative robot to enter a corresponding production scene through a control system of the cooperative robot according to a scene address and a scene quantity value carried by the action instruction. Therefore, the cooperative robot can be switched among different production scenes, and the production efficiency of orders is improved. Further, the production efficiency under different production scenarios can be compared, so that the production scenario with higher production efficiency is determined in the different production scenarios.

Description

Robot control method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of intelligent manufacturing control technologies, and in particular, to a method and an apparatus for controlling a robot, a computer device, and a storage medium.
Background
With the development of technologies such as internet, internet of things and cloud manufacturing, the manufacturing industry and information technology are deeply integrated, and an intelligent production line system with man-machine cooperation appears. The intelligent production line system with man-machine cooperation mainly adopts a unit control technology, a system control technology, a development platform technology, a network interconnection technology and the like. And the human-computer cooperation intelligent production line system is mainly based on an actual production scene, and shows the potential and feasibility of human-computer cooperation in the modern manufacturing industry through an extensible automatic assembly station, a human-computer cooperation robot and the like.
However, in the conventional technology, since the order system of the manufacturing industry and the cooperative robot exist independently of each other, when a task is executed, the cooperative robot cannot automatically switch to execute the task according to the order task in the order system, thereby causing a technical problem of low production efficiency.
Disclosure of Invention
In view of the above, it is desirable to provide a robot control method, apparatus, computer device, and storage medium capable of improving production efficiency.
A method of controlling a robot, the method comprising:
acquiring an order selection instruction and a scene selection instruction of an order to be produced;
displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction, wherein the online operation instruction book system is an auxiliary system for human-computer interaction with an operator;
responding to a trigger operation generated on a displayed page, and acquiring an action instruction of the cooperative robot, wherein the action instruction carries a scene address and a scene quantity value;
and controlling the cooperative robot to enter a corresponding production scene through a control system of the cooperative robot according to the scene address and the scene quantity value.
In one embodiment, the controlling, by the control system of the cooperative robot, the cooperative robot to enter the corresponding production scenario according to the scenario address and the scenario quantity value includes:
analyzing the scene address and the scene quantity value, and sending an analysis result to a control system of the cooperative robot;
and controlling the cooperative robot to enter a corresponding production scene according to the analysis result through the control system of the cooperative robot.
In one embodiment, the order selection instruction carries order information of an order to be produced; after the controlling system of the cooperative robot controls the cooperative robot to enter the corresponding production scenario according to the analysis result, the method further includes:
and in the production scene, executing a production task corresponding to the order to be produced according to the order information of the order to be produced by the cooperative robot.
In one embodiment, the order information includes material information; before the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced, the method further includes:
determining the position of a material box to be grabbed according to the material information, and sending the position of the material box to be grabbed to a control system of the cooperative robot;
converting the position of the material box to be grabbed into coordinate information corresponding to the material box to be grabbed according to the corresponding relation between the position of the material box and the coordinates of the material box by using a control system of the cooperative robot;
the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced includes:
and grabbing the material box to be grabbed from the storage position of the material box to be grabbed according to the coordinate information of the material box to be grabbed by the cooperation robot, and placing the material box to be grabbed at the corresponding position of the assembling material frame.
In one embodiment, an RFID tag is arranged at a position corresponding to the assembling rack, and the order information further includes an order identifier; after the material box to be grabbed is grabbed from the position of the material box to be grabbed according to the coordinate information of the material box to be grabbed, the method further comprises the following steps:
acquiring material information stored in the RFID label;
acquiring material information required by the order to be produced according to the order mark;
and comparing the material information stored in the RFID label with the material information required by the order to be produced, and judging whether the grabbed material box is correct or not according to a comparison result.
In one embodiment, the order information includes a production quantity; before the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced, the method further includes:
carrying out format conversion processing on the production quantity to obtain target data corresponding to the production quantity, and sending the target data corresponding to the production quantity to a control system of the cooperative robot;
converting the target data corresponding to the production quantity into coordinate information of the components to be grabbed according to the corresponding relation between the target data and the component coordinates through a control system of the cooperative robot;
the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced includes:
and grabbing the component to be grabbed from the position corresponding to the coordinate information of the component to be grabbed through the cooperative robot.
In one embodiment, the order information includes a packaging quantity of the finished packaged product; before the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced, the method further includes:
detecting whether a photoelectric sensor signal is received;
monitoring whether a stacking instruction corresponding to a packaging procedure is received;
if the photoelectric sensor signal and the stacking instruction are received, activating the cooperative robot;
converting the packaging quantity of the packaged products into coordinate information of the next packaged and to-be-stacked product according to the corresponding relation between the packaging quantity and the stacking coordinate through a control system of the cooperative robot;
the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced includes:
and placing the next finished package and product to be palletized at the position corresponding to the coordinate information of the next finished package and product to be palletized by the cooperation robot.
A control apparatus of a robot, the apparatus comprising:
the first acquisition module is used for acquiring an order selection instruction and a scene selection instruction of an order to be produced;
the page display module is used for displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction, wherein the online operation instruction book system is an auxiliary system for human-computer interaction with an operator;
the second acquisition module is used for responding to the trigger operation generated on the displayed page and acquiring the action instruction of the cooperative robot, wherein the action instruction carries a scene address and a scene quantity value;
and the robot control module is used for controlling the cooperative robot to enter a corresponding production scene through the control system of the cooperative robot according to the scene address and the scene quantity value.
A controller comprising a memory storing a computer program and a processor implementing the method steps in the above embodiments when executing the computer program.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method steps of the above-mentioned embodiments.
According to the control method, the control device, the computer equipment and the storage medium of the robot, the order selection instruction and the scene selection instruction of the order to be produced are obtained; displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction; and responding to the triggering operation generated on the displayed page, acquiring the action instruction of the cooperative robot, and controlling the cooperative robot to enter a corresponding production scene through a control system of the cooperative robot according to a scene address and a scene quantity value carried by the action instruction. Therefore, the cooperative robot can be switched among different production scenes, and the production efficiency of orders is improved. Further, the production efficiency under different production scenes can be compared, so that the production scene with higher production efficiency is determined in the different production scenes, and then the man-machine cooperation production mode under which scene is more suitable for actual production is determined through the comparison of the production efficiency, the production efficiency can be used as one of the reference standards for whether the man-machine cooperation mode needs to be further adjusted, and the feasibility and the necessity of man-machine cooperation can be judged through the factors such as whether the production efficiency is improved.
Drawings
FIG. 1 is a diagram of an exemplary control method for a robot;
FIG. 2a is a schematic flow chart illustrating a method for controlling a robot according to one embodiment;
FIG. 2b is a schematic diagram of an embodiment of an order selection page;
FIG. 2c is a diagram of a scene selection page in one embodiment;
FIG. 3 is a flowchart illustrating step S240 according to an embodiment;
fig. 4 is a flowchart illustrating a control method of the robot in another embodiment;
fig. 5 is a flowchart illustrating a control method of the robot in another embodiment;
fig. 6 is a flowchart illustrating a control method of the robot in another embodiment;
fig. 7 is a flowchart illustrating a control method of the robot in another embodiment;
FIG. 8a is a schematic flow chart illustrating a method for controlling a robot according to another embodiment;
FIGS. 8 b-8 c are schematic views of a work instruction sheet in one embodiment;
FIG. 8d is a schematic illustration of Franka robot APP Programming in one embodiment;
FIG. 8e is a block diagram of the Modbus architecture in one embodiment;
FIG. 8f is a definition of the Modbus Coil signal in one embodiment;
FIG. 8g is a definition of the Modbus Holding Register signal in one embodiment;
fig. 9 is a block diagram showing a control device of the robot according to one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The robot control method provided by the application can be applied to the application environment shown in fig. 1. The method comprises the following steps: the system comprises a cooperative robot 110, a PLC controller 120, an information system 130 and a server 140, wherein the information system 130 comprises an online work instruction system 131 and a database system 132, and the online work instruction system 131 is in communication connection with the database system 132. The online work instruction book system 131 is provided with an order selection page, a production scene selection page, a production instruction book page, and a quality inspection page. The online work instruction book system 131 displays an order selection page, an operator can trigger an order selection instruction through the order selection page, and can select any order to produce, then the online work instruction book system 131 displays a scene selection page, and the operator can trigger a scene selection instruction through the order selection page, and can select a corresponding scene to produce. The cooperative robot 110 has a control system. The cooperative robot 110, the PLC controller 120, and the information system 130 are integrated in the same lan, and define respective communication ports (e.g., Modbus communication ports). Acquiring an order selection instruction and a scene selection instruction of an order to be produced; displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction, wherein the online operation instruction book system is an auxiliary system for human-computer interaction with an operator; responding to a trigger operation generated on a displayed page, and acquiring an action instruction of the cooperative robot, wherein the action instruction carries a scene address and a scene quantity value; and controlling the cooperative robot to enter a corresponding production scene through a control system of the cooperative robot according to the scene address and the scene quantity value.
The information system 130 further includes an ordering system and a performance kanban system. The ordering system includes the display of selectable items of each component of the product and corresponding inventory, the selection of the order quantity, and the confirmation ordering instruction. The performance billboard system can show production profiles, current orders, tempo time, material consumption, etc. The cooperative robot 110 may be a panda model of emmica corporation, which is a cooperative robot system designed to help humans, and an operator may easily teach a panda by dragging. The system has the characteristics of modularization, ultra-light structure, high-integration mechanical and electrical integration design, and rapid and flexible deployment of sensitive torque sensors in all joints and human body-like kinematics. Based on human inspired "flexible robotic control", pandas are able to identify and handle the slightest touches by reacting within milliseconds using their artificial reflection system. The robot can realize three human-computer interaction modes, including voice interaction (the robot can understand voice commands and respond), force feedback control (the robot acts through force control in a specific direction), and position control (the robot acts through recognizing the position of an operator and combining signal interaction of a safety carpet and a scanner).
In one embodiment, as shown in fig. 2a, a robot control method is provided, which is described by taking the example that the method is applied to the server 140 in fig. 1 and the server 140 is a modbut server, and includes the following steps:
step S210, obtaining an order selection instruction and a scene selection instruction of the order to be produced.
And step S220, displaying a corresponding page through the online operation instruction book system according to the order selection instruction and the scene selection instruction.
Wherein the to-be-produced order is an order for which production is not completed. The order selection instruction is an instruction sent to the ModBus server by an operator through triggering a button in an order selection page, and the order selection instruction carries information of an order to be produced selected by the operator. The scene selection instruction is an instruction sent to the ModBus server by an operator through triggering a button in a scene selection page, the scene selection instruction carries information of a production scene selected by the operator, and production of a selected order can be completed in the selected production scene. The online work instruction book system is an auxiliary system for human-computer interaction with an operator. The online operation instruction book system can be divided into different steps and is matched with a text, picture or video mode to dynamically guide an operator to produce in real time.
Specifically, an order selection instruction and a scene selection instruction of the system to be produced are triggered through the online operation instruction book system, the order selection instruction and the scene selection instruction are sent to the ModBus server, and then the PLC receives the order selection instruction and the scene selection instruction of the order to be produced. When the order selection instruction and the scene selection instruction of the system to be produced are triggered through the online operation instruction book system, the online operation instruction book system can display a corresponding order selection page and a corresponding scene selection page. As shown in fig. 2b and 2c, the order selection page shows the orders to be produced, and the operator can select any order to produce, for example, by clicking the corresponding order position on the screen and then clicking the "select" button. The scene selection page shows production scenes for the operator to select, for example, the operator clicks a button of a corresponding scene to complete scene selection. Furthermore, after the buttons of the corresponding scenes are clicked, the online operation instruction book system can send out safety prompts to remind operators of ensuring that all equipment is in place.
Step S230, in response to the trigger operation occurring on the displayed page, acquiring an action instruction of the cooperative robot.
And step S240, controlling the cooperative robot to enter a corresponding production scene through the control system of the cooperative robot according to the scene address and the scene quantity value.
The action command carries a scene address and a scene quantity value. The scene address is a protocol address for storing production scene information, and the scene quantity value is a numerical value for representing the production scene information. The production scene selected by the operator can be entered through the scene address and the scene quantity value.
Specifically, after entering a corresponding production scene, a trigger button is arranged on a page displayed by the online work instruction book system and used for triggering the robot to start executing a production task. When an operator sends a trigger operation, such as triggering a trigger button (e.g., a "next" button) on a displayed page, the online work instruction book system sends an action instruction that the robot can execute a production task to the ModBus server, that is, in response to the trigger operation occurring on the displayed page, the ModBus server receives the action instruction that the robot can execute the production task. The action command carries a scene address and a scene quantity value, the production scene selected by an operator can be entered through the scene address and the scene quantity value, the ModBus server analyzes the action command according to the scene address and the scene quantity value, the cooperative robot is determined to enter the corresponding production scene, and a corresponding signal is sent to the cooperative robot, so that the control system of the cooperative robot controls the cooperative robot to enter the corresponding production scene.
In the control method of the robot, an order selection instruction and a scene selection instruction of an order to be produced are obtained; displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction; and responding to the triggering operation generated on the displayed page, acquiring the action instruction of the cooperative robot, and controlling the cooperative robot to enter a corresponding production scene through a control system of the cooperative robot according to the scene address and the scene quantity value carried by the action instruction. Therefore, the cooperative robot can be switched among different production scenes, and the production efficiency of orders is improved. Further, the production efficiency under different production scenes can be compared, so that the production scene with higher production efficiency is determined in the different production scenes, and then the man-machine cooperation production mode under which scene is more suitable for actual production is determined through the comparison of the production efficiency, the production efficiency can be used as one of the reference standards for whether the man-machine cooperation mode needs to be further adjusted, and the feasibility and the necessity of man-machine cooperation can be judged through the factors such as whether the production efficiency is improved.
In one embodiment, as shown in fig. 3, in step S240, controlling the cooperative robot to enter the corresponding production scenario through the control system of the cooperative robot according to the scenario address and the scenario magnitude includes:
and step S310, analyzing the scene address and the scene quantity value, and sending an analysis result to a control system of the cooperative robot.
And step S320, controlling the cooperative robot to enter a corresponding production scene according to the analysis result through the control system of the cooperative robot.
Specifically, the online operation instruction system sends an action instruction that the robot can execute the production task to the ModBus server, and the ModBus server receives the action instruction that the robot can execute the production task. The action command carries a scene address and a scene quantity value, the ModBus server analyzes the scene address and the scene quantity value according to a communication protocol, and an analysis result is sent to a control system of the cooperative robot. And the control system of the cooperative robot receives the analysis result, the corresponding relation between the analysis result and the production scene is prestored in the control system, so that the control system determines the corresponding production scene according to the analysis result, and the control system controls the cooperative robot to enter the production scene corresponding to the analysis result.
Illustratively, a production scenario may include three, Modbus Holding Register signals defined using the Modbus protocol, see Table below.
Scene address Scene magnitude Scene
0 1 Scene 1
0 2 Scene 2
0 3 Scene 3
In the embodiment, the scene address and the scene quantity value are analyzed, the analysis result is sent to the control system of the cooperative robot, the cooperative robot is controlled by the control system of the cooperative robot to enter the corresponding production scene according to the analysis result, the cooperative robot can be controlled to enter the corresponding production scene, and the effect of improving the order production efficiency is achieved.
In one embodiment, the order selection instruction carries order information of the order to be produced; after the cooperative robot is controlled by the control system of the cooperative robot to enter the corresponding production scene according to the analysis result, the method further comprises the following steps: in a production scene, executing a production task corresponding to the order to be produced according to the order information of the order to be produced by the cooperative robot.
Specifically, the ModBus server sends the scene address and the analysis result of the scene quantity value to the control system of the cooperative robot. And the control system of the cooperative robot controls the cooperative robot to enter a corresponding production scene according to the analysis result. And when entering a corresponding production scene, according to the order information of the order to be produced, the cooperative robot starts to execute the production task corresponding to the order to be produced.
In the embodiment, in a production scene, the cooperative robot executes a production task corresponding to the order to be produced according to the order information of the order to be produced, so that the correctness of human-computer cooperation on a quality level is improved, the fool and mistake proofing are realized, and the production efficiency of the order is improved.
In one embodiment, the order information includes material information. As shown in fig. 4, before executing the production task corresponding to the order to be produced according to the order information of the order to be produced by the cooperative robot, the method further includes the following steps:
and S410, determining the position of the material box to be grabbed according to the material information, and sending the position of the material box to be grabbed to a control system of the cooperative robot.
The material information may include the type of material, the quantity of material, the color of material, etc. required to produce an order. For example, when assembling the pen, the pen includes a pen point, a pen holder, a pen cap and a pen bag, and the material information may be information such as an order number, a pen point color, a pen holder color, a pen cap color and a pen bag type.
Specifically, when an order production task is executed, production materials need to be obtained first, the production materials are contained in material boxes, the material boxes are stored in a material warehouse, and the material boxes have corresponding position information, namely warehouse positions, in the material warehouse. The order selection instruction carries material information of an order to be produced, a corresponding relation between the material information and a material box position is stored in the ModBus server, after the ModBus server receives the material information, the corresponding material box position is determined according to the received material information, and materials in the material box on the position are materials required for executing an order production task. After the position of the material box to be grabbed is determined, the ModBus server sends the position of the material box to be grabbed to a control system of the cooperative robot, so that the control system of the cooperative robot controls a mechanical arm of the cooperative robot to grab the corresponding material box to be grabbed from the corresponding position.
Step S420, converting the position of the material box to be grabbed into coordinate information corresponding to the material box to be grabbed according to the corresponding relationship between the position of the material box and the coordinates of the material box by using the control system of the cooperative robot.
Specifically, if the robot needs to cooperate to grasp the material box, the position information of the material box to be grasped needs to be obtained, because the corresponding relationship between the bin position of the material box and the material coordinate is stored in the control system of the robot, after the bin position of the material box to be grasped is received, the control system of the robot converts the bin position of the material box to be grasped into the coordinate information corresponding to the material box to be grasped according to the corresponding relationship between the bin position of the material box and the material coordinate.
Executing a production task corresponding to the order to be produced according to the order information of the order to be produced by the cooperative robot, wherein the production task comprises the following steps:
and step S430, grabbing the material box to be grabbed from the storage position of the material box to be grabbed according to the coordinate information of the material box to be grabbed through the cooperative robot, and placing the material box to be grabbed at the corresponding position of the assembling rack.
Specifically, after the control system of the cooperative robot determines the coordinate information corresponding to the material box to be grabbed, the mechanical arm of the cooperative robot is controlled to move to the position of the material box to be grabbed according to the coordinate information of the material box to be grabbed, and then the cooperative robot grabs the material box to be grabbed from the position of the material box to be grabbed. In order to facilitate assembly of an operator, the cooperation robot places the grabbed material box at a corresponding position of the assembly rack.
In the embodiment, the position of the material box to be grabbed is determined according to the material information, and the position of the material box to be grabbed is sent to a control system of the cooperative robot; therefore, the storage position of the material box to be grabbed is converted into coordinate information corresponding to the material box to be grabbed through the control system of the cooperative robot according to the corresponding relation between the storage position of the material box and the coordinates of the material box; and then grabbing the material box to be grabbed from the storage position of the material box to be grabbed according to the coordinate information of the material box to be grabbed through the cooperative robot, and placing the material box to be grabbed at the corresponding position of the assembling material frame. Accurate grabbing of the material by the cooperative robot is achieved. Furthermore, snatch through the material box, realize once snatching a plurality of materials, promote the efficiency of snatching of material.
In one embodiment, an RFID tag is disposed at a corresponding position of the assembling rack, and the order information further includes an order identifier. As shown in fig. 5, after the material box to be grasped is grasped from the position of the material box to be grasped according to the coordinate information of the material box to be grasped, the method further includes the following steps:
and step S510, acquiring the material information stored in the RFID label.
And step S520, acquiring material information required by the order to be produced according to the order mark.
And step S530, comparing the material information stored in the RFID label with the material information required by the order to be produced, and judging whether the grabbed material box is correct or not according to the comparison result.
The material box grabbing device comprises a material box grabbing device, a material box assembling rack, a cooperation robot and a material assembling rack, wherein the cooperation robot grabs the material box to be grabbed from a storage position of the material box to be grabbed according to coordinate information of the material box to be grabbed and is placed at a corresponding position of the material assembling rack. In order to further ensure the accuracy of the material, the material to be gripped is verified. Specifically, the RFID tags are arranged at corresponding positions of the assembling rack, corresponding material information is stored in the RFID tags, and the material information stored in the RFID tags is information for grabbing materials. And if the corresponding relation between the order identification and the material information required by the order is stored in the database, searching the material information required by the order to be produced in the database according to the order identification, and sending the material information to the PLC, so that the PLC acquires the material information required by the order to be produced. And comparing the material information stored in the RFID label with the material information required by the order to be produced, and if the material information stored in the RFID label is consistent with the material information required by the order to be produced, judging that the grabbed material box is correct. If the two materials are not consistent, judging that the grabbed material box is incorrect, and continuously judging whether the error materials exist; if so, informing the cooperative robot to grab the wrong material from the assembling rack to the storage position of the stock rack, and grabbing the correct material to place on the assembling rack; if not, determining that the material is absent, and informing the cooperative robot to grab the absent part of the material to place on the assembly rack. Through the mode, the cooperative robot can effectively correct the material condition on the assembling material rack, and further ensure that a user can assemble with correct materials. Furthermore, an ampere lamp can be configured, and the material state is displayed through the color of the ampere lamp. In addition, it should be noted that other information can be displayed through the safety light, for example, when the pen is assembled, if the recyclable pen bag is selected for assembly, the safety light is white; if the disposable pen bag safety lamp is selected for assembly, the safety lamp is blue. Further, a start signal and an end signal of the cooperative robot in the production scenario are defined by Modbus Coil signals. The start signal is a signal for starting execution of a production task in the production scenario, the end signal is a signal for completing execution of the production task in the production scenario, and the production task executed in the present embodiment is a production task associated with a robot.
In one embodiment, the order information includes a production quantity. The production quantity may be the quantity of the order to be produced or the quantity of the completed production order. As shown in fig. 6, before executing, by the collaboration robot, the production task corresponding to the order to be produced according to the order information of the order to be produced, the method further includes:
and step S610, performing format conversion processing on the production quantity to obtain target data corresponding to the production quantity, and sending the target data corresponding to the production quantity to a control system of the cooperative robot.
In particular, in order that the control system of the cooperative robot can recognize the production quantity, it is ensured that the format conversion of the production quantity is required. And the ModBus server performs format conversion processing on the production quantity to obtain target data corresponding to the production quantity, the target data corresponding to the production quantity can be identified by the control system of the cooperative robot, and the target data corresponding to the production quantity is sent to the control system of the cooperative robot.
And S620, converting the target data corresponding to the production quantity into coordinate information of the components to be grabbed according to the corresponding relation between the target data and the component coordinates through a control system of the cooperative robot.
The assembly is a component required for completing order tasks, and can be any one or more of a pen point, a pen holder, a pen cap and a pen bag when a pen is assembled. After the material box is placed at the corresponding position of the assembling material frame, the corresponding component needs to be obtained from the material box, and therefore the position information of the component needs to be known. Specifically, the control system of the cooperative robot stores the correspondence between the target data and the coordinates of the components. And converting the target data corresponding to the production quantity into coordinate information of the component to be grabbed by the control system of the cooperative robot according to the corresponding relation between the target data and the component coordinate. Illustratively, a fixed number of components are arranged in the material box, each component has specific position information in the material box, and when a first product is assembled, the control system of the cooperative robot stores that the first production quantity corresponds to a first preset position, and then the corresponding component is obtained from the first preset position of the material box.
Executing a production task corresponding to the order to be produced according to the order information of the order to be produced by the cooperative robot, wherein the production task comprises the following steps:
and step S630, grabbing the component to be grabbed from the position corresponding to the coordinate information of the component to be grabbed through the cooperative robot.
Specifically, after the control system of the cooperative robot determines the coordinate information of the component to be grabbed, the mechanical arm controlling the cooperative robot moves to the position of the component to be grabbed according to the coordinate information of the component to be grabbed, and then the cooperative robot grabs the component to be grabbed from the position.
In the embodiment, the cooperative robot grabs the component to be grabbed from the corresponding position of the coordinate information of the component to be grabbed and places the component to be grabbed at the corresponding position of the assembly tool, so that an operator can conveniently take the component from the assembly tool in sequence, man-machine cooperation of mutual matching between the robot and the operator is realized, and the production efficiency of orders is improved.
In one embodiment, the order information includes a packaging quantity of the finished packaged product; as shown in fig. 7, before executing, by the collaboration robot, the production task corresponding to the order to be produced according to the order information of the order to be produced, the method further includes:
step S710 detects whether a photosensor signal is received.
And S720, monitoring whether a stacking instruction corresponding to the packaging process is received.
And step S730, if the photoelectric sensor signal and the stacking instruction are received, activating the cooperative robot.
And step S740, converting the packaging quantity of the packaged products into the coordinate information of the next packaged and to-be-palletized product according to the corresponding relation between the packaging quantity and the palletizing coordinate through the control system of the cooperative robot.
Executing a production task corresponding to the order to be produced according to the order information of the order to be produced by the cooperative robot, wherein the production task comprises the following steps:
and S750, placing the next product which is packaged and is to be stacked at the position corresponding to the coordinate information of the next product which is packaged and is to be stacked through the cooperative robot.
After the products are assembled, the products are packaged, and the packaged products need to be stacked. Can place the product of accomplishing the packing on the packing frock, and the packing frock is equipped with photoelectric sensor, and when placing the product of having accomplished the packing on the packing frock, the ModBus server can receive the photoelectric sensor signal. Specifically, the ModBus server detects whether receives the photoelectric sensor signal, if receives the photoelectric sensor signal, then indicates that the product of having accomplished the packing has been placed on the packing frock. And meanwhile, the ModBus server continuously monitors whether a stacking instruction corresponding to the packaging process is received. If the photoelectric sensor signal and the stacking instruction are received, it is indicated that the packaged products need to be stacked through the cooperation robot, so that the cooperation robot is activated, and the cooperation robot can execute a stacking task. Each product has its own serial number for completing the package, i.e. the order information includes the number of packages of the completed packaged product. The placement position of the finished packaged product or the coordinate information of the next finished packaged product to be palletized, namely the palletizing coordinate, can be determined through the number of the packages. The control system of the cooperative robot stores the corresponding relation between the packaging quantity and the stacking coordinate. When the packaging quantity of the packaged products is known, the control system of the cooperative robot converts the packaging quantity of the packaged products into the stacking coordinate of the packaged products according to the corresponding relation between the packaging quantity and the stacking coordinate.
After the control system of the cooperative robot determines the stacking coordinate of the packaged product, the mechanical arm of the cooperative robot is controlled to place the packaged product on the packaging tool at the position corresponding to the stacking coordinate according to the stacking coordinate of the packaged product.
In this embodiment, accomplish the packing and treat that the pile up neatly product is placed in the next corresponding position department that accomplishes the packing and treat the coordinate information of pile up neatly product through the cooperation robot, accomplish the assembly back of product in the manual work, the operator places the packing on the packing frock, and the cooperation robot carries out the pile up neatly, rationally divides the production task between operator and cooperation robot, and effective man-machine cooperation when realizing the pile up neatly process promotes the production efficiency of order.
In one embodiment, the present application provides a method of controlling a robot, as shown in fig. 8a, the method comprising the steps of:
step S810, obtaining an order selection instruction and a scene selection instruction of the order to be produced.
The order selection instruction carries order information of the order to be produced;
step S820, displaying a corresponding page through the online work instruction book system according to the order selection instruction and the scene selection instruction.
The online operation instruction book system is an auxiliary system for human-computer interaction with an operator.
Step S830, in response to the trigger operation occurring on the displayed page, acquiring an action instruction of the cooperative robot.
The action command carries a scene address and a scene quantity value.
Step 840, analyzing the scene address and the scene quantity value, and sending the analysis result to the control system of the cooperative robot;
and step S850, controlling the cooperative robot to enter a corresponding production scene according to the analysis result through the control system of the cooperative robot.
And step S860, in a production scene, executing a production task corresponding to the order to be produced through the cooperative robot according to the order information of the order to be produced.
The number of the production scenes is three, namely a scene one, a scene two and a scene three. The following describes the production process in scenario one, scenario two, and scenario three in detail.
In a first scenario: the order information comprises material information, the position of the material box to be grabbed is determined according to the material information, and the position of the material box to be grabbed is sent to a control system of the cooperative robot; converting the position of the material box to be grabbed into coordinate information corresponding to the material box to be grabbed according to the corresponding relation between the position of the material box and the coordinates of the material box through a control system of the cooperative robot; and grabbing the material box to be grabbed from the position of the material box to be grabbed according to the coordinate information of the material box to be grabbed by the cooperative robot, and placing the material box to be grabbed at the corresponding position of the assembling material frame.
Furthermore, an RFID label is arranged at a position corresponding to the assembling rack, and the order information further comprises an order mark. After the material box to be grabbed is grabbed from the position of the material box to be grabbed according to the coordinate information of the material box to be grabbed, the material information stored in the RFID label can be further obtained; acquiring material information required by the order to be produced according to the order mark; and comparing the material information stored in the RFID label with the material information required by the order to be produced, and judging whether the grabbed material box is correct or not according to the comparison result.
In scenario two: the order information includes a production quantity. Carrying out format conversion processing on the production quantity to obtain target data corresponding to the production quantity, and sending the target data corresponding to the production quantity to a control system of the cooperative robot; converting target data corresponding to the production quantity into coordinate information of the components to be grabbed according to the corresponding relation between the target data and the component coordinates through a control system of the cooperative robot; and grabbing the component to be grabbed from the corresponding position of the coordinate information of the component to be grabbed through the cooperative robot.
In the third scenario: the order information includes the number of packages of the finished packaged product. Detecting whether a photoelectric sensor signal is received; monitoring whether a stacking instruction corresponding to a packaging procedure is received; if the photoelectric sensor signal and the stacking instruction are received, activating the cooperative robot; converting the packaging quantity of the packaged products into coordinate information of the next packaged and to-be-stacked product according to the corresponding relation between the packaging quantity and the stacking coordinate through a control system of the cooperative robot; and placing the next product which is packaged and is to be palletized at the position corresponding to the coordinate information of the next product which is packaged and is to be palletized by the cooperation robot.
In one embodiment, the online work instruction book system has a plurality of different work instruction book pages, and the work instruction book pages are jumped according to an operator-triggered order selection instruction and a scenario selection instruction. The job instruction book page may display relevant production information for the current order, the number of orders outstanding, the tempo time of the order, and a dynamic job instruction box. And displaying a dynamic operation guiding process in the dynamic operation guiding frame, wherein the dynamic operation guiding process is designed step by step according to the customization of the production flow and comprises two forms of photos and videos. As shown in fig. 8b and 8c, two keys and safety warning characters are provided in the page of the work instruction book. The two keys include a "next" key and an "end order" key. Wherein:
the "next" button: and (4) entering a production flow, displaying a first-step production operation instruction on a page, and carrying out the next production link by clicking a key 'next step' by an operator. However, when the cooperative robot moves, such as transporting a tool and grabbing a pen assembly part, the 'next' button cannot be triggered. Although the keys are normally displayed in the page, the keys are in a semi-transparent form and in an inactivated state, the page surface of the job guidance is in a locked state, and the user cannot activate the next production operation guidance. Only when the collaborative robot completes the work order, the "next" button can be activated again, and the operator can click the proceed "next" button. Here, the signal interaction between the work instruction page and the cooperative robot is transmitted through a MODBUS signal, and specifically includes: clicking a 'next' key to activate the operation of the mobile tooling of the cooperative robot, and simultaneously locking the 'next' key when the cooperative robot successfully receives an instruction and starts to move; and after the cooperative robot finishes the moving instruction, sending a finishing signal, and activating a 'next' key when the operation instruction page receives the signal. In addition, the function of locking the next step key can synchronize the operation instruction book with the actual production operation, and the process quality of the product is ensured by synchronizing the dynamic operation instruction book with the actual operation of a producer, so that the production takt time is favorably improved. The operator can display the currently required operation guidance in real time by clicking the 'next step', so that the continuity and the accuracy of the production flow are ensured.
Safety warning words: when the cooperative robot moves, such as transporting a tool and grabbing a pen assembly part, the page jumps out of the prompt of 'the robot moving and please keep a safe distance or far away', so that the production safety is ensured. The safety reminder is displayed in the form of bright light color characters and pictures, and can more visually and clearly warn an operator. The yellow safety warning will disappear only when the cooperative robot completes the work order.
The "end order" button: in order to make the display time controllable, the button "end order" can be used to complete the production process in advance and enter the quality inspection process, but the button can be activated to click only when at least one product is completed. The key can further realize the human-computer cooperation operation idea in the product assembly process.
In one embodiment, the collaborative robot employs a Franka robot, and the robot APP programming interface is provided with two applications, a move to contact application and a force application, respectively. The move to contact means that after the robot grabs the material, the material is firstly moved to a contact surface at a preset position; force means fine adjustment of the position of the mechanical arm, and accurate placement of an object is achieved under certain force control. Taking the assembly pen as an example, the robot in the conventional technology has more or less deviation in movement, namely, the path and the positioning repeatability precision in the production process. When certain article were placed to needs accuracy, the robot was getting this article and is removing the in-process all can take place the deviation, will lead to article can't be placed accurate position like this, if put into the situation of aperture with pen cap, the aperture shape is similar with cap bottom projection shape, and the size is a little bit bigger, in case the robot takes place the skew, often can take place the cap card and put into the situation that the aperture edge can not be put into. To solve the problem, move to contact application and force application can be realized through a franka robot: firstly, the pen cap is obliquely moved to a preset position contact surface to stop through move to contact application; secondly, fine-tuning the position of the mechanical jaws of the robot (e.g., a horizontal push, a downward pressure) through the force application enables precise placement of the item under a certain force control. It should be noted that, in the conventional method, when the material contacts the surface, the robot may determine that the collision is wrong and stop running. The use of move to contact is to ensure the surface contact in a micro force control range, the robot can not be regarded as collision and stop by mistake, and only when the robot is contacted, a micro feedback force is received, so that the task is defined to be completed, and the material reaches the contact surface.
Further, Franka Robot is provided with object-oriented programming (i.e. APP programming, programming through a drag-and-drop function application), and in addition, a Robot programming manner is ROS (Robot Operating System) programming. In this embodiment, two programming modes, APP programming and ROS programming, may be used. As shown in fig. 8d, for simple programming and debugging of the Franka robot, programming can be performed according to a drag teaching mode through built-in APP programming, an operator does not need any programming experience, such as moving, grabbing, force feedback, Modbus signal processing and the like, when the Franka robot is used, the APP programming is dragged to a working area according to a robot working flow for sequencing, and then the Franka robot is configured step by step according to the functional requirements of each APP programming, which is very convenient. In addition, the programming is performed by the ROS, i.e. the movement of the robot is controlled in the form of a code.
The robot programming method has the advantages and disadvantages of two programming modes, ROS programming is more accurate, the robot runs more smoothly and stably, an operator is required to have ROS programming capability, APP programming is more modularized, operation can be carried out in a dragging mode, and the robot programming method is simple, convenient and poor in running repeatability. The embodiment combines two kinds of programming to establish communication between the ModBus server and the cooperative robot.
In one embodiment, for example, a pen is assembled, the components used include a nib, barrel, cap and bladder, where the nib, barrel and cap contain four color choices (black, white, red and blue) and the bladder contains two of a disposable and a repeatable. And the PLC controller, the information system and the cooperative robot are connected by adopting a Modbus TCP protocol so as to facilitate information communication. For example, order information of the information system is sent to the PLC controller and the cooperative robot, information of the sensor is sent to the cooperative robot through the PLC controller, and the execution condition of the cooperative robot is fed back to the information system. Such as the Modbus overall architecture diagram shown in fig. 8 e. Specifically, firstly, a PLC controller, an information system and a cooperative robot are integrated under the same lan, and the three need to be under the same subnet as shown in fig. one (192.168.3. During the process of assembling the pen, the cooperative robot can complete three scene tasks, and the signal is stored in the zero position of a Modbus Holding register (hereinafter referred to as ModbusHR). The first scene is that corresponding materials are taken according to order information; the second scene is that the cooperative robot takes a single part to supply to an operator; and the third scene is that the cooperation robot picks and stacks the packaged products.
In a first scenario, after placing an order, the order information is transmitted through the ModbusHR, and the first to third bits define the colors of the three parts. Bits 12-14 define the material information on the mounting rack, which is read by the RFID on the mounting rack and converted by the PLC controller into a ModbusHR signal. In addition, the product dosed manually is also defined in ModbusHR at position 7 and is displayed by a light, for example, white when the order is a recyclable pen bag and blue when the disposable pen bag is on. Bits 0 and 1 of Modbus Coil define the robot start and end signals.
In the second scenario, in order to record the progress of production, two Modbus hr signals are also defined in Modbus, which are the 4 th and 5 th bits, and this information can be used to show the order production progress and displayed on the performance information display board. The 2 nd bit of the Modbus coil defines a starting signal of the interactive cooperation production task of the robot in the second scene, and the 3 rd bit defines an ending signal of the interactive cooperation production task of the robot in the second scene; the 4 th bit defines a start signal of the robot interactive collaborative production task in the third scene, and the 5 th bit is an end signal of the robot interactive collaborative production task in the third scene.
In a third scenario, when a packaged product is placed on the packaging tool, a signal is sent to the cooperating robot, which is activated to act, the signal is defined at position 11 of ModbusHR, the signal is 3 when there is no product, and the signal is 1 when there is a product on the tool.
In addition, the action mode of the cooperative robot is also defined in position 10 of ModbusHR, and when there is no operator on the safety carpet, the cooperative robot can run at full speed, and the signal is 1. When the operator and the cooperative robot work on their respective ground, the signal is 3, and at this time, the operator and the cooperative robot have a small overlapping work area and may contact, so that the operation speed of the cooperative robot is reduced, for example, by 50%. When the operator has completely entered the working area of the robot, the signal is 7, there is a high probability that the operator will make direct contact with the cooperating robot, so the speed can be set to 0 or very slow, e.g. 10%, to sufficiently ensure the safety of the personnel.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the above-mentioned flowcharts may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps or the stages is not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a part of the steps or the stages in other steps.
In one embodiment, as shown in fig. 9, there is provided a control apparatus of a robot, including: a first obtaining module 910, a page presentation module 920, a second obtaining module 930, and a robot control module 940, wherein:
a first obtaining module 910, configured to obtain an order selection instruction and a scene selection instruction of an order to be produced;
a page display module 920, configured to display a corresponding page through an online work instruction book system according to the order selection instruction and the scene selection instruction, where the online work instruction book system is an auxiliary system for performing human-computer interaction with an operator;
a second obtaining module 930, configured to obtain an action instruction of the cooperative robot in response to a trigger operation occurring on the displayed page, where the action instruction carries a scene address and a scene quantity value;
and a robot control module 940, configured to control, according to the scene address and the scene quantity, the cooperative robot to enter a corresponding production scene through a control system of the cooperative robot.
In one embodiment, the robot control module 940 is further configured to analyze the scene address and the scene quantity, and send an analysis result to a control system of the cooperative robot; and controlling the cooperative robot to enter a corresponding production scene according to the analysis result through the control system of the cooperative robot.
In one embodiment, the order selection instructions carry order information for the order to be produced. The device also comprises a production task execution module which is used for executing the production task corresponding to the order to be produced through the cooperation robot according to the order information of the order to be produced in the production scene.
In one embodiment, the order information includes material information; the device also comprises a material box coordinate determining module, a material box position determining module and a cooperative robot control module, wherein the material box coordinate determining module is used for determining the position of the material box to be grabbed according to the material information and sending the position of the material box to be grabbed to the control system of the cooperative robot; converting the position of the material box to be grabbed into coordinate information corresponding to the material box to be grabbed according to the corresponding relation between the position of the material box and the coordinates of the material box by using a control system of the cooperative robot;
and the production task execution module is also used for grabbing the material box to be grabbed from the storage position of the material box to be grabbed according to the coordinate information of the material box to be grabbed through the cooperation robot and placing the material box to be grabbed at the corresponding position of the assembly rack.
In one embodiment, an RFID tag is disposed at a corresponding position of the assembling rack, and the order information further includes an order identifier; the device also comprises a material box verification module which is used for acquiring material information stored in the RFID label; acquiring material information required by the order to be produced according to the order mark; and comparing the material information stored in the RFID label with the material information required by the order to be produced, and judging whether the grabbed material box is correct or not according to a comparison result.
In one embodiment, the order information includes a production quantity; the device also comprises a component coordinate determining module, a coordinate calculating module and a coordinate calculating module, wherein the component coordinate determining module is used for carrying out format conversion processing on the production quantity to obtain target data corresponding to the production quantity and sending the target data corresponding to the production quantity to a control system of the cooperative robot; and converting the target data corresponding to the production quantity into coordinate information of the components to be grabbed according to the corresponding relation between the target data and the component coordinates through the control system of the cooperative robot.
And the production task execution module is also used for grabbing the component to be grabbed from the corresponding position of the coordinate information of the component to be grabbed through the cooperative robot.
In one embodiment, the order information includes a packaging quantity of the finished packaged product; the device also comprises a package coordinate determining module for detecting whether the photoelectric sensor signal is received; monitoring whether a stacking instruction corresponding to a packaging procedure is received; if the photoelectric sensor signal and the stacking instruction are received, activating the cooperative robot; converting the packaging quantity of the packaged products into coordinate information of the next packaged and to-be-stacked product according to the corresponding relation between the packaging quantity and the stacking coordinate through a control system of the cooperative robot;
and the production task execution module is also used for placing the next finished packaged and to-be-palletized product at the position corresponding to the coordinate information of the next finished packaged and to-be-palletized product through the cooperative robot.
For specific limitations of the control device of the robot, reference may be made to the above limitations of the control method of the robot, which are not described herein again. The respective modules in the control device of the robot may be entirely or partially implemented by software, hardware, or a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a controller is provided, comprising a memory in which a computer program is stored and a processor which, when executing the computer program, implements the steps of the robot control method in the above embodiments.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, realizes the steps of the robot control method in the above-mentioned embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A method of controlling a robot, the method comprising:
acquiring an order selection instruction and a scene selection instruction of an order to be produced, wherein the order selection instruction carries order information of the order to be produced, and the order information comprises production quantity;
displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction, wherein the online operation instruction book system is an auxiliary system for human-computer interaction with an operator;
responding to a trigger operation generated on a displayed page, and acquiring an action instruction of the cooperative robot, wherein the action instruction carries a scene address and a scene quantity value;
analyzing the scene address and the scene quantity value, and sending an analysis result to a control system of the cooperative robot;
controlling the cooperative robot to enter a corresponding production scene according to the analysis result through a control system of the cooperative robot;
carrying out format conversion processing on the production quantity to obtain target data corresponding to the production quantity, and sending the target data corresponding to the production quantity to a control system of the cooperative robot;
converting the target data corresponding to the production quantity into coordinate information of the components to be grabbed according to the corresponding relation between the target data and the component coordinates through a control system of the cooperative robot;
in the production scene, executing a production task corresponding to the order to be produced according to the order information of the order to be produced by the cooperative robot;
the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced includes:
and grabbing the component to be grabbed from the position corresponding to the coordinate information of the component to be grabbed through the cooperative robot.
2. The method of claim 1, wherein the order information comprises material information; before the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced, the method further includes:
determining the position of a material box to be grabbed according to the material information, and sending the position of the material box to be grabbed to a control system of the cooperative robot;
converting the position of the material box to be grabbed into coordinate information corresponding to the material box to be grabbed according to the corresponding relation between the position of the material box and the coordinates of the material box by using a control system of the cooperative robot;
the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced includes:
and grabbing the material box to be grabbed from the storage position of the material box to be grabbed according to the coordinate information of the material box to be grabbed by the cooperation robot, and placing the material box to be grabbed at the corresponding position of the assembling material frame.
3. The method according to claim 2, wherein an RFID tag is provided at a corresponding position of the assembling rack, and the order information further includes an order identifier; after the material box to be grabbed is grabbed from the position of the material box to be grabbed according to the coordinate information of the material box to be grabbed, the method further comprises the following steps:
acquiring material information stored in the RFID label;
acquiring material information required by the order to be produced according to the order mark;
and comparing the material information stored in the RFID label with the material information required by the order to be produced, and judging whether the grabbed material box is correct or not according to a comparison result.
4. The method of claim 1, wherein the order information includes a number of packages of completed packaged products; before the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced, the method further includes:
detecting whether a photoelectric sensor signal is received;
monitoring whether a stacking instruction corresponding to a packaging procedure is received;
if the photoelectric sensor signal and the stacking instruction are received, activating the cooperative robot;
determining the placement position of the next finished packaged product to be stacked according to the corresponding relation between the packaging serial number and the placement position and the packaging serial number of the finished packaged product by the control system of the cooperative robot;
the executing, by the cooperative robot, the production task corresponding to the order to be produced according to the order information of the order to be produced includes:
-placing the next finished package and product to be palletized at the placement location by means of the cooperating robot.
5. A control apparatus of a robot, characterized in that the apparatus comprises:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an order selection instruction and a scene selection instruction of an order to be produced, the order selection instruction carries order information of the order to be produced, and the order information comprises production quantity;
the page display module is used for displaying a corresponding page through an online operation instruction book system according to the order selection instruction and the scene selection instruction, wherein the online operation instruction book system is an auxiliary system for human-computer interaction with an operator;
the second acquisition module is used for responding to the trigger operation generated on the displayed page and acquiring the action instruction of the cooperative robot, wherein the action instruction carries a scene address and a scene quantity value;
the robot control module is used for analyzing the scene address and the scene quantity value and sending an analysis result to a control system of the cooperative robot; controlling the cooperative robot to enter a corresponding production scene according to the analysis result through a control system of the cooperative robot;
the production task execution module is used for executing a production task corresponding to the order to be produced through the cooperative robot according to the order information of the order to be produced in the production scene;
the component coordinate determination module is used for carrying out format conversion processing on the production quantity to obtain target data corresponding to the production quantity and sending the target data corresponding to the production quantity to a control system of the cooperative robot; converting the target data corresponding to the production quantity into coordinate information of the components to be grabbed according to the corresponding relation between the target data and the component coordinates through a control system of the cooperative robot;
the production task execution module is further used for grabbing the component to be grabbed from the position corresponding to the coordinate information of the component to be grabbed through the cooperative robot.
6. A controller comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 4.
7. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 4.
CN202011220420.2A 2020-11-05 2020-11-05 Robot control method, device, computer equipment and storage medium Active CN112356030B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011220420.2A CN112356030B (en) 2020-11-05 2020-11-05 Robot control method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011220420.2A CN112356030B (en) 2020-11-05 2020-11-05 Robot control method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112356030A CN112356030A (en) 2021-02-12
CN112356030B true CN112356030B (en) 2022-04-22

Family

ID=74514278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011220420.2A Active CN112356030B (en) 2020-11-05 2020-11-05 Robot control method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112356030B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1139524A (en) * 1997-07-15 1999-02-12 Honda Motor Co Ltd Operation helping device
CN103249368A (en) * 2010-11-11 2013-08-14 约翰霍普金斯大学 Human-machine collaborative robotic systems
CN106874092A (en) * 2017-02-10 2017-06-20 深圳市笨笨机器人有限公司 Robot task trustship method and system
CN107976918A (en) * 2016-10-24 2018-05-01 菜鸟智能物流控股有限公司 Task switching method and related device
CN108877462A (en) * 2018-06-27 2018-11-23 上海慧程工程技术服务有限公司 A kind of intelligence manufacture simulation training system based on cooperation robot
JP2019051571A (en) * 2017-09-14 2019-04-04 Dgshape株式会社 Processing device
CN111515673A (en) * 2020-04-27 2020-08-11 宁波舜宇智能科技有限公司 Electromechanical equipment assembling system based on man-machine cooperation and assembling method thereof
CN111784153A (en) * 2020-06-30 2020-10-16 宁波舜宇智能科技有限公司 Intelligent flexible assembly execution system, method, computer device and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678505B2 (en) * 2013-10-14 2017-06-13 Invensys Systems, Inc. Line management in manufacturing execution system
JP6510436B2 (en) * 2016-02-12 2019-05-08 株式会社日立製作所 Article conveying system, conveying apparatus and article conveying method
CN109308057A (en) * 2018-10-18 2019-02-05 首瑞(北京)投资管理集团有限公司 Intelligent plant management method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1139524A (en) * 1997-07-15 1999-02-12 Honda Motor Co Ltd Operation helping device
CN103249368A (en) * 2010-11-11 2013-08-14 约翰霍普金斯大学 Human-machine collaborative robotic systems
CN107976918A (en) * 2016-10-24 2018-05-01 菜鸟智能物流控股有限公司 Task switching method and related device
CN106874092A (en) * 2017-02-10 2017-06-20 深圳市笨笨机器人有限公司 Robot task trustship method and system
JP2019051571A (en) * 2017-09-14 2019-04-04 Dgshape株式会社 Processing device
CN108877462A (en) * 2018-06-27 2018-11-23 上海慧程工程技术服务有限公司 A kind of intelligence manufacture simulation training system based on cooperation robot
CN111515673A (en) * 2020-04-27 2020-08-11 宁波舜宇智能科技有限公司 Electromechanical equipment assembling system based on man-machine cooperation and assembling method thereof
CN111784153A (en) * 2020-06-30 2020-10-16 宁波舜宇智能科技有限公司 Intelligent flexible assembly execution system, method, computer device and storage medium

Also Published As

Publication number Publication date
CN112356030A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
US11059076B2 (en) Sorting support methods, sorting systems, and flatbed machine tools
US20200101613A1 (en) Information processing apparatus, information processing method, and system
US20190202058A1 (en) Method of programming an industrial robot
US20210046644A1 (en) Automated personalized feedback for interactive learning applications
US11648670B2 (en) Machine tool system
US20230219223A1 (en) Programming device
US20230251631A1 (en) A method for manufacturing construction components, a portable manufacturing unit, a software application executable on a machine tool system for controlling a tool, the machine tool system, and a method of machining the workpiece using the tool
US20210023710A1 (en) System and method for robotic bin picking using advanced scanning techniques
US10471592B2 (en) Programming method of a robot arm
Wassermann et al. Intuitive robot programming through environment perception, augmented reality simulation and automated program verification
Gradmann et al. Augmented reality robot operation interface with google tango
US11389954B2 (en) Robot control device
CN112356030B (en) Robot control method, device, computer equipment and storage medium
US11478932B2 (en) Handling assembly comprising a handling device for carrying out at least one work step, method, and computer program
KR20210112031A (en) Smart logistic warehouse system for automated product inspection and packaging
CN111899629B (en) Flexible robot teaching system and method
Peng et al. Intention recognition-based human–machine interaction for mixed flow assembly
EP3703915B1 (en) Method of performing assembling of an object, and assembly system
EP3662341B1 (en) Electronic device and controlling method thereof
US20230330847A1 (en) Method and System for Training a Robot
JP2016221602A (en) Robot, control device and program
KR20210112030A (en) Automated logistic management system based on smart factory
CN111476840A (en) Target positioning method, device, equipment and computer readable storage medium
WO2023080230A1 (en) Management device, management system, management method, and management program
EP4177838A1 (en) Marker detection apparatus and robot teaching system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant