CN111368720A - Automatic carrying and goods taking system and method - Google Patents

Automatic carrying and goods taking system and method Download PDF

Info

Publication number
CN111368720A
CN111368720A CN202010139568.7A CN202010139568A CN111368720A CN 111368720 A CN111368720 A CN 111368720A CN 202010139568 A CN202010139568 A CN 202010139568A CN 111368720 A CN111368720 A CN 111368720A
Authority
CN
China
Prior art keywords
target
manipulator
lower computer
product
operation instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010139568.7A
Other languages
Chinese (zh)
Inventor
张鑫
王剑华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
East Automotive Electronic Co ltd
Original Assignee
East Automotive Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by East Automotive Electronic Co ltd filed Critical East Automotive Electronic Co ltd
Priority to CN202010139568.7A priority Critical patent/CN111368720A/en
Publication of CN111368720A publication Critical patent/CN111368720A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Abstract

The invention provides an automatic carrying and goods taking system and a method thereof, wherein the system comprises: host computer, next machine and a plurality of manipulator, the next machine respectively with the host computer the manipulator is connected to a plurality of: the upper computer acquires user operation information and sends a corresponding operation instruction to the lower computer according to the user operation information; and when the operation instruction is in automatic operation, the lower computer identifies the product type in the operation instruction, and calls a corresponding target manipulator to execute operation according to the product type, wherein the target manipulator is any one or more of the plurality of manipulators. According to the invention, the lower computer is used for operating the target manipulator to execute the operation, so that the carrying and the goods taking of the product are completed without manual participation, and the efficiency is improved.

Description

Automatic carrying and goods taking system and method
Technical Field
The invention relates to the field of robots, in particular to an automatic carrying and goods taking system and method.
Background
In many posts of domestic factories, workers carry, take and place products and push buttons to control equipment to perform assembly or test work. The completion technology is low in difficulty and high in repeatability, after workers work for a long time, the working efficiency is reduced, inevitable damage and loss caused by misoperation of equipment are possible, and meanwhile, the domestic labor cost is continuously increased.
Disclosure of Invention
The invention aims to provide an automatic carrying and goods taking system and method, which can be used for realizing the carrying and goods taking of products by operating a target manipulator to execute operation through a lower computer without manual participation and improving the efficiency.
The technical scheme provided by the invention is as follows:
the invention provides an automatic carrying and picking system, which comprises an upper computer, a lower computer and a plurality of mechanical arms, wherein the lower computer is respectively connected with the upper computer and the mechanical arms:
the upper computer acquires user operation information and sends a corresponding operation instruction to the lower computer according to the user operation information;
and when the operation instruction is in automatic operation, the lower computer identifies the product type in the operation instruction, and calls a corresponding target manipulator to execute operation according to the product type, wherein the target manipulator is any one or more of the plurality of manipulators.
Further, the method also comprises the following steps:
the lower computer identifies a debugging object and debugging parameters in the operating instruction when the operating instruction is manual debugging, and calls a corresponding target manipulator to execute operation according to the debugging parameters according to the debugging object;
and when the operation instruction is a configuration parameter, the lower computer identifies parameter information in the operation instruction to create or modify a product configuration parameter.
Further, still include 3D position detection camera, 3D position detection camera with the next machine is connected:
the 3D position detection camera acquires product state information according to the information acquisition instruction sent by the lower computer and sends the product state information to the lower computer;
the lower computer analyzes the product state according to the product state information, and analyzes the target manipulator and the grabbing pose of the product grabbed by the target manipulator by combining the product state and the position of the manipulator;
and the target manipulator grabs the product according to the grabbing pose sent by the lower computer.
Further, the method also comprises the following steps:
the lower computer is used for establishing corresponding semaphore for all the manipulators;
the lower computer locks a target semaphore corresponding to a target manipulator and simultaneously distributes the target semaphore to unlocking corresponding to the target station when the target manipulator is distributed to the target station;
and the target manipulator receives the distribution instruction of the lower computer, and unlocks the target semaphore according to the unlocking of the target station after the target station is reached and the target operation is completed.
Further, the method also comprises the following steps:
the lower computer creates corresponding elements for the beginning and the end of each action in the process flow to form an adjacent element pair, and the elements comprise executed station identifiers and action types;
and the lower computer forms all the element pairs into a master control queue according to the steps of the process flow, and sequentially controls the corresponding stations to execute the operation according to the master control queue.
The invention also provides an automatic carrying and goods taking method, which comprises the following steps:
acquiring user operation information through an upper computer to generate a corresponding operation instruction;
receiving the operation instruction;
when the operation instruction is automatically operated, identifying the product type in the operation instruction;
and calling a corresponding target manipulator according to the product type to execute operation, wherein the target manipulator is any one or more of the plurality of manipulators.
Further, after receiving the operation instruction, the method further includes:
when the operation instruction is manual debugging, a debugging object and debugging parameters in the operation instruction are identified, and a corresponding target manipulator is called according to the debugging object to execute operation according to the debugging parameters;
and when the operation instruction is a configuration parameter, identifying parameter information in the operation instruction to create or modify a product configuration parameter. .
Further, after receiving the operation instruction, the method further includes:
sending an information acquisition instruction to a 3D position detection camera;
collecting product state information according to the information collection instruction through the 3D position detection camera;
analyzing the product state according to the product state information, and analyzing the target manipulator and the grabbing pose of the target manipulator for grabbing the product by combining the product state and the position of the manipulator;
and grabbing the product according to the grabbing pose through the target manipulator.
Further, the method also comprises the following steps:
creating corresponding semaphore for all manipulators;
when a certain target manipulator is distributed to a target station, locking a target semaphore corresponding to the target manipulator, and meanwhile distributing the target semaphore to unlocking corresponding to the target station;
and after the target manipulator reaches the target station to complete target operation, unlocking the target semaphore according to unlocking of the target station.
Further, the method also comprises the following steps:
creating corresponding elements for the beginning and the end of each action in the process flow to form an adjacent element pair, wherein the elements comprise the executed station identification and the action type;
and forming a master control queue by all the element pairs according to the steps of the process flow, and sequentially controlling corresponding stations to execute operation according to the master control queue.
According to the automatic carrying and goods taking system and method provided by the invention, the lower computer is used for operating the target manipulator to execute the operation, so that the carrying and goods taking of the product are finished, manual participation is not required, and the efficiency is improved.
Drawings
The above features, technical features, advantages and implementations of an automated transport and pickup system and method will be further described in the following detailed description of preferred embodiments in a clearly understandable manner, in conjunction with the accompanying drawings.
FIG. 1 is a schematic diagram of an embodiment of an automated transport picking system according to the invention;
FIG. 2 is a flow chart of the automated operation of an automated transport and pickup system of the present invention;
FIG. 3 is a flow chart of error handling for an automated transport picking system according to the present invention;
FIG. 4 is a flow chart of a manual commissioning of an automated transport picking system of the present invention;
fig. 5 is a flow chart of an embodiment of an automated handling pick-up method of the invention.
Description of the drawings:
100 automatic carrying and goods taking system
110 upper computer
120 lower computer
130 robot 131 target robot
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
For the sake of simplicity, the drawings only schematically show the parts relevant to the present invention, and they do not represent the actual structure as a product. In addition, in order to make the drawings concise and understandable, components having the same structure or function in some of the drawings are only schematically depicted, or only one of them is labeled. In this document, "one" means not only "only one" but also a case of "more than one".
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, family computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments the terminal device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal device that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: a drawing application, a presentation application, a network creation application, a word processing application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a digital video camera application, a Web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
In an embodiment of the present invention, as shown in fig. 1, an automatic carrying and picking system 100 includes an upper computer 110, a lower computer 120 and a plurality of manipulators 130, wherein the lower computer 120 is connected to the upper computer 110 and the plurality of manipulators 130 respectively;
the upper computer 110 acquires user operation information and sends a corresponding operation instruction to the lower computer 120 according to the user operation information;
and when the operation instruction is automatically operated, the lower computer 120 identifies the product type in the operation instruction, and calls a corresponding target manipulator 131 to execute operation according to the product type, wherein the target manipulator 131 is any one or more of the plurality of manipulators 130.
When the operation instruction is manual debugging, the lower computer 120 identifies a debugging object and debugging parameters in the operation instruction, and calls a corresponding target manipulator 131 according to the debugging object to execute operation according to the debugging parameters;
and when the operation instruction is a configuration parameter, the lower computer 120 identifies parameter information in the operation instruction to create or modify a product configuration parameter.
Specifically, in this embodiment, the system hardware is composed of an upper computer, a lower computer, a manipulator, an actuator (including a 3D position measurement device, a 2D camera, a cylinder, and an electric cylinder) on a manipulator flange, and a peripheral controllable device. The manipulator is divided into two types according to the use function, wherein one type is a feeding manipulator, and the other type is an assembling manipulator.
The system software is divided into an upper computer and a lower computer (cRIO lower computer), wherein the upper computer realizes the functions of configuration and display, and the cRIO lower computer is a main program and runs the program. CVS (Ethernet TCP/IP), 3D position detection (Ethernet TCP/IP), FANUC manipulators (Ethernet/IP or PROFIBUS DP), WAGO distributed IO (Ethernet/IP), SMC electric cylinders (RS485) and the like are hung below the cRIO lower computer.
Wherein, cRIO: the embedded controller is suitable for advanced control and monitoring application, is provided with a real-time processor and an FPGA, and provides various connection ports including an Ethernet, a USB and a serial port; TCP/IP: the protocol cluster is a protocol cluster capable of realizing information transmission among a plurality of different networks; EtherNet/IP: is a network suitable for industrial environments and applications with more stringent time requirements; profibus DP: it has high speed and low cost, and is used for the communication between the equipment level control system and the distributed I/O.
The upper computer program starting interface comprises 3 buttons for configuring parameters, manually debugging and automatically operating, and a pull-down menu for selecting the type of the automatically operating product. The upper computer obtains user operation information, namely a corresponding area clicked by a user on the program starting interface of the upper computer, so that a corresponding operation instruction is generated and sent to the lower computer.
When the operation instruction is automatically operated, the user operation information further comprises the product type selected by the user in the pull-down menu, so that the product type in the operation instruction is identified, and a corresponding target manipulator is called to execute operation according to the product type, wherein the target manipulator is any one or more of the plurality of manipulators. The robots, stations, and other resources required for different product types vary. The process flows of all product types in the automatically-operated pull-down menu are stored in the lower computer, and after the lower computer receives corresponding operation instructions, corresponding resources are called to execute according to the corresponding operation flows.
The automatic operation interface display step comprises start, pause and stop buttons. The autorun interface has 4 cycles, 1 is to capture interface messages, 2 is to process messages, 3 is to send configuration information and control commands to the cRIO, and 4 is to receive and process cRIO feedback data.
When the operation instruction is manual debugging, the user operation information also comprises a debugging object and debugging parameters selected by the user, so that the corresponding target manipulator is called according to the debugging object to execute operation according to the debugging parameters. The manual debugging interface is used for manually controlling each hardware, and the hardware comprises a cRIO body DIO, a CVS image, 3D position detection, a Func manipulator, a WAGO distributed IO and an SMC electric cylinder.
When the operation instruction is a configuration parameter, the configuration interface can open all the existing product configuration parameters for editing and saving, and can also be newly created, saved and saved as an operation. The parameters are subjected to tabular operation, each station (including a feeding manipulator station) is provided with a table, and the table comprises a step name (unique identification number), step description, calling VI (selectable drop-down list), a parameter ID, input parameters (configured by double-click jumping out of a parameter configuration interface and associated with the calling VI), Pass jumping and Fail jumping. The parameter ID is used for storing a group of input parameters, so that parameter description and later repeated calling are facilitated, and the parameter ID can be increased but cannot be deleted.
According to the invention, communication is established among the upper computer, the lower computer and the manipulator through an LABVIEW programming environment, so that the robot can complete picking, placing and carrying work among different stations. The scheme can replace manual work content, is high in efficiency, low in error rate and wide in application range, and reduces the investment of labor cost.
Preferably, in a further embodiment of the present invention, a 3D position detection camera is further included, and the 3D position detection camera is connected to the lower computer;
the 3D position detection camera acquires product state information according to the information acquisition instruction sent by the lower computer and sends the product state information to the lower computer;
the lower computer analyzes the product state according to the product state information, and analyzes the target manipulator and the grabbing pose of the product grabbed by the target manipulator by combining the product state and the position of the manipulator;
and the target manipulator grabs the product according to the grabbing pose sent by the lower computer.
Specifically, the system further comprises a 3D position detection camera, and the 3D position detection camera is connected with the lower computer. When a product needs to be grabbed, the lower computer sends an information acquisition instruction to the 3D position detection camera, the 3D position detection camera acquires product state information according to the information acquisition instruction and feeds the product state information back to the lower computer, the lower computer analyzes the product state according to the product state information, selects a target manipulator suitable for grabbing according to the product state and the manipulator position, the target manipulator grabs the grabbing pose of the product, and then the target manipulator grabs the product according to the grabbing pose.
In this embodiment, the 3D position detection data is used to analyze different shapes and positions of products to obtain an optimal pose for the robot to grasp the products.
In addition, as shown in fig. 2, during production, the lower computer receives the database contents transmitted by the process flow database and puts the database contents into the memory. In the operation process of the equipment, firstly, the record number of the database in the execution step is obtained, the database is inquired, the record execution action of the database is obtained according to the record number of the database, the keyword of the next action is obtained after the execution, the database is inquired again according to the keyword of the next action, the record number of the database corresponding to the keyword of the next action is obtained, and whether the next instruction in the database is executed or other instructions are jumped to is judged.
In addition, as shown in fig. 3, in the process of automatic operation, when a fault occurs when a manipulator or other resources perform an action, an error processing branch is entered, an error record is uploaded to the upper computer, the upper computer changes the execution state to stop all actions of the corresponding station, meanwhile, secondary damage to the equipment is avoided, after the operator confirms that no fault exists, the station is restarted, and after the operator is restarted, the execution action starts from the first step of the station process.
In addition, as shown in fig. 4, during manual debugging, that is, hardware inching mode, the upper computer downloads data and then determines the data as an inching instruction, directly sends an action instruction to the execution module, and knows specific execution action and action parameter through analysis, the action parameter is secondarily analyzed again according to actual action, after the execution result and the previous action instruction are bound into a specific format and then is uploaded to the communication module of the lower computer, and the lower computer acquires the inching action instruction and result and uploads the result to the upper computer.
Another embodiment of the present invention is an optimized embodiment of the above embodiment, and as shown in fig. 1, compared with the first embodiment, the present embodiment is mainly improved in that the present embodiment further includes:
the lower computer 110 is used for creating corresponding semaphore for all the manipulators;
the lower computer 110 locks a target semaphore corresponding to a certain target manipulator 131 and simultaneously allocates the target semaphore to unlocking corresponding to the target manipulator 131 when the certain target manipulator 131 is allocated to a target station;
the target manipulator 131 receives the distribution instruction of the lower computer 110, and unlocks the target semaphore according to unlocking of the target station after the target station is reached and target operation is completed.
Specifically, in this embodiment, the lower computer creates corresponding semaphores for all the manipulators, and may also create semaphores for any available resource, such as various actuators. When the upper computer carries out resource scheduling and distributes a certain target manipulator to the target station, the target semaphore corresponding to the target manipulator is locked, the target manipulator cannot distribute resources any more, meanwhile, the target manipulator is distributed to the unlocking corresponding to the target station, and the target semaphore corresponding to the target manipulator can be unlocked only through the unlocking of the target station, so that the target manipulator can distribute resources again. Therefore, the target manipulator receives the distribution instruction of the lower computer, and after the target manipulator reaches the target station and finishes the target operation, the target manipulator unlocks the target semaphore according to the unlocking of the target station.
In addition, because some process flows may need to schedule multiple resources at one time and combine multiple actions together, semaphores corresponding to the resources needed in the process flows can be locked one by one, and then unlocked after operations are executed one by one according to the flows.
The invention is suitable for a plurality of manipulators to work simultaneously, hardware resource conflicts can be encountered in the process, and the method for locking and unlocking the minimum action unit by utilizing the semaphore can limit the calling of the hardware resources to solve the conflicts.
Preferably, in a further embodiment of the present invention, further comprising:
the lower computer creates corresponding elements for the beginning and the end of each action in the process flow to form an adjacent element pair, and the elements comprise executed station identifiers and action types;
and the lower computer forms all the element pairs into a master control queue according to the steps of the process flow, and sequentially controls the corresponding stations to execute the operation according to the master control queue.
Specifically, the process flow segments are scheduled by a master control queue, elements appear in pairs when the queue is pressed, the two elements are adjacent, whether the elements exist or not is inquired before scheduling, a head node is deleted after determining, the corresponding process flow is entered, the nodes are inquired after the completion of the scheduling, and the actions of other flows can be continued only after the deletion.
The lower computer creates corresponding elements for the start and the end of each action in any process flow to form an adjacent element pair, wherein the elements comprise executed station identifications and action types, and the action types comprise actions to be executed by target resources and types (action start or end) corresponding to the elements. And the lower computer forms all the element pairs into a master control queue according to the steps of the process flow, and sequentially controls the corresponding stations to execute the operation according to the master control queue.
In the embodiment, a plurality of mechanical arms can be overlapped in the range activity area, and in the free scheduling process of actions among stations, the actions are managed through a queue of a master controller, so that the service time of the overlapped area is isolated, and the space interference is avoided.
In an embodiment of the present invention, as shown in fig. 2, an automatic transporting and picking method includes:
acquiring user operation information through an upper computer to generate a corresponding operation instruction;
receiving the operation instruction;
when the operation instruction is automatically operated, identifying the product type in the operation instruction;
and calling a corresponding target manipulator according to the product type to execute operation, wherein the target manipulator is any one or more of the plurality of manipulators.
After receiving the operation instruction, the method further comprises the following steps:
when the operation instruction is manual debugging, a debugging object and debugging parameters in the operation instruction are identified, and a corresponding target manipulator is called according to the debugging object to execute operation according to the debugging parameters;
and when the operation instruction is a configuration parameter, identifying parameter information in the operation instruction to create or modify a product configuration parameter.
After receiving the operation instruction, the method further comprises the following steps:
sending an information acquisition instruction to a 3D position detection camera;
collecting product state information according to the information collection instruction through the 3D position detection camera;
analyzing the product state according to the product state information, and analyzing the target manipulator and the grabbing pose of the target manipulator for grabbing the product by combining the product state and the position of the manipulator;
and grabbing the product according to the grabbing pose through the target manipulator.
Further comprising:
creating corresponding semaphore for all manipulators;
when a certain target manipulator is distributed to a target station, locking a target semaphore corresponding to the target manipulator, and meanwhile distributing the target semaphore to unlocking corresponding to the target station;
and after the target manipulator reaches the target station to complete target operation, unlocking the target semaphore according to unlocking of the target station.
Further comprising:
creating corresponding elements for the beginning and the end of each action in the process flow to form an adjacent element pair, wherein the elements comprise the executed station identification and the action type;
and forming a master control queue by all the element pairs according to the steps of the process flow, and sequentially controlling corresponding stations to execute operation according to the master control queue.
The specific operation modes of the modules in this embodiment have been described in detail in the corresponding system embodiments, and therefore, are not described in detail.
An embodiment of the invention provides a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out all or part of the method steps of the first embodiment.
All or part of the flow of the method according to the embodiments of the present invention may be implemented by a computer program, which may be stored in a computer-readable storage medium and used by a processor to implement the steps of the embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
An embodiment of the present invention further provides an electronic device, which includes a memory and a processor, wherein the memory stores a computer program running on the processor, and the processor executes the computer program to implement all or part of the method steps in the first embodiment.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like which is the control center for the computer device and which connects the various parts of the overall computer device using various interfaces and lines.
The memory may be used to store the computer programs and/or modules, and the processor may implement various functions of the computer device by running or executing the computer programs and/or modules stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, video data, etc.) created according to the use of the cellular phone, etc. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
It should be noted that the above embodiments can be freely combined as necessary. The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. The utility model provides an automatic transport system of getting goods which characterized in that, includes host computer, next machine and a plurality of manipulator, the next machine respectively with the host computer a plurality of manipulator is connected:
the upper computer acquires user operation information and sends a corresponding operation instruction to the lower computer according to the user operation information;
and when the operation instruction is in automatic operation, the lower computer identifies the product type in the operation instruction, and calls a corresponding target manipulator to execute operation according to the product type, wherein the target manipulator is any one or more of the plurality of manipulators.
2. The automated transport picking system of claim 1, further comprising:
the lower computer identifies a debugging object and debugging parameters in the operating instruction when the operating instruction is manual debugging, and calls a corresponding target manipulator to execute operation according to the debugging parameters according to the debugging object;
and when the operation instruction is a configuration parameter, the lower computer identifies parameter information in the operation instruction to create or modify a product configuration parameter.
3. The automated transport delivery system of claim 1, further comprising a 3D position detection camera, the 3D position detection camera coupled to the lower computer;
the 3D position detection camera acquires product state information according to the information acquisition instruction sent by the lower computer and sends the product state information to the lower computer;
the lower computer analyzes the product state according to the product state information, and analyzes the target manipulator and the grabbing pose of the product grabbed by the target manipulator by combining the product state and the position of the manipulator;
and the target manipulator grabs the product according to the grabbing pose sent by the lower computer.
4. The automated transport picking system of claim 1, further comprising:
the lower computer is used for establishing corresponding semaphore for all the manipulators;
the lower computer locks a target semaphore corresponding to a target manipulator and simultaneously distributes the target semaphore to unlocking corresponding to the target station when the target manipulator is distributed to the target station;
and the target manipulator receives the distribution instruction of the lower computer, and unlocks the target semaphore according to the unlocking of the target station after the target station is reached and the target operation is completed.
5. The automated transport picking system of claim 1, further comprising:
the lower computer creates corresponding elements for the beginning and the end of each action in the process flow to form an adjacent element pair, and the elements comprise executed station identifiers and action types;
and the lower computer forms all the element pairs into a master control queue according to the steps of the process flow, and sequentially controls the corresponding stations to execute the operation according to the master control queue.
6. An automatic carrying and goods taking method is characterized by comprising the following steps:
acquiring user operation information through an upper computer to generate a corresponding operation instruction;
receiving the operation instruction;
when the operation instruction is automatically operated, identifying the product type in the operation instruction;
and calling a corresponding target manipulator according to the product type to execute operation, wherein the target manipulator is any one or more of the plurality of manipulators.
7. The automated carrier retrieval method of claim 6, further comprising, after receiving the operational command:
when the operation instruction is manual debugging, a debugging object and debugging parameters in the operation instruction are identified, and a corresponding target manipulator is called according to the debugging object to execute operation according to the debugging parameters;
and when the operation instruction is a configuration parameter, identifying parameter information in the operation instruction to create or modify a product configuration parameter.
8. The automated carrier retrieval method of claim 6, further comprising, after receiving the operational command:
sending an information acquisition instruction to a 3D position detection camera;
collecting product state information according to the information collection instruction through the 3D position detection camera;
analyzing the product state according to the product state information, and analyzing the target manipulator and the grabbing pose of the target manipulator for grabbing the product by combining the product state and the position of the manipulator;
and grabbing the product according to the grabbing pose through the target manipulator.
9. The automated carrier picking method of claim 6, further comprising:
creating corresponding semaphore for all manipulators;
when a certain target manipulator is distributed to a target station, locking a target semaphore corresponding to the target manipulator, and meanwhile distributing the target semaphore to unlocking corresponding to the target station;
and after the target manipulator reaches the target station to complete target operation, unlocking the target semaphore according to unlocking of the target station.
10. The automated carrier picking method of claim 6, further comprising:
creating corresponding elements for the beginning and the end of each action in the process flow to form an adjacent element pair, wherein the elements comprise the executed station identification and the action type;
and forming a master control queue by all the element pairs according to the steps of the process flow, and sequentially controlling corresponding stations to execute operation according to the master control queue.
CN202010139568.7A 2020-03-03 2020-03-03 Automatic carrying and goods taking system and method Pending CN111368720A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010139568.7A CN111368720A (en) 2020-03-03 2020-03-03 Automatic carrying and goods taking system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010139568.7A CN111368720A (en) 2020-03-03 2020-03-03 Automatic carrying and goods taking system and method

Publications (1)

Publication Number Publication Date
CN111368720A true CN111368720A (en) 2020-07-03

Family

ID=71211206

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010139568.7A Pending CN111368720A (en) 2020-03-03 2020-03-03 Automatic carrying and goods taking system and method

Country Status (1)

Country Link
CN (1) CN111368720A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113172622A (en) * 2021-04-22 2021-07-27 深圳市商汤科技有限公司 ROS-based mechanical arm grabbing and assembling management method and system and related equipment
CN114637266A (en) * 2022-03-16 2022-06-17 北京半导体专用设备研究所(中国电子科技集团公司第四十五研究所) Control method and device based on micro-service

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010129599A1 (en) * 2009-05-04 2010-11-11 Oblong Industries, Inc. Gesture-based control systems including the representation, manipulation, and exchange of data
CN107443376A (en) * 2017-07-22 2017-12-08 深圳市萨斯智能科技有限公司 Processing method and robot of a kind of robot to teleinstruction
CN109531567A (en) * 2018-11-23 2019-03-29 南京工程学院 Remote operating underactuated manipulator control system based on ROS
CN109910010A (en) * 2019-03-23 2019-06-21 广东石油化工学院 A kind of system and method for efficient control robot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010129599A1 (en) * 2009-05-04 2010-11-11 Oblong Industries, Inc. Gesture-based control systems including the representation, manipulation, and exchange of data
CN107443376A (en) * 2017-07-22 2017-12-08 深圳市萨斯智能科技有限公司 Processing method and robot of a kind of robot to teleinstruction
CN109531567A (en) * 2018-11-23 2019-03-29 南京工程学院 Remote operating underactuated manipulator control system based on ROS
CN109910010A (en) * 2019-03-23 2019-06-21 广东石油化工学院 A kind of system and method for efficient control robot

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
冀伟;朱建江;: "基于亚像素精度的机器人抓件视觉引导系统" *
易润泽;李会军;宋爱国;: "基于多传感器的机器人遥操作人机交互系统" *
殷苏民;郑昌俊;徐启祥;孙骏;邹浩;胡泽黎;: "基于Sysmac自动化平台的Delta机器人控制系统设计" *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113172622A (en) * 2021-04-22 2021-07-27 深圳市商汤科技有限公司 ROS-based mechanical arm grabbing and assembling management method and system and related equipment
CN113172622B (en) * 2021-04-22 2024-04-16 深圳市商汤科技有限公司 ROS-based mechanical arm grabbing and assembling management method, system and related equipment
CN114637266A (en) * 2022-03-16 2022-06-17 北京半导体专用设备研究所(中国电子科技集团公司第四十五研究所) Control method and device based on micro-service

Similar Documents

Publication Publication Date Title
US11712800B2 (en) Generation of robotic user interface responsive to connection of peripherals to robot
US10850393B2 (en) Method for extending end user programming of an industrial robot with third party contributions
CN107807815B (en) Method and device for processing tasks in distributed mode
CN106874189B (en) Method for realizing automatic test system of power grid real-time database system
CN108009258A (en) It is a kind of can Configuration Online data collection and analysis platform
CN108416866B (en) Inspection task processing method and equipment
CN111368720A (en) Automatic carrying and goods taking system and method
CN111507674B (en) Task information processing method, device and system
CN104765641A (en) Job scheduling method and system
CN110071855A (en) Equipment linkage control method, device, system, gateway and storage medium
CN110704044A (en) Visual programming system
CN110705891A (en) Data processing method based on high-allocable changeability
CN213213488U (en) Automatic test system
CN112633850A (en) Method for realizing service flow automation by managing, calling and monitoring RPA robot
CN105579920A (en) Programmable controller and control method for programmable controller
CN113172622B (en) ROS-based mechanical arm grabbing and assembling management method, system and related equipment
CN115543824A (en) Software testing device, system and method
CN111844021A (en) Mechanical arm cooperative control method, device, equipment and storage medium
CN109710605B (en) Automatic equipment information acquisition device and method
CN110297642A (en) A kind of system and method for the concurrent programming in multiple terminals, configuration and test
CN110069042A (en) Control method, device, software systems and the control system of production procedure process
CN114055464B (en) Execution system for intelligent scheduling of manipulator work and work method thereof
CN108345263B (en) Method and system for realizing sequential control of process device
CN106709578B (en) Operation and maintenance monitoring device and operation and maintenance monitoring method
KR20230015626A (en) Different types of multi-rpa integrated management systems and methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200703

WD01 Invention patent application deemed withdrawn after publication