CN113254006A - Method, system, device, electronic equipment and storage medium for robot interaction - Google Patents

Method, system, device, electronic equipment and storage medium for robot interaction Download PDF

Info

Publication number
CN113254006A
CN113254006A CN202110464834.8A CN202110464834A CN113254006A CN 113254006 A CN113254006 A CN 113254006A CN 202110464834 A CN202110464834 A CN 202110464834A CN 113254006 A CN113254006 A CN 113254006A
Authority
CN
China
Prior art keywords
program
view
state
block
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110464834.8A
Other languages
Chinese (zh)
Inventor
张�杰
石金博
温梓萱
杨庆华
蔡明杨
周力
陈成
罗伟杰
范国华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Liqun Automation Technology Co ltd
Original Assignee
Dongguan Liqun Automation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Liqun Automation Technology Co ltd filed Critical Dongguan Liqun Automation Technology Co ltd
Priority to CN202110464834.8A priority Critical patent/CN113254006A/en
Publication of CN113254006A publication Critical patent/CN113254006A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Abstract

The application discloses a method, a system, a device, electronic equipment and a storage medium for robot interaction, which relate to the technical field of robot programming, and the method comprises the steps of receiving a view request triggered by a user; acquiring view data according to the view request; according to the view data and the view request, switching the current view into a preset program picture block editing view or switching the current view into a preset state view; displaying a program logic diagram on the program tile editing view or a state execution diagram on the state view; receiving an editing request of the user; and updating the view data according to the editing request, and updating the program logic diagram or the state execution diagram. A system, an apparatus, an electronic device, and a storage medium apply the method. By the method, the efficiency of the robot interactive programming can be improved.

Description

Method, system, device, electronic equipment and storage medium for robot interaction
Technical Field
The present invention relates to the field of robot programming technologies, and in particular, to a method, a system, an apparatus, an electronic device, and a storage medium for robot interaction.
Background
Currently, in the field of robot programming technology, in order to adapt a robot to more scenes and implement complex functions, a basic instruction set is generally defined for the robot, and a service interface for receiving instructions is provided on the robot. At this time, the user can create a service logic diagram based on the basic instruction set according to the service scene of the user, so that the robot can receive the operation instruction sent from the service logic diagram according to the service interface and operate, and the service scene is realized. Meanwhile, in order to further simplify the creation of the business logic diagram, a graphical programming platform can be generally adopted, namely, the business logic diagram is quickly created in a dragging mode, but in the dragging mode, a UI (user interface) needs to be separately established for displaying the state diagram. In the actual operation process, the service logic diagram needs to be changed for many times according to the actual debugging and testing condition to meet the service scene, so that the state diagram and the graphical programming interface need to be frequently switched, and the robot programming interaction efficiency is low.
Disclosure of Invention
The present application is directed to solving at least one of the problems in the prior art. Therefore, a robot interaction method, a system, a device, an electronic device and a storage medium are provided, which can improve the efficiency of robot interaction programming.
A method of robotic interaction according to an embodiment of the first aspect of the application, the method comprising:
receiving a view request triggered by a user;
acquiring view data according to the view request;
according to the view data and the view request, switching the current view into a preset program picture block editing view or switching the current view into a preset state view;
displaying a program logic diagram on the program tile editing view or a state execution diagram on the state view;
receiving an editing request of the user;
and updating the view data according to the editing request, and updating the program logic diagram or the state execution diagram.
A system for robotic interaction in accordance with an embodiment of the second aspect of the present application, the system comprising:
the interaction platform executes the robot interaction method in the first aspect to obtain a program logic diagram and a state execution diagram; and sending corresponding execution instructions in the program logic diagram or the state execution diagram;
and the robot terminal receives the execution instruction sent by the interactive platform and sends the execution result of the execution instruction to the interactive platform.
An apparatus for robotic interaction according to an embodiment of a third aspect of the present application, comprising:
the interactive module is used for acquiring an operation request of a user, wherein the operation request comprises a view request and an editing request;
a display module for displaying a program tile edit view or a status view;
a storage module to store view data displayed on the program tile edit view and the status view;
a processing module for responding to the view request to switch a current view to a program tile editing view or a state view; the processing module is also used for responding to the editing request, updating the view data, and updating the program logic diagram in the editing view of the program block or updating the state execution diagram in the state view.
An electronic device according to an embodiment of a fourth aspect of the present application includes:
at least one processor, and,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions for execution by the at least one processor to cause the at least one processor, when executing the instructions, to implement a method of robotic interaction as claimed in any one of the first aspects.
A storage medium according to an embodiment of the fifth aspect of the present application includes computer-executable instructions stored thereon for performing the method for robot interaction of the first aspect of the present application.
According to the above embodiments of the present application, at least the following advantages are provided: the program diagram block editing view and the state view share the same view data, so that the modified state execution diagram in the state view is synchronized to the program logic diagram of the program diagram block editing view, and at the moment, the view is switched from the state view to the program diagram block editing view, so that the corresponding program logic diagram can be obtained. At the moment, during debugging, simple operations such as simple state position adjustment or state deletion can be directly modified through the state diagram, and complex operations can be operated in the program diagram block editing view in a view switching mode, so that the programming efficiency of the robot is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of a robotic interaction system of the present application;
FIG. 2 is a schematic structural diagram of an interaction device according to the present application;
FIG. 3 is a primary flow diagram of a method of robotic interaction of the present application;
FIG. 4 is a schematic diagram of a robot, state view, tile edit view transition of the present application;
FIG. 5 is a schematic diagram illustrating a state view, program tile edit view creation process of the present application;
FIG. 6 is a schematic diagram of a program tile edit view creation process of the present application;
FIG. 7 is a diagram of a program drawing component area of the present application;
FIG. 8 is a state view generation diagram of the present application;
FIG. 9 is a diagram illustrating the operation of a status view responding to an edit request in accordance with the present application;
FIG. 10 is a schematic diagram illustrating operation of a program tile edit view edit request of the present application;
FIG. 11 is a flow diagram illustrating the creation of a program tile by external input from an edit view of the program tile of the present application;
FIG. 12 is a flowchart illustrating a program tile edit view sort operation of the present application;
FIG. 13 is a schematic view of a program tile edit view sort display according to the present application;
FIG. 14 is a flow chart illustrating the process of setting transparency for editing view of program tiles according to the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the present application, if there are first and second described only for the purpose of distinguishing technical features, it is not understood that relative importance is indicated or implied or that the number of indicated technical features or the precedence of the indicated technical features is implicitly indicated or implied.
Currently, in the field of robot programming technology, in order to adapt a robot to more scenes and implement complex functions, a basic instruction set is defined for the robot, and a service interface for receiving instructions is arranged on the robot. At this time, the user can create a service logic diagram based on the basic instruction set according to the service scene of the user, so that the robot can receive the operation instruction sent from the service logic diagram according to the service interface and operate, and the service scene is realized. Meanwhile, in order to further simplify the creation of the business logic diagram, a graphical programming platform can be generally adopted, namely, the business logic diagram is quickly created in a dragging mode, but in the dragging mode, a UI (user interface) needs to be separately established for displaying the state diagram. In the actual operation process, the service logic diagram needs to be changed for many times according to the actual debugging and testing condition to meet the service scene, so that the state diagram and the graphical programming interface need to be frequently switched, and the robot programming interaction efficiency is low.
For this purpose, the present application provides a system for robot interaction, as shown in fig. 1, the system for robot interaction includes an interaction platform 100 and a robot terminal 200, the interaction platform 100 responds to a user's operation request, such as a view request and an edit request, to generate a program logic diagram satisfying a business scenario and a state execution diagram displaying an execution state of the robot terminal 200. At this time, the user may send an execution instruction to the robot terminal 200 in the program logic diagram or the state execution diagram through the interactive platform 100 to observe the execution of the robot terminal 200, thereby obtaining a program logic diagram or a state execution diagram for industrial operation. The robot terminal 200 receives and executes the execution instruction, and transmits an execution result of the execution instruction to the interactive platform 100 to display a current execution state.
It can be understood that, in order to further improve the convenience of the robot interactive system, as shown in fig. 1, the interactive platform 100 is divided into a resource library 110 and an interactive terminal 120 according to its functionality, wherein the interactive terminal 120 is configured to obtain a view request and an edit request of a user to obtain corresponding resource information from the resource library 110 to generate a program logic diagram or a state execution diagram. In some embodiments, the repository 110 is in remote communication with the interactive terminal 120 (for example, the interactive terminal 120 is a web terminal, and remotely connects to the repository through WiFi, and loads a program logic diagram or a state execution diagram to be displayed); in other embodiments, the resource library 110 is disposed on the interactive terminal 120 (for example, the interactive terminal 120 is a computer terminal, the resource library 110 is stored on a hard disk of the computer terminal, and the computer terminal is remotely connected to the robot for sending the execution instruction and receiving the execution result).
It should be noted that the interactive terminal 120 may be a computer terminal, a web terminal, a mobile phone terminal, or the like capable of performing interaction. The resource library 110 is used for storing information such as a program logic diagram, a program for displaying a state execution diagram, a program for converting the program logic diagram and the state execution diagram into a robot execution code, and a picture resource displayed on the corresponding program logic diagram and the state execution diagram. The repository 110 may be a server or an external memory.
It can be understood that the present application also provides an interaction device, as shown in fig. 2, comprising:
the interaction module 310, the interaction module 310 is configured to obtain an operation request of a user, where the operation request includes a view request and an editing request;
a display module 320, the display module 320 being configured to display a program tile edit view or a status view;
the storage module 330, the storage module 330 is used for storing view data displayed on the program tile editing view and the state view;
the processing module 340, the processing module 340 is used for responding to the view request to switch the current view into the program tile editing view or the state view; the processing module is also used for responding to the editing request, so as to update the view data and update the program logic diagram or the execution diagram in the current view.
It should be noted that the interactive device may be a stand-alone device, such as a computer, a mobile phone, or a device with a designated function. The interaction module 310 may obtain an operation request of a user through an interaction device such as a mouse or a touch screen. The display module 320 displays through a display terminal such as a display screen. The storage module 330 may be an external storage module (hard disk, USB, etc.), or may be an internal storage device, and is configured to perform program storage and view data storage, so that the processing module 340 may run a program according to the stored information and then display the program in the display module, thereby implementing interaction between the user and the robot.
It is understood that the present application also provides an electronic device comprising at least one processor, and a memory communicatively coupled to the at least one processor; the storage stores instructions which are executed by the at least one processor, so that the at least one processor can realize the mutual switching between the program pattern block editing view and the state view when executing the instructions, and realize the synchronous modification of the program pattern block editing view and the state view.
It can be understood that the present application also provides a storage medium, which includes computer executable instructions stored therein, and the computer executable instructions are used for implementing mutual switching between the program segment editing view and the state view, and implementing synchronous modification of the program segment editing view and the state view.
It is noted that the term storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer.
It can be understood that, as shown in fig. 3, the application provides a method for robot interaction, which is applied to an interaction platform, and implements mutual switching between a program tile editing view and a state view, and implements synchronous modification of the program tile editing view and the state view through the interaction platform. The method comprises the following steps:
and step S100, receiving a view request triggered by a user.
It should be noted that the triggering mode may be directly triggered by a view clicking mode, for example, in some embodiments, a view switching option is set on the current view, and a view request is obtained by obtaining a selection state of the view switching option of the user; or triggering to obtain the view request by selecting the hidden view. The trigger mode may also be obtained by a key mode, for example, in some embodiments, the view request may be obtained by setting a shortcut key, such as a mouse key (which is clicked for multiple times) or a keyboard key combination mode.
And step S200, acquiring view data according to the view request.
It should be noted that the view data records the composition of the program tile edit view and the tile displaying the business logic in the state view. The view data may be an XML file or other script file, so that the interactive platform can render the graphical data by reading the information of the XML or script file.
Step S300, according to the view data and the view request, switching the current view into a preset program tile editing view or switching the current view into a preset state view.
It should be noted that the program tile editing view is a view for editing a program tile, and a user can implement a segment of robot behavior operation by dragging a corresponding program tile in the program tile editing view. The status view is a view for displaying and editing an execution status process of the robot. The state view integrates state (flow) rendering, running or debugging process information visualization.
And step S400, displaying a program logic diagram on the program tile editing view or displaying a state execution diagram on the state view.
It should be noted that, if the current view is a program tile editing view, a program logic diagram is displayed in the current view; and if the current view is the state view, displaying the state execution diagram in the current view.
And step S500, receiving an editing request of a user.
Step S600, updating the view data according to the editing request, and updating the program logic diagram or the state execution diagram.
It should be noted that the program logic diagram and the state execution diagram share the same view data. It should be noted that the state execution graph is used for presenting states (such as speed, position, etc.) of execution nodes in the program logic graph, and therefore, when information related to the states is indicated in the program logic graph, when the state graph is converted, only linked modification needs to be performed on associated parameters, and mutual conversion and synchronous update between the program logic graph and the state execution graph can be achieved. For example, in some embodiments, as shown in FIG. 4, an XML file is employed as the intermediate bridge for the transformation; at this time, on the basis of the XML description file required for generating the program logic diagram, the parameters (such as the position, name, type and description of the connection line) of the corresponding elements in the state execution diagram are added to the corresponding elements in the XML description file required for generating the program logic diagram in a form that does not affect the data required for generating the program logic diagram (i.e. adding the attribute parsed by the page of different views in the element tags). At this time, when the drawing information of the state execution diagram is read, the data corresponding to the state execution diagram is analyzed from the XML description file, and then the current state execution diagram can be loaded; when the program logic diagram is read, the corresponding program logic diagram is analyzed from the XML description file, and the diagram information of the current program logic diagram can be loaded.
Therefore, according to the above embodiments of the present application, the following beneficial effects are provided: the program diagram block editing view and the state view share the same view data, so that the modified state execution diagram in the state view is synchronized to the program logic diagram of the program diagram block editing view, and at the moment, the view is switched from the state view to the program diagram block editing view, so that the corresponding program logic diagram can be obtained. At the moment, during debugging, simple operations such as simple state position adjustment or state deletion can be directly modified through the state diagram, and complex operations can be operated in the program diagram block editing view in a view switching mode, so that the programming efficiency of the robot is improved.
It can be understood that, to further improve the convenience of synchronous update of shared view data, as shown in fig. 5, the method further includes:
step S710, creating a program drawing component in the program picture block editing view; the program drawing component is used for generating or updating a program logic diagram.
Step S720, creating a state drawing component in the state view; wherein, the state drawing component is used for generating or updating the state execution diagram.
And step S730, carrying out association setting on the program drawing component and the state drawing component.
It should be noted that the program drawing component has a plurality of member blocks, and the state drawing component also has a plurality of member blocks; and associating the program drawing component with the state drawing component, namely associating one or more member blocks in the program drawing component with the member blocks in the state drawing component, and identifying the state of the corresponding member block so as to convert the member block into the corresponding member block in another view (wherein the conversion relation is fixed). If the program drawing component is provided with a member block capable of setting the program start, the state drawing component is correspondingly provided with a member block with the state start, the member block with the program start and the member block with the state start are associated, and when the program drawing component is switched to a program block editing view, and the member block with the program start is identified, the member block with the program start is converted into the member block with the state start. Similarly, other state classes can establish the association relationship in the same view through the execution sequence, logic and the like, so that the program logic diagram and the state execution diagram can be mutually converted and synchronized according to the association relationship. At this time, when the same view data is shared, the program drawing component and the state drawing component can be stored in a unified manner with associated member blocks, so that the operation is more convenient. Therefore, when the state execution diagram is modified, the program logic diagram can be modified more quickly, and similarly, when the program logic diagram is modified, the state execution diagram can be modified more quickly. It should be noted that, the state execution diagram and the program logic diagram may be converted by different languages, for example, when an XML file is used as an intermediate conversion bridge, the state drawing component stores data in json format, and at this time, when the state drawing component is changed, the data in json format corresponding to the state drawing component may be converted into XML format and the content corresponding to XML may be updated. Similarly, during display, the corresponding data of the state drawing component in the XML is read, converted into the json format and displayed.
It can be understood that, in order to improve the convenience of the operation of editing the view of the program diagram block, the program drawing components are classified and designed and displayed according to the function of the operation, so as to improve the convenience of the operation. Therefore, as shown in fig. 6, step S710 includes:
step S711, creating a basic behavior graph block component and a logic graph block editing component; the basic behavior diagram block component and the logic diagram block editing component are both program drawing components.
It should be noted that the basic behavior diagram block component and the logic diagram block editing component both include a plurality of member blocks for implementing a specific function, for example, the basic behavior diagram block component represents a packaged block for robot motion; as shown in fig. 7, the basic behavior diagram component includes:
a thread block: a block for adding a thread for a program, including creation, configuration and priority setting of the thread;
a dot bit block: creating blocks of point locations for a specified robot;
a speed block: a block for specifying a movement speed for a specified robot; the set speed can be modified and the current speed can be obtained;
I/O block: setting an I/O signal for a specified robot, acquiring the I/O signal and controlling a switch of the I/O signal;
a robot block: a block for setting robot parameters (e.g. robot number) containing tool coordinate system block options inside;
a motion block: the robot motion control system is used for establishing various motion types of the robot and setting conditions for triggering the motion of the robot, and comprises linear motion and door-shaped motion options;
tool coordinate block: the robot tool coordinate system is used for establishing a robot tool coordinate system (comprising three-dimensional coordinates and rotation angles under the three-dimensional coordinate system);
a system block: for setting various parameters (e.g., system time, IDN) related to the robotic system.
The logic graph block editing component represents a user to create and package a program logic block of the user, namely, traversal and combination relations and the like in the basic behavior graph block component are set. Wherein the logical tile editing component comprises:
a logic block: a block for adding a logical branch to a program;
and a circulating block: a block for adding loop logic to a program;
math/text/list block: a block for calculating and processing the numeric/text/list variables;
a variable block: a block for adding user-defined variables to the program;
function block: the block is used for adding user-defined functions to the program, and the function block can contain logic blocks, loop blocks, math/text/list blocks and variable blocks (in this case, the operation of the function block can be simplified through a Class block).
It should be noted that, the user can implement the content in the tile as the basic behavior diagram component by performing the combination editing on the tiles in the logic tile editing component.
And step S712, displaying the logic pattern block editing component and the basic behavior pattern block component in a partition mode.
It should be noted that, through the partition display, the user can quickly identify which components are customizable and which can only be combined and assembled to obtain a new function. As shown in fig. 7, the display list is divided into a basic behavior diagram component area 400 and a logical diagram editing component area 500 by a partition line as shown in the figure, wherein a logical diagram editing component is displayed in the logical diagram editing component area 500, and a basic behavior diagram component is displayed in the basic behavior diagram component area 400. In other embodiments, the logical tile editing components and the basic behavior tile component classes may be placed and displayed in a list.
S713, creating a constraint relation between the basic behavior graph block component and the logic graph block editing component; wherein the constraint relationship comprises a combinatorial constraint.
Step S714, creating constraint relations among all the image blocks in the basic behavior image block component and constraint relations among all the image blocks in the logic image block editing component.
It is understood that the basic behavior graph component comprises a set of independent behavior blocks; wherein the set of independent behavior blocks comprises: a point bit block, a speed block, a tool coordinate block, an I/O block, a system block, and a thread block.
It should be noted that, one or more elements in the independent behavior block set may be combined with one or more blocks in the logic block editing assembly to obtain blocks satisfying the robot motion, for example, the linear motion of the robot includes conditions of point location, speed, and end, at this time, the linear motion may be created by dragging the point bit block and the speed block into the program block editing view, and then increasing the logical relationship between the point bit block and the speed block by the logic block in the logic block editing assembly.
It is understood that the basic behavior diagram component comprises a set of behavior composition blocks; wherein the behavior combination block set comprises elements in at least one independent behavior block set; the behavior combination block set comprises a motion block and a robot block.
It should be understood that, taking a motion block as an example, the motion block is provided with a linear motion block, a door-shaped motion block, and a trigger type block, wherein point location and speed information are displayed in the linear motion block, and at this time, a user may select a point location block at a corresponding point location for information filling, thereby weakening the operation of the combinational logic. Further improving convenience.
It can be understood that the basic behavior diagram block component further comprises a state block, and the state block is arranged corresponding to the state drawing component; the state block is combined with elements in at least one independent action block set and/or elements in an action combination block set and a logic tile editing component to form a program logic diagram.
It should be noted that the status block includes sub-blocks such as a start block, a decision block (i.e. what operation is performed when a certain condition is satisfied), a robot status (e.g. point location, speed, etc.), and the sub-blocks correspond to the status component and the connection component of the status drawing component in the status view. At this time, the edited program blocks in the program block editing view can be assembled into the state block to form the completed program logic diagram, and at this time, when the state execution diagram is converted, the execution state and logic in the state block can be acquired, judged and converted into the corresponding state component and connection component. Therefore, the convenience of mutual conversion of the program logic diagram and the state execution diagram is further improved through the state block. And realizing the motion of the robot through the combination of the independent behavior block sets. The thread blocks correspond to threads in the state diagram.
It can be understood that if the current view is a program tile edit view, the view data, the program logic view are updated by a Blockly editor.
It should be noted that the Blockly editor uses graphical programming, and there is a toolkit similar to a language converter in the Blockly editor, which can convert the graphical programming language into a plurality of programming language codes. Therefore, a plurality of program languages are understood by a Blockly editor in a graphical programming mode, a common logic structure is converted into a graph block, each graph block can be converted into a code structure of a plurality of different languages, the grammar of the language is shielded, and the programming is concise and vivid. As shown in fig. 4, in some embodiments, when the state execution diagram and the program logic diagram are used as an intermediate bridge for conversion through an XML file, the state execution diagram and the program logic diagram may convert an execution operation corresponding to the XML file into a language recognized by a robot through Blockly, where the conversion manner is to define generator.
It is noted that the state view includes a state drawing component area and a first canvas area; the program tile editing view comprises a program drawing component area and a second canvas area; wherein, the program drawing assembly is positioned in the program drawing assembly area. And the interactive platform monitors the drawing component selected by the user in the state drawing component area on the first canvas area to realize the drawing of the state execution diagram. And the interactive platform monitors the program blocks selected by the user in the program drawing component area on the second canvas area to realize the drawing of the program logic diagram (namely the combination of the blocks).
It can be understood that if the current view is a state view, a state execution diagram is displayed in the state view; at this time, as shown in fig. 8, step S400 includes:
step S410, generating a state image block set according to the view data; wherein, the state pattern block set comprises one of a state component, a connection component and a thread component.
It should be noted that, program blocks corresponding to the state component, the link component and the thread component are arranged in the program block editing view, so as to realize free conversion between the program logic diagram and the state execution diagram.
And step S420, arranging each image block in the state image block set in the state view through an automatic layout algorithm to obtain a state execution diagram.
It should be noted that, through the automatic layout algorithm, the arrangement of each state component in the state view in the current view can be automatically adjusted, so as to improve the efficiency of mutual conversion between the state execution diagram and the program logic diagram. It should be noted that the position adjustment of the drawing component on the state execution diagram can be performed manually on the basis of automatic conversion.
It can be understood that, if the current view is a status view, the view data and the status execution diagram are updated according to the editing request, in this case, as shown in fig. 9, step S500 includes:
step S510, monitoring a first moving track of the selected state drawing component in the state drawing component area in a first canvas area; the state drawing component comprises at least one of a state component, a connecting line component and a thread component.
It should be noted that the state component, the link component, and the thread component are drawing components for drawing the state execution diagram.
And step S520, obtaining an updated state execution diagram according to the first movement track.
It should be noted that the positions of the state components, the connection components, and the thread components in the first canvas area and the connection relationships with other state components, connection components, and thread components in the first canvas area can be obtained through the first movement track.
Step S530, updating state information corresponding to the state component or updating connection information corresponding to the connection component or updating thread information corresponding to the thread component in the view data according to the first moving track; the state information comprises position information, a sub-state set and a trigger event set; the connection information comprises path information and state component connection information; the thread information comprises connection component set information and state component set information.
It should be noted that the status information further includes a status name and a status ID, where the status name is used to distinguish different statuses, and the status ID can be used as an index of the status component. The set of trigger events includes a trigger event that enters the state component and a trigger event that ends the state component. The position information represents coordinate information of the status component in the status view.
It should be noted that, in the link information, the path information records position information of a start point and an end point of the link component, and the state component connection information records state information of connection of the start point and the end point of the link component.
It should be noted that the link component set information and the state component set information in the thread information respectively indicate information of a set of link components and information of state component combinations executed by the thread.
Therefore, by storing the thread information, the link information and the state information, the layout of the state component, the link component and the thread component in the state view can be obtained, and the state execution graph can be restored according to the stored thread information, the link information and the state information.
It can be understood that, since in the block ly graphical programming, the Google framework is adopted for programming, but Google itself does not provide a Class syntax block (syntax module for object-oriented programming) and does not provide local variables; functional components are therefore provided in the program drawing component area, where the functional components apply the principles of Class syntax blocks, unifying methods and properties for convenient user programming. As shown in fig. 10, if the current view is a program tile edit view, the step S500 updates the view data and the program logic view according to the edit request, and includes:
and S540, monitoring a second moving track of the selected functional component in the second canvas area to obtain a functional program image block.
And step S550, displaying a selection list of the functional program image block in the second canvas area according to the selected state of the functional program image block.
Step S560, updating the functional program blocks according to the selection status of the selection list, so as to update the program logic diagram and the view data.
At this time, after the user puts the functional component into the second canvas area in a dragging mode, the corresponding method can be input in a list selection mode, and therefore graphical programming is simplified.
It should be noted that the functional components are divided according to the functions performed by the robot, and the representation may include all actions for completing one function. Taking the motion of the robot as an example, the motion process of the robot includes setting of speed, position, motion type, and the like. At this time, a motion block (functional component) may be set in the program drawing component area, and at this time, after the user drags the motion block (i.e., the functional component) to the second canvas area, a motion type option (i.e., a selection list) may appear when the user selects the motion block, such as a portal motion and a linear motion, and at this time, the user updates the functional program block after selecting the portal motion, and a motion parameter (a parameter list corresponding to a point block or a speed) corresponding to the portal motion appears. At this time, the user only needs to add basic point blocks or speed blocks to the motion block to add specific motion parameter values, so that the generation of the motion block is simplified by using the class syntax block, and the motion block is prevented from containing wrong motion parameters. Similarly, the creation of the list option for each motion parameter may be simplified by referring to step S550.
It can be understood that, to further improve the convenience of graphical editing for the user, as shown in fig. 11, if the current view is a program tile editing view, the step S500 further includes:
and step S570, monitoring the selected external editing component to obtain an external input advanced block.
Note that the external editing component is provided as a high-level block as shown in fig. 7.
Step S580, receiving program information input by an external input high-level block, where the program information includes at least one of path information and code segments.
Since the path information is the path where the program file is located, the robot can be set in a more flexible program logic diagram by importing an existing program.
And step S590, updating the view data and the program logic diagram according to the program information.
It can be understood that the graphical interface provided by Google block is piled together in an unordered way, and a user can only search for a certain graphical block by means of memory during editing. Therefore, as shown in fig. 12, to facilitate the program logic diagram editing, the method further comprises:
step S810, acquiring a program tile set forming a program logic diagram in the program tile editing view.
Step S820, classify each program tile in the program tile set to obtain at least one classified program tile set, where each classified program tile set is a set of program tiles in the same class.
It should be noted that the same category represents the same operation, and only the parameter values are inconsistent, such as program blocks which all represent the position and speed of the robot, all represent the robot status. It can be classified as State.
And step S830, displaying the program tile sets in a classified mode according to the classified tile sets.
It should be noted that, as shown in fig. 13, taking the state of the robot as an example, the first diagram block a and the second diagram block B generated in the second canvas area respectively represent different states of the robot (for example, when the robot is in the corresponding states of the first diagram block a and the second diagram block B, the speeds and positions are different), at this time, a list header definition state is added, and the first diagram block a and the second diagram block B are classified into the definition states and displayed in a list form. Therefore, the program image block sets in the second layout area can be orderly managed in the list classification mode.
It will be appreciated that to further facilitate user processing of the sorted set of program tiles, as shown in FIG. 14, a transparency setting may be made for each program tile in the sorted set of program tiles so that each program tile or sorted set of tiles may be viewed by the user. Thus, the method further comprises:
step 840, receiving a classified tile editing request.
Step S850, expanding or folding the first program tile block in the selected classification tile block set according to the classification tile block editing request.
Step S860, if the first program block is expanded and the first program block overlaps the second program block, performing transparency setting on the overlapped first program block and the second program block; the second program block is any one of the other program blocks in the selected classification block set except the first program block or the unselected classification block set.
It should be noted that, in the second canvas area, there may be a plurality of classified tile sets, such as a state classified tile set and a thread classified tile set, taking the state classified tile set as an example, as shown in fig. 13, the classified list defining the state includes a first tile a and a second tile B, where the state tile a is in a folded state and the second tile B is in an unfolded state, and after the first tile a (i.e. the first program tile) is selected in steps S840 and S850, the first tile a may be unfolded, and at this time, the first tile a overlaps with the second tile B (i.e. the second program tile) and a partial area of the first tile a overlaps with other program tiles (i.e. the second program tile) in other categories (e.g. the thread classified tile set, not shown in the drawings) in the second canvas area, and at this time, step S860 may overlap the first tile a with the second tile B, The thread classification tile sets set different degrees of transparency. At this point, the overlapping regions of the first tile A, the second tile B, the thread classification tile set may be clearly seen in the second canvas region.
It can be understood that two program tiles adjacent in each classified tile set render different colors.
It should be noted that, taking fig. 13 as an example, the state classification tile set includes a first tile a and a second tile B, and at this time, the first tile a and the second tile B may be set to have different colors, for example, the first tile a is blue, and the second tile B is green, so as to better distinguish the first tile a from the second tile B.
It can be understood that when the robot is adjusted after the state execution diagram or the program logic diagram is generated, in order to more easily track the execution state of the robot, the method further comprises the following two steps:
step 1, acquiring an execution state of the robot terminal.
It should be noted that, for the state execution diagram, each state component corresponds to one ID, when the state execution diagram is executed to a certain state (corresponds to one state component), the ID of the currently executed state component is acquired (that is, the execution state is the executing state of the state component), and meanwhile, a corresponding instruction is sent to the robot terminal, and at this time, the interaction platform may obtain the execution state of the robot terminal according to the ID.
And 2, performing graph highlighting on the state according to the execution state.
It should be noted that the highlight processing may be performed on the corresponding state component in the state drawing through javascript. And when the robot terminal finishes executing the instruction corresponding to the state component, feeding back a result, and at the moment, acquiring the next execution state and highlighting the next execution state according to the trend of the state execution diagram. So that the whole process flow can be visualized.
In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
The embodiments of the present application have been described in detail with reference to the drawings, but the present application is not limited to the embodiments, and various changes can be made without departing from the spirit of the present application within the knowledge of those skilled in the art.

Claims (20)

1. A method of robotic interaction, the method comprising:
receiving a view request triggered by a user;
acquiring view data according to the view request;
according to the view data and the view request, switching the current view into a preset program picture block editing view or switching the current view into a preset state view;
displaying a program logic diagram on the program tile editing view or a state execution diagram on the state view;
receiving an editing request of the user;
and updating the view data according to the editing request, and updating the program logic diagram or the state execution diagram.
2. The method of robotic interaction of claim 1,
if the current view is the status view, the displaying a program logic diagram on the program tile editing view or a status execution diagram on the status view includes:
generating a set of state tiles from the view data; wherein the state tile set comprises one of a state component, a wiring component and a thread component;
and arranging each image block in the state image block set in the state view through an automatic layout algorithm to obtain the state execution diagram.
3. The method of robotic interaction of claim 1,
the status view comprises a status drawing component area and a first canvas area;
if the current view is the state view, updating the view data according to the editing request, and updating the program logic diagram or updating the state execution diagram, including:
monitoring a first moving track of the selected state drawing component in the state drawing component area in the first canvas area; the state drawing component comprises at least one of a state component, a connecting line component and a thread component;
obtaining an updated state execution diagram according to the first movement track;
according to the first moving track, updating state information corresponding to the state component or updating connection information corresponding to the connection component or updating thread information corresponding to the thread component in the view data; the state information comprises position information, a sub-state set and a trigger event set; the connection information comprises path information and state component connection information; the thread information comprises connecting line component set information and state component set information.
4. The method of robotic interaction of claim 1,
and if the current view is a program image block editing view, updating the view data and the program logic view through a Blockly editor.
5. The method of robotic interaction of claim 4,
the program tile editing view comprises a program drawing component area and a second canvas area; the program drawing component area is provided with a functional component;
if the current view is a program tile editing view, updating the view data according to the editing request, and updating the program logic diagram or updating the state execution diagram, including:
monitoring a second moving track of the selected functional component in a second canvas area to obtain a functional program image block;
displaying a selection list of the functional program image blocks in the second canvas area according to the selected state of the functional program image blocks;
and updating the functional program image blocks according to the selection state of the selection list so as to update the program logic diagram and the view data.
6. A method of robotic interaction according to any of claims 1 to 5,
the program tile editing view comprises an external editing component;
if the current view is a program tile editing view, updating the view data according to the editing request, and updating the program logic diagram or updating the state execution diagram, further comprising:
monitoring the selected external editing assembly to obtain an external input advanced block;
receiving program information input by the external input high-level block, wherein the program information comprises at least one of path information and code segments;
and updating the view data and the program logic diagram according to the program information.
7. A method of robotic interaction according to any of claims 1 to 5, wherein the method further comprises:
acquiring a program pattern block set which forms the program logic diagram in the program pattern block editing view;
classifying each program image block in the program image block set to obtain at least one classified image block set, wherein each classified image block set is a set of a plurality of program image blocks in the same class;
and displaying the program tile block set in a classification mode according to the classification tile block set.
8. The method of robotic interaction of claim 7, further comprising:
receiving a classified graph block editing request;
expanding or folding a first program pattern block in the selected classification pattern block set according to the classification pattern block editing request;
if the first program image block is expanded and the first program image block is overlapped with a second program image block, performing transparency setting on the overlapped first program image block and the second program image block; the second program block is any one of the other program blocks in the selected classified block set except the first program block or the unselected classified block set.
9. The method of robotic interaction of claim 7, further comprising:
rendering two adjacent program tiles in each of the classification tile sets in different colors.
10. A method of robotic interaction according to any of claims 1 to 5, wherein the method further comprises:
creating a program drawing component in the program tile editing view; wherein the program drawing component is used for generating or updating the program logic diagram;
creating a state drawing component in the state view; wherein the state drawing component is used for generating or updating the state execution diagram;
and carrying out association setting on the program drawing component and the state drawing component.
11. The method of robotic interaction of claim 10,
the creating a program drawing component in the program tile editing view, comprising:
creating a basic behavior graph block component and a logic graph block editing component; the basic behavior diagram block component and the logic diagram block editing component are both program drawing components;
the logic picture block editing component and the basic behavior picture block component are displayed in a partitioning mode;
creating a constraint relation between the basic behavior graph block component and the logic graph block editing component; wherein the constraint relationship comprises a combination constraint;
and creating constraint relations among all the image blocks in the basic behavior image block component and constraint relations among all the image blocks in the logic image block editing component.
12. The method of robotic interaction of claim 11,
the basic behavior graph component comprises an independent behavior block set; wherein the set of independent behavior blocks comprises: a point bit block, a speed block, a tool coordinate block, an I/O block, a system block, and a thread block.
13. The method of robotic interaction of claim 12,
the basic behavior diagram block component also comprises a behavior combination block set; wherein the set of behavior composition blocks comprises elements in at least one set of independent behavior blocks; the behavior combination block set comprises a motion block and a robot block.
14. The method of robotic interaction of claim 13,
the basic behavior diagram block component also comprises a state block, and the state block is arranged corresponding to the state drawing component; the state block is combined with elements in at least one independent action block set and/or elements in an action combination block set and the logic tile editing component to form the program logic diagram.
15. The method of robotic interaction of claim 1, further comprising:
acquiring an execution state of a robot terminal;
and executing graph highlighting processing on the state according to the execution state.
16. A system for robotic interaction, comprising:
an interactive platform, wherein the interactive platform executes the method of robot interaction according to any one of claims 1 to 15, and obtains a program logic diagram and a state execution diagram; and sending corresponding execution instructions in the program logic diagram or the state execution diagram;
and the robot terminal receives the execution instruction sent by the interactive platform and sends the execution result of the execution instruction to the interactive platform.
17. The system for robotic interaction of claim 16,
the interactive platform comprises a resource library and an interactive terminal, wherein the interactive terminal is used for acquiring a view request and an editing request of a user so as to acquire corresponding resource information from the resource library to generate the program logic diagram or the state execution diagram; the resource library is in remote communication connection with the interactive terminal or is arranged at the interactive terminal.
18. An apparatus for robotic interaction, comprising:
the interactive module is used for acquiring an operation request of a user, wherein the operation request comprises a view request and an editing request;
a display module for displaying a program tile edit view or a status view;
a storage module to store view data displayed on the program tile edit view and the status view;
a processing module for responding to the view request to switch a current view to a program tile editing view or a state view; the processing module is also used for responding to the editing request to update the view data, and editing the view updating program logic diagram in the program block or updating the state execution diagram in the state view.
19. An electronic device, comprising:
at least one processor, and,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions for execution by the at least one processor to cause the at least one processor, when executing the instructions, to implement a method of robotic interaction as claimed in any one of claims 1 to 15.
20. A storage medium comprising stored computer-executable instructions for performing the method of robotic interaction of any of claims 1-15.
CN202110464834.8A 2021-04-28 2021-04-28 Method, system, device, electronic equipment and storage medium for robot interaction Pending CN113254006A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110464834.8A CN113254006A (en) 2021-04-28 2021-04-28 Method, system, device, electronic equipment and storage medium for robot interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110464834.8A CN113254006A (en) 2021-04-28 2021-04-28 Method, system, device, electronic equipment and storage medium for robot interaction

Publications (1)

Publication Number Publication Date
CN113254006A true CN113254006A (en) 2021-08-13

Family

ID=77222003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110464834.8A Pending CN113254006A (en) 2021-04-28 2021-04-28 Method, system, device, electronic equipment and storage medium for robot interaction

Country Status (1)

Country Link
CN (1) CN113254006A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114029931A (en) * 2021-11-12 2022-02-11 珠海格力电器股份有限公司 Robot programming control method and device and robot system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020083413A1 (en) * 2000-12-20 2002-06-27 National Instruments Corporation System and method for programmatically generating a graphical program in response to a state diagram
CN101438240A (en) * 2006-05-11 2009-05-20 Abb公司 Synchronization of a graphical program and a robot program
US20160284232A1 (en) * 2013-11-27 2016-09-29 Engino. Net Ltd. System and method for teaching programming of devices
CN107765612A (en) * 2017-12-07 2018-03-06 南京诚思机器人科技有限公司 A kind of motion control method of robot, robot and system
CN109848985A (en) * 2018-12-31 2019-06-07 深圳市越疆科技有限公司 A kind of the graphical programming method, apparatus and intelligent terminal of robot
CN110497412A (en) * 2019-08-26 2019-11-26 中科新松有限公司 Robot graphic programming interactive system based on webpage and mobile terminal
CN111708530A (en) * 2020-06-24 2020-09-25 武汉久同智能科技有限公司 Industrial robot graphical programming system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020083413A1 (en) * 2000-12-20 2002-06-27 National Instruments Corporation System and method for programmatically generating a graphical program in response to a state diagram
CN101438240A (en) * 2006-05-11 2009-05-20 Abb公司 Synchronization of a graphical program and a robot program
US20160284232A1 (en) * 2013-11-27 2016-09-29 Engino. Net Ltd. System and method for teaching programming of devices
CN107765612A (en) * 2017-12-07 2018-03-06 南京诚思机器人科技有限公司 A kind of motion control method of robot, robot and system
CN109848985A (en) * 2018-12-31 2019-06-07 深圳市越疆科技有限公司 A kind of the graphical programming method, apparatus and intelligent terminal of robot
CN110497412A (en) * 2019-08-26 2019-11-26 中科新松有限公司 Robot graphic programming interactive system based on webpage and mobile terminal
CN111708530A (en) * 2020-06-24 2020-09-25 武汉久同智能科技有限公司 Industrial robot graphical programming system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
夏姝: "工业机器人可视化示教系统设计与实现", 《中国优秀硕士学位论文全文数据库》, no. 2021, pages 140 - 1042 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114029931A (en) * 2021-11-12 2022-02-11 珠海格力电器股份有限公司 Robot programming control method and device and robot system
CN114029931B (en) * 2021-11-12 2024-01-16 珠海格力电器股份有限公司 Robot programming control method and device and robot system

Similar Documents

Publication Publication Date Title
EP3798817B1 (en) User interface logical and execution view navigation and shifting
CN107844297A (en) A kind of data visualization realizes system and method
CN105608258B (en) A kind of Model-based diagnosis and information flow visual simulation system and method
US20070242082A1 (en) Scalable vector graphics, tree and tab as drag and drop objects
US20140258894A1 (en) Visual Timeline Of An Application History
US20140258969A1 (en) Web-Based Integrated Development Environment For Real-Time Collaborative Application Development
US20080270101A1 (en) Building Finite State Machine Model
CN112579050A (en) Task-based configuration rendering context
CN102640112A (en) Program creation support device
CN113900636A (en) Self-service channel business process development system and development method thereof
JP6370503B1 (en) Program creation device
CN113254006A (en) Method, system, device, electronic equipment and storage medium for robot interaction
CN104081347A (en) Graphical representation of an order of operations
CN104063212A (en) Method For Creating A User Interface
KR20070061326A (en) Method for supporting robot application programming and programming tool for the same
US8786612B2 (en) Animation editing device, animation playback device and animation editing method
KR101118536B1 (en) Method for providing authoring means of interactive contents
JP4476223B2 (en) Screen data creation device, screen data editing method, and screen data editing program
JP2010033500A (en) Gui data conversion system and gui data conversion method
CN115617441A (en) Method and device for binding model and primitive, storage medium and computer equipment
CN115237387A (en) Rapid development method and system for digital twin application
CN115495069B (en) Model-driven coal industry software process implementation method, device and equipment
CN110704537A (en) Intelligent contract generation method, device, equipment and storage medium
KR101940719B1 (en) Task graph construct apparatus and method of conversational processing system based on task graph
JP7009001B1 (en) Information processing equipment and programs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination