CN110000775B - Device management method, control device, and storage medium - Google Patents

Device management method, control device, and storage medium Download PDF

Info

Publication number
CN110000775B
CN110000775B CN201910154718.9A CN201910154718A CN110000775B CN 110000775 B CN110000775 B CN 110000775B CN 201910154718 A CN201910154718 A CN 201910154718A CN 110000775 B CN110000775 B CN 110000775B
Authority
CN
China
Prior art keywords
model
control
controlled
user
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910154718.9A
Other languages
Chinese (zh)
Other versions
CN110000775A (en
Inventor
王志彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MGA Technology Shenzhen Co Ltd
Original Assignee
MGA Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MGA Technology Shenzhen Co Ltd filed Critical MGA Technology Shenzhen Co Ltd
Priority to CN201910154718.9A priority Critical patent/CN110000775B/en
Publication of CN110000775A publication Critical patent/CN110000775A/en
Application granted granted Critical
Publication of CN110000775B publication Critical patent/CN110000775B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the invention provides a device management method, control equipment and a storage medium. The method comprises the following steps: establishing a connection with at least one device to be controlled, wherein the at least one device to be controlled comprises at least one robot and/or at least one motion control component; activating a target device model based on an activation instruction input by a user, wherein the target device model is a robot model or a motion control component model; and establishing an association relation between the target equipment model and the corresponding equipment to be controlled in the at least one equipment to be controlled based on the association instruction input by the user. According to an embodiment of the present invention, there is provided a control apparatus capable of simultaneously managing and controlling a plurality of apparatuses to be controlled (including robots and/or motion control means). In addition, a device management method applied to the control device is further provided, the method is beneficial to improving the management efficiency of the control device, and the expandability of the whole control device is good.

Description

Device management method, control device, and storage medium
Technical Field
The present invention relates to the field of motion control technologies, and in particular, to a device management method, a control device, and a storage medium.
Background
In a motion control system based on a robot (e.g., a robot arm) or similar technology, a device to be controlled (e.g., a robot or a drive controller) and a control device (e.g., an upper computer) establish a connection relationship, and a user can control the robot to move through the control device.
Most of the existing robots in the market adopt a demonstrator as equipment for controlling the robots, the demonstrator and the robots are in one-to-one correspondence, and each robot is correspondingly provided with one demonstrator. If a user wants to use various robots and various motors to form a complex motion system, the teaching aid and the robots are inconvenient to correspond to each other one by one. For example: the traditional scheme needs to program each robot in a complicated way and then adjust each robot to a relatively synchronous state step by step, generally more than ten days of debugging time is needed, and the use cost of the robots is greatly increased.
Disclosure of Invention
The present invention has been made in view of the above problems. The invention provides a device management method, a control device and a storage medium.
According to an aspect of the present invention, there is provided a device management method, including: establishing a connection with at least one device to be controlled, wherein the at least one device to be controlled comprises at least one robot and/or at least one motion control component; activating a target device model based on an activation instruction input by a user, wherein the target device model is a robot model or a motion control component model; and establishing an association relation between the target equipment model and the corresponding equipment to be controlled in the at least one equipment to be controlled based on the association instruction input by the user.
Illustratively, the target device model is a robot model, the corresponding device to be controlled is a motion control component including at least one axis, and establishing the association relationship between the target device model and the corresponding device to be controlled in the at least one device to be controlled based on the association instruction input by the user includes: for any joint of the target equipment model, receiving a joint axis association instruction which is input by a user and used for indicating a corresponding axis corresponding to the joint and corresponding to the equipment to be controlled; and associating the joint with the corresponding axis of the corresponding device to be controlled according to the joint axis association command.
Illustratively, before establishing the association relationship between the target device model and the corresponding device to be controlled in the at least one device to be controlled based on the association instruction input by the user, the method further includes: displaying a joint axis association control corresponding to any joint of the target equipment model on a human-computer interaction interface, wherein the joint axis association control is a text box control or a list control; for any joint of the target device model, receiving a joint axis association instruction input by a user and used for indicating a corresponding axis corresponding to the joint of the corresponding device to be controlled comprises the following steps: for any joint of the target equipment model, receiving identification information of a corresponding shaft corresponding to equipment to be controlled, which is input in a corresponding joint shaft association control by a user; or receiving a selection instruction of selecting the identification information of the corresponding axis of the corresponding device to be controlled from the corresponding joint axis association control by the user; the joint axis association instruction corresponding to any joint of the target device model comprises an instruction corresponding to an input or selection operation executed by a user for the joint axis association control corresponding to the joint.
Illustratively, the method further comprises: displaying identification information of at least one equipment model on a human-computer interaction interface; wherein the target device model is one of the at least one device model.
Illustratively, the human-machine interaction interface includes a robot window, and displaying the identification information of the at least one equipment model on the human-machine interaction interface includes: identifying information of the at least one equipment model is displayed in the robot window.
Illustratively, the human-computer interaction interface further comprises an editing window, and the activating the target device model based on the activation instruction input by the user comprises: activating the target equipment model in response to the clicking operation of the user on the identification information of the target equipment model in the robot window or the dragging operation of dragging the identification information of the target equipment model from the robot window to the editing window; the activation instruction comprises an instruction corresponding to a click operation or a drag operation.
Exemplarily, after establishing the connection with the at least one device to be controlled, the method further comprises: generating a device to be controlled list containing identification information of at least one device to be controlled; and displaying a list of the devices to be controlled on the man-machine interaction interface.
Illustratively, the human-computer interaction interface comprises a device window, and displaying the list of devices to be controlled on the human-computer interaction interface comprises: and displaying a list of the devices to be controlled in the device window.
Illustratively, after activating the target device model based on the activation instruction input by the user, the method further comprises: and displaying the identification information of the target equipment model in an editing window and/or an equipment window of the human-computer interaction interface.
Illustratively, before establishing the association relationship between the target device model and the corresponding device to be controlled in the at least one device to be controlled based on the association instruction input by the user, the method further includes: responding to the click operation of a user on the identification information of the target equipment model displayed by the editing window or the equipment window, and displaying a configuration window of the target equipment model on a human-computer interaction interface; and responding to the clicking operation of a user on an associated interface selection control on the configuration window, and displaying an associated information setting window for inputting an associated instruction, wherein the associated interface selection control is a button control for controlling the opening and closing of the associated information setting window.
According to another aspect of the present invention, there is provided a control apparatus comprising: the device comprises a connecting module, a control module and a control module, wherein the connecting module is used for establishing connection with at least one device to be controlled, and the at least one device to be controlled comprises at least one robot and/or at least one motion control component; the selection module is used for activating a target equipment model based on an activation instruction input by a user, and the target equipment model is a robot model or a motion control component model; and the association module is used for establishing an association relation between the target equipment model and the corresponding equipment to be controlled in the at least one equipment to be controlled based on an association instruction input by the user.
According to another aspect of the present invention, there is provided a control device comprising a processor and a memory, wherein the memory has stored therein computer program instructions for executing the device management method described above when the computer program instructions are executed by the processor.
According to another aspect of the present invention, there is provided a storage medium having stored thereon program instructions for performing the above-described device management method when executed.
According to an embodiment of the present invention, there is provided a control apparatus capable of simultaneously managing and controlling a plurality of apparatuses to be controlled (including robots and/or motion control means). In addition, a device management method applied to the control device is further provided, the method is beneficial to improving the management efficiency of the control device, and the expandability of the whole control device is good.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally represent like parts or steps.
FIG. 1 shows a schematic block diagram of a motion control system according to one embodiment of the present invention;
FIG. 2 shows a schematic block diagram of a motion control system according to another embodiment of the present invention;
FIG. 3 shows a schematic diagram of a human-machine interface on a control device according to one embodiment of the invention;
FIG. 4 shows a schematic diagram of an engineering window, according to one embodiment of the invention;
FIG. 5 shows a schematic view of a robot window according to an embodiment of the invention;
FIG. 6 shows a schematic diagram of a device window according to one embodiment of the invention;
FIG. 7 is a diagram illustrating a configuration window of a motion control unit and configuration information displayed in a partial area of the configuration window according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating a configuration window of a motion control section and configuration information displayed in a partial area of the configuration window according to another embodiment of the present invention;
FIG. 9 shows a schematic flow diagram of a device management method according to one embodiment of the invention;
fig. 10 is a diagram illustrating a configuration window of an MRX-T4 robot model and an associated information setting window displayed in a partial area of the configuration window according to an embodiment of the present invention;
FIG. 11 shows a schematic flow diagram of a motion control method according to one embodiment of the invention;
FIG. 12 illustrates a schematic diagram of a parameter editing window, according to one embodiment of the invention;
FIG. 13 shows a schematic diagram of a parameter editing window according to another embodiment of the invention;
FIG. 14 shows a schematic flow chart diagram of a user interaction method according to one embodiment of the invention;
FIG. 15 shows a schematic flow chart diagram of a user interaction method according to one embodiment of the invention;
FIG. 16 shows a schematic block diagram of a control device according to one embodiment of the present invention; and
fig. 17 shows a schematic block diagram of a control device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
To at least partially solve the above problem, an embodiment of the present invention provides a control apparatus. The control device can realize the simultaneous management and control of a plurality of devices to be controlled (including robots and/or motion control components) in a hardware mode or a hardware and software combined mode. The control device described herein may be any suitable computing device with data processing capabilities and/or instruction execution capabilities, which may be implemented using conventional computers. For example, the control device may be an upper computer, a teach pendant, or the like. For example, the control device described herein may provide a control interface, i.e. a human-machine interaction interface, to a user via a display, with which the user may effect control of the device to be controlled and/or the device model. Further, the control device described herein may also be equipped with control software for implementing algorithmic functions associated with the control interface, for example.
According to the embodiment of the invention, one control device can be used for simultaneously managing and controlling a plurality of devices to be controlled. Compared with the existing one-to-one correspondence scheme, the scheme provided by the embodiment of the invention at least has the following advantages: 1. the operation time is saved, and the synchronism among different devices to be controlled is better; 2. the control of the robot or the motion control part is easier, and the accurate control of each joint or axis can be realized; 3. because the multiple robots can be controlled simultaneously, the action programming of the multiple robots is easier; 4. because the simultaneous control of different robots can be realized, the mutual matching of different actions among the robots is easier to realize, and a complex motion system is easier to form.
It should be understood that, although the control device provided in the embodiment of the present invention has the capability of simultaneously managing and controlling a plurality of devices to be controlled, this is not a limitation of the present invention, and the control device may also be applied to a control scheme in which the control device and the devices to be controlled correspond to each other one by one.
For the above control device, an embodiment of the present invention provides a device management method applicable thereto. The equipment management method can establish the incidence relation between the equipment to be controlled (including the robot and/or the motion control component) actually connected with the control equipment and the pre-established equipment model (including the robot model and/or the motion control component model), and is convenient for a user to perform subsequent control, such as trajectory planning and the like, on the equipment to be controlled.
The device management method and the control device according to the embodiment of the present invention are not limited to the control of the robot. For example, motion control components may be used to control and drive moving components (e.g., motors, etc.), which may be used on both robotic and non-robotic products. For example, the conveyor belt on the production line is driven by a plurality of motors, the motion control part can drive and control the motors, and further, the device management method and the control device according to the embodiment of the invention can be used for managing and controlling the motion control part in the production line field. In summary, the device management method and the control device according to the embodiment of the present invention can be applied to the field of control of any robot or device that operates in a manner similar to a robot or other device having a moving part.
In the following description, the present invention is described in connection with an application environment of the device management method according to the embodiment of the present invention, that is, a motion control system. The motion control system described herein may include a control device and a device to be controlled. As described above, the control device may include, for example, an upper computer, a teach pendant, or the like. The device to be controlled may comprise, for example, a robot, a motion control part for driving the robot in motion, etc. The robot can be a four-axis robot, an H2 type plane portal robot and other robots with various configurations, and the motion control component can be a single-axis drive controller, a multi-axis drive controller and other drives. The moving part described herein may be a motor only, or a motor combined with a speed reducer, or a motor combined with a lead screw, etc.
The robots described herein may be robotic devices that automatically perform work. A robot may include a robot body, an end effector (or referred to as a tool). The body may include a plurality of joints, such as a base, a large arm, a small arm, a wrist, and the like. The end effector is, for example, a jaw/object holder that can be opened and closed, but also other operating tools. The end effector is controlled by the control device to move according to the corresponding route and complete the preset action. Specifically, for example, the end effector is manipulated by the control device, performs a motion in three dimensions, and performs a related action, such as a grasping, releasing, or other action, at a specified position.
Taking a motor matched with a reducer as an example, the motor matched with the reducer is a main motion execution component of a mechanical arm (or called as a mechanical arm, a multi-axis robot, a multi-joint robot and the like), and the mechanical arm is mainly used for clamping a target object from an initial position to a target position according to a preset route, so that the mechanical arm is suitable for mechanical automation operation in various industrial fields.
The mechanical arm on the market mainly comprises a four-axis robot (with four joints) and a six-axis robot (with six joints), each of which comprises a base, an arm and a tail end object clamping part, wherein the number of the joints on the arm determines the number of 'axes' of the robot, and each joint is driven by the rotation of a motor to realize the movement of the joint.
A motion control system according to an embodiment of the present invention is described below with reference to fig. 1 to help understand an exemplary application environment of a device management method according to an embodiment of the present invention. FIG. 1 shows a schematic block diagram of a motion control system 100 according to one embodiment of the present invention. It should be noted that the device management method provided by the embodiment of the present invention can be implemented on other systems similar to the motion control system 100, and is not limited to the specific example shown in fig. 1.
As shown in fig. 1, the motion control system 100 may include a human-machine interaction unit (i.e., control device) 110, a Controller Area Network (CAN) data line 120, a motion control component 130, and a motor (i.e., motion component) 140. The motion control part 130 includes a CAN data transceiving unit 1302, a buffer 1304, a resolving unit 1306, a wave table 1308, a PWM waveform generator 1310, and a motor driving unit 1312.
The user may edit the motion parameters through the human interaction unit 110 while controlling the motor 140 using the motion control part (e.g., the driver) 130. The human-computer interaction unit 110 sends the motion parameters edited by the user to the motion control component 130 through the CAN data line 120, and the motion control component 130 calculates the received motion parameters to obtain wavetable data, and then generates a PWM waveform to drive the motor to move.
Specifically, the calculation unit 1306 in the motion control unit 130 may read the motion parameters, perform processing such as interpolation calculation using a calculation formula on the read motion parameters, convert the motion parameters into wave table data, and store the wave table data in the wave table 1308.
The wave table 1308 may be implemented by a DDR memory or the like, and is used to store wave table data, and the size of the storage depth of the wave table 1308 may be set according to design requirements.
The PWM waveform generator 1310 is configured to generate corresponding PWM waveform data according to the wave table data stored in the wave table 1308. The PWM waveform can be also called as a pulse waveform sometimes, has two states of high level and low level, and can achieve the purposes of controlling the rotating speed of a motor, the switching state of a solenoid valve and the like by adjusting the duty ratio of the PWM waveform in the field of motion control. The PWM waveform generator 1310 may be implemented using various existing PWM waveform generators, such as a PWM waveform generator implemented using a direct digital frequency synthesis (DDS) signal generation technique, a PWM waveform generator implemented using a digital counting technique, and so on.
Therefore, the calculating unit 1306 converts the actual motion parameters set by the user into wave table data for generating PWM waveforms, and the PWM waveform generator 1310 generates corresponding PWM waveform data according to the wave table data, and sends the PWM waveform data to the motor driving unit 1312 to drive the motor 140 to move after digital-to-analog conversion, amplification and filtering.
The motor driving unit 1312 is configured to drive the motor 140 to move according to the PWM waveform, and may be implemented by using various motor driving chips.
In the example shown in fig. 1, only one motion control part 130 connected to the control device 100 is shown. An example in which one control apparatus controls a plurality of motion control sections is described below. Fig. 2 shows a schematic block diagram of a motion control system 200 according to another embodiment of the present invention. As shown in fig. 2, the motion control system 200 includes a control device 201, a gateway 202, N motion control components 203, N motion components 204, and N sensors 205. N may be an integer greater than 1.
The control device 201 is used to implement human-computer interaction, and is, for example, a computer with control software installed therein. The user can set various parameters of the robot or the motion control component on the man-machine interaction interface provided by the control device 201, so as to realize the control of the robot.
The gateway 202 is used to implement data exchange between the control device 201 and the motion control section 203. Gateway 202 may be implemented, for example, using CAN data line 120 described above.
The motion control component 203 is configured to calculate the motion parameters sent by the control device 201, and generate a driving current for controlling and driving the motion component 204 to move based on the motion parameters edited by the user, so as to drive the motion component 204 to move, and further drive a corresponding moving body (for example, a joint of a robot) to move. Each of the motion control sections 203 may be implemented using a structure similar to that of the motion control section 130 described above.
The moving parts 204 may be motors, each of which may be used to drive one joint of the robot.
The sensor 205 is used for detecting the moving position of the moving body in real time, and may be an encoder, an angle sensor, an optoelectronic switch, a machine vision system, or the like. Some robots may not have sensors.
As shown in fig. 2, a plurality of motion control units 203 may be simultaneously connected to the control apparatus 201, and the control apparatus 201 may manage and control these motion control units 203 at the same time.
The following presents a simplified overview of the human-computer interaction scheme implemented on the control device to facilitate a better understanding of the present invention.
According to the embodiment of the invention, a plurality of equipment models can be established in advance, and the equipment models can be stored in a model library. Alternatively, the model library may be stored on a server, downloaded and used by different control devices. Alternatively, the model library may be created by the user on the current control device. The plant model described herein may be a robot model or a motion control component model. On the human-computer interaction interface, a user can select and edit the equipment models, and the incidence relation between the equipment models and the actually connected equipment to be controlled can be established. After the association is established, the movement of the device to be controlled can be controlled by the movement parameters set for the device model.
The benefits of modeling the device are numerous. For example, the types of motion control components of different robots may be the same, and with this approach, it may be possible to identify the motion control components and associate them with the robot model without having to identify the actual connected robots one by one. In addition, the scheme enables the expandability of the whole control device to be good. For example, if a new robot configuration is developed, it is not necessary to develop a new control device, but it is only necessary to add a new robot model to the model library used by the control device.
Therefore, on the human-computer interaction interface of the control equipment, besides the equipment to be controlled connected with the control equipment, the equipment models in the model library can be displayed for the control of a user.
FIG. 3 shows a schematic diagram of a human-machine interface on a control device according to one embodiment of the invention. It should be noted that the layout of the human-computer interface shown in fig. 3 is only an example and not a limitation of the present invention.
As shown in FIG. 3, the human-computer interface may include one or more of the following four display areas: an engineering window (engineering area), a robot window (robot area), an editing window (editing area), and an equipment window (equipment area). Each window may be considered to be a region whose position may be fixed or adjustable. For example, the user may rearrange the human-computer interaction interface by dragging or the like, and adjust the position of each window. In addition, any one of the four windows can be opened or closed according to requirements. In addition, the human-computer interaction interface can also comprise other windows, for example, a record output window is arranged at the lowest part of the human-computer interaction interface shown in the figure 3. In one example, the edit window may be fixed in position and the user has no authority to close, and the other three windows may be fixed in position and the user has authority to close.
An engineering window: for managing all the file resources currently, for example, displaying a file list related to project files. FIG. 4 shows a schematic diagram of an engineering window, according to one embodiment of the invention. As shown in FIG. 4, in the project window, a file list of managed files of the project file "1212, prj (i.e., file name 1212, extension prj)" is displayed. The file list of project files may include the file names of all files belonging to the current project. The control software of the control device may generate files in a variety of formats, including but not limited to: a project file (extension prj), a scene file (extension sce), a robot motion file (extension mc), a motor motion file (extension pvt), a settings file (extension stp), and the like.
In the description herein, a project file may be understood as a file generated when a project is created in control software, which may be used to integrate and manage other files under the project, such as scene files, motor motion files, and the like. A scene file may be understood as a file generated when a scene (i.e., the control scene described herein) is created in the control software. In any scene, a device model may be added (may be called as activated), or a device to be controlled connected to a control device may be added, and some operations such as parameter editing may be performed on the added device model or the device to be controlled, and when a device in a scene is operated, a scene file may record information of these operations. For example, the following rules may be set in the control software: after a scene file is generated and awakened (i.e., selected) in a project window in a new or import manner, a user can select a required equipment model in a robot window to perform operations such as activation and subsequent parameter editing.
The robot motion file may be a file containing information related to a motion trajectory of the robot (e.g., motion parameters of the robot). The motor motion file may be a file containing information related to a motion trajectory of the motor (e.g., a motion parameter of the motor).
The setup file may be a file containing configuration information of the device or device model to be controlled. In case the device to be controlled is a motion control component or the device model is a motion control component model, the configuration information may comprise one or more of the following information of the motion control component or the motion control component model: basic information, motor parameters of each shaft, motion trajectory planning parameters, trigger parameters, IO interface parameters and the like. The basic information may include, for example, a model number, a version number, an Identifier (ID), and the like. The motor parameters may include, for example, motor size, step angle, maximum value, encoder information, and the like. The motion trajectory planning parameters may include an interpolation mode, an interpolation curve duty ratio, a zero position, an emergency stop mode, and the like, which are used for interpolation in the motion parameters. In the description herein, the motion trajectory planning parameter is a parameter for planning a motion trajectory of the device or device model to be controlled, and contains different content from the motion parameter.
In case the device to be controlled is a robot or the device model is a robot model, the configuration information may comprise one or more of the following information of the robot or robot model: basic information, corresponding relation between each joint and the axis of the motion control component, step length adopted for interpolation in motion parameters, zero position and the like. The basic information may include, for example, a model number, a version number, an Identifier (ID), and the like.
Robot window: and the device is used for displaying the identification information of various pre-established device models. The device model displayed by the robot window may include a robot model and/or a motion control component model. The set of all the device models displayed by the robot window may be referred to as a first model set. For example, the identification information of the device model may include a model number and/or an icon of the device model. Fig. 5 shows a schematic view of a robot window according to an embodiment of the invention. As shown in fig. 5, identification information of a plurality of device models is displayed in the robot window, where MRX represents a model of a robot type. In the case where the current scene file or current control scene is in an editable state (e.g., the user selects the current scene file), the user may select the device model that needs to be activated, and then the model is activated and displayed in the editing window in the current scene file or current control scene. In the description herein, activating a device model in a certain scene file has the same meaning as activating a device model in a control scene corresponding to the scene file. Referring back to fig. 3, the editing window shows a plurality of robot models in scene "3, sce".
And (3) a device window: for displaying identification information of the editable device. For example, the editable device may include all device models in the second model set and/or at least one device to be controlled to which a connection is established with the control device. The second set of models may include all device models activated in the at least one control scenario. The at least one device to be controlled to which a connection is established with the control device may be, for example, all the motion control means 203 shown in fig. 2 connected to the control device 201.
After the user connects the control device and the device to be controlled together through the bus (for example, up to 100 devices CAN be connected through the CAN bus), the device window may display a list of the devices to be controlled, which may include identification information of all the connected devices to be controlled. In the case where the device to be controlled is a motion control section, the list of devices to be controlled may further include identification information of each axis of the motion control section. FIG. 6 shows a schematic diagram of a device window, according to one embodiment of the invention. As shown in fig. 6, the device window displays identification information of a motion control section (a five-axis driver) of model MRQ-M2305, and also displays identification information of five axes of the motion control section, which are denoted by CH1-CH5, respectively.
Upon receiving an activation instruction input by a user indicating activation of a certain device model, the device window may display identification information of the activated device model. Referring back to FIG. 3, the identification information MRX-AS, MRX-T4 of the three robot models activated in the scene file "3. sce" are displayed in the device window, and the identification information MRX-AS, MRX-T4, MRX-DT, MRX-H2Z of the four robot models activated in the scene file "11111. sce" are also displayed.
And (4) editing the window: for editing control scenes. An interface for editing the current control scene may be provided to the user in the editing window. The operation performed in the editing window corresponds to an operation on a scene file, for example, adding an equipment model in the editing window corresponds to activating the equipment model in the current control scene, and the scene file may record information related to the activation operation. Upon receiving an activation instruction input by a user indicating activation of a certain device model, the editing window may display identification information of the activated device model.
Note that the form of the identification information displayed in the different windows may be the same or different. For example, the identification information of the device models displayed by the robot window and the editing window may each include a model number and an icon, and the identification information of the device models displayed by the device window may include only the model number.
The user may select an editable device in the editing window or the device window, where the editable device may be a device to be controlled or a device model, and then the user may view and/or edit various pieces of information of the editable device, such as the configuration information or the motion parameters.
Fig. 7 is a diagram illustrating a configuration window of a motion control section and configuration information displayed in a partial area of the configuration window according to an embodiment of the present invention. The configuration information shown in fig. 7 is basic information of the motion control section of model MRQ-M2304. As shown in fig. 7, the left area of the configuration window displays a plurality of controls, "information", "CH 1", "CH 2", "CH 3", "CH 4", "input", "output", "sen.", "system", etc., which may be button controls, and the user may select any one of the controls by clicking. The user selects the control "info" in the left area, and basic information of the current editable device can be displayed in the right area. The user can view the basic information and edit it as needed.
Fig. 8 is a diagram illustrating a configuration window of a motion control section and configuration information displayed in a partial area of the configuration window according to another embodiment of the present invention. As shown in fig. 7, the left area of the configuration window displays a number of controls, "information", "Axes 1" (similar in function to the "CH 1" control shown in fig. 7), "Axes 2" (similar in function to the "CH 2" control shown in fig. 7), "Axes 3" (similar in function to the "CH 3" control shown in fig. 7), "Axes 4" (similar in function to the "CH 4" control shown in fig. 7), "input/output" (similar in function to the "input" and "output" controls shown in fig. 7), "sensors" (similar in function to the "sen" control shown in fig. 7), "system", etc., which can be button controls, any of which the user can select by clicking. The configuration information shown in fig. 8 is the motor parameter for the first shaft (Axes1) of the motion control component model MRQ-M2304. The user selects the corresponding controls of the Axes such as "Axes 1" and "Axes 2" in the left area, i.e. the motor parameters of the Axes of the current editable device can be displayed in the right area. The user can view the motor parameters and also edit the motor parameters when needed. The display and editing modes of the rest configuration information are similar, and are not described in detail.
In the description herein, the motion parameter refers to a parameter indicating a motion trajectory of a device to be controlled or a device model. It will be understood by those skilled in the art that each motion trajectory may be composed of a plurality of points containing information such as position, time, etc., and each point is a motion parameter. A motion parameter sequence including a plurality of motion parameters may be regarded as one motion trajectory. In the case where the device to be controlled is a robot or the device model is a robot model, the motion parameter may be a motion parameter of an end effector of the robot or the robot model, or may be a motion parameter of each of at least one joint of the robot or the robot model. In the case where the device to be controlled is a motion control component or the device model is a motion control component model, the motion parameter may be a motion parameter of each of at least one axis of the motion control component or the motion control component model.
The content of the motion parameters may vary depending on the actual configuration of the moving part (e.g., motor). Illustratively, the motion parameters may include one or more of position data, velocity data, and time data. The position data may be coordinate data in a rectangular spatial coordinate system, or may be rotation angle or other data related to a position. In the case where the position data is coordinate data in a spatial rectangular coordinate system, the motion parameter may be referred to as an LVT parameter. In case the position data is a rotation angle, the motion parameters may be referred to as PVT parameters. The LVT parameters may include coordinates in a rectangular spatial coordinate system (which may be referred to as X, Y, Z) and a time to reach a corresponding coordinate point (which may be referred to as T). The PVT parameters may include a rotation angle (which may be referred to as P), a rotation speed (which may be referred to as V), a rotation time (which may be referred to as T).
According to an aspect of the present invention, there is provided a device management method. Fig. 9 shows a schematic flow diagram of a device management method 900 according to an embodiment of the invention. As shown in fig. 9, the device management method 900 includes steps S910, S920, S930.
In step S910, a connection is established with at least one device to be controlled, wherein the at least one device to be controlled comprises at least one robot and/or at least one motion control component. In the description herein, "at least one" may be equivalent to "one or more".
As described above, the control device CAN establish a connection with at least one device to be controlled via a bus (e.g., a CAN bus).
In step S920, a target device model, which is a robot model or a motion control part model, is activated based on an activation instruction input by a user.
The activation of the target device model may be adding the target device model to the current control scenario such that the target device model transitions from a non-editable state to an editable state. The user may input the activation instruction by dragging the identification information of the target device model from the robot window to the editing window, or clicking the identification information of the target device model in the robot window, or the like. Alternatively, the click may be a double click. The control device may set the target device model to an editable state upon receiving operation information (i.e., receiving an activation instruction) input by the user when performing the above-described operation. After activation, the user can further perform operations such as configuration information editing, motion parameter editing, association with the device to be controlled and the like on the human-computer interaction interface for the target device model.
In step S930, based on the association instruction input by the user, an association relationship between the target device model and a corresponding device to be controlled in the at least one device to be controlled is established.
The device model is a virtual device, and the device to be controlled is a device actually connected to the control device. Meanwhile, the device model is a model of a certain model of a robot or a motion control part, and the device to be controlled itself may be a certain model of a robot or a motion control part for driving a certain model of a robot. Therefore, some equipment models can be correspondingly associated with some equipment to be controlled. Of course, there may be some equipment models to which there is no equipment to be controlled that can be associated, and there may also be some equipment to be controlled that does not have an equipment model that can be associated.
When needed, a certain equipment model can be associated with a certain equipment to be controlled, so that a user can edit the motion parameters of the equipment model and send the motion parameters to the equipment to be controlled associated with the equipment to be controlled so as to actually drive the equipment to be controlled to move.
The target device model may be any device model that may optionally be selected by a user. For example, the user may click on the identification information of any device model in the edit window or the device window to select the device model as the target device model. Alternatively, the click may be a double click.
In the description herein, a user refers to a person interacting with a control device, which may be the same person or different persons. For example, the activation instruction of step S920 and the association instruction of step S930 may be from different users.
It should be noted that the steps of the device management method according to the embodiment of the present invention are not limited to the execution order shown in fig. 9, and may have other reasonable execution orders. For example, step S920 may be performed before or simultaneously with step S910.
According to the equipment management method provided by the embodiment of the invention, the pre-established equipment model can be associated with the corresponding equipment to be controlled, so that a user can conveniently control the actually connected equipment to be controlled through the equipment model. This method is advantageous for improving the management efficiency of the control device, because it allows the control device to recognize only the motion control part and associate it with the robot model without recognizing the actually connected robots one by one. In addition, the method enables the expandability of the whole control device to be good.
According to an embodiment of the present invention, the method 900 may further include: displaying identification information of at least one equipment model on a human-computer interaction interface; wherein the target device model is one of the at least one device model.
According to the embodiment of the invention, the human-computer interaction interface may include a robot window, and displaying the identification information of the at least one equipment model on the human-computer interaction interface may include: identifying information of the at least one equipment model is displayed in the robot window.
As described above, the human-machine-interaction interface may include a robot window in which identification information for several device models may be displayed for viewing by a user. The identification information may include a model number and/or an icon of the device model. The device models displayed by the robot window belong to a first set of models, which may be from a library of models. The layout of the robot window and the content contained in the robot window have already been described above with reference to fig. 3 and 5, and are not described again here.
It should be understood that the robot window is merely an example, and the identification information of the device models included in the first set of models may be displayed in other manners, for example, by way of a list control in a menu bar.
At least one equipment model (namely, a first model set) is displayed on the human-computer interaction interface, so that a user can conveniently select a required equipment model at any time to carry out operations such as motion parameter editing and the like.
According to the embodiment of the present invention, the human-computer interaction interface may further include an editing window, and activating the target device model based on the activation instruction input by the user (step S920) may include: activating the target equipment model in response to the clicking operation of the user on the identification information of the target equipment model in the robot window or the dragging operation of dragging the identification information of the target equipment model from the robot window to the editing window; the activation instruction comprises an instruction corresponding to a click operation or a drag operation.
The layout of the editing window and the content contained in the editing window have already been described above with reference to fig. 3, and are not described herein again.
As described above, the user can click on or drag the identification information of any device model displayed by the robot window to the editing window. When the user performs the above-described click operation or drag operation with respect to the target device model, the target device model is activated, that is, the target device model is set from the non-editable state to the editable state, in response to these operations. Further, optionally, in response to the above-described click operation or drag operation, identification information of the target device model may be displayed in the edit window and/or the device window. The clicking operation may be a single clicking operation. Furthermore, the click operation may be performed in the current control scene, that is, in a state where the current scene file is editable. In this way, the target device model may be activated in the current control scenario but not in other control scenarios.
The equipment model displayed by the editing window and/or the equipment window belongs to the second model set. The second set of models may be understood as a set of activated device models, the first set of models may be understood as a set of device models that may be activated, the second set of models being a subset of the first set of models.
According to an embodiment of the present invention, after establishing a connection with at least one device to be controlled, the method 900 may further include: generating a device to be controlled list containing identification information of at least one device to be controlled; and displaying a list of the devices to be controlled on the man-machine interaction interface.
Illustratively, the human-computer interaction interface may include a device window, and displaying the list of devices to be controlled on the human-computer interaction interface may include: and displaying a list of the devices to be controlled in the device window.
As described above, the human-computer interaction interface may include a device window in which identification information for a number of editable devices may be displayed for viewing by the user. The identification information may include a model number and/or an icon of the editable device. The layout of the device window and the content contained in the device window have already been described above with reference to fig. 3 and 6, and are not described herein again.
The editable device may comprise all device models in the second set of models and/or at least one device to be controlled to establish a connection with the controlling device. The device window may display a list of devices to be controlled, and the identification information of at least one device to be controlled that establishes a connection with the control device may be displayed through the list of devices to be controlled. In addition, the device window may further display a list of device models, and the identification information of the device models in the second set of models may be displayed through the list of device models.
It should be understood that the device window is merely an example, and the identification information of the editable device may be displayed in other manners, e.g., by way of a list control in a menu bar.
According to an embodiment of the present invention, after activating the target device model based on the activation instruction input by the user (step S920), the method 900 may further include: and displaying the identification information of the target equipment model in an editing window and/or an equipment window of the human-computer interaction interface.
The above describes an embodiment of displaying the identification information of the target device model in the editing window and/or the device window, and details are not repeated here.
According to the embodiment of the present invention, before establishing an association relationship between the target device model and a corresponding device to be controlled in the at least one device to be controlled based on an association instruction input by a user (step S930), the method 900 may further include: responding to the click operation of a user on the identification information of the target equipment model displayed by the editing window or the equipment window, and displaying a configuration window of the target equipment model on a human-computer interaction interface; and responding to the clicking operation of a user on an associated interface selection control on the configuration window, and displaying an associated information setting window for inputting an associated instruction, wherein the associated interface selection control is a button control for controlling the opening and closing of the associated information setting window.
Alternatively, the user's click on the identification information of the editing window or the target device model displayed by the device window may be a double click. Alternatively, the user's click on an associated interface selection control on the configuration window may be a single click.
Fig. 10 is a diagram illustrating a configuration window of an MRX-T4 robot model and an associated information setting window displayed in a partial area of the configuration window according to an embodiment of the present invention. Illustratively, the configuration window may be displayed in the editing window. A user double-clicking on the identification information of the MRX-T4 robot model displayed in the edit window or the device window may pop up a configuration window as shown in fig. 10.
As shown in fig. 10, the configuration window may be divided into two regions, a left region may be referred to as a control display region and is used to display controls such as "information", "detailed information", "options", "zero position", "properties", and the like, and a right region is used to display a window corresponding to a currently selected control. As shown in fig. 10, the currently selected control is an "option" control, and a right-side associated information setting window is displayed, that is, the "option" control is the associated interface selection control. Each control of the control display area shown in fig. 10 is a button control. A user clicks any button control once, and a setting interface corresponding to the button control can be displayed in the right area of the configuration window.
According to the embodiment of the present invention, the step of establishing an association relationship between the target device model and the corresponding device to be controlled in the at least one device to be controlled based on the association instruction input by the user (step S930) may include: for any joint of the target equipment model, receiving a joint axis association instruction which is input by a user and used for indicating a corresponding axis corresponding to the joint and corresponding to the equipment to be controlled; and associating the joint with the corresponding axis of the corresponding device to be controlled according to the joint axis association command.
The association instruction related to step S930 may include all joint axis association instructions corresponding to all joints of the robot model. When associating a robot model with an actually connected robot or a motion control component model with an actually connected motion control component, direct association is possible. When associating the robot model with the actually connected motion control part, it is possible to select to associate each joint of the robot model with each axis of the motion control part in one-to-one correspondence. To this end, each joint of the robot model may be assigned an axis of the motion control part corresponding thereto, and the instruction input by the user to indicate which axis the axis is a joint axis association instruction.
For example, before establishing an association relationship between the target device model and a corresponding device to be controlled in the at least one device to be controlled based on an association instruction input by the user (step S930), the method 900 may further include: displaying a joint axis association control corresponding to any joint of the target equipment model on a human-computer interaction interface, wherein the joint axis association control is a text box control or a list control; for any joint of the target device model, receiving a joint axis association instruction input by a user for indicating a corresponding axis corresponding to the joint corresponding to the device to be controlled may include: for any joint of the target equipment model, receiving identification information of a corresponding shaft corresponding to equipment to be controlled, which is input in a corresponding joint shaft association control by a user; or receiving a selection instruction of selecting the identification information of the corresponding axis of the corresponding device to be controlled from the corresponding joint axis association control by the user; the joint axis association instruction corresponding to any joint of the target device model comprises an instruction corresponding to an input or selection operation executed by a user for the joint axis association control corresponding to the joint.
The target device model may be a robot model, the robot having a plurality of joints, and the corresponding device to be controlled may be a motion control part having a plurality of axes. In this case, a one-to-one correspondence relationship between each joint of the robot and each axis of the motion control part may be established.
As described above, when the user double-clicks the identification information of the target device model in the editing window or the device window, a configuration window of the target device model may pop up, i.e., be displayed, on the human-computer interaction interface. The configuration window includes several controls, such as "information," "detailed information," "options," "zero bits," "properties," and the like. The control corresponding to the "option" is an associated interface selection control, which may be a button control. When the user clicks the option control, a related information setting window can be popped up, namely, the related information setting window is displayed, and the corresponding relation between the joint of the robot model and the axis of the motion control part can be set in the related information setting window of the robot model.
With continued reference to fig. 10, in the correlation information setting window, the base of the MRX-T4 robot model is set to the axis 1 of the motion control unit devicel (i.e., CH1), the Big Arm of the MRX-T4 robot model is set to the axis 2 of the motion control unit devicel (i.e., CH2), the Little Arm of the MRX-T4 robot model is set to the axis 3 of the motion control unit devicel (CH3), the Wrist of the MRX-T4 robot model is set to the axis 4 of the motion control unit devicel (CH4), and the Hand of the MRX-T4 robot model is set to the axis 5 of the motion control unit devicel (CH 5).
In the above manner, each joint of the robot model can be associated with each axis of the corresponding motion control part. In one example, a user may edit or import the motion parameters of the end effector of the robot model, which are converted by the control device to the motion parameters of the joints of the robot model. In another example, the user may directly edit or import the motion parameters of the joints of the robot model. Then, the motion parameters can be transmitted to each axis of the corresponding motion control component according to the established association relationship, and the robot actually controlled by the motion control component is driven to move.
In case the device to be controlled is a motion control means and is connected to an actually existing robot, the motion parameters of the target robot model may be used to drive the actually existing robot in motion. Preferably, the actual robot is configured the same as the target robot model, e.g., both MRX-T4 robots.
According to another aspect of the present invention, a motion control method is provided. FIG. 11 shows a schematic flow diagram of a motion control method 1100 according to one embodiment of the invention. As shown in fig. 11, the motion control method 1100 includes steps S1110, S1120, S1130.
In step S1110, a parameter editing window related to a target device model, which is a robot model or a motion control component model, is displayed on the human-computer interaction interface.
Illustratively, a parameter editing window related to the target device model can be displayed on the human-computer interaction interface based on a motion control instruction corresponding to the target device model and input by a user. For example, the user may click (double-click or single-click) on the edit window or on the identification information of the target device model displayed by the device window. In response to a click operation by the user, a parameter edit window may pop up.
The content displayed by the parameter editing window may be different for different types of devices. FIG. 12 shows a schematic diagram of a parameter editing window, according to one embodiment of the invention. Fig. 13 shows a schematic diagram of a parameter editing window according to another embodiment of the present invention. The motion parameter in editing in the parameter editing window shown in fig. 12 is an LVT parameter, which may be a motion parameter of an end effector of a robot or a robot model. The motion parameter in the parameter editing window shown in fig. 13, which is to be referred to in editing, is a PVT parameter, which may be a motion parameter of any one axis of the motion control part or the motion control part model.
The target device model may be any device model that may optionally be selected by a user. For example, the user may click on the identification information of any device model in the edit window or the device window to select the device model as the target device model. Alternatively, the click may be a double click.
In step S1120, the motion parameters edited by the user in the parameter editing window are received.
Referring to fig. 12, the user may enter data for a motion parameter in each row, including time data, coordinate data, jaw displacement data, etc. The coordinate data refers to coordinates of a fixed point on the end effector, such as coordinates of a certain center point of the gripper. The coordinate data for any one of the motion parameters is used to indicate the position that the end effector should reach at the time indicated by that motion parameter (i.e., the time data). The time data for any one motion parameter is used to indicate the time at which a fixed point on the end effector reaches the location (i.e., coordinate data) indicated by that motion parameter. Jaw displacement data refers to the distance two jaws of an end effector are moved laterally. The jaw displacement data is optional. By way of example and not limitation, the end effector may have jaws that can open and close, i.e., can be displaced in a lateral direction. In this case, the end effector may have jaw displacement data. This jaw displacement data is represented by h (in mm) in the example shown in figure 12.
Further, each motion parameter may also include parameter validity data and/or interpolated validity data, for example. The parameter validity data of any one motion parameter is data indicating whether or not the piece of motion parameter is valid. For example, in fig. 12, the first column of data of each motion parameter, i.e. the data in the column of "allow" is parameter validity data, and when the data is "true", it represents that the motion parameter is valid, and it may be listed as the motion trajectory of the corresponding device (robot or robot model); when the data is "false", it represents that the piece of motion parameter is invalid, and is not listed in the motion trajectory of the corresponding device, i.e. the piece of motion parameter is ignored. The interpolation validity data of any one motion parameter is data indicating whether or not interpolation is performed between the one motion parameter and the next motion parameter. For example, in fig. 12, the eighth column of data of each motion parameter, i.e., the data in the column of "interpolation mode", is interpolation validity data, and when the data is "true", it represents that interpolation is performed between the motion parameter and the next motion parameter; when the data is "false", it means that interpolation is not performed between the piece of motion parameter and the next piece of motion parameter.
In the example shown in fig. 13, the form of the motion parameter is different from that of fig. 12. The motion parameters shown in fig. 13 include parameter validity data, time data, position data, and velocity data. The position data is a rotation angle.
The user can edit one or more motion parameters in the parameter editing window, and can interpolate the edited motion parameters as required to generate a larger number of motion parameters. For any one of the motion parameters, a text box, list box, or other type of control may be used as an editing interface for the data. The user may set the value of the data in the editing interface by entering, selecting, clicking, etc.
In step S1130, the motion parameters are sent to the corresponding device to be controlled associated with the target device model to control the motion of the corresponding device to be controlled, where the corresponding device to be controlled is a robot or a motion control unit that establishes a connection with the control device.
As described above, the target device model may be associated with the device to be controlled in a manner that can be referred to the above description. After the user edits the motion parameters, the motion parameters can be sent to the corresponding device to be controlled. In case the device to be controlled is a robot, the robot may be moved based on the motion parameters. In the case that the device to be controlled is a motion control unit, the motion control unit may generate a PWM waveform based on the motion parameter, and then drive the robot connected to the motion control unit to move.
According to the motion control method provided by the embodiment of the invention, the motion control method can interact with a user, and the motion parameters edited by the user aiming at the target equipment model are utilized to control the motion of the equipment to be controlled, which is actually connected with the control equipment. This makes it possible to flexibly and conveniently control different devices to be controlled by different types of device models.
According to the embodiment of the present invention, before displaying the parameter editing window related to the target device model on the human-computer interaction interface (step S1110), the method 1100 may further include: establishing a current control scene; and activating the target device model in the current control scenario based on the activation instruction input by the user.
Exemplary, establishing the current control scenario may include: a scene file representing the current control scene is generated. The manner of generating the scene file and establishing the control scene has been described above, and the present embodiment may be understood with reference to the above description, which is not described herein again.
According to an embodiment of the present invention, the human-computer interaction interface may include a robot window, and the method 1100 may further include: displaying identification information of each device model in the first model set in a robot window; wherein the first model set comprises at least one device model, and the target device model belongs to the first model set.
The layout manner of the robot window and the content included in the robot window have already been described above with reference to fig. 3 and 5, and the present embodiment can be understood with reference to the above description, which is not repeated herein.
According to the embodiment of the present invention, the human-computer interaction interface may further include an editing window for editing the control scene, and activating the target device model in the current control scene based on the activation instruction input by the user may include: activating the target equipment model in response to the clicking operation of the user on the identification information of the target equipment model in the robot window or the dragging operation of dragging the identification information of the target equipment model from the robot window to the editing window; and displaying the identification information of the target equipment model in the editing window; the activation instruction is an instruction corresponding to a click operation or a drag operation.
The implementation of activating the target device model by inputting the activation instruction by the user has been described above, and the present embodiment may be understood with reference to the above description, which is not described herein again.
According to an embodiment of the present invention, establishing the current control scenario may include: generating a scene file representing a current control scene; prior to establishing the current control scenario, method 1100 may further include: and generating a project file, wherein the project file is used for managing at least one scene file of the control scene.
The content and the function of the engineering document have been described above, and the embodiment can be understood with reference to the above description, which is not repeated herein.
According to an embodiment of the present invention, the method 1100 may further include: and grouping the project files to obtain at least one sub-project file corresponding to at least one control scene one by one.
Referring back to FIGS. 3 and 4, it can be seen that two sub-projects Untitled _0 and Untitled _1 are shown in the project window. The two sub-projects correspond to two sub-project files for managing scene files "3. sce" and "11111. sce" of two control scenes, respectively. The mode can realize the grouping management of different control scenes, and is convenient for users to check and operate.
For example, the operation of grouping the engineering documents may be performed based on a grouping instruction input by a user, or may be automatically performed by the control device.
According to an embodiment of the present invention, the method 1100 may further include: receiving a parameter distribution instruction input by a user in a parameter editing window; and assigning the motion parameter to the specified device indicated by the parameter assignment instruction.
The motion parameters edited by the user can be sent to the corresponding device to be controlled associated with the target device model, and can also be distributed to other specified devices. In case the designated device is a device to be controlled which is actually connected to the control device, the assigned movement parameters may control the designated device to generate an actual movement. In the case where the given device is a model of the device, the assigned motion parameters may be used to simulate motion, not necessarily resulting in actual motion. In this way, the user-edited motion parameters can be flexibly assigned to one or more editable devices, which is advantageous for achieving synchronous control of different editable devices.
For example, the parameter editing window may include a device specification control, the device specification control is a text box control or a list control, and receiving the parameter assignment instruction input by the user in the parameter editing window may include: receiving identification information of specified equipment, which is input in an equipment specified control by a user; or receiving operation information of the identification information of the specified equipment selected by the user from the equipment specified control; the parameter allocation instruction comprises an instruction corresponding to input or selection operation executed by a user aiming at the equipment designated control.
Referring back to FIG. 12, a device designation control is illustrated. In fig. 12, the device designation control is indicated by a dashed box. The device designation control shown in fig. 12 is a list control. In the list control, identification information corresponding one-to-one to all of the editable devices may be displayed for selection by the user. Illustratively, the identification information of any device displayed in the list control may include the following information: the model of the corresponding device, and the scene name of the control scene to which the corresponding device belongs.
In the example shown in FIG. 12, the currently selected option is "MRX-AS @3. sce", where MRX-AS denotes the model number of the robot model, and 3.sce denotes that the robot model belongs to the control scenario of "3. sce". Therefore, this option indicates that the motion parameters are assigned to the MRX-AS robot model in the control scenario of "3. sce".
Alternatively, where the device-specifying control is a textbox control, the user may enter identification information for the specified device directly in the textbox.
According to the embodiment of the invention, the designated device is one of editable devices, the editable device comprises all device models in the second model set and/or at least one device to be controlled which is connected with the control device, the second model set comprises at least one device model, and the target device model belongs to the second model set.
According to an embodiment of the invention, the second set of models comprises all device models activated in at least one control scenario.
The meanings of the editable device and the second model set and the contents contained in the editable device and the second model set have been described above, and the present embodiment can be understood with reference to the above description, which is not described herein again.
According to another aspect of the present invention, a user interaction method is provided. FIG. 14 shows a schematic flow diagram of a user interaction method 1400 in accordance with one embodiment of the present invention. As shown in FIG. 14, the user interaction method 1400 includes steps S1410, S1420, S1430.
In step S1410, a human-machine interaction interface is displayed, the human-machine interaction interface including one or more of an engineering window, a robot window, an equipment window, and an editing window, the engineering window for displaying a list of files related to the engineering files, the robot window for displaying identification information of each equipment model in the first model set, the equipment window for displaying identification information of editable equipment, the editing window for editing the control scene, the editable device comprises all device models in the second model set and/or at least one device to be controlled which is connected with the control device, the at least one device to be controlled comprises at least one robot and/or at least one motion control component, the first model set and the second model set respectively comprise at least one device model, and each device model is a robot model or a motion control component model.
The layout and the content of the windows on the human-computer interaction interface have been described above with reference to fig. 3-6, and will not be described herein again.
In step S1420, an instruction input by a user on the human-computer interaction interface is received.
The instructions input by the user on the man-machine interface may include, but are not limited to, the above-mentioned activation instructions, association instructions, motion control instructions, and the like. The interaction between the user and the man-machine interaction interface can be realized by an interaction device of the control equipment or an independent interaction device. Optionally, the interaction means may comprise input means and output means. The user can input instructions through the input device, and the control equipment can display a man-machine interaction interface and other related information through the output device for the user to view. The input device may include, but is not limited to, one or more of a mouse, a keyboard, a touch screen. The output device may include, but is not limited to, a display. In one example, the interaction device includes a touch screen, which can simultaneously implement the functionality of the input device and the output device.
In step S1430, a corresponding operation is performed based on the instruction input by the user.
For example, upon receiving an activation instruction input by a user, the target device model may be activated accordingly. For another example, when a transmission instruction for transmitting the motion parameter input by the user is received, the edited motion parameter may be transmitted to the device to be controlled.
According to the user interaction method provided by the embodiment of the invention, the equipment model and/or the equipment to be controlled which is actually connected with the control equipment can be displayed on the man-machine interaction interface, and a user can edit the motion parameters and the like aiming at the equipment model and/or the equipment to be controlled. The interaction mode is convenient for a user to manage and control at least one device to be controlled, and is convenient for the user to simulate, test and the like the robot or the motion control component.
According to an embodiment of the present invention, performing a corresponding operation based on an instruction input by a user (step S1430) may include: establishing a current control scene based on a scene establishing instruction or a scene importing instruction input by a user; and providing an interface for editing the current control scene to the user in the editing window.
For example, the user may click a menu command of "file" of the menu bar, select a "new scene file" control, and in response to the above operation by the user, the control device may create a new scene file and may provide an interface for editing the control scene corresponding to the scene file in the editing window. Illustratively, the user may further click a menu command of "file" of the menu bar, select a "import scene file" control from the menu command, and in response to the above operation by the user, the control device may import an existing scene file selected by the user and may provide an interface for editing a control scene corresponding to the scene file in the editing window.
According to an embodiment of the present invention, performing a corresponding operation based on an instruction input by a user (step S1430) may include: the target device models in the first set of models are activated in the current control scenario based on an activation instruction input by a user.
According to an embodiment of the present invention, activating a target device model in the first model set in the current control scenario based on an activation instruction input by a user may include: activating the target equipment model in response to the clicking operation of the user on the identification information of the target equipment model in the robot window or the dragging operation of dragging the identification information of the target equipment model from the robot window to the editing window; and displaying the identification information of the target equipment model in the editing window; the activation instruction is an instruction corresponding to a click operation or a drag operation.
The implementation of activating the target device model by inputting the activation instruction by the user has been described above, and the present embodiment may be understood with reference to the above description, which is not described herein again.
According to an embodiment of the invention, the second set of models may comprise all device models activated in at least one control scenario.
According to an embodiment of the present invention, performing a corresponding operation based on an instruction input by a user (step S1430) may include: displaying a parameter editing window related to a specified device on a human-computer interaction interface based on a motion control instruction which is input by a user and corresponds to the specified device in the editable devices; receiving a motion parameter edited in a parameter editing window by a user; and assigning the motion parameters to the designated device.
The manner of displaying the parameter editing window and the manner of interacting with the user have been described above, and the present embodiment may be understood with reference to the above description, which is not described herein again.
According to the embodiment of the invention, the parameter editing window can be displayed in the editing window. Referring back to fig. 12 and 13, when the parameter editing window pops up, an interface for editing the current control scene in the editing window may be replaced with an interface of the window.
According to the embodiment of the present invention, displaying a parameter editing window related to a specific device on a human-computer interaction interface based on a motion control instruction corresponding to the specific device in the editable device input by a user may include: and responding to the click operation of the user on the identification information of the specified equipment in the equipment window, and displaying a parameter editing window on the human-computer interaction interface.
The implementation of the parameter editing window interactive with the user has been described above, and the present embodiment may be understood with reference to the above description, which is not described herein again.
According to an embodiment of the present invention, the file list may include identification information of at least one scene file respectively representing at least one control scene.
The information displayed in the project window may be considered a list of files. The file list may include identification information of one or more types of files, such as identification information of a project file, identification information of a motor motion file, identification information of a scene file, and the like. Referring back to fig. 4, the file list includes identification information "1212. prj" of the project file 1212, identification information "a.pvt" of the motor motion file a, identification information "3. sce" and "11111. sce" of the scene files 3 and 11111.
In accordance with another aspect of the invention, a method 1500 of user interaction is provided. FIG. 15 shows a schematic flow diagram of a user interaction method 1500 in accordance with one embodiment of the present invention. As shown in fig. 15, the user interaction method 1500 includes steps S1510, S1520, S1530, S1540.
In step S1510, a parameter editing window is displayed on the human-computer interaction interface.
For example, a parameter editing window related to any editable device can be displayed on the human-computer interaction interface based on a motion control instruction corresponding to the editable device input by a user. For example, the user may click (double-click or single-click) on the edit window or on the identification information of the target device model displayed by the device window. And responding to the click operation of the user, and popping up a parameter editing window corresponding to the target equipment model. The user can edit the motion parameters in the parameter editing window and reassign the motion parameters to any of the designated devices.
Illustratively, the user may click on a menu command "file" of a menu bar, from which a "new robot motion file" or "import robot motion file" control is selected, and in response to the user's operation, the control apparatus may pop up a parameter editing window for editing the motion parameters of the end effector of the robot or robot model (see fig. 12).
For example, the user may click on a menu command of "file" of the menu bar, from which a "new motor motion file" or an "import motor motion file" control is selected, and in response to the user's operation, the control apparatus may pop up a parameter edit window for editing a motion parameter of a certain joint of the robot or the robot model or for editing a motion parameter of a certain axis of the motion control part or the motion control part model (see fig. 13).
In the case where the motion parameters shown in the parameter editing window are the motion parameters of the robot or the end effector of the robot model, the motion parameters edited by the user may be stored in the robot motion file. In the case where the motion parameter shown in the parameter editing window is a motion parameter of a certain joint of the robot or the robot model or a motion parameter of a certain axis of the motion control part or the motion control part model, the motion parameter edited by the user may be stored in the motor motion file. This storage scheme is not limited to the pop-up of the parameter edit window.
In step S1520, the motion parameters edited by the user in the parameter editing window are received.
In step S1530, a parameter assignment instruction input by the user in the parameter editing window is received.
In step S1540, the motion parameter is assigned to the specified device indicated by the parameter assignment instruction.
The implementation of the user editing the motion parameters and assigning the motion parameters to the specified device has already been described above with reference to fig. 12 and 13, and the present embodiment may be understood with reference to the above description, which is not described here again.
According to the embodiment of the invention, the designated device is one of editable devices, the editable device comprises all device models in the second model set and/or at least one device to be controlled, which is connected with the control device, the at least one device to be controlled comprises at least one robot and/or at least one motion control component, the second model set comprises at least one device model, and each device model is a robot model or a motion control component model.
According to an embodiment of the present invention, the method 1500 may further include: establishing at least one control scenario; and activating at least one device model in at least one control scenario based on the activation instruction input by the user to obtain a second set of models.
The implementation of establishing a control scenario and activating a device model has been described above, and the present embodiment may be understood with reference to the above description, which is not described herein again.
According to the embodiment of the present invention, the parameter editing window includes a device specific control, the device specific control is a textbox control or a list control, and receiving the parameter allocation instruction input by the user in the parameter editing window (step S1530) may include: receiving identification information of specified equipment, which is input in an equipment specified control by a user; or receiving operation information of the identification information of the specified equipment selected by the user from the equipment specified control; the parameter allocation instruction comprises an instruction corresponding to input or selection operation executed by a user aiming at the equipment designated control.
In the case where the motion parameter is assigned to the designated device as a whole, the user may indicate only to the control device to which device the motion parameter is assigned. For example, if the motion parameters contained in the robot motion file need to be assigned to the robot or the robot model, the user may specify the robot or the robot model through the device specification control.
According to the embodiment of the invention, under the condition that the equipment designation control is the list control, the equipment designation control is used for displaying the identification information which is in one-to-one correspondence with all the equipment in the editable equipment for the user to select; the editable device comprises all device models in a second model set and/or at least one device to be controlled which is connected with the control device, the at least one device to be controlled comprises at least one robot and/or at least one motion control component, the second model set comprises at least one device model, and each device model is a robot model or a motion control component model.
According to an embodiment of the present invention, the identification information of any one of the editable devices may include the following information: the model of the corresponding device, and the scene name of the control scene to which the corresponding device belongs.
According to an embodiment of the present invention, the specific device is a motion control part or a motion control part model including at least one axis, and assigning the motion parameter to the specific device indicated by the parameter assignment instruction (step S1540) may include: the motion parameters are assigned to the specified axes of the specified device indicated by the parameter assignment instruction.
According to the embodiment of the present invention, the parameter editing window includes an equipment designation control and an axis designation control, the equipment designation control is a textbox control or a list control, the axis designation control is a textbox control or a list control, and receiving a parameter allocation instruction input by a user in the parameter editing window may include an equipment designation operation and an axis designation operation, where the equipment designation operation includes: receiving identification information of specified equipment, which is input in an equipment specified control by a user; or receiving operation information of the identification information of the specified equipment selected by the user from the equipment specified control; the axis specifying operation includes: receiving identification information of a designated axis of designated equipment, which is input in an axis designation control by a user; or receiving operation information of the identification information of the specified axis of the specified equipment selected from the equipment specified control by the user; the parameter allocation instruction comprises instructions corresponding to equipment designation operation and axis designation operation.
In the case where the given device is a motion control component or a model of a motion control component, the motion parameters may be assigned to any axis of the given device, at which point the user may indicate to the controlling device which axis of which device the motion parameters are assigned to. For example, if it is required to assign a motion parameter contained in the motor motion file to a certain axis of the motion control part or the motion control part model, the user may specify the motion control part or the motion control part model through the device specification control and specify an axis on the motion control part or the motion control part model through the axis specification control.
Referring back to FIG. 13, a device designation control and an axis designation control are illustrated. The device designation control and the axis designation control shown in fig. 12 are list controls. In the device designation control, identification information corresponding one-to-one to all of the editable devices may be displayed for selection by the user. In the axis designation control, identification information corresponding one-to-one to all axes of the designated device may be displayed for selection by the user. For example, in the case where the specified device is a motion control section of model MRQ-M2305 (a kind of five-axis driver), identifiers "CH 1", "CH 2", "CH 3", "CH 4", "CH 5" of five axes may be displayed in the axis specifying control.
According to the embodiment of the invention, under the condition that the equipment designation control is the list control, the equipment designation control is used for displaying the identification information which is in one-to-one correspondence with all the equipment in the editable equipment for the user to select; under the condition that the axis designation control is a list control, the axis designation control is used for displaying identification information corresponding to all axes of the designated equipment one by one for selection by a user; the editable device comprises all device models in a second model set and/or at least one device to be controlled which is connected with the control device, the at least one device to be controlled comprises at least one robot and/or at least one motion control component, the second model set comprises at least one device model, and each device model is a robot model or a motion control component model.
According to another aspect of the present invention, a control apparatus is provided. Fig. 16 shows a schematic block diagram of a control device 1600 according to an embodiment of the invention.
As shown in fig. 16, the control apparatus 1600 according to an embodiment of the present invention includes a connection module 1610, a selection module 1620, and an association module 1630. The respective modules may respectively perform the respective steps/functions of the device management method described above in connection with fig. 9. Only the main functions of the respective components of the control apparatus 1600 will be described below, and details that have been described above will be omitted.
The connection module 1610 is configured to establish a connection with at least one device to be controlled, wherein the at least one device to be controlled includes at least one robot and/or at least one motion control component.
The selection module 1620 is configured to activate a target device model based on an activation instruction input by a user, where the target device model is a robot model or a motion control component model.
The associating module 1630 is configured to establish an association relationship between the target device model and a corresponding device to be controlled in the at least one device to be controlled based on an associating instruction input by the user.
Fig. 17 shows a schematic block diagram of a control device 1700 according to an embodiment of the invention. The control apparatus 1700 includes a storage (i.e., memory) 1710 and a processor 1720.
The storage 1710 stores computer program instructions for implementing the corresponding steps in the device management method 900 according to an embodiment of the invention.
The processor 1720 is configured to execute computer program instructions stored in the storage 1710 to perform the corresponding steps of the device management method 900 according to an embodiment of the present invention.
In one embodiment, the computer program instructions, when executed by processor 1720, are for performing the steps of: establishing a connection with at least one device to be controlled, wherein the at least one device to be controlled comprises at least one robot and/or at least one motion control component; activating a target device model based on an activation instruction input by a user, wherein the target device model is a robot model or a motion control component model; and establishing an association relation between the target equipment model and the corresponding equipment to be controlled in the at least one equipment to be controlled based on the association instruction input by the user.
Illustratively, the control device 1700 may further include a display for displaying the human-machine interaction interface described above.
Illustratively, the control apparatus 1700 may further include an input device for receiving an instruction input by a user.
Illustratively, the display and the input device may be implemented using the same touch screen.
Furthermore, according to still another aspect of the present invention, there is also provided a storage medium on which program instructions are stored, which when executed by a computer or a processor cause the computer or the processor to execute the respective steps of the above-described device management method of the embodiment of the present invention. The storage medium may include, for example, a storage component of a tablet computer, a hard disk of a personal computer, Read Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), portable compact disc read only memory (CD-ROM), USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
A person skilled in the art can understand specific implementation schemes of the control device and the storage medium by reading the above description related to the device management method, and details are not described herein for brevity.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the foregoing illustrative embodiments are merely exemplary and are not intended to limit the scope of the invention thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some of the modules in a control device according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (13)

1. A method of device management, the method comprising:
establishing a connection with at least one device to be controlled, wherein the at least one device to be controlled comprises at least one robot and/or at least one motion control component;
activating a target device model based on an activation instruction input by a user, wherein the target device model is a robot model or a motion control component model, and the target device model is added into a current control scene through activation of the target device model, so that the target device model is changed from a non-editable state to an editable state; and
and establishing an association relation between the target equipment model and the corresponding equipment to be controlled in the at least one equipment to be controlled based on the association instruction input by the user.
2. The method of claim 1, wherein the target device model is a robot model, the corresponding device to be controlled is a motion control component including at least one axis, and the establishing the association relationship between the target device model and the corresponding device to be controlled of the at least one device to be controlled based on the association instruction input by the user comprises:
for any joint of the target device model,
receiving a joint axis association instruction which is input by a user and used for indicating the corresponding to-be-controlled equipment and corresponds to the joint;
and associating the joint with the corresponding axis of the corresponding device to be controlled according to the joint axis association instruction.
3. The method of claim 2, wherein,
before the establishing of the association relationship between the target device model and the corresponding device to be controlled in the at least one device to be controlled based on the association instruction input by the user, the method further includes:
displaying a joint axis association control corresponding to any joint of the target equipment model on a human-computer interaction interface, wherein the joint axis association control is a text box control or a list control;
for any joint of the target device model, receiving a joint axis association instruction input by a user and used for indicating a corresponding axis of the corresponding device to be controlled, which corresponds to the joint, comprises the following steps:
for any joint of the target device model,
receiving identification information of a corresponding axis of the corresponding equipment to be controlled, which is input in a corresponding joint axis association control by the user; or
Receiving a selection instruction of selecting the identification information of the corresponding axis of the corresponding device to be controlled from the corresponding joint axis association control by the user;
wherein, the joint axis association instruction corresponding to any joint of the target device model includes an instruction corresponding to an input or selection operation executed by the user for the joint axis association control corresponding to the joint.
4. The method of claim 1, wherein the method further comprises:
displaying identification information of at least one equipment model on a human-computer interaction interface;
wherein the target device model is one of the at least one device model.
5. The method of claim 4, wherein the human-machine interface comprises a robot window, and the displaying identification information of at least one device model on the human-machine interface comprises:
displaying identification information of the at least one equipment model in the robot window.
6. The method of claim 5, wherein the human-machine-interaction interface further comprises an editing window for providing the user with an interface for editing the current control scene, the activating a target device model based on the user-entered activation instruction comprising:
activating the target device model in response to a click operation of the user on the identification information of the target device model in the robot window or a drag operation of dragging the identification information of the target device model from the robot window to the editing window;
wherein the activation instruction comprises an instruction corresponding to the click operation or the drag operation.
7. The method of claim 1, wherein after the establishing a connection with at least one device to be controlled, the method further comprises:
generating a device to be controlled list containing the identification information of the at least one device to be controlled;
and displaying the list of the equipment to be controlled on a human-computer interaction interface.
8. The method of claim 7, wherein the human-machine interface comprises a device window, and the displaying the list of devices to be controlled on the human-machine interface comprises:
and displaying the list of the devices to be controlled in the device window.
9. The method of any of claims 1 to 8, wherein after the activation of the target device model based on the user-input activation instruction, the method further comprises:
and displaying the identification information of the target equipment model in an editing window and/or an equipment window of a human-computer interaction interface.
10. The method of claim 9, wherein prior to the establishing of the association between the target device model and the corresponding device to be controlled of the at least one device to be controlled based on the association instruction input by the user, the method further comprises:
responding to the clicking operation of the user on the identification information of the target equipment model displayed by the editing window or the equipment window, and displaying a configuration window of the target equipment model on the human-computer interaction interface;
and responding to the clicking operation of the user on an associated interface selection control on the configuration window, and displaying an associated information setting window for inputting the associated instruction, wherein the associated interface selection control is a button control for controlling the opening and closing of the associated information setting window.
11. A control device, comprising:
the device comprises a connection module, a control module and a control module, wherein the connection module is used for establishing connection with at least one device to be controlled, and the at least one device to be controlled comprises at least one robot and/or at least one motion control component;
the selection module is used for activating a target equipment model based on an activation instruction input by a user, wherein the target equipment model is a robot model or a motion control component model, and the target equipment model is added into a current control scene through activation of the target equipment model, so that the target equipment model is changed from a non-editable state to an editable state; and
and the association module is used for establishing an association relation between the target equipment model and the corresponding equipment to be controlled in the at least one equipment to be controlled based on the association instruction input by the user.
12. A control device comprising a processor and a memory, wherein the memory has stored therein computer program instructions for execution by the processor to perform the device management method of any of claims 1 to 10.
13. A storage medium having stored thereon program instructions for performing, when running, the device management method of any one of claims 1 to 10.
CN201910154718.9A 2019-02-28 2019-02-28 Device management method, control device, and storage medium Active CN110000775B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910154718.9A CN110000775B (en) 2019-02-28 2019-02-28 Device management method, control device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910154718.9A CN110000775B (en) 2019-02-28 2019-02-28 Device management method, control device, and storage medium

Publications (2)

Publication Number Publication Date
CN110000775A CN110000775A (en) 2019-07-12
CN110000775B true CN110000775B (en) 2021-09-21

Family

ID=67166166

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910154718.9A Active CN110000775B (en) 2019-02-28 2019-02-28 Device management method, control device, and storage medium

Country Status (1)

Country Link
CN (1) CN110000775B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110497406B (en) * 2019-08-14 2021-04-16 北京猎户星空科技有限公司 Equipment grouping method, device, equipment and medium
CN113492414B (en) * 2021-06-29 2023-05-26 江苏集萃华科智能装备科技有限公司 Web-based cross-platform man-machine interaction system for robot and implementation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106985150A (en) * 2017-03-21 2017-07-28 深圳泰坦创新科技有限公司 The method and apparatus of control machine human action
CN107220099A (en) * 2017-06-20 2017-09-29 华中科技大学 A kind of robot visualization virtual teaching system and method based on threedimensional model
CN107363835A (en) * 2017-08-06 2017-11-21 北京镁伽机器人科技有限公司 Collocation method, device, medium and the robot system of control parts of motion
CN107932504A (en) * 2017-11-13 2018-04-20 浙江工业大学 Mechanical arm operation control system based on PyQt

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104647377B (en) * 2014-12-30 2016-08-24 杭州新松机器人自动化有限公司 A kind of industrial robot based on cognitive system and control method thereof
CN105116785B (en) * 2015-06-26 2018-08-24 北京航空航天大学 A kind of multi-platform tele-robotic general-purpose control system
CN105093949A (en) * 2015-07-13 2015-11-25 小米科技有限责任公司 Method and apparatus for controlling device
EP3979018A1 (en) * 2015-11-02 2022-04-06 The Johns Hopkins University Generation of robotic user interface responsive to connection of peripherals to robot
CA3008562A1 (en) * 2015-12-16 2017-06-22 Mbl Limited Robotic manipulation methods and systems for executing a domain-specific application in an instrumented environment with containers and electronic minimanipulation libraries
CN106426186B (en) * 2016-12-14 2019-02-12 国网江苏省电力公司常州供电公司 One kind being based on hot line robot AUTONOMOUS TASK method combined of multi-sensor information
CN106940561B (en) * 2017-02-20 2020-11-24 上海大学 Container loading and unloading is with removing robot control system
JP7091609B2 (en) * 2017-04-14 2022-06-28 セイコーエプソン株式会社 Simulation equipment, robot control equipment and robots
CN108733278A (en) * 2018-05-29 2018-11-02 努比亚技术有限公司 A kind of matching making friends method, mobile terminal and computer storage media
CN109227542B (en) * 2018-10-16 2021-04-13 深圳市睿科智联科技有限公司 Cooperative robot construction method and system, mobile terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106985150A (en) * 2017-03-21 2017-07-28 深圳泰坦创新科技有限公司 The method and apparatus of control machine human action
CN107220099A (en) * 2017-06-20 2017-09-29 华中科技大学 A kind of robot visualization virtual teaching system and method based on threedimensional model
CN107363835A (en) * 2017-08-06 2017-11-21 北京镁伽机器人科技有限公司 Collocation method, device, medium and the robot system of control parts of motion
CN107932504A (en) * 2017-11-13 2018-04-20 浙江工业大学 Mechanical arm operation control system based on PyQt

Also Published As

Publication number Publication date
CN110000775A (en) 2019-07-12

Similar Documents

Publication Publication Date Title
CN109986559B (en) Parameter editing method and system, control device and storage medium
US11103997B2 (en) Software interface for authoring robotic manufacturing process
US10521522B2 (en) Robot simulator and file generation method for robot simulator
CN110000753B (en) User interaction method, control device and storage medium
CN109807898B (en) Motion control method, control device, and storage medium
CN109910004B (en) User interaction method, control device and storage medium
US7076322B2 (en) System and method for satisfying move constraints when performing a motion control sequence
CN109807896B (en) Motion control method and system, control device, and storage medium
CN109551485B (en) Motion control method, device and system and storage medium
CN203449306U (en) Master-slave-type double-industrial-robot coordination operation control system
KR20080051112A (en) Systems and methods for generating 3d simulations
CN104457566A (en) Spatial positioning method not needing teaching robot system
CN110000775B (en) Device management method, control device, and storage medium
CN109605378B (en) Method, device and system for processing motion parameters and storage medium
WO2019064917A1 (en) Robot simulator
CN104353926A (en) Motion control method suitable for automatic welding of complex curve device
CN109807897B (en) Motion control method and system, control device, and storage medium
CN115816459A (en) Robot control method, device, computer equipment, storage medium and product
CN105425728A (en) Multi-axis motion serial control teaching programming method
CN113733107B (en) Robot drag teaching method, robot and computer storage medium
CN110253538B (en) Motion data storage and robot control method, device, system and storage medium
Liu et al. DOREP: an educational experiment platform for robot control based on MATLAB and the real-time controller
CN110632895B (en) Management method of motion control component, control device and motion control system
JP2022524385A (en) Processes, systems, and non-volatile storage media
CN112486098A (en) Computer-aided machining system and computer-aided machining method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20191210

Address after: No.1705, building 8, Qianhai preeminent Financial Center (phase I), unit 2, guiwan District, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Mga Technology (Shenzhen) Co., Ltd

Address before: 102208 1, unit 1, 1 hospital, lung Yuan middle street, Changping District, Beijing 1109

Applicant before: Beijing magnesium Robot Technology Co., Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 518052 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Shenzhen mga Technology Co.,Ltd.

Address before: 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong 518000

Applicant before: Mga Technology (Shenzhen) Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant