CN110000753A - User interaction approach, control equipment and storage medium - Google Patents

User interaction approach, control equipment and storage medium Download PDF

Info

Publication number
CN110000753A
CN110000753A CN201910154654.2A CN201910154654A CN110000753A CN 110000753 A CN110000753 A CN 110000753A CN 201910154654 A CN201910154654 A CN 201910154654A CN 110000753 A CN110000753 A CN 110000753A
Authority
CN
China
Prior art keywords
control
equipment
model
user
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910154654.2A
Other languages
Chinese (zh)
Other versions
CN110000753B (en
Inventor
王志彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MGA Technology Shenzhen Co Ltd
Original Assignee
Megarobo Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Megarobo Technologies Co Ltd filed Critical Megarobo Technologies Co Ltd
Priority to CN201910154654.2A priority Critical patent/CN110000753B/en
Publication of CN110000753A publication Critical patent/CN110000753A/en
Application granted granted Critical
Publication of CN110000753B publication Critical patent/CN110000753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The embodiment of the present invention provides a kind of user interaction approach, control equipment and storage medium.Method includes: display human-computer interaction interface, human-computer interaction interface includes one or more in project window, robot window, device window and editor, project window is for showing listed files relevant to project file, robot window is used to show the identification information of each device model in the first model set, device window is used to show that the identification information of editable equipment, editor to be used for editor control scene;Receive the instruction that user inputs on human-computer interaction interface;And the corresponding operation of instruction execution based on user's input.There is provided one kind can be to the more control equipment to be controlled for control equipment (including robot and/or control parts of motion) while being managed and controlling.In addition, also providing a kind of user interaction approach applied to the control equipment, user is facilitated to be managed and control at least one robot and/or control parts of motion.

Description

User interaction approach, control equipment and storage medium
Technical field
The present invention relates to movement control technology field, relates more specifically to a kind of user interaction approach, control equipment and deposit Storage media.
Background technique
In the kinetic control system based on the similar techniques such as robot (such as mechanical arm), control equipment to be controlled (such as machine Device people or driving controller etc.) it establishes a connection with control equipment (such as host computer etc.), user can pass through control equipment control Robot motion.
Existing robot mostly uses teaching machine as the equipment of control robot in the market, and teaching machine is with robot Correspondingly, each robot is correspondingly arranged on a teaching machine.If user wants with various robots, various motors More complicated kinematic system is formed, the one-to-one scheme of above-mentioned teaching machine and robot is just inconvenient.Such as: at some 10 robots are installed on process line, needs them to synchronize and does same movement, traditional scheme needs respectively to every Then the complicated programming of platform robot progress gradually adjusts the state for arriving relative synchronization, it usually needs cost ten is more than a couple of days Debug time considerably increases the use cost of robot.
Summary of the invention
The present invention is proposed in view of the above problem.The present invention provides a kind of user interaction approach, control equipment and Storage medium.
According to an aspect of the present invention, a kind of user interaction approach is provided, control equipment is applied to, method includes: display Human-computer interaction interface, human-computer interaction interface include one in project window, robot window, device window and editor or Multinomial, project window is for showing listed files relevant to project file, and robot window is for showing the first model set In each device model identification information, device window is used to show that the identification information of editable equipment, editor to be used for Editor control scene, editable equipment include that all devices model in the second model set and/or establish with control equipment connects At least one connect control equipment to be controlled, at least one control equipment to be controlled include at least one robot and/or at least one movement Control unit, the first model set and the second model set respectively include at least one device model, and each device model is machine Device people model or control parts of motion model;Receive the instruction that user inputs on human-computer interaction interface;And it is defeated based on user The corresponding operation of the instruction execution entered.
Illustratively, the corresponding operation of instruction execution based on user's input includes: that the scene based on user's input is established Instruction or scene import instruction, establish current control scene;And it is provided a user in the editing window for editing current control The interface of scene processed.
Illustratively, the corresponding operation of instruction execution based on user's input includes: the activation instruction based on user's input The target device model in the first model set is activated in currently control scene.
Illustratively, the activation instruction based on user's input activates the mesh in the first model set in currently control scene Marking device model include: clicking operation in response to user in robot window to the identification information of target device model or The identification information of target device model is dragged to the drag operation of editor from robot window, activates target device mould Type;And the identification information of displaying target device model in the editing window;Wherein, activation instruction is clicking operation or dragging behaviour Make corresponding instruction.
Illustratively, the second model set includes all devices model activated at least one control scene.
Illustratively, the corresponding operation of instruction execution based on user's input includes: based on user's input and editable The corresponding motion control instruction of designated equipment in equipment shows that parameter relevant to designated equipment is compiled on human-computer interaction interface Collect window;Receive the kinematic parameter that user edits in parameter editor;And kinematic parameter is distributed into designated equipment.
Illustratively, parameter editor is shown in the editing window.
Illustratively, the motion control instruction corresponding with the designated equipment in editable equipment based on user's input, Shown on human-computer interaction interface relevant to designated equipment parameter editor include: in response to user in device window to finger The clicking operation of the identification information of locking equipment, the display parameters editor on human-computer interaction interface.
Illustratively, listed files includes at least one document scene for being respectively used to indicate at least one control scene Identification information.
According to a further aspect of the invention, a kind of control equipment is provided, comprising: display module, for showing human-computer interaction circle Face, human-computer interaction interface include one or more, the engineering in project window, robot window, device window and editor Window is for showing listed files relevant to project file, and robot window is for showing that each of first model set is set The identification information of standby model, device window are used to show that the identification information of editable equipment, editor to be used for editor control field Scape, editable equipment include all devices model in the second model set and/or establish at least the one of connection with control equipment A control equipment to be controlled, at least one control equipment to be controlled include at least one robot and/or at least one control parts of motion, and One model set and the second model set respectively include at least one device model, and each device model is robot model or fortune Dynamic control unit model;Receiving module, the instruction inputted on human-computer interaction interface for receiving user;And execution module, The corresponding operation of instruction execution for being inputted based on user.
According to a further aspect of the invention, a kind of control equipment, including display, processor and memory are provided, wherein aobvious Show that device for showing human-computer interaction interface, is stored with computer program instructions, computer program instructions are by processor in memory For executing above-mentioned user interaction approach when operation.
According to a further aspect of the invention, a kind of storage medium is provided, stores program instruction on a storage medium, program refers to It enables at runtime for executing above-mentioned user interaction approach.
According to embodiments of the present invention, providing one kind can be to more control equipments to be controlled (including robot and/or motion control Component) the control equipment that is managed and controls simultaneously.In addition, also providing a kind of user interaction side applied to the control equipment Method, this this interactive mode of method facilitate user to be managed and control at least one robot and/or control parts of motion System, and user is facilitated to simulate, test to robot and/or control parts of motion.
Detailed description of the invention
The embodiment of the present invention is described in more detail in conjunction with the accompanying drawings, the above and other purposes of the present invention, Feature and advantage will be apparent.Attached drawing is used to provide to further understand the embodiment of the present invention, and constitutes explanation A part of book, is used to explain the present invention together with the embodiment of the present invention, is not construed as limiting the invention.In the accompanying drawings, Identical reference label typically represents same parts or step.
Fig. 1 shows the schematic block diagram of kinetic control system according to an embodiment of the invention;
Fig. 2 shows the schematic block diagrams of kinetic control system in accordance with another embodiment of the present invention;
Fig. 3 shows the schematic diagram of the human-computer interaction interface in control equipment according to an embodiment of the invention;
Fig. 4 shows the schematic diagram of project window according to an embodiment of the invention;
Fig. 5 shows the schematic diagram of robot window according to an embodiment of the invention;
Fig. 6 shows the schematic diagram of device window according to an embodiment of the invention;
Fig. 7 shows the allocation window of control parts of motion according to an embodiment of the invention and in the allocation window The schematic diagram for the configuration information that partial region is shown;
Fig. 8 shows the allocation window of control parts of motion in accordance with another embodiment of the present invention and in the allocation window The schematic diagram of configuration information that shows of partial region;
Fig. 9 shows the schematic flow chart of device management method according to an embodiment of the invention;
Figure 10 shows the allocation window of MRX-T4 robot model according to an embodiment of the invention and in the configuration The schematic diagram for the related information setting window that the partial region of window is shown;
Figure 11 shows the schematic flow chart of motion control method according to an embodiment of the invention;
Figure 12 shows the schematic diagram of parameter editor according to an embodiment of the invention;
Figure 13 shows the schematic diagram of parameter editor in accordance with another embodiment of the present invention;
Figure 14 shows the schematic flow chart of user interaction approach according to an embodiment of the invention;
Figure 15 shows the schematic flow chart of user interaction approach according to an embodiment of the invention;
Figure 16 shows the schematic block diagram of control equipment according to an embodiment of the invention;And
Figure 17 shows the schematic block diagram of control equipment according to an embodiment of the invention.
Specific embodiment
In order to enable the object, technical solutions and advantages of the present invention become apparent, root is described in detail below with reference to accompanying drawings According to example embodiments of the present invention.Obviously, described embodiment is only a part of the embodiments of the present invention, rather than this hair Bright whole embodiments, it should be appreciated that the present invention is not limited by example embodiment described herein.Based on described in the present invention The embodiment of the present invention, those skilled in the art's obtained all other embodiment in the case where not making the creative labor It should all fall under the scope of the present invention.
In order at least be partially solved the above problem, the embodiment of the present invention provides a kind of control equipment.The control equipment can By in such a way that hardware or hardware are in conjunction with software, realize to more control equipments to be controlled (including robot and/or motion control Component) while manage and control.Control equipment described herein can be it is any suitable have data-handling capacity and/or The calculating equipment of instruction execution capability can be realized using conventional computer.For example, control equipment can be host computer, show Teach device etc..Illustratively, control equipment described herein can provide a user control interface, i.e. human-computer interaction by display Interface, user can use control of the control interface realization to equipment to be controlled and/or device model.In addition, illustratively, Control equipment described herein is also equipped with control software, for realizing algorithm function relevant to control interface.
According to embodiments of the present invention, it manages and controls while being realized with a control equipment to more control equipments to be controlled System.It compared with existing one-to-one correspondence scheme, has at least the following advantages according to the solutions of the embodiments of the present invention: 1, saving operation Time, the synchronism between different control equipments to be controlled are more preferable;2, the control of robot or control parts of motion is easier, it can be with Realize the accurate control to each joint or axis;3, due to may be implemented to more robots while controlling, to more robots Movement programming be easier;4, control while due to may be implemented to different robots, different movements between robot It cooperates and is easier to realize, it is easier to the kinematic system of complicated composition.
Although control equipment provided in an embodiment of the present invention have can to more control equipments to be controlled simultaneously be managed and The ability of control, but it is understood that, this is not limitation of the present invention, which can be applied equally to control and set In the one-to-one control program of control equipment standby and to be controlled.
For above-mentioned control equipment, the embodiment of the present invention provides a kind of user interaction approach that can be applied to it.According to this User interaction approach, can be shown on human-computer interaction interface device model and/or with control equipment it is practical connect it is to be controlled Equipment, user can carry out the operation such as kinematic parameter editor for these device models and/or control equipment to be controlled.This interaction side Formula facilitates user to be managed and control at least one robot and/or control parts of motion, and facilitates user to machine People and/or control parts of motion are simulated, are tested.
User interaction approach according to an embodiment of the present invention and control equipment are not limited to the control to robot.Example Such as, control parts of motion can be used for control and drive motion components (such as motor etc.), and moving component can both be used in machine It, can also be on non-robot product on device people's product.For example, the transmission belt on assembly line is controled with the realization of many motors , control parts of motion may be implemented to control this kind of motor, further, user interaction side according to an embodiment of the present invention Method and control equipment can be used for being managed the control parts of motion in the assembly line field and controlling.In short, according to this hair The user interaction approach and control equipment of bright embodiment can be applied to any robot or use the work similar with robot The control field of the equipment of mode or other equipment with moving component.
In the following description, in conjunction with the application environment of user interaction approach according to an embodiment of the present invention, i.e. movement control System processed, to describe the present invention.Kinetic control system described herein may include control equipment and control equipment to be controlled.Institute as above It states, control equipment may include such as host computer, teaching machine.Control equipment to be controlled may include such as robot, for driving The control parts of motion etc. of robot motion.Robot can be the various structures such as four axis robots, H2 type plane door frame robot The robot of type, control parts of motion can be various types of driving controllers such as uniaxial driving controller, multiaxis driving controller.It is described herein Moving component can be only motor, be also possible to motor combination retarder, can also be motor combination lead screw etc..
Robot described herein can be the automatic installations for executing work.Robot may include robot sheet Body, end effector (or being tool).Ontology may include multiple joints, such as pedestal, large arm, forearm, wrist etc..It holds end Row device is, for example, the clamping jaw that can a be opened and closed/object clamping part, is also possible to other operational instrument.End effector is by controlling Equipment control moves according to respective routes and completes scheduled movement.It is specific for example, the controlled control equipment of end effector manipulation, It realizes and is moved in three-dimensional space, and execute relevant action, such as crawl, release or other movements in specified position.
By motor cooperate retarder for, motor cooperation retarder be mechanical arm (or for manipulator, multi-axis robot, Articulated robot etc.) main movement execution unit, mechanical arm mainly pressed from both sides according to scheduled route from an initial position Take target object to target position, the mechanical automation operation suitable for many industrial circles.
Mechanical arm currently on the market mainly includes that four axis robots (there are four joints for tool) and six-joint robot (have six A joint), they include pedestal, arm and the object of end clamping part, and the number of arm upper joint determines robot " axis " quantity, each joint be driven by the rotation of motor, the movement to realize joint.
Kinetic control system according to an embodiment of the present invention is described below with reference to Fig. 1, it is real according to the present invention to help to understand Apply the exemplary context of use of the user interaction approach of example.Fig. 1 shows kinetic control system according to an embodiment of the invention 100 schematic block diagram.It may be noted that user interaction approach provided in an embodiment of the present invention can be in other and kinetic control system It is realized in 100 similar systems, it is not limited to specific example shown in FIG. 1.
As shown in Figure 1, kinetic control system 100 may include man-machine interaction unit (i.e. control equipment) 110, control general ability Domain network (CAN) data line 120, control parts of motion 130 and motor (i.e. moving component) 140.Control parts of motion 130 includes CAN data transceiving unit 1302, caching 1304, solving unit 1306, wave table 1308, PWM waveform generator 1310 and motor drive Moving cell 1312.
User can pass through human-computer interaction list when controlling motor 140 using control parts of motion (such as driving controller) 130 Member 110 edits kinematic parameter.Man-machine interaction unit 110 is sent to fortune via the kinematic parameter that CAN data line 120 edits user Dynamic control unit 130, control parts of motion 130 resolve the kinematic parameter received, obtain wavetable data, then generate PWM waveform, driving motor movement.
Specifically, the solving unit 1306 in control parts of motion 130 can read kinematic parameter, then by the fortune of reading Dynamic parameter carries out the processing such as interpolation resolving using solution formula, and kinematic parameter is converted to wavetable data, is stored in wave table 1308 In.
Wave table 1308 can be realized using DDR memory etc., for storing wavetable data, can be arranged according to the design needs The size of the storage depth of wave table 1308.
PWM waveform generator 1310 is used to generate corresponding PWM wave figurate number according to the wavetable data stored in wave table 1308 According to.PWM waveform is referred to as impulse waveform sometimes, has two states of low and high level, can pass through in motion control field The duty ratio for adjusting PWM waveform reaches the purpose of control motor speed, electromagnetic valve switch state.PWM waveform generator 1310 can It is real for example, by using direct digital synthesis technique (DDS) signal generation technique to be realized using existing various PWM waveform generators Existing PWM waveform generator, the PWM waveform generator realized using digital counting technology etc..
Therefore, the actual motion Parameter Switch of user setting is the wavetable data for generating PWM waveform by solving unit 1306, PWM waveform generator 1310 generates corresponding PWM wave graphic data according to wavetable data, using digital-to-analogue conversion, amplification filtering etc. After processing, it is sent to electric-motor drive unit 1312 and carrys out the movement of driving motor 140.
Electric-motor drive unit 1312 is used to be moved according to PWM waveform driving motor 140, can use all kinds of motor driven cores Piece is realized.
In the example depicted in fig. 1, a control parts of motion 130 connecting with control equipment 100 is illustrated only.Below The example that one control equipment controls multiple control parts of motion is described.Fig. 2 shows fortune in accordance with another embodiment of the present invention The schematic block diagram of autocontrol system 200.As shown in Fig. 2, kinetic control system 200 includes control equipment 201, gateway 202, N A control parts of motion 203, N number of moving component 204 and N number of sensor 205.N may be greater than 1 integer.
Equipment 201 is controlled for realizing human-computer interaction, the computer of control software is e.g. installed.User can controlled The various parameters of robot or control parts of motion are set on the human-computer interaction interface that control equipment 201 provides, are realized to robot Or the control of control parts of motion.
Gateway 202 is for realizing the data exchange for controlling equipment 201 and control parts of motion 203.Gateway 202 can be such as It is realized using above-mentioned CAN data line 120.
Control parts of motion 203 is used to resolve the kinematic parameter that control equipment 201 is sent, and is edited based on user Kinematic parameter generate control and drive motion components 204 move driving current, with drive motion components 204 movement, in turn Corresponding movable body (such as joint of robot) is driven to move.Each control parts of motion 203 can use and above-mentioned movement The similar structure of control unit 130 is realized.
Moving component 204 can be motor, and each motor can be used for driving a joint of robot.
Sensor 205 is used for the movement position of real-time detection movable body, can be encoder, angular transducer, photoelectricity Switch, NI Vision Builder for Automated Inspection etc..Some robots can not have sensor.
As shown in Fig. 2, there can be multiple control parts of motion 203 while be connected to control equipment 201, equipment 201 is controlled Can these control parts of motion 203 be managed and be controlled simultaneously.
It following is a brief introduction of the human-computer interaction scheme realized on the control device, to facilitate a better understanding of the present invention.
According to embodiments of the present invention, several device models can be pre-established, these device models can store in model In library.Optionally, model library can store on the server, by different control device downloads and use.Optionally, model library can Acquisition is established in current control equipment by user.Device model described herein can be robot model, can also be with It is control parts of motion model.On human-computer interaction interface, user can choose and edit these device models, and can build Incidence relation between vertical device model and the control equipment to be controlled actually connected.It establishes after incidence relation, it can be by being directed to The kinematic parameter of device model setting controls the movement of control equipment to be controlled.
The benefit for establishing device model has very much.For example, the type of the control parts of motion of different robots may be phase With, may not need the robot for identifying actually connect one by one using this scheme, only need to identify control parts of motion and by its It is associated with robot model.In addition, this scheme makes the scalability for entirely controlling equipment fine.For example, if opening Sent out a kind of new robot modeling, then it, only need to be in controlling model library used by equipment without developing new control equipment Increase a new robot model.
Therefore, on the human-computer interaction interface of control equipment, in addition to the control equipment to be controlled being connect with control equipment, may be used also With the device model in display model library, for user's control.
Fig. 3 shows the schematic diagram of the human-computer interaction interface in control equipment according to an embodiment of the invention.It may be noted that The layout type of human-computer interaction interface shown in Fig. 3 is only exemplary rather than limitation of the present invention.
As shown in figure 3, human-computer interaction interface may include one or more of following four display area: project window (Engineering Zone), robot window (robot zone), editor (editing area) and device window (equipment region).Each Window can be considered as a region, and position can be fixed, and be also possible to adjustable.For example, can be by user by dragging The modes such as dynamic are laid out human-computer interaction interface again, adjust the position where each window.In addition, in aforementioned four window Either window can according to need and open or close.In addition, human-computer interaction interface can also include other windows, for example, There are one record output window for human-computer interaction interface bottom shown in Fig. 3.In one example, editor can be position It sets fixed and user and haves no right closing, closing that other three windows can be fixed in position but user has the right.
Project window: for managing current all file resources, such as display listed files relevant to project file. Fig. 4 shows the schematic diagram of project window according to an embodiment of the invention.As shown in figure 4, in project window, it is shown that work The listed files of journey file " 1212.prj (i.e. file entitled 1212, extend entitled prj) " managed file.The text of project file Part list may include the filename for belonging to the All Files of current engineering.A variety of lattice can be generated in the control software of control equipment The file of formula, including but not limited to: project file (extending entitled prj), document scene (extending entitled sce), robot motion File (extending entitled mc), motor movement file (extending entitled pvt), setting file (extending entitled stp) etc..
In description herein, when project file can be understood as establishing engineering (project) in controlling software File generated can be used for integrating and managing the alternative document under the engineering, such as document scene, motor movement file Deng.Document scene is generated when can be understood as establishing a scene (control scene i.e. described herein) in controlling software File.Device model can be added and (can be described as activating) in any scene, can also add and control the to be controlled of equipment connection Control equipment, and the operation such as some parameter editors can be carried out to the device model of addition or control equipment to be controlled, in scene When equipment is operated, document scene can recorde the information of these operations.Illustratively, can be arranged such as in control software Lower rule: after generating by modes such as newly-built or importings in project window and wake up and (choose) document scene, Yong Hucai The device model needed can be selected to carry out the operation such as activation and subsequent parameter editor in robot window.
Robot motion's file can be comprising information relevant to the motion profile of robot (such as the movement of robot Parameter) file.Motor movement file can be comprising information relevant to the motion profile of motor (such as motor movement ginseng Number) file.
File, which is arranged, can be the file of the configuration information comprising control equipment to be controlled or device model.It is in control equipment to be controlled In the case that control parts of motion or device model are control parts of motion model, the configuration information may include motion control It is one or more in the following information of component or control parts of motion model: essential information, the parameter of electric machine of each axis, movement Trajectory planning parameter, trigger parameter, I/O interface parameter etc..The essential information may include such as model, version number, mark Accord with (ID) etc..The parameter of electric machine may include such as motor size, step angle, maximum value, encoder information.The movement Trajectory planning parameter may include in kinematic parameter interpolation method used by interpolation, interpolation curve duty ratio, null positions, Emergency stop mode etc..In description herein, Motion trajectory parameter is for the movement to equipment to be controlled or device model The parameter that track is planned includes different contents from kinematic parameter.
In the case where control equipment to be controlled is robot or device model is robot model, the configuration information be can wrap It includes one or more in the following information of robot or robot model: essential information, each joint and control parts of motion The corresponding relationship of axis, step-length, null positions used by interpolation etc. in kinematic parameter.The essential information may include example Such as model, version number, identifier (ID).
Robot window: for showing the identification information of the various device models pre-established.Robot window is shown Device model may include robot model and/or control parts of motion model.All devices mould shown by robot window Set composed by type is properly termed as the first model set.Illustratively, the identification information of device model may include equipment mould The model and/or icon of type.Fig. 5 shows the schematic diagram of robot window according to an embodiment of the invention.As shown in figure 5, The identification information of multiple equipment model is shown in robot window, wherein MRX indicates the model of robot type.Working as front court Scape file or in the case where saying that current control scene is in editable state (such as user chooses current scene file), Yong Huke To select the device model for needing to activate, the subsequent model is in current scene file or says that current control in scene is activated and shows Show in the editing window.In description herein, in a certain document scene activation equipment model with the document scene institute it is right Activation equipment model has the same meaning in the control scene answered.Referring back to Fig. 3, wherein editor shows scene Multiple robot models in " 3.sce ".
Device window: for showing the identification information of editable equipment.Illustratively, editable equipment may include second All devices model in model set and/or at least one control equipment to be controlled that connection is established with control equipment.Second model Set may include all devices model activated at least one control scene.At least the one of connection is established with control equipment A control equipment to be controlled can be all control parts of motion 203 for example shown in Fig. 2 being connected in control equipment 201.
User will control equipment and control equipment to be controlled passes through bus (such as CAN bus can connect up to 100 equipment) After linking together, device window can show list of devices to be controlled, the list may include it is all have connected it is to be controlled The identification information of equipment.In the case where control equipment to be controlled is control parts of motion, list of devices to be controlled can also include should The identification information of each axis of control parts of motion.Fig. 6 shows the schematic diagram of device window according to an embodiment of the invention. As shown in fig. 6, a kind of control parts of motion (five axis driving controllers) of a model MRQ-M2305 is shown in device window Identification information also shows the identification information of five axis of the control parts of motion, is indicated respectively with CH1-CH5.
When the instruction for receiving user's input activates the activation instruction of a certain device model, device window can show quilt The identification information of the device model of activation.Referring back to Fig. 3, shows in device window and swash in document scene " 3.sce " Identification information MRX-AS, MRX-AS, MRX-T4 of three robot models living, also shows in document scene Identification information MRX-AS, MRX-T4, MRX-DT, MRX-H2Z of four robot models activated in " 11111.sce ".
Editor: it is used for editor control scene.It can be provided a user in the editing window for editing current control The interface of scene.The operation carried out in the editing window is equivalent to the operation to document scene, for example, adding in the editing window One device model, which is equivalent in currently control scene, activates the device model, and document scene can recorde activation operation Relevant information.When the instruction for receiving user's input activates the activation instruction of a certain device model, editor can be shown The identification information for the device model being activated.
It may be noted that the form of the identification information shown in different windows may be the same or different.For example, machine The identification information for the device model that people's window and editor are shown can include model and icon, and what device window was shown The identification information of device model can only include model.
User can select some editable equipment in editor or device window, the editable equipment can be to Equipment is controlled, is also possible to device model, subsequent user can check and/or edit the every terms of information of the editable equipment, example Such as above-mentioned configuration information or kinematic parameter.
Fig. 7 shows the allocation window of control parts of motion according to an embodiment of the invention and in the allocation window The schematic diagram for the configuration information that partial region is shown.Configuration information shown in Fig. 7 is the motion control portion of model MRQ-M2304 The essential information of part.As shown in fig. 7, the left area of allocation window shows several controls, " information ", " CH1 ", " CH2 ", " CH3 ", " CH4 ", " input ", " output ", " SEN. ", " system " etc., these controls can be button control, and user can pass through The mode clicked selects any control therein.User selects " information " this control in left area, can be in right side region Domain shows the essential information of current editable equipment.User can check the essential information, and can also edit this when needed Essential information.
Fig. 8 shows the allocation window of control parts of motion in accordance with another embodiment of the present invention and in the allocation window The schematic diagram of configuration information that shows of partial region.As shown in fig. 7, the left area of allocation window shows several controls, " information ", " Axes1 " (similar to " CH1 " control effect shown in Fig. 7), " Axes2 " (are acted on " CH2 " control shown in Fig. 7 It is similar), " Axes3 " (similar to " CH3 " control effect shown in Fig. 7), " Axes4 " (with " CH4 " control shown in Fig. 7 act on It is similar), " input/output " (similar to " input " shown in Fig. 7 and the effect of " output " control), " sensor " (with it is shown in Fig. 7 The effect of " SEN. " control is similar), " system " etc., these controls can be button control, and user can be selected by way of clicking Select any control therein.Configuration information shown in Fig. 8 is first axis of the control parts of motion of model MRQ-M2304 (Axes1) the parameter of electric machine.The correspondence control of each axis such as " Axes1 ", " Axes2 " in user's selection left area The parameter of electric machine of each axis of current editable equipment is shown in right area.User can check the parameter of electric machine, and need When can also edit the parameter of electric machine.The display of remaining configuration information is similar with edit mode, repeats no more.
In description herein, kinematic parameter refers to the ginseng for being used to indicate the motion profile of control equipment or device model to be controlled Number.It will be understood by those skilled in the art that each motion profile can be made of multiple points comprising information such as position, times, Each point is a kinematic parameter.Kinematic parameter sequence including a plurality of kinematic parameter can be considered as a motion profile. In the case where control equipment to be controlled is robot or device model is robot model, kinematic parameter can be robot or machine The kinematic parameter of the end effector of people's model is also possible to each of robot or at least one joint of robot model The kinematic parameter in joint.The case where control equipment to be controlled is control parts of motion or device model is control parts of motion model Under, kinematic parameter can be the movement ginseng of each axis at least one axis of control parts of motion or control parts of motion model Number.
The content of kinematic parameter can be different and different according to actually constituting for moving component (such as motor).Example Property, kinematic parameter may include one or more in position data, speed data and time data.The position data can To be the coordinate data in rectangular coordinate system in space, it is also possible to rotate angle or other data relevant to position.Positional number In the case where being coordinate data in rectangular coordinate system in space, kinematic parameter is properly termed as LVT parameter.Position data is rotation In the case where angle, kinematic parameter is properly termed as PVT parameter.LVT parameter may include the coordinate in rectangular coordinate system in space (can With referred to as X, Y, Z) and reach corresponding coordinate point time (being properly termed as T).PVT parameter may include that rotation angle (can claim For P), rotation speed (being properly termed as V), rotational time (being properly termed as T).
According to an aspect of the present invention, a kind of device management method is provided.Fig. 9 shows according to an embodiment of the invention set The schematic flow chart of standby management method 900.As shown in figure 9, device management method 900 includes step S910, S920, S930.
In step S910, connection is established at least one control equipment to be controlled, wherein at least one control equipment to be controlled includes extremely A few robot and/or at least one control parts of motion.In description herein, "at least one" can with " one or It is multiple " it is equivalent.
As described above, control equipment can be established and be connected by bus (such as CAN bus) and at least one control equipment to be controlled It connects.
In step S920, the activation instruction based on user's input activates target device model, and target device model is machine People's model or control parts of motion model.
The activation of target device model, which can be, is added to target device model in current control scene, so that target is set Standby model by can not editing mode be changed into editable state.User inputs activation instruction can be by by target device model Identification information drags to editor or in robot window to the identification information of target device model from robot window It carries out the mode such as clicking to realize.Optionally, the click can be double-click.Control equipment is receiving the above-mentioned behaviour of user's execution As when input operation information (receiving activation instruction) when, so that it may by target device model specification be editable state. After the activation, user further can carry out configuration information editor, movement to target device model on human-computer interaction interface Parameter editor such as is associated at the operation with control equipment to be controlled.
In step S930, based on the associated instructions of user's input, establishes target device model and to be controlled set up at least one The incidence relation between correspondence control equipment to be controlled in standby.
Device model is virtual unit, and control equipment to be controlled is and the control practical equipment connecting of equipment.Meanwhile device model It is the robot of certain model or the model of control parts of motion, and control equipment to be controlled itself can be the robot of certain model Either for driving the control parts of motion of the robot of certain model.Therefore, certain device models can with it is certain to be controlled Control equipment correspondence associates.It is of course possible to there are certain device models there is no can control equipment to be controlled associated with it, There may be certain control equipments to be controlled, there is no being capable of device model associated with it.
When needed, a certain device model can be associated with a certain control equipment to be controlled, in this way, user can edit The kinematic parameter of the device model, and kinematic parameter is sent to the control equipment to be controlled being associated, actually to drive this to be controlled Control equipment movement.
Target device model can be any one device model, can optionally be selected by user.For example, user can To click the identification information of any appliance model in editor or device window, to select the device model as target device Model.Optionally, the click can be double-click.
In description herein, user refer to the people that interacts of control equipment, can be the same person, can also be with It is different people.For example, the activation instruction of step S920 can come from different users from the associated instructions of step S930.
It should be noted that each step of device management method according to an embodiment of the present invention is not limited to execution shown in Fig. 9 Sequentially, other be can have and reasonable execute sequence.For example, step S920 can be held before step S910 or simultaneously Row.
Device management method according to an embodiment of the present invention, can by the device model pre-established with it is corresponding to be controlled Equipment associates, in this way, being controlled by device model the control equipment to be controlled actually connected convenient for user.This method Be conducive to improve the efficiency of management of control equipment, equipment may not need the machine for identifying actually connect one by one because it makes control People only need to identify control parts of motion and be associated with it with robot model.It is set in addition, this method entirely to control Standby scalability is fine.
According to embodiments of the present invention, method 900 can also include: that at least one equipment mould is shown on human-computer interaction interface The identification information of type;Wherein, target device model is one of at least one device model.
According to embodiments of the present invention, human-computer interaction interface may include robot window, show on human-computer interaction interface The identification information of at least one device model may include: the mark letter that at least one device model is shown in robot window Breath.
As described above, human-computer interaction interface may include robot window, several equipment moulds can be shown in the window The identification information of type, so that user checks.The identification information may include the model and/or icon of device model.Robot The device model that window is shown belongs to the first model set, which can come from model library.Fig. 3 and Fig. 5 are had been combined above Describe robot window layout type and it includes content, details are not described herein again.
It should be understood that robot window is only example, the identification information of device model included by the first model set can be with It otherwise shows, for example, can be shown by way of list control in menu bar.
At least one device model (i.e. the first model set) is shown on human-computer interaction interface, can be convenient user at any time The device model that selection needs carries out the operation such as kinematic parameter editor.
According to embodiments of the present invention, human-computer interaction interface can also include editor, and the activation based on user's input refers to Enable activation target device model (step S920) may include: in response to user in robot window to target device model The identification information of target device model is dragged to dragging for editor from robot window by the clicking operation of identification information Dynamic operation, activates target device model;Wherein, activation instruction includes clicking operation or the corresponding instruction of drag operation.
Above have been combined Fig. 3 describe editor layout type and it includes content, details are not described herein again.
As described above, user can click the identification information of any appliance model shown by robot window Or dragged to editor.When user executes above-mentioned clicking operation or drag operation for target device model, response In these operation, activate target device model, i.e., by target device model by can not editing mode be set as editable state.This Outside, optionally, it in response to above-mentioned clicking operation or drag operation, can be set in editor and/or device window displaying target The identification information of standby model.The clicking operation can be single-click operation.In addition, the clicking operation can be and control currently It executes, i.e., is executed in the case where current scene file is in editable state under scene.In this way, target device model can be It activates in currently control scene rather than under other control scenes.
Device model shown by editor and/or device window belongs to the second model set.Second model set can To be not understood as the set of activated device model, the first model set can be understood as the device model that can be activated Set, the second model set is the subset of the first model set.
According to embodiments of the present invention, after establishing connection at least one control equipment to be controlled, method 900 can also be wrapped It includes: generating the list of devices to be controlled of the identification information comprising at least one control equipment to be controlled;It is shown on human-computer interaction interface List of devices to be controlled.
Illustratively, human-computer interaction interface may include device window, and control equipment to be controlled is shown on human-computer interaction interface List may include: that list of devices to be controlled is shown in device window.
As described above, human-computer interaction interface may include device window, it can show that several editables are set in the window Standby identification information, so that user checks.The identification information may include the model and/or icon of editable equipment.Above Have been combined Fig. 3 and Fig. 6 describe device window layout type and it includes content, details are not described herein again.
Editable equipment may include all devices model in the second model set and/or establish connection with control equipment At least one control equipment to be controlled.Device window can show list of devices to be controlled, establish connection at least with control equipment The identification information of one control equipment to be controlled can be shown by list of devices to be controlled.It is set in addition, device window can also be shown For model list, the identification information of the device model in the second model set can pass through device model list display.
It should be understood that device window is only example, the identification information of editable equipment can otherwise be shown, for example, It can be shown by way of list control in menu bar.
According to embodiments of the present invention, inputted based on user activation instruction activation target device model (step S920) it Afterwards, method 900 can also include: the displaying target device model in the editor and/or device window of human-computer interaction interface Identification information.
The implementation of the identification information in editor and/or device window displaying target device model is hereinbefore described Example, details are not described herein again.
According to embodiments of the present invention, in the associated instructions inputted based on user, target device model and at least one are established Before the incidence relation (step S930) between correspondence control equipment to be controlled in control equipment to be controlled, method 900 can also include: sound It should be in user to the clicking operation of the identification information of target device model shown by editor or device window, in man-machine friendship The allocation window of displaying target device model on mutual interface;In response to user to the association interface selection control on allocation window Clicking operation shows that window is arranged in the related information for inputting associated instructions, wherein selection control in association interface is for controlling The button control of the opening and closing of related information setting window processed.
Optionally, user can to the click of the identification information of target device model shown by editor or device window To be to double-click.Optionally, user can be the click of the association interface selection control on allocation window and click.
Figure 10 shows the allocation window of MRX-T4 robot model according to an embodiment of the invention and in the configuration The schematic diagram for the related information setting window that the partial region of window is shown.Illustratively, allocation window may be displayed on editor In window.User carries out the identification information of MRX-T4 robot model shown by editor or device window primary double It hits, allocation window as shown in Figure 10 can be popped up.
As shown in Figure 10, allocation window can be divided into two regions, and left area is properly termed as control display area, use In the controls such as display " information ", " details ", " option ", " zero-bit ", " attribute ", right area, which is used to showing, currently to be chosen Window corresponding to control.As shown in Figure 10, currently choosing control is " option " control, it is shown that the related information on right side is set Window is set, that is, being somebody's turn to do " option " control is the association interface selection control.Each control of control display area shown in Fig. 10 Part is button control.User once clicks either button control, can show the control in the right area of allocation window Set interface corresponding to part.
According to embodiments of the present invention, target device model is robot model, and corresponding control equipment to be controlled is comprising at least one The control parts of motion of a axis is established target device model and to be controlled is set up at least one based on the associated instructions of user's input The incidence relation (step S930) between correspondence control equipment to be controlled in standby may include: any pass for target device model Section receives the joint shaft association of the respective shaft that correspond to control equipment to be controlled, corresponding with the joint of being used to indicate for user's input Instruction;The joint is associated with the respective shaft of corresponding control equipment to be controlled according to joint shaft associated instructions.
Associated instructions involved in step S930 may include all joints corresponding with all joints of robot model Axis associated instructions.When by robot model and the robot actually connecting or by control parts of motion model and the fortune that actually connect When dynamic control unit association, it can be directly linked.It, can when robot model to be associated with the control parts of motion actually connecting Each joint of robot model is associated correspondingly with each axis of control parts of motion with selection.For this purpose, can Think that the axis of corresponding control parts of motion is specified in each joint of robot model, user's input is used to indicate the axis The instruction for being which axis is joint shaft associated instructions.
Illustratively, it in the associated instructions inputted based on user, establishes target device model and to be controlled is set up at least one Before the incidence relation (step S930) between correspondence control equipment to be controlled in standby, method 900 can also include: that target is set Any joint of standby model shows that joint shaft corresponding with the joint is associated with control on human-computer interaction interface, and joint shaft closes Joint control part is Input or list control;For any joint of target device model, being used for for user's input is received The joint shaft associated instructions of the corresponding control equipment to be controlled of instruction, corresponding with joint respective shaft may include: for target Any joint of device model receives pair of control equipment to be controlled that user inputs in corresponding joint shaft association control, corresponding Answer the identification information of axis;Or receive the respective shaft of user's corresponding control equipment to be controlled of selection from corresponding joint shaft association control Identification information selection instruction;Wherein, joint shaft associated instructions corresponding to any joint of target device model include using Family is for instruction corresponding to input performed by the association control of joint shaft corresponding to the joint or selection operation.
Target device model can be robot model, and robot has multiple joints, and corresponding control equipment to be controlled can be with It is the control parts of motion with multiple axis.In such a case, it is possible to establish each joint and motion control portion of robot One-to-one relationship between each axis of part.
As described above, when user double-clicks the identification information of target device model in editor or device window, in people It can be popped up on machine interactive interface, i.e. the allocation window of displaying target device model.It include several controls in allocation window, such as " information ", " details ", " option ", " zero-bit ", " attribute " etc..Wherein control corresponding to " option " is the selection of association interface Control can be button control.It when " option " control is somebody's turn to do in user click, can pop up, that is, show that window is arranged in related information Mouthful, the corresponding relationship of the axis in the joint and control parts of motion of robot model can be arranged in the related information of robot model Window is configured.
0 is continued to refer to figure 1, in related information setting window, by the Basement (pedestal) of MRX-T4 robot model It is correspondingly arranged the axis 1 (i.e. CH1) for control parts of motion devicel, the Big Arm (large arm) of MRX-T4 robot model is corresponding It is set as the axis 2 (i.e. CH2) of control parts of motion devicel, Little Arm (forearm) setting of MRX-T4 robot model Wrist (wrist) for the axis 3 (CH3) of control parts of motion devicel, MRX-T4 robot model is set as control parts of motion The axis 4 (CH4) of devicel, the Hand (manipulator) of MRX-T4 robot model are set as the axis of control parts of motion devicel 5(CH5)。
In the above manner, each joint of robot model can be closed with each axis of corresponding control parts of motion Connection gets up.In one example, user can edit or import the kinematic parameter of the end effector of robot model, by controlling The kinematic parameter is scaled the kinematic parameter in each joint of robot model by equipment.In another example, user can be straight It meets editor or imports the kinematic parameter in each joint of robot model.It then, can be by these kinematic parameters according to established Incidence relation sends each axis of corresponding control parts of motion to, and then drives the practical robot controlled of the control parts of motion Movement.
In the case where control equipment to be controlled is control parts of motion and is connected with the robot of physical presence, target robot The kinematic parameter of model can be used for driving the robot motion of the physical presence.Comparison it is appreciated that physical presence machine People is identical as the configuration of target robot model, such as is all MRX-T4 robot.
According to a further aspect of the invention, a kind of motion control method is provided.Figure 11 is shown according to an embodiment of the present invention Motion control method 1100 schematic flow chart.As shown in figure 11, motion control method 1100 include step S1110, S1120、S1130。
In step S1110, parameter editor relevant to target device model, target are shown on human-computer interaction interface Device model is robot model or control parts of motion model.
Illustratively, the motion control instruction corresponding with target device model that can be inputted based on user, in man-machine friendship Display parameter editor relevant to target device model on mutual interface.For example, user can click and (double-click or click) volume Collect the identification information for the target device model that window or device window are shown.In response to the clicking operation of user, ginseng can be popped up Number editor.
For different types of equipment, the content that parameter editor is shown can be different.Figure 12 is shown according to this The schematic diagram of the parameter editor of invention one embodiment.Figure 13 shows parameter editor in accordance with another embodiment of the present invention The schematic diagram of window.The kinematic parameter that editor is participated in parameter editor shown in Figure 12 is LVT parameter, can be machine The kinematic parameter of the end effector of people or robot model.The movement ginseng of editor is participated in parameter editor shown in Figure 13 Number is PVT parameter, can be the kinematic parameter of any one axis of control parts of motion or control parts of motion model.
Target device model can be any one device model, can optionally be selected by user.For example, user can To click the identification information of any appliance model in editor or device window, to select the device model as target device Model.Optionally, the click can be double-click.
In step S1120, the kinematic parameter that user edits in parameter editor is received.
Referring to Fig.1 2, user can input the data of a kinematic parameter in every row, including time data, coordinate data, Clamping jaw displacement data etc..Coordinate data refers to the coordinate of the fixed point on end effector, such as a certain central point of gripper Coordinate.The coordinate data of any kinematic parameter is used to indicate end effector time indicated by the kinematic parameter (immediately Between data) position that should reach.The time data of any kinematic parameter are used to indicate the arrival of the fixed point on end effector The time of position indicated by the kinematic parameter (i.e. coordinate data).Clamping jaw displacement data refers to two clamping jaws of end effector The distance of transverse shifting.Clamping jaw displacement data is optional.It is exemplary and not restrictive, end effector can have can Folding, the clamping jaw that can be subjected to displacement in a lateral direction.In this case, end effector can have clamping jaw displacement Data.The clamping jaw displacement data is indicated with h (unit mm) in the example depicted in fig. 12.
In addition, illustratively, every kinematic parameter can also include m odel validity data and/or interpolation validity number According to.The m odel validity data of any kinematic parameter are to be used to indicate the whether effective data of this kinematic parameter.For example, scheming In 12, the first column data of every kinematic parameter, i.e., " allowing " data of this column is m odel validity data, when the data It is that effectively, corresponding equipment (robot or robot model) can be included in represent this kinematic parameter when " true " Motion profile;When the data be " false " when represent this kinematic parameter be it is invalid, be not included in the movement of corresponding equipment This kinematic parameter is ignored in track.The interpolation efficacy data of any kinematic parameter is to be used to indicate whether to move in this The data of interpolation are executed between parameter and next kinematic parameter.For example, in Figure 12, the 8th columns of every kinematic parameter According to the data of i.e. " interpolation method " this column are interpolation efficacy data, when the data are " true " Shi Daibiao in this fortune Interpolation is executed between dynamic parameter and next kinematic parameter;When the data be " false " Shi Daibiao not in this kinematic parameter and Interpolation is executed between next kinematic parameter.
In the example depicted in fig. 13, the form of kinematic parameter is different from Figure 12.Kinematic parameter shown in Figure 13 includes ginseng Number efficacy data, time data, position data and speed data.The position data is rotation angle.
User can edit one or more kinematic parameter in parameter editor, and can according to need and editing Interpolation is carried out in good kinematic parameter, with the bigger kinematic parameter of generation quantity.For any one data in kinematic parameter, Text box, list box or other kinds of control can be used as the edition interface of the data.User can be in the edition interface In the values of the data is set by modes such as input, selection, clicks.
In step S1130, kinematic parameter is sent to control equipment to be controlled corresponding with target device model interaction, with control Corresponding equipment moving to be controlled, corresponding control equipment to be controlled are the robot or control parts of motion that connection is established with control equipment.
As set forth above, it is possible to which target device model is associated with control equipment to be controlled, interrelational form can be with reference to above Description.After user edits kinematic parameter, kinematic parameter can be sent to corresponding control equipment to be controlled.Corresponding to control equipment to be controlled is In the case where robot, which can be moved based on kinematic parameter.Corresponding control equipment to be controlled is the feelings of control parts of motion Under condition, which can generate PWM waveform based on kinematic parameter, and then drives and connect with the control parts of motion Robot motion.
Motion control method according to an embodiment of the present invention can be interacted with user, be directed to target device mould using user The kinematic parameter of type editor controls and the control practical equipment moving to be controlled connecting of equipment.This mode to pass through inhomogeneity The device model of type flexibly and easily carries out control to different control equipments to be controlled and is possibly realized.
According to embodiments of the present invention, parameter editor's window relevant to target device model is being shown on human-computer interaction interface Before mouth (step S1110), method 1100 can also include: to establish current control scene;And the activation based on user's input Instruction activates target device model in currently control scene.
Illustratively, establishing current control scene may include: the document scene generated for indicating currently to control scene. Generation document scene is hereinbefore described, establishes the mode of control scene, can understand the present embodiment with reference to above description, this Place repeats no more.
According to embodiments of the present invention, human-computer interaction interface may include robot window, method 1100 can also include: The identification information of each device model in the first model set is shown in robot window;Wherein, the first model set includes At least one device model, target device model belong to the first model set.
Above have been combined Fig. 3 and Fig. 5 describe robot window layout type and it includes content, can refer to Above description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, human-computer interaction interface can also include the editor for editor control scene, base It may include: in response to user in machine that target device model is activated in currently control scene in the activation instruction of user's input To the clicking operation of the identification information of target device model or by the identification information of target device model from machine in people's window People's window drags to the drag operation of editor, activates target device model;And displaying target equipment in the editing window The identification information of model;Wherein, activation instruction is clicking operation or the corresponding instruction of drag operation.
The implementation that user inputs activation instruction activation target device model is hereinbefore described, it can be with reference to above Description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, establishing current control scene may include: to generate for indicating currently to control scene Document scene;Before establishing current control scene, method 1100 can also include: generation project file, and project file is used for Manage the document scene of at least one control scene.
The content and its effect that project file includes is hereinbefore described, can understand this implementation with reference to above description Example, details are not described herein again.
According to embodiments of the present invention, method 1100 can also include: and be grouped to project file, to obtain and at least one At least one one-to-one sub- project file of a control scene.
Referring back to Fig. 3 and Fig. 4, it can be seen that shown in project window two sub- engineering Untitled_0 and Untitled_1.The two sub- engineerings correspond to two sub- project files, for managing the scene text of two control scenes respectively Part " 3.sce " and " 11111.sce ".This mode may be implemented to check the grouping management of different control scenes convenient for user And operation.
Illustratively, the grouping instruction execution that can be inputted based on user to the operation that project file is grouped, can also To be executed automatically by control equipment.
According to embodiments of the present invention, method 1100 can also include: the ginseng for receiving user and inputting in parameter editor Number distribution instruction;And kinematic parameter is distributed into the indicated designated equipment of parametric distribution instruction.
The kinematic parameter that user edits can be sent to control equipment to be controlled corresponding with target device model interaction, can also be with Distribute to other designated equipments.In the case where designated equipment is the control equipment to be controlled connecting practical with control equipment, distribution Kinematic parameter can control designated equipment and generate actual movement.In the case where designated equipment is device model, the fortune of distribution Dynamic parameter can be used for skimulated motion, can not necessarily generate actual movement.In this way, the kinematic parameter that user edits can be flexible One or more editable equipment are distributed on ground, this is advantageously implemented the synchronously control to different editable equipment.
Illustratively, parameter editor may include equipment specify control, equipment specify control be Input or Person's list control, receiving the parametric distribution instruction that user inputs in parameter editor may include: to receive user in equipment Identification information inputted in specified control, designated equipment;Or reception user specifies from equipment and selects designated equipment in control Identification information operation information;Wherein, parametric distribution instruction include user for equipment specify control performed by input or Instruction corresponding to selection operation.
Referring back to Figure 12, shows equipment and specify control.In Figure 12, the specified control of equipment has been indicated with dotted line frame Part.It is list control that equipment shown in Figure 12, which specifies control,.In list control, it can show and owning in editable equipment The one-to-one identification information of equipment is for selection by the user.Illustratively, the mark of any appliance shown in list control Information may include following information: the scene title of control scene belonging to the model of corresponding equipment, corresponding equipment.
In the example depicted in fig. 12, the option currently chosen is " MRX-AS@3.sce ", wherein MRX-AS indicates machine The model of people's model, 3.see indicate that the robot model belongs to " 3.see " this control scene.Therefore, which indicates movement Parameter is assigned to the MRX-AS robot model in " 3.see " this control scene.
Optionally, in the case that equipment specifies control to be Input, user can input directly in text box to be referred to The identification information of locking equipment.
According to embodiments of the present invention, designated equipment is one of editable equipment, and editable equipment includes the second model set In all devices model and/or establish at least one control equipment to be controlled of connection with control equipment, the second model set includes At least one device model, target device model belong to the second model set.
According to embodiments of the present invention, the second model set includes all devices mould activated at least one control scene Type.
Be hereinbefore described editable equipment and the second model set meaning and it includes content, can refer to upper Text description understands the present embodiment, and details are not described herein again.
According to a further aspect of the invention, a kind of user interaction approach is provided.Figure 14 is shown according to an embodiment of the present invention User interaction approach 1400 schematic flow chart.As shown in figure 14, user interaction approach 1400 include step S1410, S1420、S1430。
In step S1410, human-computer interaction interface is shown, human-computer interaction interface includes project window, robot window, equipment One or more in window and editor, project window is for showing listed files relevant to project file, robot Window is used to show the identification information of each device model in the first model set, and device window is for showing editable equipment Identification information, editor be used for editor control scene, editable equipment includes all devices mould in the second model set Type and/or at least one control equipment to be controlled that connection is established with control equipment, at least one control equipment to be controlled includes at least one Robot and/or at least one control parts of motion, the first model set and the second model set respectively include at least one and set Standby model, each device model is robot model or control parts of motion model.
Fig. 3-6 is had been combined above describes the layout type of various windows on human-computer interaction interface and comprising in Hold, details are not described herein.
In step S1420, the instruction that user inputs on human-computer interaction interface is received.
The instruction that user inputs on human-computer interaction interface can include but is not limited to above-mentioned activation instruction, associated instructions, Motion control instruction etc..The interaction of user and human-computer interaction interface can by control equipment interactive device or independent friendship Mutual device is realized.Optionally, interactive device may include input unit and output device.User can be inputted by input unit Instruction, control equipment can show human-computer interaction interface and other relevant informations so that user checks by output device.It is defeated Entering device can include but is not limited to mouse, keyboard, one or more in touch screen.Output device can include but is not limited to Display.In one example, interactive device includes touch screen, can be realized simultaneously the function of input unit and output device Energy.
In step S1430, the corresponding operation of instruction execution based on user's input.
For example, when receiving the activation instruction of user's input correspondingly, target device model can be activated.In another example In the transmission instruction for the transmission kinematic parameter for receiving user's input, the kinematic parameter editted can be sent to be controlled Equipment.
User interaction approach according to an embodiment of the present invention, can be shown on human-computer interaction interface device model and/or With the control practical control equipment to be controlled connecting of equipment, user can transport for these device models and/or control equipment to be controlled The operations such as dynamic parameter editor.This interactive mode facilitates user to carry out pipe at least one robot and/or control parts of motion Reason and control, and user is facilitated to simulate, test to robot and/or control parts of motion.
According to embodiments of the present invention, based on user input the corresponding operation (step S1430) of instruction execution may include: Scene based on user's input establishes instruction or scene imports instruction, establishes current control scene;And in the editing window to User provides the interface for editing current control scene.
Illustratively, user can click " file " of menu bar this menucommand, therefrom selection " newly-built scene text Part " control, in response to the aforesaid operations of user, control equipment can create a document scene and can be in the editing window The interface for editing control scene corresponding to the document scene is provided.Illustratively, user can also click " the text of menu bar This menucommand of part ", therefrom selection " import document scene " control, in response to the aforesaid operations of user, controlling equipment can be with It imports the existing document scene of user's selection and can provide in the editing window and edit control corresponding to the document scene The interface of scene.
According to embodiments of the present invention, based on user input the corresponding operation (step S1430) of instruction execution may include: Activation instruction based on user's input activates the target device model in the first model set in currently control scene.
According to embodiments of the present invention, the activation instruction based on user's input activates the first Models Sets in currently control scene Target device model in conjunction may include: in response to user in robot window to the identification information of target device model Clicking operation or the drag operation that the identification information of target device model is dragged to editor from robot window, activation Target device model;And the identification information of displaying target device model in the editing window;Wherein, activation instruction is to click behaviour Work or the corresponding instruction of drag operation.
The implementation that user inputs activation instruction activation target device model is hereinbefore described, it can be with reference to above Description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, the second model set may include all setting of activating at least one control scene Standby model.
According to embodiments of the present invention, based on user input the corresponding operation (step S1430) of instruction execution may include: Based on the motion control instruction corresponding with the designated equipment in editable equipment of user's input, shown on human-computer interaction interface Parameter editor relevant to designated equipment;Receive the kinematic parameter that user edits in parameter editor;And it will fortune Parametric distribution is moved to designated equipment.
The mode that the mode and user that display parameters editor is hereinbefore described interact can refer to upper Text description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, parameter editor can be shown in the editing window.Referring back to Figure 12 and Figure 13, When popping up parameter editor, can be replaced with the interface of the window in editor for editing the boundary of current control scene Face.
According to embodiments of the present invention, the motion control corresponding with the designated equipment in editable equipment based on user's input Instruction shows that parameter editor relevant to designated equipment may include: to set in response to user on human-computer interaction interface To the clicking operation of the identification information of designated equipment in standby window, the display parameters editor on human-computer interaction interface.
The implementation for interacting display parameters editor with user is hereinbefore described, can be managed with reference to above description The present embodiment is solved, details are not described herein again.
According to embodiments of the present invention, listed files may include be respectively used to indicate at least one control scene at least one The identification information of a document scene.
The information shown in project window can be considered as a listed files.Listed files may include one or more The identification information of the file of type, such as the identification information of project file, the identification information of motor movement file, document scene Identification information etc..Referring back to Fig. 4, listed files includes identification information " 1212.prj ", the motor movement of project file 1212 The identification information " a.pvt " of file a, the identification information " 3.sce " of document scene 3 and 11111 and " 11111.sce ".
According to a further aspect of the invention, a kind of user interaction approach 1500 is provided.Figure 15 shows a reality according to the present invention Apply the schematic flow chart of the user interaction approach 1500 of example.As shown in figure 15, user interaction approach 1500 includes step S1510、S1520、S1530、S1540。
In step S1510, the display parameters editor on human-computer interaction interface.
Illustratively, the motion control instruction corresponding with any editable equipment that can be inputted based on user, man-machine Parameter editor relevant to the editable equipment is shown on interactive interface.For example, user can click and (double-click or click) The identification information for the target device model that editor or device window are shown.In response to the clicking operation of user, can pop up The corresponding parameter editor of the target device model.User can edit kinematic parameter in the parameter editor, and will Kinematic parameter is reassigned to any designated equipment.
Illustratively, user can click " file " of menu bar this menucommand, therefrom selection " newly-built robot fortune Dynamic file " or " importing robot motion's file " control, in response to the operation of user, controlling equipment can be popped up for editing machine The parameter editor of the kinematic parameter of the end effector of device people or robot model (referring to Figure 12).
Illustratively, user can click " file " of menu bar this menucommand, therefrom selection " newly-built motor movement File " or " importing motor movement file " control, in response to the operation of user, controlling equipment can be popped up for editor robot Or some joint of robot model kinematic parameter or for editing control parts of motion or control parts of motion model The parameter editor of the kinematic parameter of some axis (referring to Figure 13).
The kinematic parameter shown by parameter editor is the movement of robot or the end effector of robot model In the case where parameter, the edited kinematic parameter of user can be stored in robot motion's file.In parameter editor Shown kinematic parameter is the kinematic parameter or control parts of motion or fortune in some joint of robot or robot model In the case where the kinematic parameter of some axis of dynamic control unit model, the edited kinematic parameter of user can be stored in motor In move file.This storage scheme is not only restricted to the pop-up mode of parameter editor.
In step S1520, the kinematic parameter that user edits in parameter editor is received.
In step S1530, the parametric distribution instruction that user inputs in parameter editor is received.
In step S1540, kinematic parameter is distributed into the indicated designated equipment of parametric distribution instruction.
It has been combined Figure 12 and Figure 13 above and describes user and edit kinematic parameter and distribute to kinematic parameter specified The implementation of equipment can understand the present embodiment with reference to above description, and details are not described herein again.
According to embodiments of the present invention, designated equipment is one of editable equipment, and editable equipment includes the second model set In all devices model and/or establish at least one control equipment to be controlled of connection with control equipment, at least one is to be controlled to set up Standby includes at least one robot and/or at least one control parts of motion, and the second model set includes at least one equipment mould Type, each device model are robot model or control parts of motion model.
According to embodiments of the present invention, method 1500 can also include: to establish at least one control scene;And it is based on user The activation instruction of input activates at least one device model at least one control scene, to obtain the second model set.
The implementation for establishing control scene and activation equipment model is hereinbefore described, above description can be referred to Understand the present embodiment, details are not described herein again.
According to embodiments of the present invention, parameter editor includes that equipment specifies control, and it is text box control that equipment, which specifies control, Part or list control, receiving the parametric distribution instruction (step S1530) that user inputs in parameter editor may include: It receives user and specifies identification information input in control, designated equipment in equipment;Or it receives user and specifies control from equipment The operation information of the identification information of middle selection designated equipment;Wherein, parametric distribution instruction includes user for the specified control of equipment Instruction corresponding to performed input or selection operation.
In the case where kinematic parameter distributes to designated equipment entirety, user only can indicate to set to which to control equipment Back-up matches kinematic parameter.For example, if necessary to which the kinematic parameter that robot motion's file includes is distributed to robot or machine People's model, then user specifies control to specify the robot or robot model by equipment.
According to embodiments of the present invention, in the case where it is list control that equipment, which specifies control, equipment specifies control for showing Show with the one-to-one identification information of all devices in editable equipment for selection by the user;Wherein, editable equipment includes All devices model in second model set and/or at least one control equipment to be controlled that connection is established with control equipment, at least One control equipment to be controlled includes at least one robot and/or at least one control parts of motion, and the second model set includes extremely A few device model, each device model is robot model or control parts of motion model.
According to embodiments of the present invention, the identification information of any appliance in editable equipment may include following information: right It answers the model of equipment, correspond to the scene title of control scene belonging to equipment.
According to embodiments of the present invention, designated equipment is control parts of motion or control parts of motion comprising at least one axis Model, it may include: to join movement that kinematic parameter, which is distributed to the indicated designated equipment (step S1540) of parametric distribution instruction, Number distributes to the specified axis of the indicated designated equipment of parametric distribution instruction.
According to embodiments of the present invention, parameter editor includes that equipment specifies control and axis to specify control, the specified control of equipment Part is that perhaps the specified control of list control axis is Input or list control to Input, receives user in parameter The parametric distribution instruction inputted in editor may include the specified operation of equipment and the specified operation of axis, wherein the specified behaviour of equipment Work includes: to receive user to specify identification information input in control, designated equipment in equipment;Or it receives user and refers to from equipment Determine the operation information that the identification information of designated equipment is selected in control;The specified operation of axis includes: to receive user to specify control in axis The identification information of middle input, designated equipment specified axis;Or reception user specifies from equipment and selects designated equipment in control Specified axis identification information operation information;Wherein, parametric distribution instruction includes the specified operation of equipment and the specified operation institute of axis Corresponding instruction.
In the case where designated equipment is control parts of motion or control parts of motion model, kinematic parameter can be distributed to Any one axis of designated equipment, user can indicate which axis distribution kinematic parameter to which equipment to control equipment at this time. For example, if necessary to which the kinematic parameter that motor movement file includes is distributed to control parts of motion or control parts of motion model Some axis, then user can specify control to specify the control parts of motion or control parts of motion model by equipment, and lead to Crossing axis specifies control to specify an axis on the control parts of motion or control parts of motion model.
Referring back to Figure 13, shows equipment and control and axis is specified to specify control.Equipment shown in Figure 12 specify control and It is list control that axis, which specifies control,.It specifies in control, can show a pair of with all devices one in editable equipment in equipment The identification information answered is for selection by the user.It specifies in control, can show one-to-one with all axis of designated equipment in axis Identification information is for selection by the user.For example, designated equipment is a kind of control parts of motion (five axis drive of model MRQ-M2305 Control device) in the case where, can axis specify shown in control the identifiers " CH1 " of five axis, " CH2 ", " CH3 ", " CH4 ", “CH5”。
According to embodiments of the present invention, in the case where it is list control that equipment, which specifies control, equipment specifies control for showing Show with the one-to-one identification information of all devices in editable equipment for selection by the user;Specifying control in axis is list control In the case where part, axis specifies control for showing with the one-to-one identification information of all axis of designated equipment for user's choosing It selects;Wherein, editable equipment includes all devices model in the second model set and/or establishes connection extremely with control equipment A few control equipment to be controlled, at least one control equipment to be controlled includes at least one robot and/or at least one motion control portion Part, the second model set include at least one device model, and each device model is robot model or control parts of motion mould Type.
According to a further aspect of the invention, a kind of control equipment is provided.Figure 16 shows control according to an embodiment of the invention The schematic block diagram of control equipment 1600.
As shown in figure 16, control equipment 1600 according to an embodiment of the present invention includes display module 1610, receiving module 1620 and execution module 1630.The modules can be executed respectively above in conjunction with each of Figure 14 user interaction approach described A step/function.Only the major function of each component of the control equipment 1600 is described below, and omits above retouched The detail content stated.
Display module 1610 for showing human-computer interaction interface, human-computer interaction interface include project window, robot window, One or more in device window and editor, project window is for showing listed files relevant to project file, machine Device people's window is used to show the identification information of each device model in the first model set, and device window is for showing editable The identification information of equipment, editor are used for editor control scene, and editable equipment includes that all in the second model set set Standby model and/or at least one control equipment to be controlled that connection is established with control equipment, at least one control equipment to be controlled includes at least One robot and/or at least one control parts of motion, the first model set and the second model set respectively include at least one A device model, each device model are robot model or control parts of motion model.
Receiving module 1620 is for receiving the instruction that user inputs on human-computer interaction interface.
Sending module 1630 is used for the corresponding operation of instruction execution inputted based on user.
Figure 17 shows the schematic block diagrams of control equipment 1700 according to an embodiment of the invention.Control equipment 1700 Including display 1710, storage device (i.e. memory) 1720 and processor 1730.
The display 1710 is for showing above-mentioned human-computer interaction interface.
The storage of storage device 1720 is for realizing the phase in user interaction approach 1400 according to an embodiment of the present invention Answer the computer program instructions of step.
The processor 1730 is for running the computer program instructions stored in the storage device 1720, to execute root According to the corresponding steps of the user interaction approach 1400 of the embodiment of the present invention.
Illustratively, control equipment 1700 can also include input unit, for receiving the instruction of user's input.
Illustratively, aforementioned display device and input unit can be realized using same touch screen.
In one embodiment, for executing following steps when computer program instructions are run by processor 1730: display Human-computer interaction interface, human-computer interaction interface include one in project window, robot window, device window and editor or Multinomial, project window is for showing listed files relevant to project file, and robot window is for showing the first model set In each device model identification information, device window is used to show that the identification information of editable equipment, editor to be used for Editor control scene, editable equipment include that all devices model in the second model set and/or establish with control equipment connects At least one connect control equipment to be controlled, at least one control equipment to be controlled include at least one robot and/or at least one movement Control unit, the first model set and the second model set respectively include at least one device model, and each device model is machine Device people model or control parts of motion model;Receive the instruction that user inputs on human-computer interaction interface;And it is defeated based on user The corresponding operation of the instruction execution entered.
In addition, another aspect according to the present invention, additionally provides a kind of storage medium, stores journey on said storage Sequence instruction makes the computer or processor execute the present invention real when described program instruction is run by computer or processor Apply the corresponding steps of the above-mentioned user interaction approach of example.The storage medium for example may include tablet computer storage unit, Hard disk, read-only memory (ROM), Erasable Programmable Read Only Memory EPROM (EPROM), the portable compact disc of personal computer Read any combination of memory (CD-ROM), USB storage or above-mentioned storage medium.The computer readable storage medium It can be any combination of one or more computer readable storage mediums.
Those of ordinary skill in the art are by reading the associated description above for user interaction approach, it is possible to understand that above-mentioned The specific implementation for controlling equipment and storage medium, for sake of simplicity, details are not described herein.
Although describing example embodiment by reference to attached drawing here, it should be understood that above example embodiment are only exemplary , and be not intended to limit the scope of the invention to this.Those of ordinary skill in the art can carry out various changes wherein And modification, it is made without departing from the scope of the present invention and spiritual.All such changes and modifications are intended to be included in appended claims Within required the scope of the present invention.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In several embodiments provided herein, it should be understood that disclosed device and method can pass through it Its mode is realized.For example, apparatus embodiments described above are merely indicative, for example, the division of the unit, only Only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components can be tied Another equipment is closed or is desirably integrated into, or some features can be ignored or not executed.
In the instructions provided here, numerous specific details are set forth.It is to be appreciated, however, that implementation of the invention Example can be practiced without these specific details.In some instances, well known method, structure is not been shown in detail And technology, so as not to obscure the understanding of this specification.
Similarly, it should be understood that in order to simplify the present invention and help to understand one or more of the various inventive aspects, To in the description of exemplary embodiment of the present invention, each feature of the invention be grouped together into sometimes single embodiment, figure, Or in descriptions thereof.However, the method for the invention should not be construed to reflect an intention that i.e. claimed The present invention claims features more more than feature expressly recited in each claim.More precisely, such as corresponding power As sharp claim reflects, inventive point is that the spy of all features less than some disclosed single embodiment can be used Sign is to solve corresponding technical problem.Therefore, it then follows thus claims of specific embodiment are expressly incorporated in this specific Embodiment, wherein each, the claims themselves are regarded as separate embodiments of the invention.
It will be understood to those skilled in the art that any combination pair can be used other than mutually exclusive between feature All features disclosed in this specification (including adjoint claim, abstract and attached drawing) and so disclosed any method Or all process or units of equipment are combined.Unless expressly stated otherwise, this specification (is wanted including adjoint right Ask, make a summary and attached drawing) disclosed in each feature can be replaced with an alternative feature that provides the same, equivalent, or similar purpose.
In addition, it will be appreciated by those of skill in the art that although some embodiments described herein include other embodiments In included certain features rather than other feature, but the combination of the feature of different embodiments mean it is of the invention Within the scope of and form different embodiments.For example, in detail in the claims, embodiment claimed it is one of any Can in any combination mode come using.
Various component embodiments of the invention can be implemented in hardware, or to run on one or more processors Software module realize, or be implemented in a combination thereof.It will be understood by those of skill in the art that can be used in practice Microprocessor or digital signal processor (DSP) realize some modules in control equipment according to an embodiment of the present invention Some or all functions.The present invention is also implemented as executing some or all of method as described herein Program of device (for example, computer program and computer program product).Such realization program of the invention, which can store, to be counted On calculation machine readable medium, or it may be in the form of one or more signals.Such signal can be from internet website Upper downloading obtains, and is perhaps provided on the carrier signal or is provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and ability Field technique personnel can be designed alternative embodiment without departing from the scope of the appended claims.In the claims, Any reference symbol between parentheses should not be configured to limitations on claims.Word "comprising" does not exclude the presence of not Element or step listed in the claims.Word "a" or "an" located in front of the element does not exclude the presence of multiple such Element.The present invention can be by means of including the hardware of several different elements and being come by means of properly programmed computer real It is existing.In the unit claims listing several devices, several in these devices can be through the same hardware branch To embody.The use of word first, second, and third does not indicate any sequence.These words can be explained and be run after fame Claim.
The above description is merely a specific embodiment or to the explanation of specific embodiment, protection of the invention Range is not limited thereto, and anyone skilled in the art in the technical scope disclosed by the present invention, can be easily Expect change or replacement, should be covered by the protection scope of the present invention.Protection scope of the present invention should be with claim Subject to protection scope.

Claims (12)

1. a kind of user interaction approach is applied to control equipment, which comprises
Show that human-computer interaction interface, the human-computer interaction interface include project window, robot window, device window and editor's window One or more in mouthful, the project window is for showing listed files relevant to project file, the robot window For showing the identification information of each device model in the first model set, the device window is for showing editable equipment Identification information, the editor be used for editor control scene, the editable equipment includes the institute in the second model set There is device model and/or establish at least one control equipment to be controlled of connection with the control equipment, described at least one is to be controlled Equipment includes at least one robot and/or at least one control parts of motion, first model set and second mould Type set respectively includes at least one device model, and each device model is robot model or control parts of motion model;
Receive the instruction that user inputs on the human-computer interaction interface;And
The corresponding operation of instruction execution based on user input.
2. the method for claim 1, wherein corresponding operation packet of the instruction execution based on user input It includes:
Scene based on user input establishes instruction or scene imports instruction, establishes current control scene;And
The interface for editing the current control scene is provided a user in the editor.
3. the method for claim 1, wherein corresponding operation packet of the instruction execution based on user input It includes:
Activation instruction based on user's input activates the target device mould in first model set in currently control scene Type.
4. method as claimed in claim 3, wherein the activation instruction based on user's input swashs in currently control scene Target device model in living first model set includes:
In response to clicking operation of the user in the robot window to the identification information of the target device model or The identification information of the target device model is dragged to the drag operation of the editor by person from the robot window, is swashed The target device model living;And
The identification information of the target device model is shown in the editor;
Wherein, the activation instruction is the clicking operation or the corresponding instruction of the drag operation.
5. such as the described in any item methods of Claims 1-4, wherein second model set be included in it is described at least one The all devices model activated in control scene.
6. such as the described in any item methods of Claims 1-4, wherein the instruction execution based on user input is corresponding Operation include:
Based on the motion control instruction corresponding with the designated equipment in the editable equipment of user input, in the people Parameter editor relevant to the designated equipment is shown on machine interactive interface;
Receive the kinematic parameter that the user edits in the parameter editor;And
The kinematic parameter is distributed into the designated equipment.
7. method as claimed in claim 6, wherein the parameter editor is shown in the editor.
8. method as claimed in claim 6, wherein it is described based on the user input with the finger in the editable equipment The corresponding motion control instruction of locking equipment shows parameter editor relevant to the designated equipment on the human-computer interaction interface Window includes:
In response to the user to the clicking operation of the identification information of the designated equipment in the device window, in the people The parameter editor is shown on machine interactive interface.
9. the method for claim 1, wherein the listed files includes being respectively used to indicate at least one described control The identification information of at least one document scene of scene.
10. a kind of control equipment, comprising:
Display module, for showing that human-computer interaction interface, the human-computer interaction interface include project window, robot window, set One or more in standby window and editor, the project window is used to show listed files relevant to project file, The robot window is used to show that the identification information of each device model in the first model set, the device window to be used for Show that the identification information of editable equipment, the editor are used for editor control scene, the editable equipment includes second All devices model in model set and/or at least one control equipment to be controlled that connection is established with the control equipment, it is described At least one control equipment to be controlled includes at least one robot and/or at least one control parts of motion, first Models Sets It closes and second model set respectively includes at least one device model, each device model is robot model or movement control Partial model processed;
Receiving module, the instruction inputted on the human-computer interaction interface for receiving user;And
Execution module, the corresponding operation of instruction execution for being inputted based on the user.
11. a kind of control equipment, including display, processor and memory, wherein the display is for showing human-computer interaction Interface is stored with computer program instructions in the memory, and the computer program instructions are used when being run by the processor In execution user interaction approach as described in any one of claim 1 to 9.
12. a kind of storage medium stores program instruction on said storage, described program instruction is at runtime for holding Row user interaction approach as described in any one of claim 1 to 9.
CN201910154654.2A 2019-02-28 2019-02-28 User interaction method, control device and storage medium Active CN110000753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910154654.2A CN110000753B (en) 2019-02-28 2019-02-28 User interaction method, control device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910154654.2A CN110000753B (en) 2019-02-28 2019-02-28 User interaction method, control device and storage medium

Publications (2)

Publication Number Publication Date
CN110000753A true CN110000753A (en) 2019-07-12
CN110000753B CN110000753B (en) 2021-10-26

Family

ID=67166215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910154654.2A Active CN110000753B (en) 2019-02-28 2019-02-28 User interaction method, control device and storage medium

Country Status (1)

Country Link
CN (1) CN110000753B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113043250A (en) * 2021-04-21 2021-06-29 深圳先进技术研究院 Robot teaching system and method, and robot control system and method
CN113874175A (en) * 2020-03-09 2021-12-31 深圳市大疆创新科技有限公司 Control system, method, electronic device, removable device, and computer-readable storage medium
CN114237487A (en) * 2021-11-26 2022-03-25 浙江长兴和良智能装备有限公司 Control method of pipe fitting machining equipment, interface generation method and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070233928A1 (en) * 2006-03-31 2007-10-04 Robert Gough Mechanism and apparatus for dynamically providing required resources for a hot-added PCI express endpoint or hierarchy
CN105643607A (en) * 2016-04-08 2016-06-08 深圳市中科智敏机器人科技有限公司 Intelligent industrial robot with sensing and cognitive abilities
CN106457557A (en) * 2014-03-19 2017-02-22 株式会社乐博特思 Robot assembly device
CN106794580A (en) * 2014-06-03 2017-05-31 波特与多利有限责任公司 System and method for instructing robot manipulation
CN107220099A (en) * 2017-06-20 2017-09-29 华中科技大学 A kind of robot visualization virtual teaching system and method based on threedimensional model
CN107562017A (en) * 2017-08-06 2018-01-09 北京镁伽机器人科技有限公司 Parameter edit methods, computer-readable medium and the computer of control parts of motion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070233928A1 (en) * 2006-03-31 2007-10-04 Robert Gough Mechanism and apparatus for dynamically providing required resources for a hot-added PCI express endpoint or hierarchy
CN106457557A (en) * 2014-03-19 2017-02-22 株式会社乐博特思 Robot assembly device
CN106794580A (en) * 2014-06-03 2017-05-31 波特与多利有限责任公司 System and method for instructing robot manipulation
CN105643607A (en) * 2016-04-08 2016-06-08 深圳市中科智敏机器人科技有限公司 Intelligent industrial robot with sensing and cognitive abilities
CN107220099A (en) * 2017-06-20 2017-09-29 华中科技大学 A kind of robot visualization virtual teaching system and method based on threedimensional model
CN107562017A (en) * 2017-08-06 2018-01-09 北京镁伽机器人科技有限公司 Parameter edit methods, computer-readable medium and the computer of control parts of motion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘杰,王涛: "《工业机器人离线编程与仿真项目教程》", 31 January 2019 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113874175A (en) * 2020-03-09 2021-12-31 深圳市大疆创新科技有限公司 Control system, method, electronic device, removable device, and computer-readable storage medium
CN113043250A (en) * 2021-04-21 2021-06-29 深圳先进技术研究院 Robot teaching system and method, and robot control system and method
CN114237487A (en) * 2021-11-26 2022-03-25 浙江长兴和良智能装备有限公司 Control method of pipe fitting machining equipment, interface generation method and storage medium

Also Published As

Publication number Publication date
CN110000753B (en) 2021-10-26

Similar Documents

Publication Publication Date Title
CN106200983B (en) A kind of system of combination virtual reality and BIM realization virtual reality scenario architectural design
US10521522B2 (en) Robot simulator and file generation method for robot simulator
Garg et al. Digital twin for fanuc robots: Industrial robot programming and simulation using virtual reality
US7076322B2 (en) System and method for satisfying move constraints when performing a motion control sequence
CN109986559A (en) Parameter edit methods and system, control equipment and storage medium
CN102906652B (en) Method and system for closed-loop controller programming
US7076332B2 (en) System and method for invoking execution of a sequence of operations that includes motion control, machine vision, and data acquisition (DAQ) functionality
US7849416B2 (en) System and method for graphically creating a sequence of motion control, machine vision, and data acquisition (DAQ) operations
Abidi et al. Contribution of virtual reality for lines production’s simulation in a lean manufacturing environment
CN110000753A (en) User interaction approach, control equipment and storage medium
EP1842631A1 (en) Apparatus and method for automatic path generation for an industrial robot
CN105159754B (en) In-circuit emulation method and device based on business intelligence cloud platform
CN109910004A (en) User interaction approach, control equipment and storage medium
US7930643B2 (en) System and method for previewing a sequence of motion control operations
CN109807898A (en) Motion control method, control equipment and storage medium
CN102629388B (en) Mechanical equipment simulation system generating method
EP2596446A1 (en) A non-programmer method for creating simulation-enabled 3d robotic models for immediate robotic simulation, without programming intervention
CN109807896A (en) Motion control method and system, control equipment and storage medium
CN109551485A (en) Motion control method, device and system and storage medium
Balzerkiewitz et al. The evolution of virtual reality towards the usage in early design phases
CN110000775A (en) Device management method, control equipment and storage medium
CN109605378A (en) Processing method, device and system and the storage medium of kinematic parameter
Waurich et al. Interactive FMU-Based Visualization for an Early Design Experience.
CN109551484A (en) Processing method, device and system and the storage medium of kinematic parameter
Hossain et al. Virtual control system development platform with the application of PLC device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20191211

Address after: No.1705, building 8, Qianhai preeminent Financial Center (phase I), unit 2, guiwan District, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Mga Technology (Shenzhen) Co., Ltd

Address before: 102208 1, unit 1, 1 hospital, lung Yuan middle street, Changping District, Beijing 1109

Applicant before: Beijing magnesium Robot Technology Co., Ltd.

CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518052 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Shenzhen mga Technology Co.,Ltd.

Address before: 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong 518000

Applicant before: Mga Technology (Shenzhen) Co.,Ltd.

GR01 Patent grant
GR01 Patent grant