CN109807896A - Motion control method and system, control equipment and storage medium - Google Patents

Motion control method and system, control equipment and storage medium Download PDF

Info

Publication number
CN109807896A
CN109807896A CN201910154655.7A CN201910154655A CN109807896A CN 109807896 A CN109807896 A CN 109807896A CN 201910154655 A CN201910154655 A CN 201910154655A CN 109807896 A CN109807896 A CN 109807896A
Authority
CN
China
Prior art keywords
control
equipment
user
motion
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910154655.7A
Other languages
Chinese (zh)
Other versions
CN109807896B (en
Inventor
王志彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MGA Technology Shenzhen Co Ltd
Original Assignee
Megarobo Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Megarobo Technologies Co Ltd filed Critical Megarobo Technologies Co Ltd
Priority to CN201910154655.7A priority Critical patent/CN109807896B/en
Publication of CN109807896A publication Critical patent/CN109807896A/en
Application granted granted Critical
Publication of CN109807896B publication Critical patent/CN109807896B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The embodiment of the present invention provides a kind of motion control method and system, control equipment and storage medium.Method includes: that motion control window relevant to designated equipment is shown on human-computer interaction interface, and designated equipment is one of robot, control parts of motion, robot model and control parts of motion model;The motion control instruction inputted based on user in motion control window, determine the respective kinematic parameter of at least one object, wherein, at least one object includes one of following item: the end effector of designated equipment, at least one joint of designated equipment, designated equipment at least one axis;The respective kinematic parameter of at least one object is respectively allocated at least one object, to control the movement of at least one object.There is provided one kind can be to the more control equipment to be controlled for control equipment (including robot and/or control parts of motion) while being managed and controlling.In addition, also providing a kind of motion control method applied to the control equipment, user is facilitated to carry out motion control.

Description

Motion control method and system, control equipment and storage medium
Technical field
The present invention relates to movement control technology field, relates more specifically to a kind of motion control method and system, control are set Standby and storage medium.
Background technique
In the kinetic control system based on the similar techniques such as robot (such as mechanical arm), control equipment to be controlled (such as machine Device people or driving controller etc.) it establishes a connection with control equipment (such as host computer etc.), user can pass through control equipment control Robot motion.
Existing robot mostly uses teaching machine as the equipment of control robot in the market, and teaching machine is with robot Correspondingly, each robot is correspondingly arranged on a teaching machine.If user wants with various robots, various motors More complicated kinematic system is formed, the one-to-one scheme of above-mentioned teaching machine and robot is just inconvenient.Such as: at some 10 robots are installed on process line, needs them to synchronize and does same movement, traditional scheme needs respectively to every Then the complicated programming of platform robot progress gradually adjusts the state for arriving relative synchronization, it usually needs cost ten is more than a couple of days Debug time considerably increases the use cost of robot.
Summary of the invention
The present invention is proposed in view of the above problem.The present invention provides a kind of motion control methods and system, control Equipment and storage medium.
According to an aspect of the present invention, a kind of motion control method is provided, comprising: display and finger on human-computer interaction interface The relevant motion control window of locking equipment, designated equipment are robot, control parts of motion, robot model and motion control portion One of part model;Based on the motion control instruction that user inputs in motion control window, the respective fortune of at least one object is determined Dynamic parameter, wherein at least one object includes one of following item: at least the one of the end effector of designated equipment, designated equipment At least one axis of a joint, designated equipment;It is right that the respective kinematic parameter of at least one object is respectively allocated at least one As to control the movement of at least one object.
Illustratively, motion control window includes parameter setting control, the fortune inputted based on user in motion control window Dynamic control instruction, determines that the respective kinematic parameter of at least one object includes: the behaviour executed based on user to parameter setting control Make, determine the respective kinematic parameter of at least one object, wherein motion control instruction includes that user holds for parameter setting control Instruction corresponding to capable operation.
Illustratively, parameter setting control includes and at least one one-to-one sliding of the object in the first object set Control, the first object set includes at least partly object at least one object, is held based on user to parameter setting control Capable operation determines that the respective kinematic parameter of at least one object includes: that any object in the first object set is received User executes the operation information of slide to the sliding block in slider control corresponding to the object;Based on corresponding to the object Slider control on position of the sliding block between the end position and/or initial position and end position in sliding process Difference determines the position data in the kinematic parameter of the object;And/or at the time of being pressed based on user positioned at the sliding block of initial position Time difference between at the time of with release positioned at the sliding block of end position, determine the time data in the kinematic parameter of the object; Wherein, position data indicates target position, rotation angle or move distance, and time data indicate indicated by the data of in-position Target position or the time consumed by rotation angle indicated by position data or move distance.
Illustratively, the size of time data is equal to the time difference.
Illustratively, the size of position data and the size of alternate position spike are directly proportional.
Illustratively, motion control window further include with the object in the second object set it is one-to-one at least one work as Preceding parameter display area domain, the second object set are the subsets of the first object set, for any object in the second object set, Sliding block of the parameter current display area corresponding to the object in slider control corresponding to the real-time display object exists In sliding process, time data in the predicted value for the position which currently reaches and/or the kinematic parameter of the object it is pre- Measured value.
Illustratively, parameter setting control includes and at least one set of text correspondingly of the object in third object set Frame control, every group of Input include at least one Input, and third object set includes at least one object At least partly object determines the respective kinematic parameter of at least one object based on the operation that user executes parameter setting control It include: that it is defeated in one group of Input corresponding to the object that user is received for any object in third object set Position data and/or time data in the kinematic parameter of the object entered, wherein position data indicates target position, rotation Angle or move distance, time data indicate target position indicated by the data of in-position or by indicated by position datas Rotate the time of angle or move distance consumption.
Illustratively, in the case where at least one object includes the end effector of designated equipment, parameter setting control Including six button controls, six button controls be divided into one-to-one three pairs of three reference axis of rectangular coordinate system in space by Button control, two button controls in each pair of button control respectively correspond two opposite directions, every in six button controls A button control is used to indicate end effector in the reference axis corresponding to the button control along corresponding to the button control Direction movement, based on the operation that user executes parameter setting control, determine the respective kinematic parameter packet of at least one object It includes: receiving the operation information that user executes clicking operation at least partly button control in six button controls;Based on user To the operation information of at least partly button control, the position data in the kinematic parameter of end effector is determined.
Illustratively, parameter setting control further includes step-length setting control, is used to indicate each of six button controls Button control clicks the move distance of an end effector, and step-length setting control is Input.
Illustratively, parameter setting control further includes time setting control, is used to indicate each of six button controls Button control clicks the run duration of an end effector, and time setting control is Input.
Illustratively, parameter setting control includes that one-to-one at least one is discrete with the object in the 4th object set Be zeroed control, and the 4th object set includes at least partly object at least one object, based on user to parameter setting control The operation of execution determines that the respective kinematic parameter of at least one object includes: that any object in the 4th object set is rung It should set default first for the kinematic parameter of the object in user to the clicking operation of discrete zero control corresponding to the object Initial value.
Illustratively, parameter setting control includes whole zero control, the behaviour executed based on user to parameter setting control Make, determines that the respective kinematic parameter of at least one object includes: the clicking operation in response to user to whole zero control, it is near The kinematic parameter of a few object is respectively set to respective preset initial value.
Illustratively, designated equipment is robot or robot model, and motion control window includes current location viewing area Domain, position data corresponding to the position that the end effector for real-time display designated equipment currently reaches.
Illustratively, before showing motion control window relevant to designated equipment on human-computer interaction interface, method Further include: designated equipment or the identification information of at least one object are shown on human-computer interaction interface;On human-computer interaction interface Show that motion control window relevant to designated equipment includes: the mark in response to user to designated equipment or at least one object First clicking operation of information shows motion control window.
Illustratively, before showing motion control window relevant to designated equipment on human-computer interaction interface, method Further include: designated equipment or the identification information of at least one object are shown on human-computer interaction interface;On human-computer interaction interface Show that motion control window relevant to designated equipment includes: the mark in response to user to designated equipment or at least one object Second clicking operation of information shows menu window, and menu window includes the opening and closing for controlling motion control window Motion control menu item;In response to user to the third clicking operation of motion control menu item, motion control window is shown.
Illustratively, the kinematic parameter of each object at least one object be used to indicate the object current time it Movement pre-determined distance or arrival target position in preset time afterwards.
Illustratively, motion control window includes distribution control, and the respective kinematic parameter of at least one object is divided respectively At least one object of dispensing includes: the clicking operation executed in response to user to distribution control, at least one object is respective Kinematic parameter is respectively allocated at least one object.
Illustratively, designated equipment is one of editable equipment, and editable equipment includes all devices in model set Model and/or at least one control equipment to be controlled that connection is established with control equipment, at least one control equipment to be controlled includes at least one A robot and/or at least one control parts of motion, model set include at least one device model, and each device model is Robot model or control parts of motion model.
According to a further aspect of the invention, a kind of control equipment is provided, comprising: display module, in human-computer interaction interface Relevant to the designated equipment motion control window of upper display, designated equipment be robot, control parts of motion, robot model and One of control parts of motion model;Determining module, the motion control instruction for being inputted based on user in motion control window, really The fixed respective kinematic parameter of at least one object, wherein at least one object includes one of following item: being held the end of designated equipment At least one joint of row device, designated equipment, designated equipment at least one axis;And distribution module, it is used at least one The respective kinematic parameter of object is respectively allocated at least one object, to control the movement of at least one object.
According to a further aspect of the invention, a kind of control equipment, including display, processor and memory are provided, wherein aobvious Show that device for showing human-computer interaction interface, is stored with computer program instructions, computer program instructions are by processor in memory For executing above-mentioned motion control method when operation.
According to a further aspect of the invention, a kind of kinetic control system is provided, including control equipment is specified at least one and set Standby, control equipment is for executing above-mentioned motion control method, to control the object movement of at least one designated equipment.
According to a further aspect of the invention, a kind of storage medium is provided, stores program instruction on a storage medium, program refers to It enables at runtime for executing above-mentioned motion control method.
According to embodiments of the present invention, providing one kind can be to more control equipments to be controlled (including robot and/or motion control Component) the control equipment that is managed and controls simultaneously.In addition, also providing a kind of motion control side applied to the control equipment Method can be interacted with user to determine designated equipment (robot, control parts of motion, robot model or control parts of motion mould Type) kinematic parameter, this method facilitates user to carry out motion control to the robot or control parts of motion that actually connect, and And user is facilitated to carry out movement simulation, test etc. to robot model or control parts of motion model.
Detailed description of the invention
The embodiment of the present invention is described in more detail in conjunction with the accompanying drawings, the above and other purposes of the present invention, Feature and advantage will be apparent.Attached drawing is used to provide to further understand the embodiment of the present invention, and constitutes explanation A part of book, is used to explain the present invention together with the embodiment of the present invention, is not construed as limiting the invention.In the accompanying drawings, Identical reference label typically represents same parts or step.
Fig. 1 shows the schematic block diagram of kinetic control system according to an embodiment of the invention;
Fig. 2 shows the schematic block diagrams of kinetic control system in accordance with another embodiment of the present invention;
Fig. 3 shows the schematic diagram of the human-computer interaction interface in control equipment according to an embodiment of the invention;
Fig. 4 shows the schematic diagram of project window according to an embodiment of the invention;
Fig. 5 shows the schematic diagram of robot window according to an embodiment of the invention;
Fig. 6 shows the schematic diagram of device window according to an embodiment of the invention;
Fig. 7 shows the allocation window of control parts of motion according to an embodiment of the invention and in the allocation window The schematic diagram for the configuration information that partial region is shown;
Fig. 8 shows the allocation window of control parts of motion in accordance with another embodiment of the present invention and in the allocation window The schematic diagram of configuration information that shows of partial region;
Fig. 9 shows the schematic flow chart of device management method according to an embodiment of the invention;
Figure 10 shows the allocation window of MRX-T4 robot model according to an embodiment of the invention and in the configuration The schematic diagram for the related information setting window that the partial region of window is shown;
Figure 11 shows the schematic flow chart of motion control method according to an embodiment of the invention;
Figure 12 shows the schematic diagram of parameter editor according to an embodiment of the invention;
Figure 13 shows the schematic diagram of parameter editor in accordance with another embodiment of the present invention;
Figure 14 shows the schematic flow chart of user interaction approach according to an embodiment of the invention;
Figure 15 shows the schematic flow chart of user interaction approach according to an embodiment of the invention;
Figure 16 shows the schematic flow chart of motion control method according to an embodiment of the invention;
Figure 17 shows the schematic diagram of motion control window according to an embodiment of the invention;
Figure 18 shows according to an embodiment of the invention for first axis of control parts of motion shown in fig. 6 to be arranged Kinematic parameter motion control window schematic diagram;
Figure 19 shows the schematic diagram of motion control window in accordance with another embodiment of the present invention;
Figure 20 shows the schematic flow chart of parameter edit methods according to an embodiment of the invention;
Figure 21 shows the schematic flow chart of motion control method according to an embodiment of the invention;
Figure 22 shows the schematic block diagram of control equipment according to an embodiment of the invention;And
Figure 23 shows the schematic block diagram of control equipment according to an embodiment of the invention.
Specific embodiment
In order to enable the object, technical solutions and advantages of the present invention become apparent, root is described in detail below with reference to accompanying drawings According to example embodiments of the present invention.Obviously, described embodiment is only a part of the embodiments of the present invention, rather than this hair Bright whole embodiments, it should be appreciated that the present invention is not limited by example embodiment described herein.Based on described in the present invention The embodiment of the present invention, those skilled in the art's obtained all other embodiment in the case where not making the creative labor It should all fall under the scope of the present invention.
In order at least be partially solved the above problem, the embodiment of the present invention provides a kind of control equipment.The control equipment can By in such a way that hardware or hardware are in conjunction with software, realize to more control equipments to be controlled (including robot and/or motion control Component) while manage and control.Control equipment described herein can be it is any suitable have data-handling capacity and/or The calculating equipment of instruction execution capability can be realized using conventional computer.For example, control equipment can be host computer, show Teach device etc..Illustratively, control equipment described herein can provide a user control interface, i.e. human-computer interaction by display Interface, user can use control of the control interface realization to equipment to be controlled and/or device model.In addition, illustratively, Control equipment described herein is also equipped with control software, for realizing algorithm function relevant to control interface.
According to embodiments of the present invention, it manages and controls while being realized with a control equipment to more control equipments to be controlled System.It compared with existing one-to-one correspondence scheme, has at least the following advantages according to the solutions of the embodiments of the present invention: 1, saving operation Time, the synchronism between different control equipments to be controlled are more preferable;2, the control of robot or control parts of motion is easier, it can be with Realize the accurate control to each joint or axis;3, due to may be implemented to more robots while controlling, to more robots Movement programming be easier;4, control while due to may be implemented to different robots, different movements between robot It cooperates and is easier to realize, it is easier to the kinematic system of complicated composition.
It should be understood that although control equipment provided in an embodiment of the present invention, which has, to carry out more control equipments to be controlled simultaneously The ability managed and controlled, but this is not limitation of the present invention, the control equipment can be applied equally to control equipment and to It controls in the one-to-one control program of equipment.
For above-mentioned control equipment, the embodiment of the present invention provides a kind of motion control method that can be applied to it.The movement Control method can interact to determine designated equipment (robot, control parts of motion, robot model or motion control portion with user Part model) kinematic parameter, this method facilitates user to carry out movement control to the robot or control parts of motion that actually connect System, and user is facilitated to carry out movement simulation, test etc. to robot model or control parts of motion model.
Motion control method according to an embodiment of the present invention and control equipment are not limited to the control to robot.Example Such as, control parts of motion can be used for control and drive motion components (such as motor etc.), and moving component can both be used in machine It, can also be on non-robot product on device people's product.For example, the transmission belt on assembly line is controled with the realization of many motors , control parts of motion may be implemented to control this kind of motor, further, motion control side according to an embodiment of the present invention Method and control equipment can be used for being managed the control parts of motion in the assembly line field and controlling.In short, according to this hair The motion control method and control equipment of bright embodiment can be applied to any robot or use the work similar with robot The control field of the equipment of mode or other equipment with moving component.
In the following description, in conjunction with the application environment of motion control method according to an embodiment of the present invention, i.e. movement control System processed, to describe the present invention.Kinetic control system described herein may include control equipment and control equipment to be controlled.Institute as above It states, control equipment may include such as host computer, teaching machine.Control equipment to be controlled may include such as robot, for driving The control parts of motion etc. of robot motion.Robot can be the various structures such as four axis robots, H2 type plane door frame robot The robot of type, control parts of motion can be various types of driving controllers such as uniaxial driving controller, multiaxis driving controller.It is described herein Moving component can be only motor, be also possible to motor combination retarder, can also be motor combination lead screw etc..
Robot described herein can be the automatic installations for executing work.Robot may include robot sheet Body, end effector (or being tool).Ontology may include multiple joints, such as pedestal, large arm, forearm, wrist etc..It holds end Row device is, for example, the clamping jaw that can a be opened and closed/object clamping part, is also possible to other operational instrument.End effector is by controlling Equipment control moves according to respective routes and completes scheduled movement.It is specific for example, the controlled control equipment of end effector manipulation, It realizes and is moved in three-dimensional space, and execute relevant action, such as crawl, release or other movements in specified position.
By motor cooperate retarder for, motor cooperation retarder be mechanical arm (or for manipulator, multi-axis robot, Articulated robot etc.) main movement execution unit, mechanical arm mainly pressed from both sides according to scheduled route from an initial position Take target object to target position, the mechanical automation operation suitable for many industrial circles.
Mechanical arm currently on the market mainly includes that four axis robots (there are four joints for tool) and six-joint robot (have six A joint), they include pedestal, arm and the object of end clamping part, and the number of arm upper joint determines robot " axis " quantity, each joint be driven by the rotation of motor, the movement to realize joint.
Kinetic control system according to an embodiment of the present invention is described below with reference to Fig. 1, it is real according to the present invention to help to understand Apply the exemplary context of use of the motion control method of example.Fig. 1 shows kinetic control system according to an embodiment of the invention 100 schematic block diagram.It may be noted that motion control method provided in an embodiment of the present invention can be in other and kinetic control system It is realized in 100 similar systems, it is not limited to specific example shown in FIG. 1.
As shown in Figure 1, kinetic control system 100 may include man-machine interaction unit (i.e. control equipment) 110, control general ability Domain network (CAN) data line 120, control parts of motion 130 and motor (i.e. moving component) 140.Control parts of motion 130 includes CAN data transceiving unit 1302, caching 1304, solving unit 1306, wave table 1308, PWM waveform generator 1310 and motor drive Moving cell 1312.
User can pass through human-computer interaction list when controlling motor 140 using control parts of motion (such as driving controller) 130 Member 110 edits kinematic parameter.Man-machine interaction unit 110 is sent to fortune via the kinematic parameter that CAN data line 120 edits user Dynamic control unit 130, control parts of motion 130 resolve the kinematic parameter received, obtain wavetable data, then generate PWM waveform, driving motor movement.
Specifically, the solving unit 1306 in control parts of motion 130 can read kinematic parameter, then by the fortune of reading Dynamic parameter carries out the processing such as interpolation resolving using solution formula, and kinematic parameter is converted to wavetable data, is stored in wave table 1308 In.
Wave table 1308 can be realized using DDR memory etc., for storing wavetable data, can be arranged according to the design needs The size of the storage depth of wave table 1308.
PWM waveform generator 1310 is used to generate corresponding PWM wave figurate number according to the wavetable data stored in wave table 1308 According to.PWM waveform is referred to as impulse waveform sometimes, has two states of low and high level, can pass through in motion control field The duty ratio for adjusting PWM waveform reaches the purpose of control motor speed, electromagnetic valve switch state.PWM waveform generator 1310 can It is real for example, by using direct digital synthesis technique (DDS) signal generation technique to be realized using existing various PWM waveform generators Existing PWM waveform generator, the PWM waveform generator realized using digital counting technology etc..
Therefore, the actual motion Parameter Switch of user setting is the wavetable data for generating PWM waveform by solving unit 1306, PWM waveform generator 1310 generates corresponding PWM wave graphic data according to wavetable data, using digital-to-analogue conversion, amplification filtering etc. After processing, it is sent to electric-motor drive unit 1312 and carrys out the movement of driving motor 140.
Electric-motor drive unit 1312 is used to be moved according to PWM waveform driving motor 140, can use all kinds of motor driven cores Piece is realized.
In the example depicted in fig. 1, a control parts of motion 130 connecting with control equipment 100 is illustrated only.Below The example that one control equipment controls multiple control parts of motion is described.Fig. 2 shows fortune in accordance with another embodiment of the present invention The schematic block diagram of autocontrol system 200.As shown in Fig. 2, kinetic control system 200 includes control equipment 201, gateway 202, N A control parts of motion 203, N number of moving component 204 and N number of sensor 205.N may be greater than 1 integer.
Equipment 201 is controlled for realizing human-computer interaction, the computer of control software is e.g. installed.User can controlled The various parameters of robot or control parts of motion are set on the human-computer interaction interface that control equipment 201 provides, are realized to robot Control.
Gateway 202 is for realizing the data exchange for controlling equipment 201 and control parts of motion 203.Gateway 202 can be such as It is realized using above-mentioned CAN data line 120.
Control parts of motion 203 is used to resolve the kinematic parameter that control equipment 201 is sent, and is edited based on user Kinematic parameter generate control and drive motion components 204 move driving current, with drive motion components 204 movement, in turn Corresponding movable body (such as joint of robot) is driven to move.Each control parts of motion 203 can use and above-mentioned movement The similar structure of control unit 130 is realized.
Moving component 204 can be motor, and each motor can be used for driving a joint of robot.
Sensor 205 is used for the movement position of real-time detection movable body, can be encoder, angular transducer, photoelectricity Switch, NI Vision Builder for Automated Inspection etc..Some robots can not have sensor.
As shown in Fig. 2, there can be multiple control parts of motion 203 while be connected to control equipment 201, equipment 201 is controlled Can these control parts of motion 203 be managed and be controlled simultaneously.
It following is a brief introduction of the human-computer interaction scheme realized on the control device, to facilitate a better understanding of the present invention.
According to embodiments of the present invention, several device models can be pre-established, these device models can store in model In library.Optionally, model library can store on the server, by different control device downloads and use.Optionally, model library can Acquisition is established in current control equipment by user.Device model described herein can be robot model, can also be with It is control parts of motion model.On human-computer interaction interface, user can choose and edit these device models, and can build Incidence relation between vertical device model and the control equipment to be controlled actually connected.It establishes after incidence relation, it can be by being directed to The kinematic parameter of device model setting controls the movement of control equipment to be controlled.
The benefit for establishing device model has very much.For example, the type of the control parts of motion of different robots may be phase With, may not need the robot for identifying actually connect one by one using this scheme, only need to identify control parts of motion and by its It is associated with robot model.In addition, this scheme makes the scalability for entirely controlling equipment fine.For example, if opening Sent out a kind of new robot modeling, then it, only need to be in controlling model library used by equipment without developing new control equipment Increase a new robot model.
Therefore, on the human-computer interaction interface of control equipment, in addition to the control equipment to be controlled being connect with control equipment, may be used also With the device model in display model library, for user's control.
Fig. 3 shows the schematic diagram of the human-computer interaction interface in control equipment according to an embodiment of the invention.It may be noted that The layout type of human-computer interaction interface shown in Fig. 3 is only exemplary rather than limitation of the present invention.
As shown in figure 3, human-computer interaction interface may include one or more of following four display area: project window (Engineering Zone), robot window (robot zone), editor (editing area) and device window (equipment region).Each Window can be considered as a region, and position can be fixed, and be also possible to adjustable.For example, can be by user by dragging The modes such as dynamic are laid out human-computer interaction interface again, adjust the position where each window.In addition, in aforementioned four window Either window can according to need opening, closing, scaling (such as minimize, maximize) etc..In addition, human-computer interaction interface is also It may include other windows, for example, there are one record output window for human-computer interaction interface bottom shown in Fig. 3.One In a example, editor can be fixed in position and user have no right close, other three windows can be fixed in position but User have the right close.
Project window: for managing current all file resources, such as display listed files relevant to project file. Fig. 4 shows the schematic diagram of project window according to an embodiment of the invention.As shown in figure 4, in project window, it is shown that work The listed files of journey file " 1212.prj (i.e. file entitled 1212, extend entitled prj) " managed file.The text of project file Part list may include the filename for belonging to the All Files of current engineering.A variety of lattice can be generated in the control software of control equipment The file of formula, including but not limited to: project file (extending entitled prj), document scene (extending entitled sce), robot motion File (extending entitled mc), motor movement file (extending entitled pvt), setting file (extending entitled stp) etc..
In description herein, when project file can be understood as establishing engineering (project) in controlling software File generated can be used for integrating and managing the alternative document under the engineering, such as document scene, motor movement file Deng.Document scene is generated when can be understood as establishing a scene (control scene i.e. described herein) in controlling software File.Device model can be added and (can be described as activating) in any scene, can also add and control the to be controlled of equipment connection Control equipment, and the operation such as some parameter editors can be carried out to the device model of addition or control equipment to be controlled, in scene When equipment is operated, document scene can recorde the information of these operations.Illustratively, can be arranged such as in control software Lower rule: after generating by modes such as newly-built or importings in project window and wake up and (choose) document scene, Yong Hucai The device model needed can be selected to carry out the operation such as activation and subsequent parameter editor in robot window.
Robot motion's file can be comprising information relevant to the motion profile of robot (such as the movement of robot Parameter) file.Motor movement file can be comprising information relevant to the motion profile of motor (such as motor movement ginseng Number) file.
File, which is arranged, can be the file of the configuration information comprising control equipment to be controlled or device model.It is in control equipment to be controlled In the case that control parts of motion or device model are control parts of motion model, the configuration information may include motion control It is one or more in the following information of component or control parts of motion model: essential information, the parameter of electric machine of each axis, movement Trajectory planning parameter, trigger parameter, I/O interface parameter etc..The essential information may include such as model, version number, mark Accord with (ID) etc..The parameter of electric machine may include such as motor size, step angle, maximum value, encoder information.The movement Trajectory planning parameter may include in kinematic parameter interpolation method used by interpolation, interpolation curve duty ratio, null positions, Emergency stop mode etc..In description herein, Motion trajectory parameter is for the movement to equipment to be controlled or device model The parameter that track is planned includes different contents from kinematic parameter.
In the case where control equipment to be controlled is robot or device model is robot model, the configuration information be can wrap It includes one or more in the following information of robot or robot model: essential information, each joint and control parts of motion The corresponding relationship of axis, step-length, null positions used by interpolation etc. in kinematic parameter.The essential information may include example Such as model, version number, identifier (ID).
Robot window: for showing the identification information of the various device models pre-established.Robot window is shown Device model may include robot model and/or control parts of motion model.All devices mould shown by robot window Set composed by type is properly termed as the first model set.Illustratively, the identification information of device model may include equipment mould The model and/or icon of type.Fig. 5 shows the schematic diagram of robot window according to an embodiment of the invention.As shown in figure 5, The identification information of multiple equipment model is shown in robot window, wherein MRX indicates the model of robot type.Working as front court Scape file or in the case where saying that current control scene is in editable state (such as user chooses current scene file), Yong Huke To select the device model for needing to activate, the subsequent model is in current scene file or says that current control in scene is activated and shows Show in the editing window.In description herein, in a certain document scene activation equipment model with the document scene institute it is right Activation equipment model has the same meaning in the control scene answered.Referring back to Fig. 3, wherein editor shows scene Multiple robot models in " 3.sce ".
Device window: for showing the identification information of editable equipment.Illustratively, editable equipment may include second All devices model in model set and/or at least one control equipment to be controlled that connection is established with control equipment.Second model Set may include all devices model activated at least one control scene.At least the one of connection is established with control equipment A control equipment to be controlled can be all control parts of motion 203 for example shown in Fig. 2 being connected in control equipment 201.
User will control equipment and control equipment to be controlled passes through bus (such as CAN bus can connect up to 100 equipment) After linking together, device window can show list of devices to be controlled, the list may include it is all have connected it is to be controlled The identification information of equipment.In the case where control equipment to be controlled is control parts of motion, list of devices to be controlled can also include should The identification information of each axis of control parts of motion.Fig. 6 shows the schematic diagram of device window according to an embodiment of the invention. As shown in fig. 6, a kind of control parts of motion (five axis driving controllers) of a model MRQ-M2305 is shown in device window Identification information also shows the identification information of five axis of the control parts of motion, is indicated respectively with CH1-CH5.
When the instruction for receiving user's input activates the activation instruction of a certain device model, device window can show quilt The identification information of the device model of activation.Referring back to Fig. 3, shows in device window and swash in document scene " 3.sce " Identification information MRX-AS, MRX-AS, MRX-T4 of three robot models living, also shows in document scene Identification information MRX-AS, MRX-T4, MRX-DT, MRX-H2Z of four robot models activated in " 11111.sce ".
Editor: it is used for editor control scene.It can be provided a user in the editing window for editing current control The interface of scene.The operation carried out in the editing window is equivalent to the operation to document scene, for example, adding in the editing window One device model, which is equivalent in currently control scene, activates the device model, and document scene can recorde activation operation Relevant information.When the instruction for receiving user's input activates the activation instruction of a certain device model, editor can be shown The identification information for the device model being activated.
It may be noted that the form of the identification information shown in different windows may be the same or different.For example, machine The identification information for the device model that people's window and editor are shown can include model and icon, and what device window was shown The identification information of device model can only include model.
User can select some editable equipment in editor or device window, the editable equipment can be to Equipment is controlled, is also possible to device model, subsequent user can check and/or edit the every terms of information of the editable equipment, example Such as above-mentioned configuration information or kinematic parameter.
Fig. 7 shows the allocation window of control parts of motion according to an embodiment of the invention and in the allocation window The schematic diagram for the configuration information that partial region is shown.Configuration information shown in Fig. 7 is the motion control portion of model MRQ-M2304 The essential information of part.As shown in fig. 7, the left area of allocation window shows several controls, " information ", " CH1 ", " CH2 ", " CH3 ", " CH4 ", " input ", " output ", " SEN. ", " system " etc., these controls can be button control, and user can pass through The mode clicked selects any control therein.User selects " information " this control in left area, can be in right side region Domain shows the essential information of current editable equipment.User can check the essential information, and can also edit this when needed Essential information.
Fig. 8 shows the allocation window of control parts of motion in accordance with another embodiment of the present invention and in the allocation window The schematic diagram of configuration information that shows of partial region.As shown in fig. 7, the left area of allocation window shows several controls, " information ", " Axes1 " (similar to " CH1 " control effect shown in Fig. 7), " Axes2 " (are acted on " CH2 " control shown in Fig. 7 It is similar), " Axes3 " (similar to " CH3 " control effect shown in Fig. 7), " Axes4 " (with " CH4 " control shown in Fig. 7 act on It is similar), " input/output " (similar to " input " shown in Fig. 7 and the effect of " output " control), " sensor " (with it is shown in Fig. 7 The effect of " SEN. " control is similar), " system " etc., these controls can be button control, and user can be selected by way of clicking Select any control therein.Configuration information shown in Fig. 8 is first axis of the control parts of motion of model MRQ-M2304 (Axes1) the parameter of electric machine.The correspondence control of each axis such as " Axes1 ", " Axes2 " in user's selection left area The parameter of electric machine of each axis of current editable equipment is shown in right area.User can check the parameter of electric machine, and need When can also edit the parameter of electric machine.The display of remaining configuration information is similar with edit mode, repeats no more.
In description herein, kinematic parameter refers to the ginseng for being used to indicate the motion profile of control equipment or device model to be controlled Number.It will be understood by those skilled in the art that each motion profile can be made of multiple points comprising information such as position, times, Each point can correspond to a kinematic parameter.Kinematic parameter sequence including a plurality of kinematic parameter can be considered as a movement rail Mark.In the case where control equipment to be controlled is robot or device model is robot model, kinematic parameter can be robot or The kinematic parameter of the end effector of robot model is also possible at least one joint of robot or robot model The kinematic parameter in each joint.In control equipment to be controlled be control parts of motion or device model is the feelings of control parts of motion model Under condition, kinematic parameter can be the movement of each axis at least one axis of control parts of motion or control parts of motion model Parameter.
The content of kinematic parameter can be different and different according to actually constituting for moving component (such as motor).Example Property, kinematic parameter may include one or more in position data, speed data and time data.The position data can To be the coordinate data in rectangular coordinate system in space, it is also possible to rotate angle or other data relevant to position.Positional number In the case where being coordinate data in rectangular coordinate system in space, kinematic parameter is properly termed as LVT parameter.Position data is rotation In the case where angle, kinematic parameter is properly termed as PVT parameter.LVT parameter may include the coordinate in rectangular coordinate system in space (can With referred to as X, Y, Z) and reach corresponding coordinate point time (being properly termed as T).PVT parameter may include that rotation angle (can claim For P), rotation speed (being properly termed as V), rotational time (being properly termed as T).
Optionally, the position data of every kinematic parameter can indicate target position (can be considered as absolute position), can also To indicate one section of move distance or rotation angle (relative position can be considered as).Optionally, the time data of every kinematic parameter It can be the time (being a time point, absolute time can be considered as) for reaching target position, be also possible to by position data Time (being a period, relative time can be considered as) consumed by indicated move distance or rotation angle.
In one example, kinematic parameter can be arranged in user one by one, be arranged after a kinematic parameter every time, control is set It is standby to be moved according to the kinematic parameter real-time control control equipment to be controlled or device model.In another example, user can be with A plurality of kinematic parameter is disposably set, it, can (moment can at any time after control equipment obtains a plurality of kinematic parameter Automatically determined with being set by the user or controlling equipment) control the fortune that control equipment or device model to be controlled are formed along a plurality of kinematic parameter The movement of dynamic rail mark.
According to an aspect of the present invention, a kind of device management method is provided.Fig. 9 shows according to an embodiment of the invention set The schematic flow chart of standby management method 900.As shown in figure 9, device management method 900 includes step S910, S920, S930.
In step S910, connection is established at least one control equipment to be controlled, wherein at least one control equipment to be controlled includes extremely A few robot and/or at least one control parts of motion.In description herein, "at least one" can with " one or It is multiple " it is equivalent.
As described above, control equipment can be established and be connected by bus (such as CAN bus) and at least one control equipment to be controlled It connects.
In step S920, the activation instruction based on user's input activates target device model, and target device model is machine People's model or control parts of motion model.
The activation of target device model, which can be, is added to target device model in current control scene, so that target is set Standby model by can not editing mode be changed into editable state.User inputs activation instruction can be by by target device model Identification information drags to editor or in robot window to the identification information of target device model from robot window It carries out the mode such as clicking to realize.Optionally, the click can be double-click.Control equipment is receiving the above-mentioned behaviour of user's execution As when input operation information (receiving activation instruction) when, so that it may by target device model specification be editable state. After the activation, user further can carry out configuration information editor, movement to target device model on human-computer interaction interface Parameter editor such as is associated at the operation with control equipment to be controlled.
In step S930, based on the associated instructions of user's input, establishes target device model and to be controlled set up at least one The incidence relation between correspondence control equipment to be controlled in standby.
Device model is virtual unit, and control equipment to be controlled is and the control practical equipment connecting of equipment.Meanwhile device model It is the robot of certain model or the model of control parts of motion, and control equipment to be controlled itself can be the robot of certain model Either for driving the control parts of motion of the robot of certain model.Therefore, certain device models can with it is certain to be controlled Control equipment correspondence associates.It is of course possible to there are certain device models there is no can control equipment to be controlled associated with it, There may be certain control equipments to be controlled, there is no being capable of device model associated with it.
When needed, a certain device model can be associated with a certain control equipment to be controlled, in this way, user can edit The kinematic parameter of the device model, and kinematic parameter is sent to the control equipment to be controlled being associated, actually to drive this to be controlled Control equipment movement.
Target device model can be any one device model, can optionally be selected by user.For example, user can To click the identification information of any appliance model in editor or device window, to select the device model as target device Model.Optionally, the click can be double-click.
In description herein, user refer to the people that interacts of control equipment, can be the same person, can also be with It is different people.For example, the activation instruction of step S920 can come from different users from the associated instructions of step S930.
It should be noted that each step of device management method according to an embodiment of the present invention is not limited to execution shown in Fig. 9 Sequentially, other be can have and reasonable execute sequence.For example, step S920 can be held before step S910 or simultaneously Row.
Device management method according to an embodiment of the present invention, can by the device model pre-established with it is corresponding to be controlled Equipment associates, in this way, being controlled by device model the control equipment to be controlled actually connected convenient for user.This method Be conducive to improve the efficiency of management of control equipment, equipment may not need the machine for identifying actually connect one by one because it makes control People only need to identify control parts of motion and be associated with it with robot model.It is set in addition, this method entirely to control Standby scalability is fine.
According to embodiments of the present invention, method 900 can also include: that at least one equipment mould is shown on human-computer interaction interface The identification information of type;Wherein, target device model is one of at least one device model.
According to embodiments of the present invention, human-computer interaction interface may include robot window, show on human-computer interaction interface The identification information of at least one device model may include: the mark letter that at least one device model is shown in robot window Breath.
As described above, human-computer interaction interface may include robot window, several equipment moulds can be shown in the window The identification information of type, so that user checks.The identification information may include the model and/or icon of device model.Robot The device model that window is shown belongs to the first model set, which can come from model library.Fig. 3 and Fig. 5 are had been combined above Describe robot window layout type and it includes content, details are not described herein again.
It should be understood that robot window is only example, the identification information of device model included by the first model set can be with It otherwise shows, for example, can be shown by way of list control in menu bar.
At least one device model (i.e. the first model set) is shown on human-computer interaction interface, can be convenient user at any time The device model that selection needs carries out the operation such as kinematic parameter editor.
According to embodiments of the present invention, human-computer interaction interface can also include editor, and the activation based on user's input refers to Enable activation target device model (step S920) may include: in response to user in robot window to target device model The identification information of target device model is dragged to dragging for editor from robot window by the clicking operation of identification information Dynamic operation, activates target device model;Wherein, activation instruction includes clicking operation or the corresponding instruction of drag operation.
Above have been combined Fig. 3 describe editor layout type and it includes content, details are not described herein again.
As described above, user can click the identification information of any appliance model shown by robot window Or dragged to editor.When user executes above-mentioned clicking operation or drag operation for target device model, response In these operation, activate target device model, i.e., by target device model by can not editing mode be set as editable state.This Outside, optionally, it in response to above-mentioned clicking operation or drag operation, can be set in editor and/or device window displaying target The identification information of standby model.The clicking operation can be single-click operation.In addition, the clicking operation can be and control currently It executes, i.e., is executed in the case where current scene file is in editable state under scene.In this way, target device model can be It activates in currently control scene rather than under other control scenes.
Device model shown by editor and/or device window belongs to the second model set.Second model set can To be not understood as the set of activated device model, the first model set can be understood as the device model that can be activated Set, the second model set is the subset of the first model set.
According to embodiments of the present invention, after establishing connection at least one control equipment to be controlled, method 900 can also be wrapped It includes: generating the list of devices to be controlled of the identification information comprising at least one control equipment to be controlled;It is shown on human-computer interaction interface List of devices to be controlled.
Illustratively, human-computer interaction interface may include device window, and control equipment to be controlled is shown on human-computer interaction interface List may include: that list of devices to be controlled is shown in device window.
As described above, human-computer interaction interface may include device window, it can show that several editables are set in the window Standby identification information, so that user checks.The identification information may include the model and/or icon of editable equipment.Above Have been combined Fig. 3 and Fig. 6 describe device window layout type and it includes content, details are not described herein again.
Editable equipment may include all devices model in the second model set and/or establish connection with control equipment At least one control equipment to be controlled.Device window can show list of devices to be controlled, establish connection at least with control equipment The identification information of one control equipment to be controlled can be shown by list of devices to be controlled.It is set in addition, device window can also be shown For model list, the identification information of the device model in the second model set can pass through device model list display.
It should be understood that device window is only example, the identification information of editable equipment can otherwise be shown, for example, It can be shown by way of list control in menu bar.
According to embodiments of the present invention, inputted based on user activation instruction activation target device model (step S920) it Afterwards, method 900 can also include: the displaying target device model in the editor and/or device window of human-computer interaction interface Identification information.
The implementation of the identification information in editor and/or device window displaying target device model is hereinbefore described Example, details are not described herein again.
According to embodiments of the present invention, in the associated instructions inputted based on user, target device model and at least one are established Before the incidence relation (step S930) between correspondence control equipment to be controlled in control equipment to be controlled, method 900 can also include: sound It should be in user to the clicking operation of the identification information of target device model shown by editor or device window, in man-machine friendship The allocation window of displaying target device model on mutual interface;In response to user to the association interface selection control on allocation window Clicking operation shows that window is arranged in the related information for inputting associated instructions, wherein selection control in association interface is for controlling The button control of the opening and closing of related information setting window processed.
Optionally, user can to the click of the identification information of target device model shown by editor or device window To be to double-click.Optionally, user can be the click of the association interface selection control on allocation window and click.
Figure 10 shows the allocation window of MRX-T4 robot model according to an embodiment of the invention and in the configuration The schematic diagram for the related information setting window that the partial region of window is shown.Illustratively, allocation window may be displayed on editor In window.User carries out the identification information of MRX-T4 robot model shown by editor or device window primary double It hits, allocation window as shown in Figure 10 can be popped up.
As shown in Figure 10, allocation window can be divided into two regions, and left area is properly termed as control display area, use In the controls such as display " information ", " details ", " option ", " zero-bit ", " attribute ", right area, which is used to showing, currently to be chosen Window corresponding to control.As shown in Figure 10, currently choosing control is " option " control, it is shown that the related information on right side is set Window is set, that is, being somebody's turn to do " option " control is the association interface selection control.Each control of control display area shown in Fig. 10 Part is button control.User once clicks either button control, can show the control in the right area of allocation window Set interface corresponding to part.
According to embodiments of the present invention, target device model is robot model, and corresponding control equipment to be controlled is comprising at least one The control parts of motion of a axis is established target device model and to be controlled is set up at least one based on the associated instructions of user's input The incidence relation (step S930) between correspondence control equipment to be controlled in standby may include: any pass for target device model Section receives the joint shaft association of the respective shaft that correspond to control equipment to be controlled, corresponding with the joint of being used to indicate for user's input Instruction;The joint is associated with the respective shaft of corresponding control equipment to be controlled according to joint shaft associated instructions.
Associated instructions involved in step S930 may include all joints corresponding with all joints of robot model Axis associated instructions.When by robot model and the robot actually connecting or by control parts of motion model and the fortune that actually connect When dynamic control unit association, it can be directly linked.It, can when robot model to be associated with the control parts of motion actually connecting Each joint of robot model is associated correspondingly with each axis of control parts of motion with selection.For this purpose, can Think that the axis of corresponding control parts of motion is specified in each joint of robot model, user's input is used to indicate the axis The instruction for being which axis is joint shaft associated instructions.
Illustratively, it in the associated instructions inputted based on user, establishes target device model and to be controlled is set up at least one Before the incidence relation (step S930) between correspondence control equipment to be controlled in standby, method 900 can also include: that target is set Any joint of standby model shows that joint shaft corresponding with the joint is associated with control on human-computer interaction interface, and joint shaft closes Joint control part is Input or list control;For any joint of target device model, being used for for user's input is received The joint shaft associated instructions of the corresponding control equipment to be controlled of instruction, corresponding with joint respective shaft may include: for target Any joint of device model receives pair of control equipment to be controlled that user inputs in corresponding joint shaft association control, corresponding Answer the identification information of axis;Or receive the respective shaft of user's corresponding control equipment to be controlled of selection from corresponding joint shaft association control Identification information selection instruction;Wherein, joint shaft associated instructions corresponding to any joint of target device model include using Family is for instruction corresponding to input performed by the association control of joint shaft corresponding to the joint or selection operation.
Target device model can be robot model, and robot has multiple joints, and corresponding control equipment to be controlled can be with It is the control parts of motion with multiple axis.In such a case, it is possible to establish each joint and motion control portion of robot One-to-one relationship between each axis of part.
As described above, when user double-clicks the identification information of target device model in editor or device window, in people It can be popped up on machine interactive interface, i.e. the allocation window of displaying target device model.It include several controls in allocation window, such as " information ", " details ", " option ", " zero-bit ", " attribute " etc..Wherein control corresponding to " option " is the selection of association interface Control can be button control.It when " option " control is somebody's turn to do in user click, can pop up, that is, show that window is arranged in related information Mouthful, the corresponding relationship of the axis in the joint and control parts of motion of robot model can be arranged in the related information of robot model Window is configured.
0 is continued to refer to figure 1, in related information setting window, by the Basement (pedestal) of MRX-T4 robot model It is correspondingly arranged the axis 1 (i.e. CH1) for control parts of motion devicel, the Big Arm (large arm) of MRX-T4 robot model is corresponding It is set as the axis 2 (i.e. CH2) of control parts of motion devicel, Little Arm (forearm) setting of MRX-T4 robot model Wrist (wrist) for the axis 3 (CH3) of control parts of motion devicel, MRX-T4 robot model is set as control parts of motion The axis 4 (CH4) of devicel, the Hand (manipulator) of MRX-T4 robot model are set as the axis of control parts of motion devicel 5(CH5)。
In the above manner, each joint of robot model can be closed with each axis of corresponding control parts of motion Connection gets up.In one example, user can edit or import the kinematic parameter of the end effector of robot model, by controlling The kinematic parameter is scaled the kinematic parameter in each joint of robot model by equipment.In another example, user can be straight It meets editor or imports the kinematic parameter in each joint of robot model.It then, can be by these kinematic parameters according to established Incidence relation sends each axis of corresponding control parts of motion to, and then drives the practical robot controlled of the control parts of motion Movement.
In the case where control equipment to be controlled is control parts of motion and is connected with the robot of physical presence, target robot The kinematic parameter of model can be used for driving the robot motion of the physical presence.Comparison it is appreciated that physical presence machine People is identical as the configuration of target robot model, such as is all MRX-T4 robot.
According to a further aspect of the invention, a kind of motion control method is provided.Figure 11 is shown according to an embodiment of the present invention Motion control method 1100 schematic flow chart.As shown in figure 11, motion control method 1100 include step S1110, S1120、S1130。
In step S1110, parameter editor relevant to target device model, target are shown on human-computer interaction interface Device model is robot model or control parts of motion model.
Illustratively, the parameter edit instruction corresponding with target device model that can be inputted based on user, in man-machine friendship Display parameter editor relevant to target device model on mutual interface.For example, user can click and (double-click or click) volume Collect the identification information for the target device model that window or device window are shown.In response to the clicking operation of user, ginseng can be popped up Number editor.
For different types of equipment, the content that parameter editor is shown can be different.Figure 12 is shown according to this The schematic diagram of the parameter editor of invention one embodiment.Figure 13 shows parameter editor in accordance with another embodiment of the present invention The schematic diagram of window.The kinematic parameter that editor is participated in parameter editor shown in Figure 12 is LVT parameter, can be machine The kinematic parameter of the end effector of people or robot model.The movement ginseng of editor is participated in parameter editor shown in Figure 13 Number is PVT parameter, can be the kinematic parameter of any one axis of control parts of motion or control parts of motion model.
Target device model can be any one device model, can optionally be selected by user.For example, user can To click the identification information of any appliance model in editor or device window, to select the device model as target device Model.Optionally, the click can be double-click.
In step S1120, the kinematic parameter that user edits in parameter editor is received.
Referring to Fig.1 2, user can input the data of a kinematic parameter in every row, including time data, coordinate data, Clamping jaw displacement data etc..Coordinate data refers to the coordinate of the fixed point on end effector, such as a certain central point of gripper Coordinate.The coordinate data of any kinematic parameter is used to indicate end effector time indicated by the kinematic parameter (immediately Between data) position that should reach.The time data of any kinematic parameter are used to indicate the arrival of the fixed point on end effector The time of position indicated by the kinematic parameter (i.e. coordinate data).Clamping jaw displacement data refers to two clamping jaws of end effector The distance of transverse shifting.Clamping jaw displacement data is optional.It is exemplary and not restrictive, end effector can have can Folding, the clamping jaw that can be subjected to displacement in a lateral direction.In this case, end effector can have clamping jaw displacement Data.The clamping jaw displacement data is indicated with h (unit mm) in the example depicted in fig. 12.
In addition, illustratively, every kinematic parameter can also include m odel validity data and/or interpolation validity number According to.The m odel validity data of any kinematic parameter are to be used to indicate the whether effective data of this kinematic parameter.For example, scheming In 12, the first column data of every kinematic parameter, i.e., " allowing " data of this column is m odel validity data, when the data It is that effectively, corresponding equipment (robot or robot model) can be included in represent this kinematic parameter when " true " Motion profile;When the data be " false " when represent this kinematic parameter be it is invalid, be not included in the movement of corresponding equipment This kinematic parameter is ignored in track.The interpolation efficacy data of any kinematic parameter is to be used to indicate whether to move in this The data of interpolation are executed between parameter and next kinematic parameter.For example, in Figure 12, the 8th columns of every kinematic parameter According to the data of i.e. " interpolation method " this column are interpolation efficacy data, when the data are " true " Shi Daibiao in this fortune Interpolation is executed between dynamic parameter and next kinematic parameter;When the data be " false " Shi Daibiao not in this kinematic parameter and Interpolation is executed between next kinematic parameter.
In the example depicted in fig. 13, the form of kinematic parameter is different from Figure 12.Kinematic parameter shown in Figure 13 includes ginseng Number efficacy data, time data, position data and speed data.The position data is rotation angle.
User can edit one or more kinematic parameter in parameter editor, and can according to need and editing Interpolation is carried out in good kinematic parameter, with the bigger kinematic parameter of generation quantity.For any one data in kinematic parameter, Text box, list box or other kinds of control can be used as the edition interface of the data.User can be in the edition interface In the values of the data is set by modes such as input, selection, clicks.
In step S1130, kinematic parameter is sent to control equipment to be controlled corresponding with target device model interaction, with control Corresponding equipment moving to be controlled, corresponding control equipment to be controlled are the robot or control parts of motion that connection is established with control equipment.
As set forth above, it is possible to which target device model is associated with control equipment to be controlled, interrelational form can be with reference to above Description.After user edits kinematic parameter, kinematic parameter can be sent to corresponding control equipment to be controlled.Corresponding to control equipment to be controlled is In the case where robot, which can be moved based on kinematic parameter.Corresponding control equipment to be controlled is the feelings of control parts of motion Under condition, which can generate PWM waveform based on kinematic parameter, and then drives and connect with the control parts of motion Robot motion.
Motion control method according to an embodiment of the present invention can be interacted with user, be directed to target device mould using user The kinematic parameter of type editor controls and the control practical equipment moving to be controlled connecting of equipment.This mode to pass through inhomogeneity The device model of type flexibly and easily carries out control to different control equipments to be controlled and is possibly realized.
According to embodiments of the present invention, parameter editor's window relevant to target device model is being shown on human-computer interaction interface Before mouth (step S1110), method 1100 can also include: to establish current control scene;And the activation based on user's input Instruction activates target device model in currently control scene.
Illustratively, establishing current control scene may include: the document scene generated for indicating currently to control scene. Generation document scene is hereinbefore described, establishes the mode of control scene, can understand the present embodiment with reference to above description, this Place repeats no more.
According to embodiments of the present invention, human-computer interaction interface may include robot window, method 1100 can also include: The identification information of each device model in the first model set is shown in robot window;Wherein, the first model set includes At least one device model, target device model belong to the first model set.
Above have been combined Fig. 3 and Fig. 5 describe robot window layout type and it includes content, can refer to Above description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, human-computer interaction interface can also include the editor for editor control scene, base It may include: in response to user in machine that target device model is activated in currently control scene in the activation instruction of user's input To the clicking operation of the identification information of target device model or by the identification information of target device model from machine in people's window People's window drags to the drag operation of editor, activates target device model;And displaying target equipment in the editing window The identification information of model;Wherein, activation instruction is clicking operation or the corresponding instruction of drag operation.
The implementation that user inputs activation instruction activation target device model is hereinbefore described, it can be with reference to above Description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, establishing current control scene may include: to generate for indicating currently to control scene Document scene;Before establishing current control scene, method 1100 can also include: generation project file, and project file is used for Manage the document scene of at least one control scene.
The content and its effect that project file includes is hereinbefore described, can understand this implementation with reference to above description Example, details are not described herein again.
According to embodiments of the present invention, method 1100 can also include: and be grouped to project file, to obtain and at least one At least one one-to-one sub- project file of a control scene.
Referring back to Fig. 3 and Fig. 4, it can be seen that shown in project window two sub- engineering Untitled_0 and Untitled_1.The two sub- engineerings correspond to two sub- project files, for managing the scene text of two control scenes respectively Part " 3.sce " and " 11111.sce ".This mode may be implemented to check the grouping management of different control scenes convenient for user And operation.
Illustratively, the grouping instruction execution that can be inputted based on user to the operation that project file is grouped, can also To be executed automatically by control equipment.
According to embodiments of the present invention, method 1100 can also include: the ginseng for receiving user and inputting in parameter editor Number distribution instruction;And kinematic parameter is distributed into the indicated designated equipment of parametric distribution instruction.
The kinematic parameter that user edits can be sent to control equipment to be controlled corresponding with target device model interaction, can also be with Distribute to other designated equipments.In the case where designated equipment is the control equipment to be controlled connecting practical with control equipment, distribution Kinematic parameter can control designated equipment and generate actual movement.In the case where designated equipment is device model, the fortune of distribution Dynamic parameter can be used for skimulated motion, can not necessarily generate actual movement.In this way, the kinematic parameter that user edits can be flexible One or more editable equipment are distributed on ground, this is advantageously implemented the synchronously control to different editable equipment.
Illustratively, parameter editor may include equipment specify control, equipment specify control be Input or Person's list control, receiving the parametric distribution instruction that user inputs in parameter editor may include: to receive user in equipment Identification information inputted in specified control, designated equipment;Or reception user specifies from equipment and selects designated equipment in control Identification information operation information;Wherein, parametric distribution instruction include user for equipment specify control performed by input or Instruction corresponding to selection operation.
Referring back to Figure 12, shows equipment and specify control.In Figure 12, the specified control of equipment has been indicated with dotted line frame Part.It is list control that equipment shown in Figure 12, which specifies control,.In list control, it can show and owning in editable equipment The one-to-one identification information of equipment is for selection by the user.Illustratively, the mark of any appliance shown in list control Information may include following information: the scene title of control scene belonging to the model of corresponding equipment, corresponding equipment.
In the example depicted in fig. 12, the option currently chosen is " MRX-AS@3.sce ", wherein MRX-AS indicates machine The model of people's model, 3.sce indicate that the robot model belongs to " 3.sce " this control scene.Therefore, which indicates movement Parameter is assigned to the MRX-AS robot model in " 3.sce " this control scene.
Optionally, in the case that equipment specifies control to be Input, user can input directly in text box to be referred to The identification information of locking equipment.
According to embodiments of the present invention, designated equipment is one of editable equipment, and editable equipment includes the second model set In all devices model and/or establish at least one control equipment to be controlled of connection with control equipment, the second model set includes At least one device model, target device model belong to the second model set.
According to embodiments of the present invention, the second model set includes all devices mould activated at least one control scene Type.
Be hereinbefore described editable equipment and the second model set meaning and it includes content, can refer to upper Text description understands the present embodiment, and details are not described herein again.
According to a further aspect of the invention, a kind of user interaction approach is provided.Figure 14 is shown according to an embodiment of the present invention User interaction approach 1400 schematic flow chart.As shown in figure 14, user interaction approach 1400 include step S1410, S1420、S1430。
In step S1410, human-computer interaction interface is shown, human-computer interaction interface includes project window, robot window, equipment One or more in window and editor, project window is for showing listed files relevant to project file, robot Window is used to show the identification information of each device model in the first model set, and device window is for showing editable equipment Identification information, editor be used for editor control scene, editable equipment includes all devices mould in the second model set Type and/or at least one control equipment to be controlled that connection is established with control equipment, at least one control equipment to be controlled includes at least one Robot and/or at least one control parts of motion, the first model set and the second model set respectively include at least one and set Standby model, each device model is robot model or control parts of motion model.
Fig. 3-6 is had been combined above describes the layout type of various windows on human-computer interaction interface and comprising in Hold, details are not described herein.
In step S1420, the instruction that user inputs on human-computer interaction interface is received.
The instruction that user inputs on human-computer interaction interface can include but is not limited to above-mentioned activation instruction, associated instructions, Parameter edit instruction, motion control instruction etc..The interaction of user and human-computer interaction interface can pass through the interaction of control equipment Device or independent interactive device are realized.Optionally, interactive device may include input unit and output device.User can lead to Cross input unit input instruction, control equipment can be shown by output device human-computer interaction interface and other relevant informations with It is checked for user.Input unit can include but is not limited to mouse, keyboard, one or more in touch screen.Output device can To include but is not limited to display.In one example, interactive device includes touch screen, can be realized simultaneously input unit and The function of output device.
In step S1430, the corresponding operation of instruction execution based on user's input.
For example, when receiving the activation instruction of user's input correspondingly, target device model can be activated.In another example In the transmission instruction for the transmission kinematic parameter for receiving user's input, the kinematic parameter editted can be sent to be controlled Equipment.
User interaction approach according to an embodiment of the present invention, can be shown on human-computer interaction interface device model and/or With the control practical control equipment to be controlled connecting of equipment, user can transport for these device models and/or control equipment to be controlled The operations such as dynamic parameter editor.This interactive mode facilitates user to be managed and control at least one control equipment to be controlled, and User is facilitated to simulate, test to robot or control parts of motion.
According to embodiments of the present invention, based on user input the corresponding operation (step S1430) of instruction execution may include: Scene based on user's input establishes instruction or scene imports instruction, establishes current control scene;And in the editing window to User provides the interface for editing current control scene.
Illustratively, user can click " file " of menu bar this menucommand, therefrom selection " newly-built scene text Part " control, in response to the aforesaid operations of user, control equipment can create a document scene and can be in the editing window The interface for editing control scene corresponding to the document scene is provided.Illustratively, user can also click " the text of menu bar This menucommand of part ", therefrom selection " import document scene " control, in response to the aforesaid operations of user, controlling equipment can be with It imports the existing document scene of user's selection and can provide in the editing window and edit control corresponding to the document scene The interface of scene.
According to embodiments of the present invention, based on user input the corresponding operation (step S1430) of instruction execution may include: Activation instruction based on user's input activates the target device model in the first model set in currently control scene.
According to embodiments of the present invention, the activation instruction based on user's input activates the first Models Sets in currently control scene Target device model in conjunction may include: in response to user in robot window to the identification information of target device model Clicking operation or the drag operation that the identification information of target device model is dragged to editor from robot window, activation Target device model;And the identification information of displaying target device model in the editing window;Wherein, activation instruction is to click behaviour Work or the corresponding instruction of drag operation.
The implementation that user inputs activation instruction activation target device model is hereinbefore described, it can be with reference to above Description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, the second model set may include all setting of activating at least one control scene Standby model.
According to embodiments of the present invention, based on user input the corresponding operation (step S1430) of instruction execution may include: Based on the parameter edit instruction corresponding with the designated equipment in editable equipment of user's input, shown on human-computer interaction interface Parameter editor relevant to designated equipment;Receive the kinematic parameter that user edits in parameter editor;And it will fortune Parametric distribution is moved to designated equipment.
The mode that the mode and user that display parameters editor is hereinbefore described interact can refer to upper Text description understands the present embodiment, and details are not described herein again.
According to embodiments of the present invention, parameter editor can be shown in the editing window.Referring back to Figure 12 and Figure 13, When popping up parameter editor, can be replaced with the interface of the window in editor for editing the boundary of current control scene Face.
According to embodiments of the present invention, the parameter editor corresponding with the designated equipment in editable equipment based on user's input Instruction shows that parameter editor relevant to designated equipment may include: to set in response to user on human-computer interaction interface To the clicking operation of the identification information of designated equipment in standby window, the display parameters editor on human-computer interaction interface.
The implementation for interacting display parameters editor with user is hereinbefore described, can be managed with reference to above description The present embodiment is solved, details are not described herein again.
According to embodiments of the present invention, listed files may include be respectively used to indicate at least one control scene at least one The identification information of a document scene.
The information shown in project window can be considered as a listed files.Listed files may include one or more The identification information of the file of type, such as the identification information of project file, the identification information of motor movement file, document scene Identification information etc..Referring back to Fig. 4, listed files includes identification information " 1212.prj ", the motor movement of project file 1212 The identification information " a.pvt " of file a, the identification information " 3.sce " of document scene 3 and 11111 and " 11111.sce ".
According to a further aspect of the invention, a kind of user interaction approach 1500 is provided.Figure 15 shows a reality according to the present invention Apply the schematic flow chart of the user interaction approach 1500 of example.As shown in figure 15, user interaction approach 1500 includes step S1510、S1520、S1530、S1540。
In step S1510, the display parameters editor on human-computer interaction interface.
Illustratively, the parameter edit instruction corresponding with any editable equipment that can be inputted based on user, man-machine Parameter editor relevant to the editable equipment is shown on interactive interface.For example, user can click and (double-click or click) The identification information for the target device model that editor or device window are shown.In response to the clicking operation of user, can pop up The corresponding parameter editor of the target device model.User can edit kinematic parameter in the parameter editor, and will Kinematic parameter is reassigned to any designated equipment.
Illustratively, user can click " file " of menu bar this menucommand, therefrom selection " newly-built robot fortune Dynamic file " or " importing robot motion's file " control, in response to the operation of user, controlling equipment can be popped up for editing machine The parameter editor of the kinematic parameter of the end effector of device people or robot model (referring to Figure 12).
Illustratively, user can click " file " of menu bar this menucommand, therefrom selection " newly-built motor movement File " or " importing motor movement file " control, in response to the operation of user, controlling equipment can be popped up for editor robot Or some joint of robot model kinematic parameter or for editing control parts of motion or control parts of motion model The parameter editor of the kinematic parameter of some axis (referring to Figure 13).
The kinematic parameter shown by parameter editor is the movement of robot or the end effector of robot model In the case where parameter, the edited kinematic parameter of user can be stored in robot motion's file.In parameter editor Shown kinematic parameter is the kinematic parameter or control parts of motion or fortune in some joint of robot or robot model In the case where the kinematic parameter of some axis of dynamic control unit model, the edited kinematic parameter of user can be stored in motor In move file.This storage scheme is not only restricted to the pop-up mode of parameter editor.
In step S1520, the kinematic parameter that user edits in parameter editor is received.
In step S1530, the parametric distribution instruction that user inputs in parameter editor is received.
In step S1540, kinematic parameter is distributed into the indicated designated equipment of parametric distribution instruction.
It has been combined Figure 12 and Figure 13 above and describes user and edit kinematic parameter and distribute to kinematic parameter specified The implementation of equipment can understand the present embodiment with reference to above description, and details are not described herein again.
According to embodiments of the present invention, designated equipment is one of editable equipment, and editable equipment includes the second model set In all devices model and/or establish at least one control equipment to be controlled of connection with control equipment, at least one is to be controlled to set up Standby includes at least one robot and/or at least one control parts of motion, and the second model set includes at least one equipment mould Type, each device model are robot model or control parts of motion model.
According to embodiments of the present invention, method 1500 can also include: to establish at least one control scene;And it is based on user The activation instruction of input activates at least one device model at least one control scene, to obtain the second model set.
The implementation for establishing control scene and activation equipment model is hereinbefore described, above description can be referred to Understand the present embodiment, details are not described herein again.
According to embodiments of the present invention, parameter editor includes that equipment specifies control, and it is text box control that equipment, which specifies control, Part or list control, receiving the parametric distribution instruction (step S1530) that user inputs in parameter editor may include: It receives user and specifies identification information input in control, designated equipment in equipment;Or it receives user and specifies control from equipment The operation information of the identification information of middle selection designated equipment;Wherein, parametric distribution instruction includes user for the specified control of equipment Instruction corresponding to performed input or selection operation.
In the case where kinematic parameter distributes to designated equipment entirety, user only can indicate to set to which to control equipment Back-up matches kinematic parameter.For example, if necessary to which the kinematic parameter that robot motion's file includes is distributed to robot or machine People's model, then user specifies control to specify the robot or robot model by equipment.
According to embodiments of the present invention, in the case where it is list control that equipment, which specifies control, equipment specifies control for showing Show with the one-to-one identification information of all devices in editable equipment for selection by the user;Wherein, editable equipment includes All devices model in second model set and/or at least one control equipment to be controlled that connection is established with control equipment, at least One control equipment to be controlled includes at least one robot and/or at least one control parts of motion, and the second model set includes extremely A few device model, each device model is robot model or control parts of motion model.
According to embodiments of the present invention, the identification information of any appliance in editable equipment may include following information: right It answers the model of equipment, correspond to the scene title of control scene belonging to equipment.
According to embodiments of the present invention, designated equipment is control parts of motion or control parts of motion comprising at least one axis Model, it may include: to join movement that kinematic parameter, which is distributed to the indicated designated equipment (step S1540) of parametric distribution instruction, Number distributes to the specified axis of the indicated designated equipment of parametric distribution instruction.
According to embodiments of the present invention, parameter editor includes that equipment specifies control and axis to specify control, the specified control of equipment Part is that perhaps the specified control of list control axis is Input or list control to Input, receives user in parameter The parametric distribution instruction inputted in editor may include the specified operation of equipment and the specified operation of axis, wherein the specified behaviour of equipment Work includes: to receive user to specify identification information input in control, designated equipment in equipment;Or it receives user and refers to from equipment Determine the operation information that the identification information of designated equipment is selected in control;The specified operation of axis includes: to receive user to specify control in axis The identification information of middle input, designated equipment specified axis;Or reception user specifies from equipment and selects designated equipment in control Specified axis identification information operation information;Wherein, parametric distribution instruction includes the specified operation of equipment and the specified operation institute of axis Corresponding instruction.
In the case where designated equipment is control parts of motion or control parts of motion model, kinematic parameter can be distributed to Any one axis of designated equipment, user can indicate which axis distribution kinematic parameter to which equipment to control equipment at this time. For example, if necessary to which the kinematic parameter that motor movement file includes is distributed to control parts of motion or control parts of motion model Some axis, then user can specify control to specify the control parts of motion or control parts of motion model by equipment, and lead to Crossing axis specifies control to specify an axis on the control parts of motion or control parts of motion model.
Referring back to Figure 13, shows equipment and control and axis is specified to specify control.Equipment shown in Figure 12 specify control and It is list control that axis, which specifies control,.It specifies in control, can show a pair of with all devices one in editable equipment in equipment The identification information answered is for selection by the user.It specifies in control, can show one-to-one with all axis of designated equipment in axis Identification information is for selection by the user.For example, designated equipment is a kind of control parts of motion (five axis drive of model MRQ-M2305 Control device) in the case where, can axis specify shown in control the identifiers " CH1 " of five axis, " CH2 ", " CH3 ", " CH4 ", “CH5”。
According to embodiments of the present invention, in the case where it is list control that equipment, which specifies control, equipment specifies control for showing Show with the one-to-one identification information of all devices in editable equipment for selection by the user;Specifying control in axis is list control In the case where part, axis specifies control for showing with the one-to-one identification information of all axis of designated equipment for user's choosing It selects;Wherein, editable equipment includes all devices model in the second model set and/or establishes connection extremely with control equipment A few control equipment to be controlled, at least one control equipment to be controlled includes at least one robot and/or at least one motion control portion Part, the second model set include at least one device model, and each device model is robot model or control parts of motion mould Type.
According to an aspect of the present invention, a kind of motion control method is provided.Figure 16 shows according to an embodiment of the invention The schematic flow chart of motion control method 1600.As shown in figure 16, motion control method 1600 include step S1610, S1620、S1630。
In step S1610, motion control window relevant to designated equipment is shown on human-computer interaction interface, it is described specified Equipment is one of robot, control parts of motion, robot model and control parts of motion model.
Illustratively, designated equipment is one of editable equipment, and editable equipment includes model set (i.e. above-mentioned second mould Type set) in all devices model and/or establish at least one control equipment to be controlled of connection with control equipment, at least one is waited for Controlling equipment includes at least one robot and/or at least one control parts of motion, and model set includes at least one equipment Model, each device model are robot model or control parts of motion model.
Be hereinbefore described editable equipment and the second model set meaning and it includes content, can refer to upper Text description understands the present embodiment, and details are not described herein again.When control equipment to be controlled and control equipment establish connection, control equipment can be first First read information of control equipment to be controlled, such as model, the information of each axis of control equipment to be controlled of control equipment to be controlled etc..Then, The identification information of the control equipment to be controlled having connected can be shown in above equipment window.After user's activation equipment model, swashed The identification information of device model living also may be displayed on device window.Can optionally establish the device model that is activated with Incidence relation between the control equipment to be controlled of connection.For any control equipment to be controlled or device model of device window, can adjust Motion control window carries out motion control out.
In the case that designated equipment is robot or control parts of motion, user can use the setting of motion control window and refer to The kinematic parameter of locking equipment, to control designated equipment actual motion.Designated equipment is robot model or control parts of motion mould In the case where type, the movement of designated equipment can be emulated, user can use motion control window setting designated equipment Kinematic parameter, with control the designated equipment in simulation process movement.Designated equipment is robot model or control parts of motion In the case where model, user can also be set with controlling with specified using the kinematic parameter of motion control window setting designated equipment Standby associated equipment moving to be controlled.
Figure 17 shows the schematic diagram of motion control window according to an embodiment of the invention.Motion control shown in Figure 17 Window is motion control window relevant to the robot model of model MRX-T4.As shown in figure 17, motion control window can To include some slider controls corresponding with each joint of robot model, Input etc., these slider controls, Input can receive the motion control instruction of user's input, and the fortune in each joint is determined based on received motion control instruction Dynamic parameter.
At least one object is determined based on the motion control instruction that user inputs in motion control window in step S1620 Respective kinematic parameter, wherein at least one object includes one of following item: end effector, the designated equipment of designated equipment At least one joint, designated equipment at least one axis.
Illustratively, at least one object in step S1620 may include all objects of designated equipment.The object It can be end effector, joint or axis.In the case where designated equipment is robot or robot model, at least one object It can be end effector or at least one joint.In the case that at least one object is at least one joint, each object is One joint.In the case where designated equipment is control parts of motion or control parts of motion model, at least one object can be with It is at least one axis, each object is an axis.
In step S1630, the respective kinematic parameter of at least one object is respectively allocated at least one object, with control The movement of at least one object.The kinematic parameter of at least one object can be distributed at least one object correspondingly.
In one example, the kinematic parameter of each object at least one object is used to indicate the object when current Movement pre-determined distance or arrival target position in preset time after quarter.The kinematic parameter of motion control window setting can be Parameter for the movement of real-time control corresponding objects.For example, user can drag in motion control window as shown in figure 17 Sliding block in slider control corresponding with pedestal (Basement).The every execution of user is once from by sliding block, dragging sliding block To the process of release slider, a slide can be considered as.Referring to Figure 17, the every slider control corresponding to pedestal of user Slide of upper execution, the pedestal of robot model correspondingly or can be moved synchronously once.That is, with Slide of the every execution in family, control equipment can accordingly obtain a kinematic parameter, and designated equipment can be with corresponding sports one It is secondary.Optionally, user can be added to as shown in Figure 12 or 13 in any one kinematic parameter that motion control window is arranged In kinematic parameter sequence shown by parameter editor.
Motion control method according to an embodiment of the present invention can be interacted with user to determine designated equipment (robot, movement Control unit, robot model or control parts of motion model) kinematic parameter, this method facilitates user to actually connecting Robot or control parts of motion carry out motion control, and facilitate user to robot model or control parts of motion model into Row movement simulation, test etc..
Motion control interface is properly termed as " APP ", can develop a variety of forms for robot or control parts of motion APP, meet a variety of different application demands.The APP of the control parts of motion of the robot or various configuration of various configuration can And have nothing in common with each other in addition user can also oneself exploitation APP realize the manipulation to robot or control parts of motion.
The exemplary of motion control window is described below and recalls mode.
According to an embodiment of the present invention, motion control window relevant to designated equipment is being shown on human-computer interaction interface Before mouthful (step S1610), method 1600 can also include: designated equipment to be shown on human-computer interaction interface or at least one is right The identification information of elephant;Show that relevant to designated equipment motion control window (step S1610) can be on human-computer interaction interface Include: the first clicking operation in response to user to the identification information of designated equipment or at least one object, shows motion control Window.
It is exemplary and not restrictive, the first clicking operation can be left double-click operation.It illustratively, can be as above The identification information of designated equipment is shown in the device window.It optionally, in addition, can also be in device window as described above The identification information of any object of middle display designated equipment.In one example, user can be to the identification information of designated equipment The first clicking operation is carried out, to recall the motion control window of the kinematic parameter at least one object to be arranged.At another In example, user can carry out the first clicking operation to the identification information of at least one object, be used to be arranged at least one to recall The motion control window of the kinematic parameter of a object.Illustratively, for each of at least one object, user can be right The identification information of the object carries out the first clicking operation, to recall the motion control window of the kinematic parameter for the object to be arranged Mouthful.
Referring back to Fig. 6, show model MRQ-M2305 control parts of motion and the control parts of motion five The identification information of a axis.User can be double-clicked on the identification information of any one axis with left mouse button, such as be double-clicked on CH1, It then can then show the motion control window of the kinematic parameter for the axis to be arranged.Figure 18 shows an implementation according to the present invention The schematic diagram of the motion control window of the kinematic parameter of first axis for control parts of motion shown in fig. 6 to be arranged of example. As shown in figure 18, it is shown that the motion control window of the kinematic parameter for CH1 this axis to be arranged.
In one example, user is double-clicked in any robot or the identification information of robot model with left mouse button Afterwards, can then show for be arranged the robot or robot model articulate kinematic parameter motion control window (as shown in figure 17).
Illustratively, the human-computer interaction interface includes device window, described that specified set is shown on human-computer interaction interface Standby or at least one object identification information may include: that designated equipment or at least one object are shown in the device window Identification information.The implementation that the identification information of control equipment or device model to be controlled is shown in device window is hereinbefore described Example can understand the embodiment for showing the identification information of designated equipment in device window in conjunction with above description, repeat no more.This Outside, the identification information of at least one object of designated equipment can also be shown in device window.Referring back to Fig. 6, Ke Yi The identification information of all axis of control parts of motion is shown in device window.Although in addition, being not shown in attached drawing, in equipment The articulate identification information of the institute of robot or robot model can be shown in window.
Pass through at least one of the identification information or any designated equipment to any designated equipment shown in device window First clicking operation of the identification information of object, can recall motion control window.This mode that recalls is fairly simple convenient, side Just user quickly recalls required motion control window.
According to embodiments of the present invention, motion control window relevant to designated equipment is being shown on human-computer interaction interface Before (step S1610), method 1600 can also include: that designated equipment or at least one object are shown on human-computer interaction interface Identification information;Show that motion control window (step S1610) relevant to designated equipment can wrap on human-computer interaction interface Include: the second clicking operation in response to user to the identification information of designated equipment or at least one object shows menu window, dish Single window includes the motion control menu item for controlling the opening and closing of motion control window;Movement is controlled in response to user The third clicking operation of menu item processed shows motion control window.
It is exemplary and not restrictive, the second clicking operation can be right button single-click operation.Illustratively, menu window can To be the menu window of Pop-up, second point can be carried out in identification information of the user to designated equipment or at least one object Pop-up when hitting operation.Menu window can show one or more menu items in the form of a list, wherein may include for controlling The motion control menu item of the opening and closing of motion control window processed.
It is exemplary and not restrictive, third clicking operation can be left button single-click operation.When user is to menu window When any menu item is clicked, corresponding window (such as described motion control window) can be popped up or execute other correspondences behaviour Make.
According to embodiments of the present invention, motion control window includes parameter setting control, based on user in motion control window The motion control instruction of input determines that the respective kinematic parameter (step S1620) of at least one object may include: based on user To the operation that parameter setting control executes, the respective kinematic parameter of at least one object is determined, wherein motion control instruction includes Instruction corresponding to the operation that user executes for parameter setting control.
Parameter setting control may include any suitable type can operational controls, which can interact with user.Base In operation of the user to parameter setting control, the kinematic parameter of one or more objects can be determined by controlling equipment.Illustratively, It is above-mentioned can operational controls can be slider control, Input etc..Corresponding to different objects can operational controls can be The control of same type is also possible to different types of control.For example, corresponding to first part's object at least one object Can operational controls be slider control, corresponding to second part object can operational controls be Input, Part III Corresponding to object can operational controls include slider control and Input.
According to embodiments of the present invention, parameter setting control include with the object in the first object set correspondingly at least One slider control, the first object set include at least partly object at least one object, are set based on user to parameter The operation for setting control execution, determines that the respective kinematic parameter of at least one object may include: in the first object set Any object receives the operation information that user executes slide to the sliding block in slider control corresponding to the object;Base End position and/or initial position and stop bits of the sliding block in sliding process in the slider control corresponding to the object Alternate position spike between setting determines the position data in the kinematic parameter of the object;And/or it is pressed based on user positioned at initial position Sliding block at the time of and release be located at end position sliding block at the time of between time difference, determine in the kinematic parameter of the object Time data;Wherein, position data indicates target position, rotation angle or move distance, and time data indicate in-position Target position indicated by data or the time consumed by rotation angle indicated by position data or move distance.
Sliding block is dragged in slider control, can control the movement of corresponding objects.For example, sliding block is dragged from a position The distance for moving another position is longer, can indicate that move distance/rotation angle of corresponding objects is bigger, by sliding block from one Position drags to another position and discharges that the time used is shorter, can indicate that the corresponding objects movement time used is fewer, Dragging sliding block can indicate the direction of motion of desired corresponding objects to the left or to the right.Dragging sliding block can be by pinning mouse The mode of left button and mobile mouse drags, and can also be dragged by rotating the idler wheel of mouse.
With continued reference to Figure 17, show (pedestal, large arm, small with five joints of the robot model of model MRX-T4 Arm, wrist, manipulator) corresponding five slider controls.User can drag slider control corresponding to any joint, Control equipment can accordingly determine the size of the kinematic parameter in the joint.
In the case where the position data of kinematic parameter indicates target position, which can be based on primary sliding behaviour The end position of sliding block determines during work.The case where the position data of kinematic parameter indicates rotation angle or move distance Under, the position data can based on a slide during sliding block initial position and end position determine.It can manage Solution, initial position is that for user as the position where sliding block when sliding block, end position is once to slide in a slide Position in dynamic operation when user's release slider where sliding block.Illustratively, the size of position data can slided with sliding block The size of alternate position spike between initial position in the process and end position is directly proportional.
Illustratively, the size of position data may include the absolute value of position data and the symbol of position data, sliding block Initial position and end position between alternate position spike size may include position absolute value of the difference and alternate position spike symbol.When The symbol of alternate position spike is timing, such as when sliding block slides to the right, and the symbol of position data can be the first symbol, is used to indicate pair Reply along first direction as moving.When the symbol of alternate position spike is negative, the symbol of position data can be the second symbol, for referring to Show that corresponding objects move in a second direction.Illustratively, the first symbol can be positive sign, and first direction can be side clockwise To.Illustratively, the second symbol can be negative sign, and second direction can be for counterclockwise.
Illustratively, the size of time data can be equal at the time of user presses positioned at the sliding block of initial position and discharge Time difference between at the time of positioned at the sliding block of end position.Sliding block is dragged identical distance by user, can be spent different Time.For example, user can make sliding block stop any time at any position of slider control.User by sliding block it After be always maintained at and do not loose one's grip, the time of holding is longer, and the time data in the kinematic parameter of corresponding objects are bigger, can make pair Reply is as running slower.Therefore, in the case where the position data of same size is arranged, user can according to need selection and set Set the time data of different length.
Slider control is a kind of simple, strong operability interactive tool, and user can be simpler by slider control Singly conveniently set up kinematic parameter.
According to embodiments of the present invention, motion control window can also include corresponding with the object in the second object set At least one parameter current display area, the second object set is the subset of the first object set, for the second object set In any object, parameter current display area corresponding to the object is for slider bar control corresponding to the real-time display object Sliding block on part is in sliding process, in the predicted value for the position which currently reaches and/or the kinematic parameter of the object The predicted value of time data.
7 are continued to refer to figure 1, parameter current display area A1 is shown above each slider control (as an example, figure 17 only with dotted line frame marked pedestal corresponding to A1).Each parameter current display area includes two labels, respectively indicates and works as The predicted value of predicted value (unit is °) and time consumed by from initial position to the position currently reached of the position of preceding arrival (unit ms).In the case that position data in kinematic parameter indicates target position, parameter current display area is shown The predicted value of the position currently reached is the predicted value of position data.Parameter current display area show from initial position to The predicted value of time consumed by the position currently reached is the predicted value of the time data in kinematic parameter.Slider control The predicted value of position and time that top is shown can be to be changed in real time.User drag sliding block during, the position and The predicted value of time changes with the variation of sliding block current location.
For example, showing current pedestal in attached drawing 17 40 ° of this positions, if user presses mouse and drags sliding block fortune to the left It moves -114 ° and is always maintained at and do not loose one's grip, 12378ms is kept, then this process can indicate: making pedestal from 40 ° of used times 12378ms rotates to -114 ° counterclockwise (pedestal rotates clockwise angle and is positive, and rotated counterclockwise by angle is negative).In Figure 17, The 460ms that current location display area is shown corresponding to pedestal is time data determined by last slide.
The placement position of parameter current display area can be arbitrary, and the present invention limits not to this.For example, any Parameter current display area corresponding to object can also be arranged in the lower section of slider control corresponding to the object, left, Right etc., the placement position of parameter current display area corresponding to different objects can be identical, be also possible to different.
Whether the predicted value of real-time display position and time, the kinematic parameter that user's timely learning can be made to be arranged meet It is required that kinematic parameter is configured and is adjusted convenient for user, this mode can greatly improve user experience.
According to embodiments of the present invention, parameter setting control include with the object in third object set correspondingly at least One group of Input, every group of Input include at least one Input, and third object set includes at least one At least partly object in object determines that at least one object is respective based on the operation that user executes parameter setting control Kinematic parameter may include: to receive user's one group of text corresponding to the object for any object in third object set Position data and/or time data in the kinematic parameter of the object inputted in this frame control, wherein position data indicates mesh Cursor position, rotation angle or move distance, time data indicate target position indicated by the data of in-position or by positions The time of rotation angle indicated by data or move distance consumption.
Optionally, any Input at least one set of Input can be with digital addition and subtraction function Input (as shown in figure 17).
Illustratively, literary with every group in the object in third object set correspondingly at least one set of Input This frame control may include the position Input for receiving the position data of user's input.7 are continued to refer to figure 1, every The left side of a slider control a, it is also shown that Input.Text frame control can be used for receiving the position of user's input Set data.User can input the coordinate of either objective position in the Input on the left of each slider control, control Equipment can control designated equipment and next move to the target position.User can also be in the text on the left of each slider control Input any pre-determined distance or predetermined angle in this frame control, control equipment can control designated equipment, and next to move this default Distance rotates the predetermined angle.
In one example, with it is every in the object in third object set correspondingly at least one set of Input A Input may include the time Input for receiving the time data of user's input.User can be each Input time data in time Input, control equipment can control designated equipment in the time indicated by time data Move to target position, movement pre-determined distance or rotation predetermined angle.In another example, parameter setting control may include Single time Input.The time Input can be used for receiving the institute in the third object set of user's input Have the time data of object, that is, the time data of all objects in third object set can be it is identical, they can benefit It is arranged with same time Input is unified.For example, with reference to Figure 17, in the column " Step " shown with label be " Time (s) " Input, text frame control is for the unified time data that five joints are arranged.
Motion control window can use any appropriate layout type and may include any appropriate control, not It is confined to example shown in Figure 17.For example, Figure 19 shows the signal of motion control window in accordance with another embodiment of the present invention Figure.User's interactive mode relevant to motion control window shown in Figure 19 is described below.
According to embodiments of the present invention, in the case where at least one object includes the end effector of designated equipment, parameter Setting control includes six button controls, and six button controls are divided into be corresponded with three reference axis of rectangular coordinate system in space Three pairs of button controls, two button controls in each pair of button control respectively correspond two opposite directions, six button controls Each button control in part is used to indicate end effector in the reference axis corresponding to the button control along the button control The movement of direction corresponding to part, based on the operation that user executes parameter setting control, determines the respective fortune of at least one object Dynamic parameter may include: to receive the operation that user executes clicking operation at least partly button control in six button controls to believe Breath;Based on user to the operation information of at least partly button control, the position data in the kinematic parameter of end effector is determined.
Referring to Figure 19, on the right side of motion control window, one piece of region A2 is shown, which includes six button controls " ∧ ", " ∨ ", " < ", " > ", "+", "-", wherein " ∧ ", " ∨ " can respectively correspond the forward direction of the Z axis in XYZ coordinate system and Z Axis negative sense, " < ", " > " can respectively correspond Y-axis negative sense and Y-axis forward direction in XYZ coordinate system, and "+", "-" can respectively correspond X-axis forward direction and X-axis negative sense in XYZ coordinate system.For example, user is every to click primary " < " control, the end of robot can control Hold actuator along Y-axis negative movement pre-determined distance.The pre-determined distance can be considered as step-length.Optionally, user can use step The value of long setting control setting step-length.Optionally, the value of step-length can be equal to default value.
Using six button controls shown in Figure 19, can control the end effector of robot or robot model along Arbitrary coordinate axis movement in XYZ coordinate system, thus the movement of real-time control end effector in space.Compare desirable It is that designated equipment is and controls the practical robot connecting of equipment, can control robot in this way and generate actual motion.
In addition, showing one piece of region A3 in the left side of motion control window, which includes four Inputs, The position data and time data of end effector can be directly set in these this paper frame controls.Position data may include three A coordinate value, the respectively coordinate value in X, Y, Z axis.User can in these three Inputs input coordinate value, be used for Indicate end effector next target position to be reached.User can also in the A3 of region the top Input Input time data are used to indicate the time for reaching target position consumption.It is hereinbefore described to be arranged with Input and transport The embodiment of dynamic parameter, repeats no more.
According to embodiments of the present invention, parameter setting control further includes step-length setting control, is used to indicate six button controls In each button control click an end effector move distance, step-length setting control is Input.
Optionally, step-length setting control can be the Input with digital addition and subtraction function.Referring to Figure 19, in region Region A4 is shown below A2, which includes two Inputs, wherein following label is the text box of " Step (mm) " Control is step-length setting control.User can input numerical value in step-length setting control and/or be pressed with the plus-minus on the right side of control The size of button adjusting integer value, and then obtain step value.Each button control in six button controls is clicked once corresponding The move distance of end effector can be equal to the step value being arranged in step-length setting control.The step value of current setting is 20mm, it is meant that when user clicks the either button control in six button controls every time, control equipment can control end and hold Row device moves 20mm along direction corresponding to the button control.
According to embodiments of the present invention, parameter setting control can also include time setting control, be used to indicate six buttons Each button control in control clicks the run duration of an end effector, and time setting control is Input.
Optionally, time setting control can be the Input with digital addition and subtraction function.With continued reference to Figure 19, area Label included by the A4 of domain is that the Input of " t (s) " is time setting control.User can be in time setting control Numerical value and/or the size with the plus-minus button adjustment numerical value on the right side of control are inputted, and then obtains time data.Six button controls In each button control click the run duration of an end effector and can be equal in time setting control time for being arranged Data.The time data of current setting are 1s, it is meant that when user clicks the either button control in six button controls every time, Control equipment can control end effector and move pre-determined distance (i.e. above-mentioned step-length) along direction corresponding to the button control, And the time for passing through the pre-determined distance is 1s.
According to embodiments of the present invention, parameter setting control include with the object in the 4th object set correspondingly at least One discrete zero control, the 4th object set include at least partly object at least one object, based on user to parameter The operation that setting control executes determines that the respective kinematic parameter of at least one object includes: for appointing in the 4th object set An object sets the kinematic parameter of the object in response to user to the clicking operation of discrete zero control corresponding to the object It is set to preset initial value.
Referring back to Figure 17, for being arranged above the Input of position data corresponding to each object, show One " Zero " control, the control are discrete zero control.User clicks any discrete zero control, can be by the control pair The kinematic parameter for the object answered is zeroed, it is made to be returned to preset initial value.For example, user clicks discrete zero corresponding to pedestal The kinematic parameter of pedestal can be zeroed by control.
According to embodiments of the present invention, parameter setting control includes whole zero control, based on user to parameter setting control The operation of execution determines that the respective kinematic parameter of at least one object includes: the click in response to user to whole zero control Operation, is respectively set to respective preset initial value for the kinematic parameter of at least one object.
" To Zero " control is shown in movement window top half referring back to Figure 17, which is whole return Zero control.User clicks entirety zero control, the articulate kinematic parameter of institute of robot can be zeroed together, i.e., together It is returned to respective preset initial value.
According to embodiments of the present invention, designated equipment is robot or robot model, and motion control window includes present bit Display area is set, position data corresponding to the position that the end effector for real-time display designated equipment currently reaches.
With reference to Figure 17 and Figure 19, current location display area has been marked with dotted line frame.Designated equipment be robot or In the case where robot model, with end effector.No matter the kinematic parameter in one or more joints is set or is arranged The kinematic parameter of end effector, end effector can generate corresponding movement.It is alternatively possible to which real-time monitoring end executes The position of device, and the coordinate of the current in-position of real-time display.This mode facilitates user to check the motion conditions of designated equipment, To know whether designated equipment correctly runs, while also facilitating user that ideal location point is added to designated equipment when needed Kinematic parameter sequence, it is convenient that trajectory planning is carried out to designated equipment.
According to embodiments of the present invention, motion control window includes distribution control, and the respective movement of at least one object is joined It may include: that the click executed in response to user to distribution control is grasped that number, which is respectively allocated at least one object (step S1630), Make, the respective kinematic parameter of at least one object is respectively allocated at least one object.
With reference to Figure 19, distribution control is shown in the A3 of region, it, can be by the text of region A3 when the user clicks when the control The kinematic parameter being arranged in this frame control distributes to end effector, to drive end effector to move.It is appreciated that in Figure 17 Shown in similar distribution control also can be set in motion control window, repeat no more.
For the first object set described herein, the second object set, third object set, the 4th object set etc. pair For set, in addition to having specified otherwise (such as the second object set be the first object set subset), any two collection Conjunction may include one or more shared objects, can not also include any shared object.
According to a further aspect of the invention, a kind of parameter edit methods are provided.Figure 20 is shown according to an embodiment of the present invention Parameter edit methods 2000 schematic flow chart.As shown in figure 20, parameter edit methods 2000 include step S2010 and S2020。
In step S2010, motion control window relevant to designated equipment is shown on human-computer interaction interface, wherein refer to Locking equipment is one of robot, control parts of motion, robot model and control parts of motion model, and motion control window includes Parameter adds control.
Have been combined above Figure 17-19 describe motion control window form and it includes content, it is no longer superfluous herein It states.
In step S2020, the operation of control is added to parameter in response to user, kinematic parameter to be added is added to specified In the kinematic parameter sequence of the specified object of equipment, specified object is end effector, joint or axis.
In the case where designated equipment is robot or robot model, specified object can be end effector or appoint It anticipates a joint.In the case where designated equipment is control parts of motion or control parts of motion model, specified object be can be Any one axis.
Exemplary parameter addition control can be button control and not restrictive, adds and controls to parameter in step S2020 The operation of part can be clicking operation, such as left mouse button is clicked.
Referring back to Figure 19, parameter addition control 1 is shown in the A3 of region, is shown in the top half of motion control window Parameter addition control 2 is gone out.It, can be by region A3 in response to the clicking operation of user when the user clicks when parameter addition control 1 Input in the kinematic parameter that is arranged be added in the kinematic parameter sequence of end effector.Fortune is hereinbefore described The function and significance of dynamic argument sequence, details are not described herein again.When the user clicks when parameter addition control 2, in response to the point of user Operation is hit, the current kinetic parameters of end effector can be added in the kinematic parameter sequence of end effector.Current fortune The meaning of dynamic parameter will be described below.
It, can be at any time by an ideal kinematic parameter during user carries out real time kinematics control to designated equipment It is added in the kinematic parameter sequence of the specified object of designated equipment, it is convenient that trajectory planning is carried out to specified object.For example, current The end effector of robot moves at the P of position, and user is considered the crucial point of a comparison, then can click such as Figure 19 institute The parameter addition control 2 shown, a kinematic parameter can be obtained (i.e. wait add based on the coordinate data of position P by then controlling equipment Add kinematic parameter), and the kinematic parameter is added in the kinematic parameter sequence of end effector.Kinematic parameter sequence can be first Begin to include one or more kinematic parameter.After kinematic parameter to be added is added to kinematic parameter sequence, kinematic parameter sequence It is updated, obtains new kinematic parameter sequence.In subsequent any moment, controlling equipment can be by new kinematic parameter sequence It is sent to the end effector of robot, is transported with controlling end effector along track corresponding to new kinematic parameter sequence It is dynamic.
Parameter edit methods according to an embodiment of the present invention, motion control window can be interacted with user, be enabled a user to It, at any time as needed will be any enough when carrying out motion control to designated equipment (or its specified object) by motion control window One kinematic parameter is added in kinematic parameter sequence, and user is facilitated to carry out motion control to designated equipment (or its specified object) And trajectory planning.
According to embodiments of the present invention, parameter addition control may include being used to indicate the current kinetic parameters of specified object It is added to the first button control of kinematic parameter sequence, kinematic parameter to be added includes current kinetic parameters.
Current kinetic parameters may include the position corresponding data currently reached with specified object.
9 are continued to refer to figure 1, parameter adds as the first button control of control 2 can be by present bit when clicking the control Coordinate shown by display area is set as the position data in current kinetic parameters.For example, it is assumed that current location display area Shown X, Y, Z value is respectively 100,50,0, then can be by (X:100;Y:50;Z:0) as the position in current kinetic parameters Set data.
Time data in current kinetic parameters can be default value, can also be set by the user.For example, current kinetic is joined Time data in number can be defaulted as 0.In this way, current kinetic parameters obtained can be (X:100;Y:50;Z:0;T: 0)。
User can observe coordinate shown by the display area of current location at any time, when the position for thinking currently to reach is ratio When more crucial point, parameter addition control can be clicked, is added to kinematic parameter sequence for current location as a tracing point In.The edit mode of this kinematic parameter sequence is not necessarily to the data of user's manual setting kinematic parameter, can subtract to a certain extent The workload of few user.
According to embodiments of the present invention, parameter addition control includes being used to indicate the user setting kinematic parameter of specified object It is added to the second button control of kinematic parameter sequence, kinematic parameter to be added includes user setting kinematic parameter.
9 are continued to refer to figure 1, parameter adds as the second button control of control 1 can be by user institute when clicking the control The kinematic parameter of setting is added in kinematic parameter sequence.
According to embodiments of the present invention, motion control window includes parameter setting control, is being added in response to user to parameter Kinematic parameter to be added is added to (step in the kinematic parameter sequence of the specified object of designated equipment by the operation of control S2020 before), method 2000 can also include: the operation executed based on user to parameter setting control, determine that user setting is transported Dynamic parameter.
Motion control method 1600 described in parameter setting control Figure 16-19 above in conjunction involved in method 2000 Related parameter setting control is identical, can refer to the form and its operating method of above description comprehension parameter setting control, Details are not described herein again.User can pass through the number of the controls manual setting kinematic parameter such as above-mentioned slider control, Input According to this mode facilitates user setting and adjusts the size of kinematic parameter, compares accurately kinematic parameter to help to obtain.
According to embodiments of the present invention, parameter setting control includes at least one Input, is set based on user to parameter The operation for setting control execution determines that user setting kinematic parameter includes: to receive user to input at least one Input User setting kinematic parameter in position data and/or time data.
According to embodiments of the present invention, in the case where specified object is end effector, parameter setting control includes six Button control, six button controls be divided into the one-to-one three pairs of button controls of three reference axis of rectangular coordinate system in space, Two button controls in each pair of button control respectively correspond two opposite directions, each button control in six button controls Part is used to indicate end effector and transports in the reference axis corresponding to the button control along direction corresponding to the button control It is dynamic, based on the operation that user executes parameter setting control, determine user setting kinematic parameter include: reception user to six by At least partly button control in button control executes the operation information of clicking operation;Based on user at least partly button control Operation information determines the position data in user setting kinematic parameter.
According to embodiments of the present invention, parameter setting control further includes step-length setting control, is used to indicate six button controls In each button control click an end effector move distance, step-length setting control is Input.
According to embodiments of the present invention, parameter setting control further includes time setting control, is used to indicate six button controls In each button control click an end effector run duration, time setting control is Input.
According to embodiments of the present invention, designated equipment is robot or robot model, and motion control window includes present bit Display area is set, position data corresponding to the position that the end effector for real-time display designated equipment currently reaches.On The content and its effect that current location display area includes has been described in text, and details are not described herein again.
According to embodiments of the present invention, motion control window includes image display area, and method 2000 can also include: to receive The realtime graphic of designated equipment;And realtime graphic is shown in image display area.
It can be using the camera of control equipment or the realtime graphic of individual camera acquisition designated equipment.Control equipment It can receive collected realtime graphic, and shown on human-computer interaction interface, so that user checks.Pass through this side Formula, user can check the motion conditions of designated equipment intuitively, at a glance, can know whether designated equipment is correctly transported Row, and can know which location point is adapted for being added to the point in kinematic parameter sequence, to edit movement accurately and in time Argument sequence, preferably control designated equipment (or its specified object) movement.
According to embodiments of the present invention, method 2000 can also include: the display parameters editor on human-computer interaction interface; The kinematic parameter that user edits in parameter editor is received, to obtain kinematic parameter sequence.Parameter is hereinbefore described The content and its display mode that editor includes, details are not described herein again.
According to embodiments of the present invention, method 2000 can also include: that kinematic parameter sequence is shown in parameter editor; In the operation in response to user to parameter addition control, kinematic parameter to be added is added to the fortune of the specified object of designated equipment In dynamic argument sequence after (step S2020), method 2000 further include: using kinematic parameter to be added as in kinematic parameter sequence Selected kinematic parameter next line or the last line kinematic parameter in kinematic parameter sequence next line, be shown in parameter In editor.
Illustratively, kinematic parameter to be added is added to kinematic parameter sequence may include adding kinematic parameter to be added It is added to the predetermined position of kinematic parameter sequence.The predetermined position can arbitrarily be set, including but not limited to: movement to be added is joined Number moves ginseng as the last line in the next line or kinematic parameter sequence of the selected kinematic parameter in kinematic parameter sequence Several next lines is added to kinematic parameter sequence, wherein kinematic parameter in kinematic parameter sequence according to time data from it is small to Big sequence.In this case, the time data of kinematic parameter to be added can be set to default value, such as 0, then can be by User modifies to the time data in parameter editor, as described below.
Illustratively, kinematic parameter to be added is added to kinematic parameter sequence may include the sequence according to time data Kinematic parameter to be added is added to kinematic parameter sequence.In this case, the time data of kinematic parameter to be added can be with It is set before addition by user.
Selected kinematic parameter can be the kinematic parameter that current cursor is chosen.For example, referring back to Figure 13, it is assumed that current light Marking choose is that the kinematic parameter of fourth line (serial number 3) can be with when then kinematic parameter to be added is added to kinematic parameter sequence Default is inserted between fourth line and fifth line, and is shown in parameter editor as shown in fig. 13 that according to the sequence after insertion In.
In another example, no matter which the kinematic parameter that current cursor is chosen is, can be joined movement to be added Number default is inserted into after the last line of kinematic parameter sequence, and is shown according to the sequence after insertion and is joined as shown in fig. 13 that In number editor.
According to embodiments of the present invention, kinematic parameter to be added includes position data, method 2000 can also include: in response to Operation of the user to parameter addition control, the time data of kinematic parameter to be added are used as default.Default value can root According to any setting is needed, the present invention limits not to this, for example, default value can be 0,1,2 etc..
According to embodiments of the present invention, using kinematic parameter to be added as the selected kinematic parameter in kinematic parameter sequence The next line of last line kinematic parameter in next line or kinematic parameter sequence, after being shown in parameter editor, Method 2000 can also include: the modification in response to user in parameter editor to the time data of kinematic parameter to be added Operation, modifies to the time data of kinematic parameter to be added.
At least partly parameter in kinematic parameter shown by parameter editor as shown in Figures 12 and 13 is can to compile It collects and modifies.For example, the time data of kinematic parameter to be added are defaulted as 0 and are shown in after parameter editor, user It can be revised as 5.
According to embodiments of the present invention, parameter editor may include addition functional switch control, add functional switch control Part is the button control for controlling the opening and closing of addition function, and kinematic parameter to be added is added to the finger of designated equipment The step (step S2020) determined in the kinematic parameter sequence of object is executed in the case where addition function is opened.
Referring back to Figure 12 and 13, addition functional switch control is shown, which can be button control.It is exemplary Ground, user can click the addition functional switch control with left mouse button, and addition function can be opened for the first time by clicking, and second Addition function can be closed by clicking, and addition function can be again turned on for the third time by clicking, and so on.In adding for such as Figure 12 or 13 In the case where adding function to open, just above-mentioned kinematic parameter to be added can be added to the kinematic parameter sequence as shown in Figure 12 or 13 Column, otherwise can not add.
User, which can according to need, opens or closes addition function, and this scheme can provide a user more autonomous Power, and advantageously reduce maloperation of the user in parameter editor.
According to a further aspect of the invention, a kind of motion control method is provided.Figure 21 is shown according to an embodiment of the present invention Motion control method 2100 schematic flow chart.As shown in figure 21, motion control method 2100 include step S2110, S2121、S2130。
In step S2110, motion control window relevant to designated equipment is shown on human-computer interaction interface, wherein refer to Locking equipment is one of robot, control parts of motion, robot model and control parts of motion model, and motion control window includes Parameter setting control.
The respective fortune of at least one object is determined based on the operation that user executes parameter setting control in step S2120 Dynamic parameter, wherein at least one object includes one of following item: at least the one of the end effector of designated equipment, designated equipment At least one axis of a joint, designated equipment.
In step S2130, the respective kinematic parameter of at least one object is respectively allocated at least one object, with control The movement of at least one object.
According to embodiments of the present invention, parameter setting control include with the object in the first object set correspondingly at least One slider control, the first object set include at least partly object at least one object, are set based on user to parameter The operation for setting control execution, determines that the respective kinematic parameter (step S2120) of at least one object may include: for first pair As any object in set, the behaviour that user executes slide to the sliding block in slider control corresponding to the object is received Make information;End position and/or start bit based on the sliding block in slider control corresponding to the object in sliding process The alternate position spike between end position is set, determines the position data in the kinematic parameter of the object;And/or based on user by bottom Time difference between at the time of the sliding block of initial position and at the time of release is positioned at the sliding block of end position, determine the object Time data in kinematic parameter;Wherein, position data indicates target position, rotation angle or move distance, time tables of data Show what rotation angle indicated by target position indicated by the data of in-position or process position data or move distance consumed Time.
According to embodiments of the present invention, the size of time data is equal to the time difference.
According to embodiments of the present invention, the size of position data and the size of alternate position spike are directly proportional.
According to embodiments of the present invention, motion control window further include with the object in the second object set correspondingly extremely A few parameter current display area, the second object set is the subset of the first object set, in the second object set Any object, parameter current display area corresponding to the object is for slider control corresponding to the real-time display object Time of the sliding block in sliding process, in the predicted value for the position which currently reaches and/or the kinematic parameter of the object The predicted value of data.
According to embodiments of the present invention, parameter setting control include with the object in third object set correspondingly at least One group of Input, every group of Input include at least one Input, and third object set includes at least one At least partly object in object determines that at least one object is respective based on the operation that user executes parameter setting control Kinematic parameter (step S2120) includes: to receive user corresponding to the object for any object in third object set Position data and/or time data in the kinematic parameter of the object inputted in one group of Input, wherein position data Indicate target position, rotation angle or move distance, time data indicate position indicated by the data of in-position or by position Set the time of rotation angle indicated by data or move distance consumption.
According to embodiments of the present invention, parameter setting control includes corresponding with each object in the 4th object set At least one discrete zero control, the 4th object set include at least partly object at least one object, are based on user couple The operation that parameter setting control executes, determines that the respective kinematic parameter (step S2120) of at least one object includes: for the 4th Any object in object set, it is in response to user to the clicking operation of discrete zero control corresponding to the object, this is right The kinematic parameter of elephant is set as preset initial value.
According to embodiments of the present invention, parameter setting control includes whole zero control, based on user to parameter setting control The operation of execution determines that the respective kinematic parameter (step S2120) of at least one object includes: in response to user to whole zero The kinematic parameter of at least one object is respectively set to respective preset initial value by the clicking operation of control.
Parameter setting control involved in motion control method 2100 is set with parameter involved in motion control method 1600 Set that control is identical, can be understood based on above description the form of parameter setting control involved in motion control method 2100 and its Mode of operation repeats no more.
According to embodiments of the present invention, parameter setting control include with the object in the 5th object set correspondingly at least One group of button control, every group of button control include corresponding two button controls in the direction opposite with two, each object Each button control in one group of corresponding button control is used to indicate the object along direction corresponding to the button control Movement, the 5th object set includes at least partly object at least one object, is executed based on user to parameter setting control Operation, determine that the respective kinematic parameter (step S2120) of at least one object includes: for any in the 5th object set Object receives the behaviour that user executes clicking operation at least partly button control in one group of button control corresponding to the object Make information;Based on user to the operation information of at least partly button control, the position data in the kinematic parameter of the object is determined.
Referring back to Figure 17, on the right side of the slider control corresponding to each joint of robot model, it is also shown that one group Button control.Every group of button control includes two button controls " < ", " > ", respectively corresponds both direction, such as side clockwise To and counterclockwise.Illustratively, " > " button control corresponding with pedestal is clicked, it is next clockwise to can control pedestal Rotate predetermined angle.The size of the predetermined angle can use following step-length setting controls and be configured.
According to embodiments of the present invention, parameter setting control further includes at least one step-length setting control, each step-length setting Control is associated at least partly object in the 5th object set, and each of at least one step-length setting control is for referring to Show that each button control in one group of button control corresponding to any object associated with the step-length setting control clicks one The rotation angle or move distance of the secondary object, step-length setting control are Inputs.
With reference to Figure 17, step-length setting control is indicated with dotted line frame, the step-length of each joint current setting is 1, indicates point Hit the button control on primary right side, 1 ° clockwise or counterclockwise of joint.In one example, the 5th object set can be directed to Each object in conjunction, is respectively configured a step-length setting control, and each step-length setting control is used to be arranged the step of affiliated partner The rotation angle or move distance that length, i.e. each button control corresponding to affiliated partner click the primary object.For example, five Joint can configure five step-length setting controls.In another example, all objects that can be directed in the 5th object set, A step-length setting control is only configured, which is used to be arranged the step-length of all objects associated with the control, Each button control corresponding to i.e. each affiliated partner clicks the rotation angle or move distance of the primary object.For example, five A joint can configure a step-length setting control.Further, it is also possible to other step-length setting control configuration modes are selected, such as Five joints configure three step-length setting controls, and first and second step-length setting control are respectively used to two joints of setting Step-length, third step-length setting control are used to be arranged the step-length in a joint.
According to embodiments of the present invention, parameter setting control further includes at least one time setting control, each time setting Control is associated at least partly object in the 5th object set, and each of at least one time setting control is for referring to Show that each button control in one group of button control corresponding to any object associated with the time setting control clicks one The run duration of the secondary object, time setting control are Inputs.
With reference to Figure 17, time setting control is indicated with dotted line frame, the time of current setting is 1, indicates to click one time five The either button control in a joint, predetermined angle, time used are 1 second (s) to corresponding joint clockwise or counterclockwise.Time Setting control is similar with the configuration mode of above-mentioned step-length setting control, it can configures the time setting control of any suitable number Part, each time setting control are used to be arranged the run duration of one or more objects.One is only shown in example shown in Figure 17 A time setting control, for the run duration in five joints to be arranged.
Illustratively, the run duration that each button control clicks a corresponding objects can also be default value.
According to embodiments of the present invention, the first object set and the 5th object set include at least one shared object, method Further include: for any object at least one shared object, based on user to one group of button control corresponding to the object In at least partly button control operation information, the position of the sliding block in slider control corresponding to the synchronous adjustment object It sets.
Continue to refer to figure 17, it is assumed that user clicks once button control " < " corresponding with pedestal, the slider bar control of pedestal Sliding block on part can correspond to slides pre-determined distance to the left.The size that button control clicks the pre-determined distance of sliding block sliding can Determine that button control is clicked primary to click the step-length (rotation angle or move distance) of a corresponding objects based on button control The step-length of corresponding objects can be arranged using above-described step-length setting control.Optionally, button control is clicked primary corresponding The step-length of object can also be default value.
The mode of operation synchronous adjustment slide position based on button control, can make sliding block move and change with object, Sliding block is kept to be in correct position, this is conducive to user and carries out movement control to designated equipment in conjunction with button control and slider control System.
According to embodiments of the present invention, designated equipment is robot or robot model, and motion control window includes present bit Display area is set, position data corresponding to the position that the end effector for real-time display designated equipment currently reaches.On The content that current location display area includes has been described in text, and details are not described herein again.
According to a further aspect of the invention, a kind of control equipment is provided.Figure 22 shows control according to an embodiment of the invention The schematic block diagram of control equipment 2200.
As shown in figure 22, control equipment 2200 according to an embodiment of the present invention includes display module 2210, determining module 2220 and distribution module 2230.The modules can be executed respectively above in conjunction with each of Figure 16 motion control method described A step/function.Only the major function of each component of the control equipment 2200 is described below, and omits above retouched The detail content stated.
Display module 2210 is used to show motion control window relevant to designated equipment on human-computer interaction interface, specify Equipment is one of robot, control parts of motion, robot model and control parts of motion model.
Determining module 2220 is used for the motion control instruction inputted based on user in motion control window, determines at least one The respective kinematic parameter of object, wherein at least one object includes one of following item: the end effector of designated equipment is specified At least one joint of equipment, designated equipment at least one axis.
Distribution module 2230 is used to the respective kinematic parameter of at least one object being respectively allocated at least one object, with Control the movement of at least one object.
Figure 23 shows the schematic block diagram of control equipment 2300 according to an embodiment of the invention.Control equipment 2300 Including display 2310, storage device (i.e. memory) 2320 and processor 2330.
Display 2310 is for showing above-mentioned human-computer interaction interface.
The storage of storage device 2320 is for realizing the phase in motion control method 1600 according to an embodiment of the present invention Answer the computer program instructions of step.
The processor 2330 is for running the computer program instructions stored in the storage device 2320, to execute root According to the corresponding steps of the motion control method 1600 of the embodiment of the present invention.
Illustratively, control equipment 2300 can also include input unit, for receiving the instruction of user's input.
Illustratively, aforementioned display device and input unit can be realized using same touch screen.
According to a further aspect of the invention, a kind of kinetic control system is provided, including control equipment is specified at least one and set It is standby, the control equipment for executing motion control method 1600 according to an embodiment of the present invention, with control it is described at least one The object of designated equipment moves.Illustratively, each of at least one designated equipment is the machine connecting with control equipment People or control parts of motion.
In addition, another aspect according to the present invention, additionally provides a kind of storage medium, stores journey on said storage Sequence instruction makes the computer or processor execute the present invention real when described program instruction is run by computer or processor Apply the corresponding steps of the above-mentioned motion control method 1600 of example.The storage medium for example may include the storage unit of tablet computer It is part, the hard disk of personal computer, read-only memory (ROM), Erasable Programmable Read Only Memory EPROM (EPROM), portable compact Disk read-only memory (CD-ROM), any combination of USB storage or above-mentioned storage medium.The computer-readable storage Medium can be any combination of one or more computer readable storage mediums.
Those of ordinary skill in the art are by reading the associated description above for motion control method 1600, it is possible to understand that The specific implementation of above-mentioned control equipment and storage medium, for sake of simplicity, details are not described herein.
Although describing example embodiment by reference to attached drawing here, it should be understood that above example embodiment are only exemplary , and be not intended to limit the scope of the invention to this.Those of ordinary skill in the art can carry out various changes wherein And modification, it is made without departing from the scope of the present invention and spiritual.All such changes and modifications are intended to be included in appended claims Within required the scope of the present invention.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and ability Field technique personnel can be designed alternative embodiment without departing from the scope of the appended claims.In the claims, Any reference symbol between parentheses should not be configured to limitations on claims.Word "comprising" does not exclude the presence of not Element or step listed in the claims.Word "a" or "an" located in front of the element does not exclude the presence of multiple such Element.The present invention can be by means of including the hardware of several different elements and being come by means of properly programmed computer real It is existing.In the unit claims listing several devices, several in these devices can be through the same hardware branch To embody.The use of word first, second, and third does not indicate any sequence.These words can be explained and be run after fame Claim.
The above description is merely a specific embodiment or to the explanation of specific embodiment, protection of the invention Range is not limited thereto, and anyone skilled in the art in the technical scope disclosed by the present invention, can be easily Expect change or replacement, should be covered by the protection scope of the present invention.Protection scope of the present invention should be with claim Subject to protection scope.

Claims (22)

1. a kind of motion control method is applied to control equipment, comprising:
Show that motion control window relevant to designated equipment, the designated equipment are robot, fortune on human-computer interaction interface One of dynamic control unit, robot model and control parts of motion model;
Based on the motion control instruction that user inputs in the motion control window, the respective movement ginseng of at least one object is determined Number, wherein at least one described object includes one of following item: the end effector of the designated equipment, the designated equipment At least one joint, the designated equipment at least one axis;And
The respective kinematic parameter of at least one object is respectively allocated at least one described object, it is described at least with control One object movement.
2. the method for claim 1, wherein the motion control window includes parameter setting control,
The motion control instruction inputted based on user in the motion control window, determines the respective fortune of at least one object Dynamic parameter includes:
Based on the operation that the user executes the parameter setting control, the respective movement ginseng of at least one described object is determined Number, wherein the motion control instruction includes finger corresponding to operation of the user for parameter setting control execution It enables.
3. method according to claim 2, wherein the parameter setting control includes and the object one in the first object set One at least one corresponding slider control, first object set include at least partly right at least one described object As,
The operation executed based on the user to the parameter setting control determines the respective fortune of at least one described object Dynamic parameter includes:
For any object in first object set,
Receive the operation information that the user executes slide to the sliding block in slider control corresponding to the object;
Based on end position of the sliding block in slider control corresponding to the object in sliding process and/or initial position with Alternate position spike between end position determines the position data in the kinematic parameter of the object;And/or
It is located at the sliding block of the end position at the time of pressing based on the user positioned at the sliding block of the initial position with release At the time of between time difference, determine the time data in the kinematic parameter of the object;
Wherein, the position data indicates target position, rotation angle or move distance, and the time data indicate described in arrival Target position indicated by position data or by rotation angle indicated by the position data or move distance consume when Between.
4. method as claimed in claim 3, wherein the size of the time data is equal to the time difference.
5. method as claimed in claim 3, wherein the size of the position data is directly proportional to the size of the alternate position spike.
6. method as claimed in claim 3, wherein the motion control window further includes and the object in the second object set At least one one-to-one parameter current display area, second object set is the subset of first object set,
For any object in second object set, parameter current display area corresponding to the object is for aobvious in real time Show the sliding block in slider control corresponding to the object in sliding process, the predicted value for the position which currently reaches And/or the predicted value of the time data in the kinematic parameter of the object.
7. method according to claim 2, wherein the parameter setting control includes and the object one in third object set One corresponding at least one set of Input, every group of Input includes at least one Input, the third object Set includes at least partly object at least one described object,
The operation executed based on the user to the parameter setting control determines the respective fortune of at least one described object Dynamic parameter includes:
For any object in the third object set, the user one group of text box control corresponding to the object is received Position data and/or time data in the kinematic parameter of the object inputted in part, wherein the position data indicates target Position, rotation angle or move distance, the time data indicate to reach target position or warp indicated by the position data Spend the time of rotation angle indicated by the position data or move distance consumption.
8. method according to claim 2, wherein executed in the end that at least one described object includes the designated equipment In the case where device, the parameter setting control includes six button controls, and six button controls are divided into be sat with space right-angle The one-to-one three pairs of button controls of three reference axis of system are marked, two button controls in each pair of button control respectively correspond two A opposite direction, each button control in six button controls are used to indicate the end effector in the button control It is moved in reference axis corresponding to part along direction corresponding to the button control,
The operation executed based on the user to the parameter setting control determines the respective fortune of at least one described object Dynamic parameter includes:
Receive the operation information that the user executes clicking operation at least partly button control in six button controls;
Based on the user to the operation information of at least partly button control, the kinematic parameter of the end effector is determined In position data.
9. method according to claim 8, wherein the parameter setting control further includes step-length setting control, is used to indicate Each button control in six button controls clicks the move distance of the primary end effector, the step-length setting Control is Input.
10. method as claimed in claim 8 or 9, wherein the parameter setting control further includes time setting control, is used for Indicate that each button control in six button controls clicks the run duration of the primary end effector, the time Setting control is Input.
11. method according to claim 2, wherein the parameter setting control includes and the object in the 4th object set At least one one-to-one discrete zero control, the 4th object set include at least portion at least one described object Divide object,
The operation executed based on the user to the parameter setting control determines the respective fortune of at least one described object Dynamic parameter includes:
For any object in the 4th object set, in response to the user to discrete zero control corresponding to the object The kinematic parameter of the object is set preset initial value by the clicking operation of part.
12. method according to claim 2, wherein the parameter setting control includes whole zero control,
The operation executed based on the user to the parameter setting control determines the respective fortune of at least one described object Dynamic parameter includes:
In response to the user to the clicking operation of the whole zero control, by the kinematic parameter of at least one object point It is not set as respective preset initial value.
13. method as described in any one of claim 1 to 9, wherein the designated equipment is robot or robot model, The motion control window includes current location display area, and the end effector for designated equipment described in real-time display is current Position data corresponding to the position of arrival.
14. method as described in any one of claim 1 to 9, wherein
It is described show relevant to designated equipment motion control window on human-computer interaction interface before, the method is also wrapped It includes:
The identification information of the designated equipment or at least one object is shown on the human-computer interaction interface;
It is described to show that motion control window relevant to designated equipment includes: on human-computer interaction interface
In response to the user to the first clicking operation of the identification information of the designated equipment or at least one object, show Show the motion control window.
15. method as described in any one of claim 1 to 9, wherein
It is described show relevant to designated equipment motion control window on human-computer interaction interface before, the method is also wrapped It includes:
The identification information of the designated equipment or at least one object is shown on the human-computer interaction interface;
It is described to show that motion control window relevant to designated equipment includes: on human-computer interaction interface
In response to the user to the second clicking operation of the identification information of the designated equipment or at least one object, show Show that menu window, the menu window include the motion control menu for controlling the opening and closing of the motion control window ?;
In response to the user to the third clicking operation of the motion control menu item, the motion control window is shown.
16. method as described in any one of claim 1 to 9, wherein the movement of each object at least one described object Parameter is used to indicate the object and moves pre-determined distance in the preset time after current time or reach target position.
17. method as described in any one of claim 1 to 9, wherein the motion control window includes distribution control, described The respective kinematic parameter of at least one object, which is respectively allocated at least one described object, includes:
It, will the respective movement ginseng of at least one object in response to the clicking operation that the user executes the distribution control Number is respectively allocated at least one described object.
18. method as described in any one of claim 1 to 9, wherein the designated equipment is one of editable equipment, described Editable equipment includes all devices model in model set and/or establishes at least one of connection with the control equipment and wait for Equipment is controlled, at least one described control equipment to be controlled includes at least one robot and/or at least one control parts of motion, institute Stating model set includes at least one device model, and each device model is robot model or control parts of motion model.
19. a kind of control equipment, comprising:
Display module, for showing motion control window relevant to designated equipment on human-computer interaction interface, described specify is set Standby is one of robot, control parts of motion, robot model and control parts of motion model;
Determining module, the motion control instruction for being inputted based on user in the motion control window, determines that at least one is right As respective kinematic parameter, wherein at least one described object includes one of following item: the end of the designated equipment executes Device, at least one joint of the designated equipment, the designated equipment at least one axis;And
Distribution module, for the respective kinematic parameter of at least one object to be respectively allocated at least one described object, To control at least one object movement.
20. a kind of control equipment, including display, processor and memory, wherein the display is for showing human-computer interaction Interface is stored with computer program instructions in the memory, and the computer program instructions are used when being run by the processor In execution such as the described in any item motion control methods of claim 1 to 18.
21. a kind of kinetic control system, including control equipment and at least one designated equipment, the control equipment is for executing such as The described in any item motion control methods of claim 1 to 18, to control the object movement of at least one designated equipment.
22. a kind of storage medium stores program instruction on said storage, described program instruction is at runtime for holding Row such as the described in any item motion control methods of claim 1 to 18.
CN201910154655.7A 2019-02-28 2019-02-28 Motion control method and system, control device, and storage medium Active CN109807896B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910154655.7A CN109807896B (en) 2019-02-28 2019-02-28 Motion control method and system, control device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910154655.7A CN109807896B (en) 2019-02-28 2019-02-28 Motion control method and system, control device, and storage medium

Publications (2)

Publication Number Publication Date
CN109807896A true CN109807896A (en) 2019-05-28
CN109807896B CN109807896B (en) 2021-05-04

Family

ID=66607910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910154655.7A Active CN109807896B (en) 2019-02-28 2019-02-28 Motion control method and system, control device, and storage medium

Country Status (1)

Country Link
CN (1) CN109807896B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110293539A (en) * 2019-06-24 2019-10-01 佛山智异科技开发有限公司 Implementation method, device and the teaching machine of industrial robot teaching device software architecture

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11898996B2 (en) * 2022-01-03 2024-02-13 Teng-Jen Yang Test system with detection feedback

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150273685A1 (en) * 2014-04-01 2015-10-01 Bot & Dolly, Llc Software Interface for Authoring Robotic Manufacturing Process
US20170206064A1 (en) * 2013-03-15 2017-07-20 JIBO, Inc. Persistent companion device configuration and deployment platform
CN106985150A (en) * 2017-03-21 2017-07-28 深圳泰坦创新科技有限公司 The method and apparatus of control machine human action
CN107932504A (en) * 2017-11-13 2018-04-20 浙江工业大学 Mechanical arm operation control system based on PyQt
CN109035740A (en) * 2018-09-27 2018-12-18 上海节卡机器人科技有限公司 Control method, device and the tele-control system of robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170206064A1 (en) * 2013-03-15 2017-07-20 JIBO, Inc. Persistent companion device configuration and deployment platform
US20150273685A1 (en) * 2014-04-01 2015-10-01 Bot & Dolly, Llc Software Interface for Authoring Robotic Manufacturing Process
CN106985150A (en) * 2017-03-21 2017-07-28 深圳泰坦创新科技有限公司 The method and apparatus of control machine human action
CN107932504A (en) * 2017-11-13 2018-04-20 浙江工业大学 Mechanical arm operation control system based on PyQt
CN109035740A (en) * 2018-09-27 2018-12-18 上海节卡机器人科技有限公司 Control method, device and the tele-control system of robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110293539A (en) * 2019-06-24 2019-10-01 佛山智异科技开发有限公司 Implementation method, device and the teaching machine of industrial robot teaching device software architecture

Also Published As

Publication number Publication date
CN109807896B (en) 2021-05-04

Similar Documents

Publication Publication Date Title
CN109986559A (en) Parameter edit methods and system, control equipment and storage medium
US20220009100A1 (en) Software Interface for Authoring Robotic Manufacturing Process
Ma et al. Digital twin enhanced human-machine interaction in product lifecycle
CN106142091B (en) Information processing method and information processing unit
CN106200983B (en) A kind of system of combination virtual reality and BIM realization virtual reality scenario architectural design
US10521522B2 (en) Robot simulator and file generation method for robot simulator
US6718231B2 (en) Authoring system and authoring method, and storage medium
CN110000753A (en) User interaction approach, control equipment and storage medium
CN109910004A (en) User interaction approach, control equipment and storage medium
CN109807898A (en) Motion control method, control equipment and storage medium
CN109807896A (en) Motion control method and system, control equipment and storage medium
Holubek et al. An innovative approach of industrial robot programming using virtual reality for the design of production systems layout
CN109605378A (en) Processing method, device and system and the storage medium of kinematic parameter
CN110000775A (en) Device management method, control equipment and storage medium
CN109807897A (en) Motion control method and system, control equipment and storage medium
Wang et al. LabVIEW-based data acquisition system design
CN109551484A (en) Processing method, device and system and the storage medium of kinematic parameter
Buchholz et al. Design of a test environment for planning and interaction with virtual production processes
Talaba et al. Product engineering: tools and methods based on virtual reality
Osorio-Gómez et al. An augmented reality tool to validate the assembly sequence of a discrete product
Colceriu et al. User-centered design of an intuitive robot playback programming system
Aksonov et al. Interactive design of CNC equipment operator panels
CN106652258A (en) Book borrowing and returning control system for library
Nilles et al. Improv: Live coding for robot motion design
CN111862297A (en) Coon 3D-based collaborative robot visual simulation teaching method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20191212

Address after: No.1705, building 8, Qianhai preeminent Financial Center (phase I), unit 2, guiwan District, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Mga Technology (Shenzhen) Co., Ltd

Address before: 102208 1, unit 1, 1 hospital, lung Yuan middle street, Changping District, Beijing 1109

Applicant before: Beijing magnesium Robot Technology Co., Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 518052 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Shenzhen mga Technology Co.,Ltd.

Address before: 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong 518000

Applicant before: Mga Technology (Shenzhen) Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant