CN109822565A - Robot control method, system and storage medium - Google Patents

Robot control method, system and storage medium Download PDF

Info

Publication number
CN109822565A
CN109822565A CN201910037228.0A CN201910037228A CN109822565A CN 109822565 A CN109822565 A CN 109822565A CN 201910037228 A CN201910037228 A CN 201910037228A CN 109822565 A CN109822565 A CN 109822565A
Authority
CN
China
Prior art keywords
end effector
user
robot
user interface
operating area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910037228.0A
Other languages
Chinese (zh)
Inventor
胡军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MGA Technology Shenzhen Co Ltd
Original Assignee
Megarobo Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Megarobo Technologies Co Ltd filed Critical Megarobo Technologies Co Ltd
Priority to CN201910037228.0A priority Critical patent/CN109822565A/en
Publication of CN109822565A publication Critical patent/CN109822565A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

The embodiment provides a kind of robot control method, system and storage mediums.Wherein the robot includes end effector, which comprises obtains the location information of the end effector;User interface is shown, wherein the user interface includes operating area;And it controls the end effector in the operation of the operating area in response to user and executes corresponding, relevant to location information movement.Above-mentioned technical proposal realizes simple, the flexible control to robot in the case where not increasing hardware device, solves the problems, such as the cost and human resources repetition and waste of most of robot controls at present, greatly improves user experience.

Description

Robot control method, system and storage medium
Technical field
The present invention relates to robotic technology field, relate more specifically to a kind of robot control method, a kind of robot control System processed and a kind of storage medium.
Background technique
Currently, user is preparatory by the task according to operation purpose and opposite unification mostly for the control of robot It measures with fixed program and realizes.This control mode can be only done robot according to set pre-set moving line and/ Or track carries out controling within the scope of single or certain fixed form.In other words, this control mode can only accomplish single machine list With.Once needing to change the task of robot, generally requires to reprogram and even design, cost and human resources weight Multiple waste, while it is poor to also result in user experience.
Summary of the invention
The present invention is proposed in view of the above problem.The present invention provides a kind of robot control methods, a kind of machine People's control system and a kind of storage medium.
One side according to an embodiment of the present invention provides a kind of method of robot control, wherein the robot packet Include end effector, which comprises
Obtain the location information of the end effector;
User interface is shown, wherein the user interface includes operating area;And
In response to user in the operation of the operating area, controls the end effector and execute corresponding and institute's rheme Confidence ceases relevant movement.
Illustratively, wherein the location information for obtaining the end effector includes:
Obtain coordinate value of the end effector under cartesian cartesian coordinate system;
The operating area includes first area, and the first area corresponds to the X-axis and Y with cartesian cartesian coordinate system The coplanar plane of axis is parallel, the first plane where the end effector;
The user interface further include in the first area first can operational controls, for by the user into Row operation executes the movement to control the end effector in first plane.
Illustratively, wherein the first area is border circular areas, described first can operational controls be located at the circle The center point in domain, for carrying out drag operation by the user;
It is described in response to user the operation of the operating area control the end effector execute it is corresponding, with it is described Location information it is relevant movement include:
In first plane, moving direction corresponding with the direction of the drag operation is determined;
The moving distance of the end effector is determined according to the distance of the drag operation;And
The end effector is controlled in the mobile moving distance of the moving direction.
Illustratively, wherein the distance according to the drag operation determines the moving distance of the end effector Include:
The moving distance: D is determined according to the following formulav=(Df/ R) * δ,
Wherein, DvIndicate the moving distance, DfIndicate the distance of the drag operation, R indicates the half of the border circular areas Diameter, δ indicate maximum step square.
Illustratively, wherein the user interface further include second can operational controls, for being operated by the user To adjust the corresponding relationship in the direction in the border circular areas in every radius and first plane.
Illustratively, wherein the user interface further includes the first display area, for showing the end effector The number axis of current pose and the cartesian cartesian coordinate system based on current pose.
Illustratively, wherein the operating area further includes second area, includes corresponding to flute card in the second area The line segment of the Z axis of your rectangular coordinate system;
The user interface further include third on the line segment can operational controls, for being dragged by the user Dynamic operation moves corresponding distance to control the end effector along the Z axis.
Illustratively, wherein
The end effector is clamping jaw;
The operating area includes third region;
The user interface further include in the third region the 4th can operational controls, for by the user into Row clicking operation is to control the folding angle of the clamping jaw.
Illustratively, wherein the user interface further includes the second display area, for showing the end effector Location information.
Illustratively, wherein described to be held in response to user in the operation control end effector of the operating area Row is corresponding, act relevant to the location information includes:
In response to the user in the operation of the operating area, controls the end effector and execute in real time accordingly , relevant to location information movement;Or
In the user after the completion of operating area operates, controls the end effector and execute corresponding and institute Rheme confidence ceases relevant movement.
According to another aspect of an embodiment of the present invention, a kind of robot control system, including display and place are additionally provided Manage device:
The display is for showing user interface, wherein the user interface includes operating area;
The processor is used to obtain the location information of the end effector of robot, and is used in response to user in institute The operation for stating operating area controls the end effector and executes corresponding, relevant to location information movement.
Another aspect according to an embodiment of the present invention, additionally provides a kind of storage medium, stores on said storage Program instruction, described program instruction is at runtime for executing above-mentioned robot control method.
Above-mentioned technical proposal realizes simple, the flexible control to robot in the case where not increasing hardware device, It solves the problems, such as the cost and human resources repetition and waste in most of robot controls at present, greatly improves user Experience.
The above description is only an overview of the technical scheme of the present invention, in order to better understand the technical means of the present invention, And it can be implemented in accordance with the contents of the specification, and in order to allow above and other objects of the present invention, feature and advantage can It is clearer and more comprehensible, the followings are specific embodiments of the present invention.
Detailed description of the invention
The embodiment of the present invention is described in more detail in conjunction with the accompanying drawings, the above and other purposes of the present invention, Feature and advantage will be apparent.Attached drawing is used to provide to further understand the embodiment of the present invention, and constitutes explanation A part of book, is used to explain the present invention together with the embodiment of the present invention, is not construed as limiting the invention.In the accompanying drawings, Identical reference label typically represents same parts or step.
Fig. 1 shows the schematic flow chart of robot control method according to an embodiment of the invention;
Fig. 2 shows the schematic diagrames of user interface according to an embodiment of the invention.
Specific embodiment
In order to enable the object, technical solutions and advantages of the present invention become apparent, root is described in detail below with reference to accompanying drawings According to example embodiments of the present invention.Obviously, described embodiment is only a part of the embodiments of the present invention, rather than this hair Bright whole embodiments, it should be appreciated that the present invention is not limited by example embodiment described herein.Based on described in the present invention The embodiment of the present invention, those skilled in the art's obtained all other embodiment in the case where not making the creative labor It should all fall under the scope of the present invention.
According to an embodiment of the present invention, a kind of robot control method is provided.Robot is automatic execution work Installations.Robot may include robot body, end effector (or being tool).Ontology may include multiple joints Such as pedestal, large arm, forearm, wrist.End effector is, for example, the clamping jaw that can be opened and closed, and is also possible to other operative employees Tool.End effector is moved according to respective routes by robot control system control and completes scheduled movement.Specifically for example, end It holds actuator by the manipulation of robot, realizes and moved in three-dimensional space, and execute relevant action, example in specified position Such as crawl, release or other movements.
Fig. 1 shows the schematic flow chart of robot control method 100 according to an embodiment of the invention.Such as Fig. 1 Shown, robot control method 100 includes the following steps:
Step S110 obtains the location information of the end effector of robot.
When manipulating end effector, in order to accurately control the mobile track of end effector in its motion process simultaneously Relevant action can be executed in designated space position, can establish the coordinate system of robot, to determine the position of the end effector Confidence breath.The motion profile of end effector can be set or be controlled in this way, and it is made to execute correlation in specified spatial position Movement.
Optionally, which can be robot body coordinate system, and the central point with the pedestal of robot is Coordinate origin.Because the pedestal of robot is to maintain motionless during each joint of robot executes operation.By This, executes robot control using the robot body coordinate system, can simplify calculating to avoid various coordinate system transformations.
It is appreciated that the location information of the end effector of above-mentioned acquisition robot can use encoder or angle sensor The various suitable sensors such as device obtain.
It is appreciated that end effector is the tool for occupying certain space, rather than a point.For convenience of calculation, benefit Use the location information of a point in coordinate system as the location information of end effector.Optionally, with end effector certain The location information of a position point or its location information for occupying some point in space as the end effector.It is specific for example, End effector is the tool with a similar circular cone, can be using the position of the endpoint of the top of end effector as end Hold the position of actuator.In another example end effector is the clamping jaw that can be opened and closed, the endpoint composition of several teeth of clamping jaw can be used Geometrical plane figure central point position of the position as end effector.
Step S120 shows user interface, wherein the user interface includes operating area.
In order to which the motion process and manipulation end effector of the end effector of more convenient, intuitive control robot execute Relevant action provides the user interface for human-computer interaction.Based on the user interface, user can more intuitive, visualization Control robot.The user interface includes the operating area for robot control.The operating area is specifically used for receiving User's operation executes corresponding movement to control the end effector of robot according to user's operation.
Step S130, in response to user the operating area operation, control the end effector execute it is corresponding, Movement relevant to location information acquired in step S110.
Based on the operating area provided in user interface described in step S120, user can execute in operating area Relevant operation.In response to user in the operation of operating area, i.e., the operational order of user, control end are received by the user interface Actuator is held to execute movement corresponding with the operation.It is understood that end effector acquired in the movement and step S110 Location information is relevant.For example, the movement can be from position determined by step S110 to specific direction it is mobile it is specific away from From.The movement can also be that the position determined by step S110 carries out the operation such as grabbing.
In one example, the above-mentioned mode of operation based on operating area in the user interface may include one or more Man-machine interaction mode.Specifically for example, can realize the operation in operating area by way of providing and inputting operation instruction, It can complete to operate by carrying out the dragging of simulation type in operating area, more intuitive, the visual mode such as opening or closing, also It can be other man-machine interaction modes that can complete operation.
Above-mentioned robot control method overcomes current most of robots according to times of operation purpose and opposite unification It is the problem of business measurement in advance and fixed program, single without being carried out according to set pre-set moving line and/or track Or controling within the scope of certain fixed form.It can be according to the different task of robot, the case where not increasing hardware device Under, simply, robot is neatly controlled, without re-starting programming or design for different task.Thus it not only reduces Installation cost, and human resources repetition and waste is avoided, greatly improve user experience.
Fig. 2 shows the schematic diagrames of user interface according to an embodiment of the invention.The user interface is used for " work " word The electrodeless damp type control stick of shape anthroposomatology shape.The clamping jaw that the operating stick can be used for manipulating mechanical arm body is approached and is reached At object to be handled, clamps object and be transported to target position.The top of the user interface of Fig. 2 is divided into display area, lower part For operating area.
Illustratively, a cartesian cartesian coordinate system is defined to determine the location information of end effector.In above-mentioned machine In the step S110 of device people control method 100, coordinate value of the end effector under cartesian cartesian coordinate system is obtained.Descartes Rectangular coordinate system can be defined according to the installation site of robot.For example, robot installation is on the ground, flute can be defined The X-axis and Y-axis of karr rectangular coordinate system are in the horizontal plane.In another example robot is mounted on the metope perpendicular to ground, it can To define X-axis and Y-axis in the vertical plane for being parallel to metope.
User interface as shown in Figure 2 includes that user is related dynamic for the end effector execution for operating or controlling robot The operating area of work.
Optionally, the operating area includes first area, and the first area corresponds in robot motion space First plane.First plane is the plane where the end effector, the X-axis of the plane and cartesian cartesian coordinate system The plane parallel coplanar with Y-axis.It is appreciated that the operation for the first area is mapped as being that control end is held accordingly Row device executes corresponding movement in the first plane.
Further including in the first area first can operational controls.Described first can operational controls operated by user, Movement corresponding with the operation of user is executed in first plane for controlling the end effector.End effector The movement may include that end effector is opened in the first plane from initial position (such as identified position in step S110) The movement begun to different directions.It is appreciated that above-mentioned steps can be executed again after end effector has executed movement S110, to update the location information of end effector.The end for understanding while user's more real-time can be allowed in this way robot executes The position of device.
Above-mentioned technical proposal obtains coordinate value of the end effector under cartesian cartesian coordinate system and provides and is used for It controls end effector and executes corresponding actions in the first plane, simplify the difficulty of user's operation, be convenient for users to operate.
Illustratively, the first area is border circular areas, described first can operational controls be located at the border circular areas The center point, for carrying out drag operation by the user.In user interface shown in Fig. 2, first can operational controls be shown as one It is a compared with the smaller round block in first area.
User can be dragged in the first area first can operational controls, to control the end effector first It is moved in plane.For example, user can choose first can operational controls, then do not drag releasably its according to some direction move It is dynamic, the control chosen is unclamped after reaching target position.
Optionally, the step S130 in above-mentioned robot control method 100 in response to user the operating area behaviour The work control end effector executes corresponding, relevant to location information movement and may comprise steps of.
Firstly, in this first plane, it is determining can the corresponding shifting in the direction of drag operation of operational controls to first with user Dynamic direction.Then, the moving distance of the end effector is determined according to the distance of the drag operation.Finally, described in control End effector is in the mobile moving distance of the moving direction.Illustratively, end effector can be led by electrodeless variable-speed Draw, with the movement moving distance.
User will to first can operational controls execute drag operation when, the current location of end effector (such as passes through Step S110 determine position) correspond to first area the center of circle.User to first can operational controls execute drag operation when, The direction of the drag operation of user corresponds to the direction moved of end effector, user's dragging first can operational controls distance The distance moved corresponding to end effector.In short, the direction that end effector is dragged according to the first operational controls is first Mobile corresponding distance in plane.
Above-mentioned robot control method uses similar handle operation mode, and mode of operation, which is shown, uses similar instrumentation disk Mode real-time display realizes simple, the flexible control of the robot under multioperation dimension.
Illustratively, it is above-mentioned according to first can operational controls distance of drag operation in first area determine the end Moving distance of the actuator in the first plane is according to the following formula: Dv=(Df/R)*δ.Wherein, DvIndicate that the end is held The moving distance of row device, DfIndicate the distance of the drag operation, R indicates that the radius of the border circular areas, δ indicate maximum step Square.The maximum step pitch indicates the moving distance of end effector when the distance of drag operation is the radius R of border circular areas.
Optionally, first can operational controls after being released, automatically return to the center point of first area, it is next to execute Secondary operation.
Illustratively, when control end effector is mobile more remote, can repeatedly drag first can operational controls.This When the operation that execute choosing above repeatedly, not unclamp dragging, then unclamp, until end effector reaches target position.
User has found the direction of sitting posture and manipulation habit and robot coordinate system at this time when operating machine people sometimes It is inconsistent, be easy to cause user by operate in first area first can operational controls control the end effector of robot It malfunctions when executing corresponding actions or feels awkward.Specifically for example, user may be accustomed to thinking that the top of first area is machine The front of people, the left of first area are the left of robot.If user is at this moment under sitting posture state, robot coordinate system with Above-mentioned user's habit is consistent, then user's operation, which is got up, does not just have to be concerned about the transformation in direction, will not feel awkward.
Specifically, for example user it is expected that the end effector of robot moves forwards.If in the current sitting posture shape of user Under state, robot coordinate system meets user's habit, then user's dragging first can operational controls it is mobile i.e. controllable to surface The end effector of robot moves forwards.If under the current sitting posture state of user, the direction of robot coordinate system and use The direction of family habit differs 180 °, then user's dragging first can operational controls be moved upward the end effector of then robot Rearward move.In other words, at this time if control end effector moves forwards, user must drag downwards first Can operational controls be just able to achieve above-mentioned control.It thus may require that user considers the above problem when executing drag operation, no It only makes troubles to user and is easy to cause maloperation.
Can also include in order to solve with the problems in foregoing description, in user interface second can operational controls.Such as Fig. 2 institute Show, first area draw above is designated as second can operational controls.Described second can operational controls be used for by user operated with Adjust the corresponding relationship of every radius and the direction in the first plane in the border circular areas of first area.
Illustratively, a radius or two vertical radiuses can be identified in border circular areas, for identifying circle The direction of radius in domain.As shown in the border circular areas in Fig. 2, wherein identifying 2 vertical radiuses.It will be marked in border circular areas The direction for the two radiuses known corresponds respectively to the X-axis and Y-axis of above-mentioned cartesian cartesian coordinate system.In response to user to second Can operational controls operation, two radiuses identified in border circular areas can be rotated, to indicate to change border circular areas The corresponding relationship of interior radius and the direction in the first plane.
Second can operational controls can be button.User can click the button, to adjust border circular areas inside radius and The corresponding relationship in the direction in one plane.Second can operational controls can also be slider bar.User can slide the slider bar, with Adjust the corresponding relationship of border circular areas inside radius and the direction in the first plane.Second can operational controls can also be input frame. User can input specific rotation angle to rotate two radiuses in border circular areas.Those of ordinary skill in the art can manage Solution, above-mentioned second can the specific implementations of operational controls be example, be not construed as limiting the invention.
By provide second can operational controls adjust the corresponding relationship in the direction in border circular areas in radius and the first plane, On the one hand the angle that deliberately placement machine people puts can not be had to.Still further aspect, the operation for being also adapted to different user are practised It is used.The conversion difficulty that user carries out the direction of motion in operation is not only simplified, situation easy to operate by mistake is also avoided.
Illustratively, it as shown in Fig. 2, the operating area of user interface further includes second area, is wrapped in the second area Include the line segment of the Z axis corresponding to cartesian cartesian coordinate system, i.e. vertical line part in Fig. 2.The user interface further includes being located at Third on the line segment can operational controls, for carrying out drag operation by user to control the end effector along the Z axis Mobile corresponding distance.Dragged up and down along Z axis the third can operational controls, i.e., controllable end effector is in Z-direction Upper movement, for example, being raised and lowered along the vertical direction.
Based on third can operational controls provide operating function, user, can when operational tip actuator is raised and lowered With without measurement and fixed program in advance, operating process is simple, visualizes, the experience of user is greatly improved.
Illustratively, the end effector of the robot is clamping jaw, can be used for grabbing or clamping object.Such as In one application scenarios, some object is moved to B point from A point by the clamping jaw for controlling robot.Clamping jaw can be firstly moved to A Point.Then the clamping jaw of opening is placed on object by the folding of control clamping jaw to certain angle, then clamping jaw is closed until that can grab Object.The mobile clamping jaw finally opens clamping jaw and discharges object to B point to B point.In above process, the movement of clamping jaw includes grabbing It takes and discharges, need to control the folding angle of clamping jaw in crawl and release.User interface as shown in Figure 2, user circle In face be located at third region in the 4th can operational controls i.e. for by the user carry out clicking operation to control the clamping jaw Folding angle.4th can the "+" buttons of operational controls be used to control the angle of clamping jaw and become larger, "-" button is for controlling clamping jaw Angle become smaller.Thus accomplished by the 4th can operational controls control fourth dimension degree robot except 3 d space coordinate End effector incremental mode folding angle, i.e. the folding of clamping jaw.User is facilitated as a result, carries out robot control.
Illustratively, as shown in the upper left corner of Fig. 2, the user interface further includes the first display area, for showing machine The number axis of the current pose of the end effector of device people and the cartesian cartesian coordinate system based on current pose.First display area It shows the current pose of the electrodeless damp type control stick of I-shaped anthroposomatology shape and the Descartes right angle where it is sat Mark system.By the first display area, user can intuitively understand the direction of the cartesian cartesian coordinate system where robot, just In the end effector for more desirably controlling robot.
Illustratively, the user interface further includes the second display area, for showing that the end of the robot executes The location information of device.The location information can be the coordinate value of three-dimensional cartesian rectangular coordinate system.For example, as shown in Fig. 2, Coordinate value of the clamping jaw of robot shown in second display area under some state: X=231.362;Y=0.38932;Z =512.263.It is appreciated that the coordinate value can be the coordinate value of the central point of clamping jaw.For the clamping jaw of four teeth, clamping jaw Central point is the rectangular or square center that the top of four teeth is constituted.
The location information that second display area is shown can help user to understand position in end effector motion process in real time Variation is set, the operation for preferably controlling end effector is conducive to.
Illustratively, the control mode of robot can use a variety of different modes.
In one example, the end effector is controlled in real time in the operation of the operating area in response to user Execute movement corresponding with the operation, relevant with the current location information of end effector.Specifically for example, if setting behaviour Make first can operational controls and robot end effector movement real time correlation, then dragging first can operational controls fortune When dynamic, while the end effector of robot also real time kinematics, i.e., once dragging first can operational controls movement, the end of robot End actuator simultaneously according to first can the direction of operational controls movement follow and moved.The robot of control in real time can allow use Family understands control effect at any time, in order to which it determines the control program of next step.
In another example, the end effector is controlled after the completion of the operating area operates in the user Execute movement corresponding with the operation, relevant with the current location information of end effector.In which, operation first can The non real-time association of the movement of the end effector of operational controls and robot.For example, dragging first can operational controls to some position After setting release, the end effector ability setting in motion of robot, rather than follow in real time.
Illustratively, the operating area of the user interface additionally provides the operation stopped for robot emergency, such as Fig. 2 The button in the upper right corner represents emergency and stops operation.
According to a further embodiment of the invention, a kind of robot control system is additionally provided.The robot control system packet Include display and processor.The display is for showing user interface, wherein the user interface includes operating area;It is described Processor is used to obtain the location information of the end effector of robot, and is used in response to user in the operating area Operation controls the end effector and executes corresponding, relevant to location information movement.The processor can be specific For executing the corresponding steps of above-mentioned robot control method according to an embodiment of the present invention.
In addition, another aspect according to the present invention, additionally provides a kind of storage medium, stores journey on said storage Sequence instruction makes the computer or processor execute the present invention real when described program instruction is run by computer or processor Apply the corresponding steps of the above-mentioned robot control method of example.The storage medium for example may include the storage unit of tablet computer It is part, the hard disk of personal computer, read-only memory (ROM), Erasable Programmable Read Only Memory EPROM (EPROM), portable compact Disk read-only memory (CD-ROM), any combination of USB storage or above-mentioned storage medium.The computer-readable storage Medium can be any combination of one or more computer readable storage mediums.
Those of ordinary skill in the art are by reading the associated description above for robot control method, it is possible to understand that on The specific implementation of robot control system and storage medium is stated, for sake of simplicity, details are not described herein.
Robot control method, system and storage medium according to an embodiment of the present invention, in the feelings for not increasing hardware device Under condition, similar handle operation mode is used, mode of operation show using similar instrumentation disk mode real-time display, is realized pair Simple, the flexible control of robot solves the cost of most of robot controls and human resources at present and repeats wave The problem of taking greatly improve user experience.
Although describing example embodiment by reference to attached drawing here, it should be understood that above example embodiment are only exemplary , and be not intended to limit the scope of the invention to this.Those of ordinary skill in the art can carry out various changes wherein And modification, it is made without departing from the scope of the present invention and spiritual.All such changes and modifications are intended to be included in appended claims Within required the scope of the present invention.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In several embodiments provided herein, it should be understood that disclosed device and method can pass through it Its mode is realized.For example, apparatus embodiments described above are merely indicative, for example, the division of the unit, only Only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components can be tied Another equipment is closed or is desirably integrated into, or some features can be ignored or not executed.
In the instructions provided here, numerous specific details are set forth.It is to be appreciated, however, that implementation of the invention Example can be practiced without these specific details.In some instances, well known method, structure is not been shown in detail And technology, so as not to obscure the understanding of this specification.
Similarly, it should be understood that in order to simplify the present invention and help to understand one or more of the various inventive aspects, To in the description of exemplary embodiment of the present invention, each feature of the invention be grouped together into sometimes single embodiment, figure, Or in descriptions thereof.However, the method for the invention should not be construed to reflect an intention that i.e. claimed The present invention claims features more more than feature expressly recited in each claim.More precisely, such as corresponding power As sharp claim reflects, inventive point is that the spy of all features less than some disclosed single embodiment can be used Sign is to solve corresponding technical problem.Therefore, it then follows thus claims of specific embodiment are expressly incorporated in this specific Embodiment, wherein each, the claims themselves are regarded as separate embodiments of the invention.
It will be understood to those skilled in the art that any combination pair can be used other than mutually exclusive between feature All features disclosed in this specification (including adjoint claim, abstract and attached drawing) and so disclosed any method Or all process or units of equipment are combined.Unless expressly stated otherwise, this specification (is wanted including adjoint right Ask, make a summary and attached drawing) disclosed in each feature can be replaced with an alternative feature that provides the same, equivalent, or similar purpose.
In addition, it will be appreciated by those of skill in the art that although some embodiments described herein include other embodiments In included certain features rather than other feature, but the combination of the feature of different embodiments mean it is of the invention Within the scope of and form different embodiments.For example, in detail in the claims, embodiment claimed it is one of any Can in any combination mode come using.
Various component embodiments of the invention can be implemented in hardware, or to run on one or more processors Software module realize, or be implemented in a combination thereof.It will be understood by those of skill in the art that can be used in practice Microprocessor or digital signal processor (DSP) are some in robot control system according to an embodiment of the present invention to realize The some or all functions of module.The present invention be also implemented as a part for executing method as described herein or Whole program of device (for example, computer program and computer program product).It is such to realize that program of the invention deposit Storage on a computer-readable medium, or may be in the form of one or more signals.Such signal can be from because of spy It downloads and obtains on net website, be perhaps provided on the carrier signal or be provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and ability Field technique personnel can be designed alternative embodiment without departing from the scope of the appended claims.In the claims, Any reference symbol between parentheses should not be configured to limitations on claims.Word "comprising" does not exclude the presence of not Element or step listed in the claims.Word "a" or "an" located in front of the element does not exclude the presence of multiple such Element.The present invention can be by means of including the hardware of several different elements and being come by means of properly programmed computer real It is existing.In the unit claims listing several devices, several in these devices can be through the same hardware branch To embody.The use of word first, second, and third does not indicate any sequence.These words can be explained and be run after fame Claim.
The above description is merely a specific embodiment or to the explanation of specific embodiment, protection of the invention Range is not limited thereto, and anyone skilled in the art in the technical scope disclosed by the present invention, can be easily Expect change or replacement, should be covered by the protection scope of the present invention.Protection scope of the present invention should be with claim Subject to protection scope.

Claims (12)

1. a kind of robot control method, wherein the robot includes end effector, which comprises
Obtain the location information of the end effector;
User interface is shown, wherein the user interface includes operating area;And
In response to user in the operation of the operating area, controls the end effector and execute corresponding and position letter Cease relevant movement.
2. the method for claim 1, wherein the location information for obtaining the end effector includes:
Obtain coordinate value of the end effector under cartesian cartesian coordinate system;
The operating area includes first area, and the first area corresponds to total with the X-axis and Y-axis of cartesian cartesian coordinate system The plane in face is parallel, the first plane where the end effector;
The user interface further include in the first area first can operational controls, for being grasped by the user Make to execute the movement to control the end effector in first plane.
3. method according to claim 2, wherein the first area is border circular areas, and described first can operational controls position In the center point of the border circular areas, for carrying out drag operation by the user;
It is described to execute the corresponding and position in the operation control end effector of the operating area in response to user Information it is relevant movement include:
In first plane, moving direction corresponding with the direction of the drag operation is determined;
The moving distance of the end effector is determined according to the distance of the drag operation;And
The end effector is controlled in the mobile moving distance of the moving direction.
4. method as claimed in claim 3, wherein the distance according to the drag operation determines the end effector Moving distance include:
The moving distance: D is determined according to the following formulav=(Df/ R) * δ,
Wherein, DvIndicate the moving distance, DfIndicate the distance of the drag operation, R indicates the radius of the border circular areas, δ Indicate maximum step square.
5. the method as claimed in claim 3 or 4, wherein the user interface further include second can operational controls, for by institute User is stated to be operated to adjust the corresponding relationship in the direction in the border circular areas in every radius and first plane.
6. such as the described in any item methods of claim 2 to 4, wherein the user interface further includes the first display area, is used for Show the current pose of the end effector and the number axis of the cartesian cartesian coordinate system based on current pose.
7. such as the described in any item methods of claim 2 to 4, wherein the operating area further includes second area, and described second It include the line segment of the Z axis corresponding to cartesian cartesian coordinate system in region;
The user interface further include third on the line segment can operational controls, for carrying out dragging behaviour by the user Make to move corresponding distance to control the end effector along the Z axis.
8. such as the described in any item methods of Claims 1-4, wherein
The end effector is clamping jaw;
The operating area includes third region;
The user interface further include in the third region the 4th can operational controls, for being carried out a little by the user Operation is hit to control the folding angle of the clamping jaw.
9. such as the described in any item methods of Claims 1-4, wherein the user interface further includes the second display area, is used for Show the location information of the end effector.
10. such as the described in any item methods of Claims 1-4, wherein it is described in response to user the operating area operation It controls the end effector and executes corresponding, relevant to the location information act and include:
In response to the user the operating area operation, control the end effector execute in real time it is corresponding, with The relevant movement of the location information;Or
In the user after the completion of operating area operates, controls the end effector and execute corresponding and institute's rheme Confidence ceases relevant movement.
11. a kind of robot control system, including display and processor:
The display is for showing user interface, wherein the user interface includes operating area;
The processor is used to obtain the location information of the end effector of robot, and is used in response to user in the behaviour The operation for making region controls the end effector and executes corresponding, relevant to location information movement.
12. a kind of storage medium stores program instruction on said storage, described program instruction is at runtime for holding Row robot control method as described in any one of claim 1 to 10.
CN201910037228.0A 2019-01-15 2019-01-15 Robot control method, system and storage medium Pending CN109822565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910037228.0A CN109822565A (en) 2019-01-15 2019-01-15 Robot control method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910037228.0A CN109822565A (en) 2019-01-15 2019-01-15 Robot control method, system and storage medium

Publications (1)

Publication Number Publication Date
CN109822565A true CN109822565A (en) 2019-05-31

Family

ID=66861579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910037228.0A Pending CN109822565A (en) 2019-01-15 2019-01-15 Robot control method, system and storage medium

Country Status (1)

Country Link
CN (1) CN109822565A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111195909A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Steering engine control method and device for robot, terminal and computer storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06344294A (en) * 1993-06-07 1994-12-20 Ishikawajima Harima Heavy Ind Co Ltd Remote control method and device for robot
CN104364723A (en) * 2012-04-05 2015-02-18 里斯集团控股有限责任两合公司 Method for operating an industrial robot
CN105171756A (en) * 2015-07-20 2015-12-23 缪学良 Method for controlling remote robot through combination of videos and two-dimensional coordinate system
CN105739437A (en) * 2016-04-21 2016-07-06 奇弩(北京)科技有限公司 Method for controlling robot through intelligent terminal
CN106493739A (en) * 2016-11-21 2017-03-15 电子科技大学中山学院 Mechanical control device of capacitive touch screen equipment
CN106997175A (en) * 2016-10-21 2017-08-01 遨博(北京)智能科技有限公司 A kind of robot simulation control method and device
CN107297740A (en) * 2016-04-14 2017-10-27 广明光电股份有限公司 the control system of robot
CN107322571A (en) * 2017-08-11 2017-11-07 广州亮点装备技术有限公司 A kind of connection in series-parallel drags teaching robot
CN107696036A (en) * 2017-08-21 2018-02-16 北京精密机电控制设备研究所 A kind of dragging teaching machine of apery mechanical arm
CN108145709A (en) * 2016-12-06 2018-06-12 韩华泰科株式会社 The method and apparatus for controlling robot
US10001912B2 (en) * 2014-10-01 2018-06-19 Denso Wave Incorporated Robot operation apparatus, robot system, and robot operation program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06344294A (en) * 1993-06-07 1994-12-20 Ishikawajima Harima Heavy Ind Co Ltd Remote control method and device for robot
CN104364723A (en) * 2012-04-05 2015-02-18 里斯集团控股有限责任两合公司 Method for operating an industrial robot
US10001912B2 (en) * 2014-10-01 2018-06-19 Denso Wave Incorporated Robot operation apparatus, robot system, and robot operation program
CN105171756A (en) * 2015-07-20 2015-12-23 缪学良 Method for controlling remote robot through combination of videos and two-dimensional coordinate system
CN107297740A (en) * 2016-04-14 2017-10-27 广明光电股份有限公司 the control system of robot
CN105739437A (en) * 2016-04-21 2016-07-06 奇弩(北京)科技有限公司 Method for controlling robot through intelligent terminal
CN106997175A (en) * 2016-10-21 2017-08-01 遨博(北京)智能科技有限公司 A kind of robot simulation control method and device
CN106493739A (en) * 2016-11-21 2017-03-15 电子科技大学中山学院 Mechanical control device of capacitive touch screen equipment
CN108145709A (en) * 2016-12-06 2018-06-12 韩华泰科株式会社 The method and apparatus for controlling robot
CN107322571A (en) * 2017-08-11 2017-11-07 广州亮点装备技术有限公司 A kind of connection in series-parallel drags teaching robot
CN107696036A (en) * 2017-08-21 2018-02-16 北京精密机电控制设备研究所 A kind of dragging teaching machine of apery mechanical arm

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111195909A (en) * 2019-12-27 2020-05-26 深圳市优必选科技股份有限公司 Steering engine control method and device for robot, terminal and computer storage medium

Similar Documents

Publication Publication Date Title
US11723734B2 (en) User-interface control using master controller
CN107106245B (en) Interaction between user interface and master controller
JP5784670B2 (en) Method, apparatus, and system for automated motion for medical robots
CN105479467B (en) Robotic manipulation device, robot system and robot manipulation's program
JP6007194B2 (en) Surgical robot system for performing surgery based on displacement information determined by user designation and its control method
US6222465B1 (en) Gesture-based computer interface
WO2017033365A1 (en) Remote control robot system
CN105960623B (en) For controlling the mancarried device and its method of robot
EP2617530A1 (en) Master control input device and master-slave manipulator
WO2011065034A1 (en) Method for controlling action of robot, and robot system
JP2015182213A (en) Robot control apparatus, robot, robot system, instruction method, and program
JP2016519813A (en) 3D input device with complementary rotation controller
CN109219413A (en) Multi input robotic surgical system control program
KR20170024769A (en) Robot control apparatus
CN109648568A (en) Robot control method, system and storage medium
CN108025434A (en) Robots arm with input element
CN109822565A (en) Robot control method, system and storage medium
Chan et al. Towards a multimodal system combining augmented reality and electromyography for robot trajectory programming and execution
CN109822569A (en) Robot control method, system and storage medium
CN107303673A (en) Robot
JP7386451B2 (en) Teaching system, teaching method and teaching program
EP1182535A1 (en) Haptic terminal
Ismail et al. Gantry robot kinematic analysis user interface based on visual basic and MATLAB
CN110026980A (en) A kind of control method of mechanical arm controlling terminal
CN109416588A (en) The method interacted for operator and technical object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20191211

Address after: No.1705, building 8, Qianhai preeminent Financial Center (phase I), unit 2, guiwan District, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Mga Technology (Shenzhen) Co., Ltd

Address before: 102208 1, unit 1, 1 hospital, lung Yuan middle street, Changping District, Beijing 1109

Applicant before: Beijing magnesium Robot Technology Co., Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 518052 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Shenzhen mga Technology Co.,Ltd.

Address before: 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong 518000

Applicant before: Mga Technology (Shenzhen) Co.,Ltd.

CB02 Change of applicant information
RJ01 Rejection of invention patent application after publication

Application publication date: 20190531

RJ01 Rejection of invention patent application after publication