CN109732593B - Remote control method and device for robot and terminal equipment - Google Patents

Remote control method and device for robot and terminal equipment Download PDF

Info

Publication number
CN109732593B
CN109732593B CN201811624739.4A CN201811624739A CN109732593B CN 109732593 B CN109732593 B CN 109732593B CN 201811624739 A CN201811624739 A CN 201811624739A CN 109732593 B CN109732593 B CN 109732593B
Authority
CN
China
Prior art keywords
motion
robot
coordinates
coordinate system
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811624739.4A
Other languages
Chinese (zh)
Other versions
CN109732593A (en
Inventor
苏细祥
刘主福
刘宇飞
杜志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yuejiang Technology Co Ltd
Original Assignee
Shenzhen Yuejiang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yuejiang Technology Co Ltd filed Critical Shenzhen Yuejiang Technology Co Ltd
Priority to CN201811624739.4A priority Critical patent/CN109732593B/en
Publication of CN109732593A publication Critical patent/CN109732593A/en
Application granted granted Critical
Publication of CN109732593B publication Critical patent/CN109732593B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention is suitable for the technical field of robot control and provides a remote control method, a device and terminal equipment of a robot, wherein the method comprises the following steps: the method comprises the steps of obtaining motion information of a target, drawing a corresponding motion track according to the motion information, converting the motion track into space point coordinates under a world coordinate system of the robot, generating a motion instruction according to the space point coordinates, sending the motion instruction to the robot, controlling the robot to move according to the motion instruction, obtaining the motion point coordinates in the motion process of the robot, and simulating the motion point coordinates through a simulation technology so as to display the motion posture of the robot. The invention converts the motion trail of the target object to generate the motion instruction to control the robot to execute the corresponding operation, and reproduces the motion posture of the robot by the simulation technology, thereby leading the human-computer interaction to be more humanized, and improving the flexibility of the human-computer interaction and the efficiency of controlling the robot.

Description

Remote control method and device for robot and terminal equipment
Technical Field
The invention belongs to the technical field of robot control, and particularly relates to a remote control method and device of a robot and terminal equipment.
Background
In recent years, with the rapid development of modern industries in subdivided fields, robots and robot arms have played an increasing role in various fields.
Along with the development of science and technology, the robot has the characteristics of more and more powerful functions, stable performance, high efficiency and high precision, and can replace the manual work to engage in a large number of repetitive works. People demand more and more intellectualization of robots, namely, the robots are required to be more humanized and interesting.
At present, human-computer interaction of a robot requires a user to set an operation instruction through various devices such as a mouse, a keyboard, an offline program and the like, so that the robot is controlled to complete the operation intention of the user.
Disclosure of Invention
In view of this, embodiments of the present invention provide a remote control method and apparatus for a robot, and a terminal device, so as to solve the problems in the prior art that a human-computer interaction mode of a robot is poor in flexibility, an operation is not humanized, and a user's requirement cannot be met.
A first aspect of an embodiment of the present invention provides a remote control method for a robot, including:
acquiring motion information of a target, and drawing a corresponding motion track according to the motion information;
converting the motion trail into space point coordinates under a world coordinate system of the robot;
generating a motion instruction according to the space point coordinates, and sending the motion instruction to the robot;
controlling the robot to move according to the motion instruction;
and acquiring the coordinates of the motion points in the motion process of the robot, and simulating the coordinates of the motion points by a simulation technology to display the motion posture of the robot.
Optionally, obtaining motion information of the target, and drawing a corresponding motion trajectory according to the motion information includes:
when the target is detected to start moving, acquiring the movement information of the target at preset time intervals; wherein the motion information comprises relative coordinate points in a relative coordinate system in the motion process of the target; the relative coordinate system is a world coordinate system established by taking the coordinates of the starting point of the target when the target starts to move as the origin;
and drawing a corresponding motion track according to the motion information.
Optionally, the converting the motion trajectory into a space point coordinate under a world coordinate system of the robot includes:
fitting the motion coordinate points according to a linear equation so as to convert the motion coordinate points into space point coordinates under a world coordinate system of the robot; the world coordinate system is established by taking the coordinates of the starting point of the robot as an origin.
Optionally, the obtaining a motion point coordinate in the motion process of the robot, and simulating the motion point coordinate by a simulation technique to display the motion posture of the robot includes:
and acquiring the coordinates of the moving points of the robot in the world coordinate system in the moving process according to a preset time interval.
And simulating the coordinates of the motion points by a simulation technology, and acquiring and displaying the motion attitude of the robot.
A second aspect of an embodiment of the present invention provides a remote control apparatus for a robot, including:
the acquisition module is used for acquiring the motion information of the target and drawing a corresponding motion track according to the motion information;
the conversion module is used for converting the motion trail into space point coordinates under a world coordinate system of the robot;
the sending module is used for generating a motion instruction according to the space point coordinates and sending the motion instruction to the robot;
the control module is used for controlling the robot to move according to the motion instruction;
and the simulation module is used for acquiring the coordinates of the motion points in the motion process of the robot and simulating the coordinates of the motion points by a simulation technology so as to display the motion posture of the robot.
A third aspect of an embodiment of the present invention provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method as described above when executing the computer program.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method as described above.
According to the embodiment of the invention, the motion track of the target object is converted to generate the motion instruction to control the robot to execute the corresponding operation, and the motion posture of the robot is reproduced through a simulation technology, so that the human-computer interaction is more humanized, and the flexibility of the human-computer interaction and the efficiency of controlling the robot are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a remote control method for a robot according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a remote control method of a robot according to a second embodiment of the present invention;
fig. 3 is a schematic flow chart of a remote control method of a robot according to a third embodiment of the present invention;
fig. 4 is a schematic flowchart of a remote control method for a robot according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a remote control device of a robot according to a fifth embodiment of the present invention;
fig. 6 is a schematic diagram of a terminal device according to a sixth embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "comprises" and "comprising," and any variations thereof, in the description and claims of this invention and the above-described drawings are intended to cover non-exclusive inclusions. For example, a process, method, or system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. Furthermore, the terms "first," "second," and "third," etc. are used to distinguish between different objects and are not used to describe a particular order.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Example one
As shown in fig. 1, the present embodiment provides a remote control method for a robot, which may be applied to a terminal device such as a mobile phone, a PC, a tablet computer, and the like. The remote control method for the robot provided by the embodiment comprises the following steps:
s101, obtaining motion information of a target, and drawing a corresponding motion track according to the motion information.
In specific application, the motion information of the target is obtained through the recognition device, and a corresponding motion track is drawn according to the motion information. The identification device comprises but is not limited to a somatosensory identification device and a gesture identification device; different types of motion information can be acquired according to different identification devices. For example, the motion information of the target may include motion information of the target in a motion state acquired by moving the target-worn motion sensing recognition device, or gesture motion information of the target acquired by touching the gesture recognition device with the target.
And S102, converting the motion trail into a space point coordinate under a world coordinate system of the robot.
In a specific application, the motion trail is converted into the space point coordinates of the robot in the world coordinate system. The world coordinate system of the robot is a world coordinate system established by taking the initial point coordinate of the robot as an origin.
S103, generating a motion instruction according to the space point coordinates, and sending the motion instruction to the robot.
In specific application, a corresponding motion instruction is generated according to the space point coordinates, and the motion instruction is sent to the robot.
And S104, controlling the robot to move according to the motion instruction.
In specific application, according to the motion instruction, the robot is controlled to execute corresponding motion operation according to the motion instruction.
And S105, obtaining the coordinates of the motion points in the motion process of the robot, and simulating the coordinates of the motion points by a simulation technology to display the motion posture of the robot.
In specific application, the coordinates of a motion point in the motion process of the robot are obtained, and the coordinates of the motion point are processed through a 3D simulation technology to simulate and display the motion posture of the robot. The 3D simulation technology, i.e., three-dimensional simulation, is a technology that uses a computer technology to generate a realistic virtual environment with multiple perceptions such as sight, hearing, touch, taste, etc., and a user can interact with entities in the virtual environment using various sensing devices through natural skills of the user. In one embodiment, the above method may be used in an actuator of a robot, such as a robot arm or an actuator tip of a robot hand comprised by the robot.
In the embodiment, the motion track of the target object is converted to generate the motion instruction to control the robot to execute the corresponding operation, and the motion posture of the robot is reproduced through the simulation technology, so that the human-computer interaction is more humanized, and the flexibility of the human-computer interaction and the efficiency of controlling the robot are improved.
Example two
As shown in fig. 2, this embodiment is a further description of the method steps in the first embodiment. In this embodiment, step S101 includes
S1011, when the target is detected to start moving, obtaining the movement information of the target at preset time intervals; wherein the motion information comprises relative coordinate points in a relative coordinate system in the motion process of the target; the relative coordinate system is a world coordinate system established by taking the coordinates of the starting point of the target when the target starts to move as the origin.
In a specific application, if the target is detected to start moving, the operation of acquiring the real-time movement information of the target is repeatedly executed for multiple times according to a preset time interval until the movement of the target is finished. The motion information comprises relative coordinate points of the target in a relative coordinate system in the current motion process; the relative coordinate system is a world coordinate system established by taking the coordinates of the starting point of the target when the target starts to move as the origin; the preset time interval may be specifically set according to actual conditions, for example, if the preset time interval is set to 0.5s, the real-time motion information of the target is acquired every 0.5 s.
And S1012, drawing a corresponding motion track according to the motion information.
In specific application, a corresponding motion track is drawn according to the motion information so as to convert the motion information into a motion instruction for controlling the robot.
According to the embodiment, the real-time motion information of the target is acquired for multiple times, and the corresponding motion track is drawn according to the real-time motion information of the target, so that the authenticity of the motion information is improved, and the efficiency of controlling the robot by a user is improved.
EXAMPLE III
As shown in fig. 3, this embodiment is a further description of the method steps in the first embodiment. In this embodiment, step S102 includes:
s1021, fitting the motion coordinate points according to a linear equation so as to convert the motion coordinate points into space point coordinates under a world coordinate system of the robot; the world coordinate system is established by taking the coordinates of the starting point of the robot as an origin.
In a specific application, the motion coordinate points in the motion information are fitted according to a linear equation so as to convert the motion coordinate points into space point coordinates in a world coordinate system of the robot. The world coordinate system is a world coordinate system established by taking the coordinates of the starting point of the robot as the origin. The starting point coordinates of the robot are point coordinates when the robot acquires the motion command (when the robot does not execute the motion operation corresponding to the motion command and the robot still keeps the state before acquiring the motion command).
In this embodiment, the linear equation is:
y=ax+b;
wherein, X is a coordinate value of any motion point coordinate in the three-axis (i.e. X-axis, Y-axis or Z-axis) direction in the relative coordinate system, a is a coefficient of the coordinate value of any motion point coordinate in the X-axis, Y-axis or Z-axis direction in the motion information, and b is a constant. The values of a and b can be obtained by obtaining the coordinates of the motion points with preset number, fitting the coordinates of the motion points in the X-axis, Y-axis or Z-axis direction in the relative coordinate system, and calculating the average value.
According to the embodiment, the motion information of the target is fitted through the linear equation and converted into the space point coordinates of the robot, so that the motion instruction generation efficiency is improved, and the efficiency of the robot in executing corresponding operations is controlled.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example four
As shown in fig. 4, this embodiment is a further description of the method steps in the first embodiment. In this embodiment, step S105 includes:
s1051, obtaining the coordinates of the moving points in the world coordinate system in the moving process of the robot according to a preset time interval.
In the specific application, the operation of acquiring the coordinates of the motion point of the robot under the world coordinate system of the robot in the motion process of the robot is executed once at preset time intervals until the motion of the target is finished. The preset time interval can be specifically set according to actual conditions.
And S1052, simulating the coordinates of the motion points through a simulation technology, and acquiring and displaying the motion attitude of the robot.
In a specific application, the coordinates of the motion points are processed by a 3D simulation technology, the motion posture of the robot is simulated and displayed through a display screen (namely the motion posture of the robot is reproduced).
In one embodiment, a three-dimensional model of the robot may be designed and acquired in advance, any motion unit of the robot (for example, any two joints and a motion axis between the two joints, such as any one finger knuckle of the robot) may be used as an independent simulation object, and the motion unit may be simulated and reproduced by acquiring coordinate parameter changes (for example, rotation coordinate changes and translation coordinate changes) of the motion unit to simulate the motion posture of the robot.
According to the embodiment, the motion point coordinates of the robot are acquired in real time, and the motion posture of the robot is reproduced through a simulation technology, so that the flexibility and the interestingness of human-computer interaction are improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
EXAMPLE five
As shown in fig. 5, the present embodiment provides a remote control device 100 of a robot, which is used to execute the method steps in the first embodiment. The present embodiment provides a remote control device 100 for a robot, including:
the acquisition module 101 is configured to acquire motion information of a target and draw a corresponding motion trajectory according to the motion information;
the conversion module 102 is configured to convert the motion trajectory into a space point coordinate in a world coordinate system of the robot;
the sending module 103 is configured to generate a motion instruction according to the spatial point coordinates, and send the motion instruction to the robot;
a control module 104, configured to control the robot to move according to the motion instruction;
and the simulation module 105 is configured to acquire a coordinate of a motion point in the motion process of the robot, and simulate the coordinate of the motion point by using a simulation technology to display a motion posture of the robot.
In one embodiment, the obtaining module 101 includes:
the detection unit is used for acquiring the motion information of the target at preset time intervals when the target starts to move is detected; wherein the motion information comprises relative coordinate points in a relative coordinate system in the motion process of the target; the relative coordinate system is a world coordinate system established by taking the coordinates of the starting point of the target when the target starts to move as the origin;
and the drawing unit is used for drawing the corresponding motion trail according to the motion information.
In one embodiment, the conversion module 102 includes:
the conversion unit is used for fitting the motion coordinate points according to a linear equation so as to convert the motion coordinate points into space point coordinates under a world coordinate system of the robot; the world coordinate system is established by taking the coordinates of the starting point of the robot as an origin.
In one embodiment, the simulation module 105 includes:
and the acquisition unit is used for acquiring the coordinates of the moving points in the world coordinate system in the moving process of the robot according to a preset time interval.
And the simulation unit is used for simulating the coordinates of the motion points through a simulation technology, acquiring and displaying the motion attitude of the robot.
In the embodiment, the motion track of the target object is converted to generate the motion instruction to control the robot to execute the corresponding operation, and the motion posture of the robot is reproduced through the simulation technology, so that the human-computer interaction is more humanized, and the flexibility of the human-computer interaction and the efficiency of controlling the robot are improved.
EXAMPLE six
Fig. 6 is a schematic diagram of the terminal device provided in this embodiment. As shown in fig. 6, the terminal device 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62, such as a remote control program of a robot, stored in said memory 61 and executable on said processor 60. The processor 60, when executing the computer program 62, implements the steps in the above-described embodiments of the remote control method for each robot, such as the steps S101 to S105 shown in fig. 1. Alternatively, the processor 60, when executing the computer program 62, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 101 to 105 shown in fig. 5.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 62 in the terminal device 6. For example, the computer program 62 may be divided into an acquisition module, a conversion module, a transmission module, a control module, and a simulation module, each module having the following specific functions:
the acquisition module is used for acquiring the motion information of the target and drawing a corresponding motion track according to the motion information;
the conversion module is used for converting the motion trail into space point coordinates under a world coordinate system of the robot;
the sending module is used for generating a motion instruction according to the space point coordinates and sending the motion instruction to the robot;
the control module is used for controlling the robot to move according to the motion instruction;
and the simulation module is used for acquiring the coordinates of the motion points in the motion process of the robot and simulating the coordinates of the motion points by a simulation technology so as to display the motion posture of the robot.
The terminal device 6 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 6 is merely an example of a terminal device 6 and does not constitute a limitation of terminal device 6 and may include more or less components than those shown, or some components in combination, or different components, for example, the terminal device may also include input output devices, network access devices, buses, etc.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may also be an external storage device of the terminal device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital Card (SD), a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing the computer program and other programs and data required by the terminal device. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A method for remote control of a robot, comprising:
acquiring motion information of a target, and drawing a corresponding motion track according to the motion information;
converting the motion trail into space point coordinates under a world coordinate system of the robot;
generating a motion instruction according to the space point coordinates, and sending the motion instruction to the robot;
controlling the robot to move according to the motion instruction;
and acquiring the coordinates of the motion points in the motion process of the robot, and simulating the coordinates of the motion points by a simulation technology to display the motion posture of the robot.
2. The remote control method of a robot according to claim 1, wherein obtaining motion information of a target and drawing a corresponding motion trajectory according to the motion information comprises:
when the target is detected to start moving, acquiring the movement information of the target at preset time intervals; wherein the motion information comprises relative coordinate points in a relative coordinate system in the motion process of the target; the relative coordinate system is a world coordinate system established by taking the coordinates of the starting point of the target when the target starts to move as the origin;
and drawing a corresponding motion track according to the motion information.
3. The remote control method of a robot according to claim 2, wherein converting the motion trajectory into spatial point coordinates in a world coordinate system of the robot comprises:
fitting the motion coordinate points according to a linear equation so as to convert the motion coordinate points into space point coordinates under a world coordinate system of the robot; the world coordinate system is established by taking the coordinates of the starting point of the robot as an origin.
4. The remote control method of the robot according to claim 1, wherein the obtaining of coordinates of a moving point during the movement of the robot and the simulation of the coordinates of the moving point by a simulation technique to display the movement posture of the robot comprises:
acquiring the coordinates of a moving point under the world coordinate system in the moving process of the robot according to a preset time interval;
and simulating the coordinates of the motion points by a simulation technology, and acquiring and displaying the motion attitude of the robot.
5. A remote control device for a robot, comprising:
the acquisition module is used for acquiring the motion information of the target and drawing a corresponding motion track according to the motion information;
the conversion module is used for converting the motion trail into space point coordinates under a world coordinate system of the robot;
the sending module is used for generating a motion instruction according to the space point coordinates and sending the motion instruction to the robot;
the control module is used for controlling the robot to move according to the motion instruction;
and the simulation module is used for acquiring the coordinates of the motion points in the motion process of the robot and simulating the coordinates of the motion points by a simulation technology so as to display the motion posture of the robot.
6. The remote control device of a robot of claim 5, wherein the acquisition module comprises:
the detection unit is used for acquiring the motion information of the target at preset time intervals when the target starts to move is detected; wherein the motion information comprises relative coordinate points in a relative coordinate system in the motion process of the target; the relative coordinate system is a world coordinate system established by taking the coordinates of the starting point of the target when the target starts to move as the origin;
and the drawing unit is used for drawing the corresponding motion trail according to the motion information.
7. The remote control device of a robot of claim 6, wherein the translation module comprises:
the conversion unit is used for fitting the motion coordinate points according to a linear equation so as to convert the motion coordinate points into space point coordinates under a world coordinate system of the robot; the world coordinate system is established by taking the coordinates of the starting point of the robot as an origin.
8. The remote control device of a robot of claim 5, wherein the simulation module comprises:
the acquisition unit is used for acquiring the coordinates of the moving points in the world coordinate system in the moving process of the robot according to a preset time interval;
and the simulation unit is used for simulating the coordinates of the motion points through a simulation technology, acquiring and displaying the motion attitude of the robot.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
CN201811624739.4A 2018-12-28 2018-12-28 Remote control method and device for robot and terminal equipment Active CN109732593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811624739.4A CN109732593B (en) 2018-12-28 2018-12-28 Remote control method and device for robot and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811624739.4A CN109732593B (en) 2018-12-28 2018-12-28 Remote control method and device for robot and terminal equipment

Publications (2)

Publication Number Publication Date
CN109732593A CN109732593A (en) 2019-05-10
CN109732593B true CN109732593B (en) 2021-04-23

Family

ID=66361924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811624739.4A Active CN109732593B (en) 2018-12-28 2018-12-28 Remote control method and device for robot and terminal equipment

Country Status (1)

Country Link
CN (1) CN109732593B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110587596A (en) * 2019-07-30 2019-12-20 武汉恒新动力科技有限公司 Multi-axis configuration device remote control method and device, terminal equipment and storage medium
CN111158364B (en) * 2019-12-30 2024-02-09 深圳市优必选科技股份有限公司 Robot repositioning method and device and terminal equipment
CN111736607B (en) * 2020-06-28 2023-08-11 上海黑眸智能科技有限责任公司 Robot motion guiding method, system and terminal based on foot motion
CN115476366B (en) * 2021-06-15 2024-01-09 北京小米移动软件有限公司 Control method, device, control equipment and storage medium for foot robot
CN113534807B (en) * 2021-07-21 2022-08-19 北京优锘科技有限公司 Method, device, equipment and storage medium for realizing robot inspection visualization

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03287394A (en) * 1990-03-30 1991-12-18 Nec Corp Remote handling device
DE102008042612A1 (en) * 2008-10-06 2010-04-08 Kuka Roboter Gmbh Industrial robots and path planning method for controlling the movement of an industrial robot
CN104007664B (en) * 2014-05-20 2017-08-29 中广核研究院有限公司 A kind of nuclear power station climbing robot three dimensional visual simulation skimulated motion method
CN106041928B (en) * 2016-06-24 2018-03-20 东南大学 A kind of robot manipulating task task generation method based on part model
CN106142083B (en) * 2016-07-21 2018-03-16 河北工业大学 The method of the three-dimensional motion emulation of high-altitude curtain wall mounting robot
CN108673505A (en) * 2018-05-28 2018-10-19 南昌大学 A kind of mechanical arm tail end precise motion control method

Also Published As

Publication number Publication date
CN109732593A (en) 2019-05-10

Similar Documents

Publication Publication Date Title
CN109732593B (en) Remote control method and device for robot and terminal equipment
Kumar et al. Hand data glove: a wearable real-time device for human-computer interaction
CN111694429A (en) Virtual object driving method and device, electronic equipment and readable storage
US10751877B2 (en) Industrial robot training using mixed reality
Haban et al. Global events and global breakpoints in distributed systems
Seo et al. Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences
CN111208783B (en) Action simulation method, device, terminal and computer storage medium
CN111815754A (en) Three-dimensional information determination method, three-dimensional information determination device and terminal equipment
US20190050132A1 (en) Visual cue system
Ng et al. GARDE: a gesture-based augmented reality design evaluation system
CN110568929B (en) Virtual scene interaction method and device based on virtual keyboard and electronic equipment
US10162737B2 (en) Emulating a user performing spatial gestures
CN111113429B (en) Action simulation method, action simulation device and terminal equipment
CN113119104B (en) Mechanical arm control method, mechanical arm control device, computing equipment and system
Jahani et al. Exploring a user-defined gesture vocabulary for descriptive mid-air interactions
CN108628455B (en) Virtual sand painting drawing method based on touch screen gesture recognition
Tran et al. Easy-to-use virtual brick manipulation techniques using hand gestures
Besnea et al. Experiments regarding implementation of a virtual training environment for automotive industry
Liu et al. COMTIS: Customizable touchless interaction system for large screen visualization
CN115481489A (en) System and method for verifying suitability of body-in-white and production line based on augmented reality
CN115115814A (en) Information processing method, information processing apparatus, readable storage medium, and electronic apparatus
CN114299203A (en) Processing method and device of virtual model
CN113496168A (en) Sign language data acquisition method, sign language data acquisition equipment and storage medium
CN112911266A (en) Implementation method and system of Internet of things practical training system based on augmented reality technology
CN110216676B (en) Mechanical arm control method, mechanical arm control device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant