CN116125887A - Robot remote control platform based on virtual reality and design method thereof - Google Patents
Robot remote control platform based on virtual reality and design method thereof Download PDFInfo
- Publication number
- CN116125887A CN116125887A CN202310073689.XA CN202310073689A CN116125887A CN 116125887 A CN116125887 A CN 116125887A CN 202310073689 A CN202310073689 A CN 202310073689A CN 116125887 A CN116125887 A CN 116125887A
- Authority
- CN
- China
- Prior art keywords
- robot
- control
- field
- central controller
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
- G05B19/0423—Input/output
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/23—Pc programming
- G05B2219/23051—Remote control, enter program remote, detachable programmer
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
The invention relates to the technical field of robot control, in particular to a robot remote control platform based on virtual reality and a design method thereof. The platform comprises: the control end device comprises a virtual field presentation device and a control end track capturing device, wherein the control end track capturing device is used for acquiring position information of the tail end of a human arm or the tail end of an object controlled by a human hand and transmitting the position information to the central controller; the field end device comprises a robot, a field environment and a robot gesture capturing device, and the central controller comprises a field environment and robot gesture analysis module, a virtual environment building and displaying module, a control tail end track analysis module, a robot track calculation and control instruction production module, a logic control module and a communication control module. The invention can virtually present the field operation environment in real time, and generate the robot control instruction to remotely control the robot in real time, so that the robot accurately reproduces the control actions of an operator.
Description
Technical Field
The invention relates to the technical field of robot control, in particular to a robot remote control platform based on virtual reality and a design method thereof.
Background
With the continuous development of robot technology, robots have become increasingly important in aspects of industrial production, building decoration and daily life in human society. However, nowadays, robots are executed according to a programmed trajectory, and the trajectory cannot be flexibly changed in some special or emergency situations. When construction work with certain artistic works is performed, uniqueness of the works cannot be reflected. Some huge art creations, robots are not completely adequate, and manual work is dangerous and slow in progress. Real-time remote control is therefore required for robotic applications in particular environments. Unlike the operation control of an operator beside a robot, the real-time remote control increases the difficulty of the real-time control because all environment and state information of the robot operation site cannot be obtained. When an artist carries out remote creation, how to reproduce a robot accurately is a difficulty to be solved, so how to grasp all environment and state information of a robot operation site by remote control personnel is urgently needed to be solved.
Disclosure of Invention
In order to solve the technical problems in the prior art, the invention provides the robot remote control platform based on virtual reality and the design method thereof, which can realize real-time virtual presentation of an on-site operation environment, generate a robot control instruction to remotely control the robot in real time and enable the robot to accurately reproduce the control actions of a manipulator.
A first object of the present invention is to provide a robot remote control platform based on virtual reality.
The second object of the present invention is to provide a design method of a remote control platform of a robot based on virtual reality.
In order to achieve the first object, the present invention provides the following technical solutions:
the robot remote control platform based on virtual reality is characterized by comprising control end equipment, field end equipment and a central controller, wherein the control end equipment and the field end equipment are connected with the central controller;
the control end device comprises a virtual field presenting device and a control end track capturing device, wherein the virtual field presenting device is used for presenting a virtual scene generated by the central controller, and the control end track capturing device is used for acquiring position information of the tail end of a human arm or the tail end of an object controlled by a human hand and transmitting the position information to the central controller;
the field end device comprises a robot, a field environment and a robot gesture capturing device, wherein the field environment and the robot gesture capturing device are used for shooting field environment images and robot gesture images in real time and sending the shot field environment images and the robot gesture images to the central controller;
the central controller comprises a field environment and robot gesture analysis module, a virtual environment establishment and display module, a control terminal track analysis module, a robot track calculation and control instruction production module, a logic control module and a communication control module.
Specifically, the on-site environment and robot gesture analysis module is used for performing synthesis analysis on the on-site environment and robot gesture pictures shot by the on-site camera and converting the on-site environment and the robot gesture pictures into information suitable for displaying the virtual environment;
the virtual environment building and displaying module is used for converting the content to be displayed into a virtual reality projection system;
the control terminal track analysis module is used for calculating the position information of the sensor collected by the control terminal, calculating the space coordinate information of each point position of the sensor, fitting each point position to a track and analyzing the motion track of the control terminal;
the robot track calculation and control instruction generation module is used for calculating a robot motion track according to the on-site environment parameters of the control terminal motion track set and then generating a robot control instruction according to the robot motion track;
the logic control and communication control module comprises a logic control module and a communication control module, wherein the logic control module is used for completing logic sequence control of the whole system, carrying out emergency treatment on emergency and the like; the communication control module is used for controlling the data interaction between the central controller and the control terminal equipment and controlling the data interaction between the field terminal equipment and the central controller.
In order to achieve the second object, the present invention provides the following technical solutions:
the design method of the robot remote control platform based on the virtual reality is characterized by comprising the following steps:
s1, erecting field end equipment at a field operation end, and erecting control end equipment and a central controller at a control end;
s2, acquiring field end conditions through field end equipment, completing field environment simulation calculation by a central controller, and presenting a virtual scene through virtual field presentation equipment;
s3, capturing the limb actions of a control person or the tail end track of an object, and completing the track analysis of the robot at the site by a central controller to send a control instruction to the robot;
s4, the robot moves according to the control instruction, the gesture and the site situation of the robot are collected in real time, the central controller completes the site environment simulation calculation, and a virtual scene is presented through the virtual site presentation equipment;
s5, continuously cycling the steps S2-S4 until the action execution of the field end robot is completed.
Specifically, the step S2 includes: the method comprises the steps of shooting a field environment image and a robot gesture image in real time through a high-speed high-definition camera, synthesizing a multi-angle image into a three-dimensional space for modeling, converting a real-time state of an operation field end into a three-dimensional space model through a central controller, and displaying a virtual scene through a virtual field display device.
Specifically, the step S3 includes: and the space coordinate information of the tail end of the human body or the object is transmitted into the central controller in real time through a sensor arranged at the tail end of the human body or the object, and the motion trail information of the tail end of the human body or the object controlled by the human body is obtained through calculation according to the space coordinate information of the tail end of the object.
Compared with the prior art, the technical scheme at least has the following beneficial effects:
the invention provides a robot remote control platform based on virtual reality and a design method thereof. At the control end, the control end track capturing device is used for collecting position information of the tail end of a human arm or the tail end of an object controlled by a human hand, space coordinate information of the tail end of the object controlled by the human body or the human body is transmitted into the central controller in real time, and motion track information of the tail end of the object controlled by the human body or the human body is obtained through calculation according to the space coordinate information. Real-time virtual presentation of the field operation environment can be realized, and a robot control instruction is generated to remotely control the robot in real time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a remote control platform of a robot based on virtual reality in an embodiment of the invention;
fig. 2 is a schematic diagram of a functional module of a central controller in an embodiment of the present invention.
Detailed Description
The technical solution of the present invention will be described in further detail below with reference to the accompanying drawings and examples, it being apparent that the described examples are some, but not all, examples of the present invention, and embodiments of the present invention are not limited thereto. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1:
in the robot remote control platform provided by the invention, in a virtual environment aiming at the field operation end environment simulation, a central controller generates a robot control instruction by capturing the motion track of a control end human body or an object tail end controlled by the human body, combining the actual field end condition, the control instruction is sent to a field operation end robot, and the robot finishes the appointed action according to the instruction. In the whole process, the robot at the field operation end follows the motion trail of the control end in real time; the robot state and the environment change of the field operation end are also presented in the virtual environment in real time. The central controller completes the analysis of the control terminal track, calculates and generates a field end robot control instruction, completes the acquisition of the field environment, the robot gesture and other conditions of the field end, and produces a virtual scene; the central controller is also responsible for the aspects of logic control, communication control and the like of the whole system.
Fig. 1 is a schematic structural diagram of a remote control platform of a robot based on virtual reality according to an embodiment of the present invention, including a control end device, a field end device, and a central controller, where:
the control end device comprises a virtual field presenting device and a control end track capturing device, wherein the virtual field presenting device is used for presenting a virtual scene generated by calculation of the central controller through a virtual reality projection system. The person may enter the virtual scene by making actions to control the robot to follow to make corresponding movements.
The control terminal track capturing device comprises a sensor and a corresponding sensor receiver, wherein the sensor is arranged at the tail end of a human arm or the tail end of an object controlled by a human hand, the sensor receiver is arranged at a plurality of positions in a movement space range, and the sensor receiver acquires sensor position information in real time and transmits the sensor position information to the central controller. The sensor adopts the principle of laser interference, a laser beam is emitted by the receiver, reflected by the sensor and then received by the receiver, and thus the distance between the sensor and the receiver is obtained. And simultaneously, the position information of the sensor in the space can be calculated by matching with a plurality of receivers. The main performance index of the sensor requires: the measuring range is not less than 3 m, the spatial position direction resolution is not less than 0.1 micron, and the sampling speed is not less than 256 points/second.
The field end device includes a robot, a field environment, and a robot pose capture device. The robot includes: the robot comprises a robot body, a robot control cabinet and other devices, wherein a controller, a servo driver, an input/output module and the like are arranged in the robot control cabinet, and the controller receives a motion instruction sent by a central controller and controls the robot body to complete corresponding actions through the servo driver and the input/output module according to the motion instruction.
The robot body refers to a mechanical part of a robot, is a supporting base and an executing mechanism of the robot, and comprises: transmission parts, fuselage, arms, wrists, etc. The robot further includes: the tool or tool installed at the tail end of the robot is used for clamping objects (such as pens used for drawing) required by the robot to finish tasks, and the tools can be replaced according to the tasks executed by the robot. The robot mounting part means a device or mechanism for mounting and fixing the robot body. The robot moving device is used for driving the robot body to move, and when the movement range of the robot body is insufficient for completing the execution task, the movement range of the robot body is increased.
The robot body, the robot control cabinet, the tool installed at the tail end of the robot, the robot installation, the robot movement and other equipment are all contained in the robot, and the central controller is connected with the robot control cabinet for data exchange and sends a control instruction to the robot control cabinet.
The field environment and robot gesture capturing device comprises a plurality of high-speed high-definition cameras, wherein the high-speed high-definition cameras are arranged on an operation field and used for shooting field environment and robot gesture pictures in real time and sending the pictures to the central controller. The high-speed high-definition camera can complete rapid and repeated sampling of a high-speed target in a short time. The high-speed camera technology has the outstanding advantages of real-time target capture, quick image recording, instant playback, visual and clear images and the like, and captures moving images at a frame rate exceeding 250 frames per second.
As shown in fig. 2, the central controller comprises a field environment and robot gesture analysis module, a virtual environment establishing and displaying module, a control terminal track analysis module, a robot track calculation and control instruction production module, a logic control module and a communication control module.
And the on-site environment and robot gesture analysis module is used for carrying out synthesis analysis on the on-site environment and robot gesture pictures shot by the on-site camera and converting the on-site environment and the robot gesture pictures into information suitable for virtual environment display. The virtual environment building and displaying module is used for converting the content to be displayed into the virtual reality projection system. The control terminal track analysis module is used for calculating the position information of the sensor collected by the control terminal, calculating the space coordinate information of each point position of the sensor, fitting each point position to a track, and analyzing the motion track of the control terminal. The robot track calculation and control instruction generation module is used for gathering the motion track of the control tail end with the environmental parameters of the site, calculating the motion track of the robot and generating a robot control instruction according to the motion track of the robot. And the logic control module is used for completing logic sequence control of the whole system, carrying out emergency treatment on emergency and the like. And the communication control module is used for controlling various data interactions between the central controller and the control terminal equipment and between the field terminal equipment and the central controller.
In summary, by using a high-speed high-definition camera shooting technology and combining technologies such as multi-angle image synthesis three-dimensional space modeling through a field environment and a robot gesture capturing device, a real-time state of an operation field end is converted into a three-dimensional space model in a computer, and a virtual scene at a control end is presented through a virtual field presentation device. At the control end, the control end track capturing device is used for collecting position information of the tail end of a human arm or the tail end of an object controlled by a human hand, space coordinate information of the tail end of the object controlled by the human body or the human body is transmitted into the central controller in real time, and motion track information of the tail end of the object controlled by the human body or the human body is obtained through calculation according to the space coordinate information.
Example 2:
in a virtual environment for simulation of an on-site operation end environment, a central controller generates a robot control instruction by capturing a control end human body or an object tail end motion track controlled by the human body, combining an actual on-site end condition, and the control instruction is sent to an on-site operation end robot, so that the robot can finish specified actions according to the instruction. In the whole process, the robot at the field operation end follows the motion trail of the control end in real time; the robot state and the environment change of the field operation end are also presented in the virtual environment in real time.
Based on the robot remote control platform based on virtual reality of embodiment 1, the embodiment also provides a design method of the robot remote control platform based on virtual reality, comprising the following steps:
s1, erecting field end equipment at a field operation end, and erecting control end equipment and a central controller at a control end.
S2, acquiring field end conditions through field end equipment, completing field environment simulation calculation by a central controller, and presenting a virtual scene through virtual field presentation equipment.
Specifically, a field environment image and a robot gesture image are shot in real time through a high-speed high-definition camera, a multi-angle image is synthesized into a three-dimensional space model, a real-time state of an operation field end is converted into a three-dimensional space model through a central controller, and a virtual scene is presented through a virtual field presentation device.
S3, capturing the limb actions of a control person or the tail end track of an object, and completing the track analysis of the robot at the site end by the central controller, and sending a control instruction to the robot.
Specifically, the space coordinate information of the human body or the tail end of the object controlled by the human body is transmitted into the central controller in real time through the sensor arranged at the tail end of the human body or the tail end of the object controlled by the human body, and the motion trail information of the tail end of the human body or the tail end of the object controlled by the human body is calculated according to the space coordinate information of the tail end of the object.
According to the space coordinate information of the object end, calculating to obtain the motion trail information of the object end controlled by the human body or the human body, specifically comprising: and marking a plurality of points in the space according to the received space coordinate information of the tail end of the object, filling the plurality of points into a line by two adjacent points through an interpolation algorithm, and carrying out smooth transition treatment on corners.
The central controller completes the analysis of the control tail end track, and calculates and generates a field end robot control instruction; the method comprises the steps of completing acquisition of conditions such as a field environment of a field end, a robot gesture and the like, and producing a virtual scene; the central controller is also responsible for the aspects of logic control, communication control and the like of the whole system.
S4, the robot moves according to the control instruction, the gesture and the site situation of the robot are collected in real time, the central controller completes the site environment simulation calculation, and a virtual scene is displayed at the control end.
And continuously cycling the steps S2, S3 and S4 until the action execution of the field end robot is completed.
The above examples are preferred embodiments of the present invention, but the embodiments of the present invention are not limited to the above examples, and any other changes, modifications, substitutions, combinations, and simplifications that do not depart from the spirit and principle of the present invention should be made in the equivalent manner, and the embodiments are included in the protection scope of the present invention.
Claims (10)
1. The robot remote control platform based on virtual reality is characterized by comprising control end equipment, field end equipment and a central controller, wherein the control end equipment and the field end equipment are connected with the central controller;
the control end device comprises a virtual field presenting device and a control end track capturing device, wherein the virtual field presenting device is used for presenting a virtual scene generated by the central controller, and the control end track capturing device is used for acquiring position information of the tail end of a human arm or the tail end of an object controlled by a human hand and transmitting the position information to the central controller;
the field end device comprises a robot, a field environment and a robot gesture capturing device, wherein the field environment and the robot gesture capturing device are used for shooting field environment images and robot gesture images in real time and sending the shot field environment images and the robot gesture images to the central controller;
the central controller comprises a field environment and robot gesture analysis module, a virtual environment establishment and display module, a control terminal track analysis module, a robot track calculation and control instruction production module, a logic control module and a communication control module.
2. The robot remote control platform based on virtual reality according to claim 1, wherein the field environment and robot gesture analysis module is configured to perform a synthetic analysis on a field environment and a robot gesture picture shot by a field end camera, and convert the field environment and the robot gesture picture into information suitable for displaying the virtual environment;
the virtual environment building and displaying module is used for converting the content to be displayed into a virtual reality projection system;
the control terminal track analysis module is used for calculating the position information of the sensor collected by the control terminal, calculating the space coordinate information of each point position of the sensor, fitting each point position to a track and analyzing the motion track of the control terminal;
the robot track calculation and control instruction generation module is used for calculating a robot motion track according to the on-site environment parameters of the control terminal motion track set and then generating a robot control instruction according to the robot motion track;
the logic control and communication control module comprises a logic control module and a communication control module, wherein the logic control module is used for completing logic sequence control of the whole system, carrying out emergency treatment on emergency and the like; the communication control module is used for controlling the data interaction between the central controller and the control terminal equipment and controlling the data interaction between the field terminal equipment and the central controller.
3. The robot remote control platform based on virtual reality according to claim 2, wherein the control end track capturing device comprises a sensor and a corresponding sensor receiver, the sensor is installed at the end of a human arm or the end of an object controlled by a human hand, the sensor receiver is arranged at a plurality of positions in a motion space range, and the sensor receiver acquires sensor position information in real time and transmits the sensor position information to the central controller.
4. The virtual reality based robotic remote control platform of claim 2, wherein the field environment and robotic pose capture device comprises a plurality of high speed disambiguation cameras disposed at the job site.
5. The robot remote control platform based on virtual reality according to claim 2, wherein the robot comprises a robot body and a robot control cabinet, a controller, a servo driver and an input/output module are arranged in the robot control cabinet, and the controller receives a motion command sent by the central controller and then controls the robot body to complete corresponding actions through the servo driver and the input/output module according to the motion command.
6. The virtual reality based robotic remote control platform of claim 5, wherein the robot further comprises: the robot comprises a tool and a robot moving device which are arranged at the tail end of a robot, wherein the tool arranged at the tail end of the robot is a tool or a tool arranged at the tail end of a robot body and is used for clamping objects required by the robot to finish tasks, and the tools can be replaced according to the tasks executed by the robot; the robot moving device is used for driving the robot body to move, and when the movement range of the robot body is insufficient for completing the execution task, the movement range of the robot body is increased.
7. The design method of the robot remote control platform based on the virtual reality is characterized by comprising the following steps:
s1, erecting field end equipment at a field operation end, and erecting control end equipment and a central controller at a control end;
s2, acquiring field end conditions through field end equipment, completing field environment simulation calculation by a central controller, and presenting a virtual scene through virtual field presentation equipment;
s3, capturing the limb actions of a control person or the tail end track of an object, and completing the track analysis of the robot at the site by a central controller to send a control instruction to the robot;
s4, the robot moves according to the control instruction, the gesture and the site situation of the robot are collected in real time, the central controller completes the site environment simulation calculation, and a virtual scene is presented through the virtual site presentation equipment;
s5, continuously cycling the steps S2-S4 until the action execution of the field end robot is completed.
8. The method for designing a remote control platform of a robot based on virtual reality according to claim 7, wherein the step S2 comprises: the method comprises the steps of shooting a field environment image and a robot gesture image in real time through a high-speed high-definition camera, synthesizing a multi-angle image into a three-dimensional space for modeling, converting a real-time state of an operation field end into a three-dimensional space model through a central controller, and displaying a virtual scene through a virtual field display device.
9. The method for designing a remote control platform for a robot based on virtual reality according to claim 7, wherein the step S3 comprises: and the space coordinate information of the tail end of the human body or the object is transmitted into the central controller in real time through a sensor arranged at the tail end of the human body or the object, and the motion trail information of the tail end of the human body or the object controlled by the human body is obtained through calculation according to the space coordinate information of the tail end of the object.
10. The method for designing a remote control platform of a robot based on virtual reality according to claim 9, wherein the calculating the motion trail information of the human body or the object end controlled by the human body according to the space coordinate information of the object end comprises: and marking a plurality of points in the space according to the received space coordinate information of the tail end of the object, filling the plurality of points into a line by two adjacent points through an interpolation algorithm, and carrying out smooth transition treatment on corners.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310073689.XA CN116125887A (en) | 2023-02-07 | 2023-02-07 | Robot remote control platform based on virtual reality and design method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310073689.XA CN116125887A (en) | 2023-02-07 | 2023-02-07 | Robot remote control platform based on virtual reality and design method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116125887A true CN116125887A (en) | 2023-05-16 |
Family
ID=86307760
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310073689.XA Pending CN116125887A (en) | 2023-02-07 | 2023-02-07 | Robot remote control platform based on virtual reality and design method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116125887A (en) |
-
2023
- 2023-02-07 CN CN202310073689.XA patent/CN116125887A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11103997B2 (en) | Software interface for authoring robotic manufacturing process | |
CN111633644A (en) | Industrial robot digital twin system combined with intelligent vision and operation method thereof | |
US9623559B2 (en) | Systems and methods for instructing robotic operation | |
EP3145682B1 (en) | Systems and methods for time-based parallel robotic operation | |
CN102848389B (en) | Realization method for mechanical arm calibrating and tracking system based on visual motion capture | |
EP1435737A1 (en) | An augmented reality system and method | |
CN109262609A (en) | Mechanical arm tele-control system and method based on virtual reality technology | |
CN110142770B (en) | Robot teaching system and method based on head-mounted display device | |
CN110125944B (en) | Mechanical arm teaching system and method | |
JP2019188477A (en) | Robot motion teaching device, robot system, and robot control device | |
CN113829343A (en) | Real-time multi-task multi-person man-machine interaction system based on environment perception | |
CN104298244A (en) | Industrial robot three-dimensional real-time and high-precision positioning device and method | |
CN107671838B (en) | Robot teaching recording system, teaching process steps and algorithm flow thereof | |
CN111739170B (en) | Visual platform of industrial robot workstation | |
CN210361314U (en) | Robot teaching device based on augmented reality technology | |
Sun et al. | A remote controlled mobile robot based on wireless transmission | |
CN111633653A (en) | Mechanical arm control system and method based on visual positioning | |
CN116125887A (en) | Robot remote control platform based on virtual reality and design method thereof | |
JPH11338532A (en) | Teaching device | |
CN113282173B (en) | Double-arm robot remote real-time control system and method based on virtual reality | |
CN210589293U (en) | Arm teaching device | |
Hu et al. | Manipulator arm interactive control in unknown underwater environment | |
CN114523469B (en) | ROS-based manipulator motion planning and simulation system | |
Sukumar et al. | Augmented reality-based tele-robotic system architecture for on-site construction | |
US20210229286A1 (en) | Cyber-physical system-based remote control framework for robots |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |