CN115179256B - Remote teaching method and system - Google Patents

Remote teaching method and system Download PDF

Info

Publication number
CN115179256B
CN115179256B CN202210658818.7A CN202210658818A CN115179256B CN 115179256 B CN115179256 B CN 115179256B CN 202210658818 A CN202210658818 A CN 202210658818A CN 115179256 B CN115179256 B CN 115179256B
Authority
CN
China
Prior art keywords
teaching
main control
data
teaching data
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210658818.7A
Other languages
Chinese (zh)
Other versions
CN115179256A (en
Inventor
鲁仁全
李一亮
孟伟
程志键
任鸿儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Peng Cheng Laboratory
Original Assignee
Guangdong University of Technology
Peng Cheng Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology, Peng Cheng Laboratory filed Critical Guangdong University of Technology
Priority to CN202210658818.7A priority Critical patent/CN115179256B/en
Publication of CN115179256A publication Critical patent/CN115179256A/en
Application granted granted Critical
Publication of CN115179256B publication Critical patent/CN115179256B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention belongs to the technical field of robots, and discloses a remote teaching method and a remote teaching system. The method comprises the following steps: the cloud server acquires image data of an environment where the controlled device is located; the method comprises the steps of sending image data to one or more corresponding main control devices, enabling the main control devices to generate virtual simulation environments, displaying the virtual simulation environments through wearable virtual reality devices, collecting teaching action information through a teaching handle, and generating and feeding back teaching data; and when the teaching data fed back by the plurality of main control devices are received, selecting target teaching data from the teaching data and sending the target teaching data to the controlled device so that the controlled device can complete corresponding teaching actions. Through the mode, the multi-production requirement of multiple users is met, the teaching scheme of the manipulator can be timely changed when production needs to be changed, the teaching capability of the demonstrator for dealing with workpieces with complex and changeable structures is improved, and field supervision of technicians is not needed.

Description

Remote teaching method and system
Technical Field
The invention relates to the technical field of robots, in particular to a remote teaching method and a remote teaching system.
Background
The development of the existing discrete manufacturing industry has some bottleneck problems: 1. compared with the process industry, the discrete manufacturing industry has a large dependence on worker experience: the product specification is various, the production process is complex, the key process depends on manual operation, the product consistency is poor, and the reject ratio is high; 2. poor collaborative manufacturing capability and high production equipment redundancy: the machine tools and the equipment of the discrete manufacturing enterprises are various, brands are mixed, new and old and have strong isomerism, and the cooperative control difficulty is high. Therefore, in order to accelerate the speed of a manufacturing enterprise from a middle-low end to a high end, a research on an industrial control device facing the intelligent manufacturing process is needed, experience of a skilled operator is extracted to be a knowledge rule, the knowledge rule is embedded into an intelligent manipulator, finally, the intelligent level of equipment is improved through cooperative control of a plurality of manipulators, the existing production process is optimized and improved, and cost reduction, quality improvement, efficiency improvement and transformation upgrading of the manufacturing enterprise are realized.
However, the track planning of the conventional industrial manipulator is generally performed by a demonstrator, and in a production site, a technician firstly performs corresponding production teaching on the manipulator by the demonstrator, and then makes the robot perform corresponding production actions according to the teaching content. However, for some large and complex and changeable workpieces, the traditional demonstrator is somewhat laborious, not only has low efficiency and time consumption, but also requires technical workers to monitor on site, and has higher technical requirements for operators. Moreover, the production site is far away from the user, so that production change can not be timely made according to the production needs of the user, and production is delayed and timeliness is low. Meanwhile, one manipulator only allows one teaching device, and production changes cannot be made in time according to production requirements of a plurality of users.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide a remote teaching method and a remote teaching system, and aims to solve the technical problem that the efficiency of a traditional demonstrator is low.
In order to achieve the above purpose, the present invention provides a remote teaching method applied to a cloud server, the method comprising the following steps:
Acquiring image data of an environment where controlled equipment is located;
The image data are sent to one or more corresponding main control devices, so that the main control devices generate virtual simulation environments according to the image data, the virtual simulation environments are displayed through wearable virtual reality devices, teaching action information input by a user is collected through a teaching handle, and teaching data are generated and fed back according to the teaching action information;
And when the teaching data fed back by the plurality of main control devices are received, selecting target teaching data from the teaching data fed back by the plurality of main control devices and sending the target teaching data to the controlled device, so that the controlled device completes corresponding teaching actions according to the target teaching data.
Optionally, the selecting target teaching data from the teaching data fed back by the multiple master control devices to send to the controlled device includes:
acquiring the priority corresponding to each main control device;
Selecting the master control equipment with the highest priority as a target master control equipment;
And sending the teaching data fed back by the target main control equipment to the controlled equipment as target teaching data.
Optionally, after the obtaining the priorities corresponding to the master control devices, the method further includes:
Sequencing teaching data fed back by a plurality of main control devices according to the priority;
When the teaching data fed back by the plurality of master control devices are received, selecting target teaching data from the teaching data fed back by the plurality of master control devices and sending the target teaching data to the controlled device, wherein the method further comprises the following steps:
And when action completion information fed back by the controlled equipment is received, sending next teaching data to the controlled equipment according to the sequencing order.
Optionally, after the master device with the highest priority is selected as the target master device, the method further includes:
when the target main control equipment is multiple, determining the complexity degree of teaching data fed back by the multiple target main control equipment;
And selecting the teaching data with the lowest complexity as target teaching data and sending the target teaching data to the controlled equipment.
Optionally, before the target teaching data is selected from the teaching data fed back by the plurality of master control devices and sent to the controlled device, the method further includes:
Judging whether the teaching data fed back by each main control device meets the motion logic corresponding to the controlled device;
deleting the teaching data which does not meet the motion logic, and prompting the corresponding main control equipment to carry out teaching operation again.
Optionally, after the sending the image data to the corresponding one or more master devices, the method further includes:
when a first main control device requests auxiliary teaching aiming at a second main control device, sending an authority request to the second main control device;
After the permission request passes, synchronizing first teaching data corresponding to the second main control equipment to the first main control equipment for display;
when the first main control equipment requests control, a corresponding control right switching request is sent to the second main control equipment;
after the control right switching request passes, receiving second teaching data acquired by the first main control equipment;
synchronizing the second teaching data to the first main control equipment for display;
after the teaching task is completed, receiving third teaching data sent by the first main control equipment;
And sending the third teaching data to the controlled device so that the controlled device can complete corresponding teaching actions according to the third teaching data.
Optionally, before the sending the image data to the corresponding one or more master devices, the method further includes:
when receiving a connection request of one or more master control devices for a controlled device, verifying one or more master control devices;
and taking the master control equipment which passes the verification as the current login equipment corresponding to the controlled equipment.
In addition, in order to achieve the above object, the present invention also provides a remote teaching method applied to a master control device, the remote teaching method comprising:
Acquiring image data of an environment where controlled equipment is located from a cloud server;
generating a virtual simulation environment according to the image data;
displaying the virtual simulation environment through wearable virtual reality equipment, and collecting teaching action information input by a user through a teaching handle;
Generating teaching data according to the teaching action information;
and sending the teaching data to the cloud server so that the cloud server caches the teaching data.
Optionally, the generating a virtual simulation environment according to the image data includes:
Extracting features of the image data to obtain corresponding feature information;
establishing a stereo matching relationship between the image pairs according to the characteristic information;
And carrying out three-dimensional space point cloud reconstruction according to the characteristic information, the three-dimensional matching relation and the calibrated internal and external parameters corresponding to the image acquisition equipment to obtain a virtual simulation environment.
In addition, in order to achieve the above object, the present invention also proposes a remote teaching system, which is characterized in that the remote teaching system at least includes the cloud server as described in any one of the above and the master control device as described in any one of the above;
the cloud server is used for acquiring image data of the environment where the controlled equipment is located and sending the image data to one or more corresponding main control equipment;
Each main control device is used for generating a virtual simulation environment according to the image data, displaying the virtual simulation environment through a wearable virtual reality device, collecting teaching action information input by a user through a teaching handle, generating teaching data according to the teaching action information, and sending the teaching data to the cloud server;
And the cloud server is also used for selecting target teaching data from the teaching data fed back by the plurality of main control equipment and sending the target teaching data to the controlled equipment when the teaching data fed back by the plurality of main control equipment are received, so that the controlled equipment completes corresponding teaching actions according to the target teaching data.
According to the invention, the cloud end server acquires image data of the environment where the controlled equipment is located; the method comprises the steps of sending image data to one or more corresponding main control devices, enabling the main control devices to generate virtual simulation environments according to the image data, displaying the virtual simulation environments through wearable virtual reality devices, collecting teaching action information input by a user through a teaching handle, and generating and feeding back teaching data according to the teaching action information; when the teaching data fed back by the plurality of main control devices are received, target teaching data are selected from the teaching data fed back by the plurality of main control devices and are sent to the controlled device, so that the controlled device completes corresponding teaching actions according to the target teaching data. Through the mode, the multi-production requirement of multiple users is met, the teaching scheme of the manipulator can be timely changed when production needs to be changed, the teaching capability of the demonstrator for dealing with workpieces with complex and changeable structures is improved, and field supervision of technicians is not needed.
Drawings
FIG. 1 is a schematic diagram of a remote teaching device of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of the remote teaching method of the present invention;
FIG. 3 is a flow chart of a second embodiment of the remote teaching method of the present invention;
FIG. 4 is a schematic flow chart of a third embodiment of the remote teaching method of the present invention;
fig. 5 is a block diagram of a first embodiment of a remote teaching system according to the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a remote teaching device of a hardware running environment according to an embodiment of the present invention.
As shown in fig. 1, the remote teaching device may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a high-speed random access Memory (Random Access Memory, RAM) or a stable nonvolatile Memory (NVM), such as a disk Memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
Those skilled in the art will appreciate that the configuration shown in fig. 1 is not limiting of the remote teaching apparatus and may include more or fewer components than shown, or may combine certain components, or may be arranged in different components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and a remote teaching program may be included in the memory 1005 as one type of storage medium.
In the remote teaching apparatus shown in fig. 1, the network interface 1004 is mainly used for data communication with a network server; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 in the remote teaching apparatus of the present invention may be provided in the remote teaching apparatus, which invokes the remote teaching program stored in the memory 1005 through the processor 1001 and performs the remote teaching method provided by the embodiment of the present invention.
The embodiment of the invention provides a remote teaching method, referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the remote teaching method of the invention.
In this embodiment, the remote teaching method is applied to a cloud server, and includes the following steps:
step S10: and acquiring image data of the environment where the controlled device is located.
It can be understood that the controlled device includes a manipulator (or other robots) and an image acquisition module, where the image acquisition module may be a camera or other image capturing devices, and is configured to acquire working image data of a production site of the manipulator, upload the working image data to a cloud server, process the working image data by the cloud server, and send the working image data to the master control device. In a specific implementation, the image acquisition module adopts an OAK-D binocular camera module of Opencv.
Step S20: and sending the image data to one or more corresponding main control devices, so that the main control devices generate virtual simulation environments according to the image data, display the virtual simulation environments through wearable virtual reality devices, collect teaching action information input by a user through a teaching handle, and generate and feed back teaching data according to the teaching action information.
It should be noted that, the teaching end of this embodiment includes master control equipment, wearable virtual reality equipment (for example: wearable VR headgear) and teaching handle, wherein, master control equipment is connected with wearable virtual reality equipment, and the other end and the teaching handle of wearable virtual reality equipment are connected, and the master control equipment other end passes through 5G communication module and is controlled equipment wireless connection, and 5G communication module is connected with industrial gateway, and the high in the clouds server passes through 5G module and is connected master control equipment and controlled equipment. In a specific implementation, the model adopted by the 5G communication module is a Huashi MH500 module, the model adopted by the industrial gateway is Inje-Weida Jeson nano, and the cloud server block adopts a Huashi cloud server.
It should be understood that after the main control device obtains the image data of the environment where the controlled device is located from the cloud server, a virtual simulation scene is provided to the wearable virtual reality device, the user receives the environment where the controlled device is located in a real-time manner, and perceives and controls the controlled device at the remote end from the first person's viewing angle, so that the remote production teaching of the controlled device is realized. Meanwhile, as the posture sensing device is arranged in the wearable VR headgear, the rotation flexibility of the visual angle is greatly improved, and the image information around the controlled device is acquired through the image acquisition module, so that a user at the remote end can perform teaching work of the controlled device like a production site, and the motion condition of the controlled end can be fed back for the remote end; the user is through wearing wearable VR headgear equipment, is collected user's teaching action information by the teaching handle to after carrying out data processing through master control equipment, be converted corresponding transmission protocol by industrial gateway, 5G communication module with data upload to high in the clouds server and carry out corresponding algorithm processing, again by 5G communication module transmission to controlled equipment, thereby control controlled equipment carries out teaching action. The ultra-high speed up to 200Mbps information throughput of the 5G communication module is fully utilized, and the problems of low VR information data transmission speed and high delay are solved; meanwhile, when facing to a workpiece with a complex structure, natural and efficient man-machine interaction can be realized through the head-mounted VR equipment and the control handle, so that teaching work of the complex workpiece is efficiently realized.
It should be noted that, the teaching end of the embodiment may have a plurality of master control devices, and the master control devices are connected with the cloud server, so that teaching of the controlled devices can be realized, and when a technician cannot reach a production workshop, teaching of the controlled devices can be realized rapidly and efficiently; meanwhile, the cloud server allows a plurality of teaching devices to log in, and in the teaching process, a technician can remotely provide teaching guidance for another master control device by related expert teaching, so that the teaching efficiency and success probability of the manipulator are greatly improved.
Further, before the step S20, the method further includes: when receiving a connection request of one or more master control devices for a controlled device, verifying one or more master control devices; and taking the master control equipment which passes the verification as the current login equipment corresponding to the controlled equipment.
It should be understood that a user logs in the cloud server through the master control device, the login management module of the cloud server confirms the user identity, checks the user identity, confirms that the user login is successful after the user login is passed, namely, the master control device passing the check is used as the current login device corresponding to the controlled device, performs preliminary processing such as filtering, integration and the like on the operation environment data of the controlled device, and then transmits the operation environment data to the current login master control device through the 5G communication network.
Step S30: and when the teaching data fed back by the plurality of main control devices are received, selecting target teaching data from the teaching data fed back by the plurality of main control devices and sending the target teaching data to the controlled device, so that the controlled device completes corresponding teaching actions according to the target teaching data.
The cloud server caches the teaching data fed back by the plurality of main control devices, and selects the teaching data according to a preset selection strategy, wherein the preset selection strategy is selected according to the priority of the main control devices, and the data fed back by the device with the highest priority is selected as target teaching data; optionally, presetting a selection policy to select the teaching data with the smallest data amount as the target teaching data; alternatively, the preset selection policy is to randomly select one of the teaching data as the target teaching data. The present embodiment is not limited thereto.
In a specific implementation, the cloud server at least includes a login management module, a data cache module, and a data processing module. After all devices in the system are connected, a user logs in the cloud server through the main control device, the login management module confirms the user identity, the user identity is checked, and after the user identity passes the check, the user login is confirmed to be successful. The remote controlled device uploads image data of the environment where the manipulator is located to the cloud server, the data processing module performs primary processing such as filtering and integration, and the data processing module is transmitted to the main control device through the 5G communication network. After receiving the image data, the main control equipment constructs a three-dimensional space environment of the manipulator and machine production environment and displays the three-dimensional space environment in the wearable VR headgear. In the teaching process, teaching personnel carry out planning operation on teaching actions according to production needs of the remote end manipulator, wherein virtual simulation environments are displayed in VR headgear, the teaching personnel control the simulation manipulator to move through the teaching handle, and the master control equipment records movement tracks of all movement joints of the simulation manipulator. After teaching is completed, the master control equipment carries out arrangement compression on teaching data, 5G protocol conversion is carried out when the industrial gateway is used, and the teaching data is uploaded to the cloud server through the 5G communication module. The system comprises a master control device, an industrial gateway, a controlled device and a 5G communication module, wherein the master control device and the industrial gateway and the controlled device and the 5G communication module both adopt an RS-485 communication protocol, the highest data transmission rate of the RS-485 communication protocol is 10Mbps, and the system has good noise interference resistance, long transmission distance and multi-station capability; the industrial gateway, the 5G communication module and the cloud server adopt a TCP/IP communication protocol, and the TCP/IP communication protocol TCP/IP has high reliability, ensures the correctness of transmitted data and does not lose or disorder. After the cloud server acquires the teaching end data, the cache module caches the teaching data, the data processing module of the cloud server judges whether physical logic errors on the movement of the manipulator exist in the teaching data, and if the errors exist, the physical logic errors are fed back to the main control equipment to carry out teaching operation again. The cloud server selects proper target teaching data from the cached teaching data and sends the target teaching data to the controlled equipment, the remote controlled equipment performs production verification after receiving a complete teaching scheme, and after the verification is passed, the corresponding teaching action is completed according to the received target teaching data, and the image information is fed back by the image acquisition module. If the manipulator successfully completes the production task, teaching is finished, and if the manipulator cannot normally complete the production task, the cloud server feeds back related information to the main control equipment to prompt the main control equipment to execute teaching operation again until teaching is finished.
Further, before the target teaching data is selected from the teaching data fed back by the plurality of master control devices and sent to the controlled device, the method further includes: judging whether the teaching data fed back by each main control device meets the motion logic corresponding to the controlled device; deleting the teaching data which does not meet the motion logic, and prompting the corresponding main control equipment to carry out teaching operation again.
It can be understood that the cloud server is provided with an action simulation function, wherein the action simulation function comprises a rotation angle threshold value of each joint of the manipulator, when teaching data are received, teaching actions corresponding to the teaching data are simulated, and when the fact that the rotation angle of each joint of the manipulator formed by the teaching actions does not exceed the rotation angle threshold value is detected, the teaching data are judged to meet the motion logic corresponding to the controlled equipment; if the rotation angle of any joint of the manipulator exceeds the rotation angle threshold, judging that the teaching data does not meet the motion logic corresponding to the controlled equipment, deleting the teaching data, and feeding back to the main control equipment to carry out teaching operation again.
Further, after the image data is sent to the corresponding one or more master control devices, the method further includes: when a first main control device requests auxiliary teaching aiming at a second main control device, sending an authority request to the second main control device; after the permission request passes, synchronizing first teaching data corresponding to the second main control equipment to the first main control equipment for display; when the first main control equipment requests control, a corresponding control right switching request is sent to the second main control equipment; after the control right switching request passes, receiving second teaching data acquired by the first main control equipment; synchronizing the second teaching data to the first main control equipment for display; after the teaching task is completed, receiving third teaching data sent by the first main control equipment; and sending the third teaching data to the controlled device so that the controlled device can complete corresponding teaching actions according to the third teaching data.
It should be noted that, this embodiment also provides an auxiliary teaching function, when the user a logs in and teaches, the user B may log in at the same time, when the user B requests auxiliary teaching, the user a may select whether to share the teaching process to the user B synchronously, and after obtaining the consent of the user a, the user B may watch the teaching process of the user a in real time, and may communicate with the user a in real time to assist teaching.
In a specific implementation, a plurality of users can log in to control the controlled equipment at the same time, the second main control equipment can send an auxiliary teaching request to the first main control equipment through the cloud server, and after the first main control equipment agrees to pass, the teaching data of the first main control equipment and the second main control equipment realize synchronous updating and displaying. And the first main control equipment and the second main control equipment can carry out teaching control on the same manipulator. Assuming that the first main control equipment carries out manipulator production teaching, the second main control equipment can acquire the current teaching action of the manipulator through the wearable VR headgear of the first main control equipment, if teaching is needed to be carried out on the manipulator, a control request is sent to the first main control equipment through the cloud server, and after the first main control equipment agrees to the request, the second main control equipment looks at the teaching control right of the manipulator. Similarly, assuming that the second main control device performs manipulator production teaching, the first main control device can also obtain the teaching control right of the manipulator after sending a control request, thereby realizing the switching of the manipulator control right. After the teaching task is completed, the first main control equipment requesting auxiliary teaching uploads teaching data to the cloud server, and the cloud server issues the teaching data to the remote controlled manipulator to be executed.
In the embodiment, a cloud end server acquires image data of an environment where a controlled device is located; the method comprises the steps of sending image data to one or more corresponding main control devices, enabling the main control devices to generate virtual simulation environments according to the image data, displaying the virtual simulation environments through wearable virtual reality devices, collecting teaching action information input by a user through a teaching handle, and generating and feeding back teaching data according to the teaching action information; when the teaching data fed back by the plurality of main control devices are received, target teaching data are selected from the teaching data fed back by the plurality of main control devices and are sent to the controlled device, so that the controlled device completes corresponding teaching actions according to the target teaching data. Through the mode, the multi-production requirement of multiple users is met, the teaching scheme of the manipulator can be timely changed when production needs to be changed, the teaching capability of the demonstrator for dealing with workpieces with complex and changeable structures is improved, and field supervision of technicians is not needed.
Referring to fig. 3, fig. 3 is a schematic flow chart of a second embodiment of the remote teaching method of the present invention.
Based on the first embodiment, the step S30 of the remote teaching method of the present embodiment includes:
Step S301: and when the teaching data fed back by the plurality of main control devices are received, acquiring the priority corresponding to each main control device.
It can be understood that, the cloud server of this embodiment further includes a priority management module, when each master control device logs in the cloud server, the priority management module confirms the user priority, the first level is highest, the second level is second, and so on, the priority order is stored from high to low according to the priority order, and the cloud server can process the teaching data uploaded by the master control device with high priority preferentially. Optionally, the user priority is determined based on the user identity in the user information, e.g. the expert identity has a higher priority than the engineer identity.
Step S302: and selecting the master control equipment with the highest priority as the target master control equipment.
Step S303: and sending the teaching data fed back by the target main control equipment to the controlled equipment as target teaching data, so that the controlled equipment completes corresponding teaching actions according to the target teaching data.
When a plurality of users upload teaching data at the same time, the data caching module can cache the data at the same time, and the priority module can transmit the teaching data fed back by the master control equipment with high priority to the data processing module for processing and issuing to the remote controlled equipment according to the priority of the master control equipment.
Further, after the obtaining the priorities corresponding to the master control devices, the method further includes: sequencing teaching data fed back by a plurality of main control devices according to the priority;
When the teaching data fed back by the plurality of master control devices are received, selecting target teaching data from the teaching data fed back by the plurality of master control devices and sending the target teaching data to the controlled device, wherein the method further comprises the following steps:
And when action completion information fed back by the controlled equipment is received, sending next teaching data to the controlled equipment according to the sequencing order.
It should be understood that the teaching data fed back by the master control device with the highest priority is sent to the controlled device, so that the controlled device completes the corresponding teaching action, after the execution is completed, action completion information is fed back to the cloud server, the cloud server sends the teaching data fed back by the master control device with relatively lower priority to the controlled device according to the sorting order, and the like until the whole issuing of the teaching data is completed.
Further, after the master device with the highest priority is selected as the target master device, the method further includes: when the target main control equipment is multiple, determining the complexity degree of teaching data fed back by the multiple target main control equipment; and selecting the teaching data with the lowest complexity as target teaching data and sending the target teaching data to the controlled equipment.
It should be noted that, if multiple users log in and control at the same time and the priorities of the multiple users belong to the same priority, the data processing module will determine the complexity of the teaching data fed back by the main control device, and process the teaching data with low complexity preferentially, and send the teaching data to the remote controlled device, and after the controlled device completes the corresponding production task, the cloud server continues to process the teaching data with relatively high complexity.
In a specific implementation, the cloud server can determine the complexity degree of the teaching data according to the data size of the teaching data, and preferentially process the teaching data with small data size. Preferably, the cloud server simulates each piece of teaching data through the action simulation function, determines the total value of the rotation angle of each joint when executing each piece of teaching data, determines the complexity degree of the teaching data according to the total value of the rotation angle, and preferentially processes the teaching data with the minimum total value of the rotation angle of each joint.
In the embodiment, a cloud end server acquires image data of an environment where a controlled device is located; the method comprises the steps of sending image data to one or more corresponding main control devices, enabling the main control devices to generate virtual simulation environments according to the image data, displaying the virtual simulation environments through wearable virtual reality devices, collecting teaching action information input by a user through a teaching handle, and generating and feeding back teaching data according to the teaching action information; when teaching data fed back by a plurality of main control devices are received, acquiring priorities corresponding to the main control devices; selecting the master control equipment with the highest priority as a target master control equipment; and the teaching data fed back by the target main control equipment is used as target teaching data to be sent to the controlled equipment, so that the controlled equipment completes corresponding teaching actions according to the target teaching data. Through the mode, the teaching data is issued according to the priority of the master control equipment, so that the action disorder caused by multi-user synchronous control is avoided, the multi-user production requirement is met, the teaching scheme can be timely changed when the production requirement is changed by the manipulator, the teaching capability of the demonstrator for the workpiece with the complex and changeable structure is improved, and the field supervision of technicians is not required.
In addition, referring to fig. 4, fig. 4 is a flow chart of a third embodiment of the remote teaching method of the present invention, and the embodiment of the present invention provides a remote teaching method applied to a master control device, including the following steps:
step S01: and acquiring image data of the environment where the controlled device is located from the cloud server.
Step S02: and generating a virtual simulation environment according to the image data.
Step S03: and displaying the virtual simulation environment through the wearable virtual reality equipment, and collecting teaching action information input by a user through a teaching handle.
Step S04: and generating teaching data according to the teaching action information.
Step S05: and sending the teaching data to the cloud server so that the cloud server caches the teaching data.
Specifically, the step S02 includes: extracting features of the image data to obtain corresponding feature information; establishing a stereo matching relationship between the image pairs according to the characteristic information; and carrying out three-dimensional space point cloud reconstruction according to the characteristic information, the three-dimensional matching relation and the calibrated internal and external parameters corresponding to the image acquisition equipment to obtain a virtual simulation environment.
It should be understood that the image acquisition module is configured as three cameras, with the working space of the controlled device as the center, each of the cameras being placed 120 ° apart, and three pictures of the working space of the controlled device are acquired each time. In a specific implementation, an object in space is restored by using an image captured by a camera, and it is assumed that there is one of the following simple linear relations between the image captured by the camera and the object in three-dimensional space: image =m object, where matrix M can be seen as a geometric model of camera imaging, and parameters in M are internal and external parameters in the camera calibration process.
The extracted features of this embodiment include: feature points, feature lines, and regions. Preferably, feature points are used as matching primitives. In a specific implementation, the feature extraction approach is closely tied to the matching policy, and in one implementation, an accelerated robust feature algorithm (Speeded Up Robust Features, SURF) is used to perform feature extraction on the image acquired by the image acquisition module.
It should be understood that stereo matching refers to establishing a correspondence between image pairs according to the extracted features, that is, one-to-one correspondence between imaging points of the same physical spatial point in two different images. In one implementation, the feature extracted image features are stereo matched using a fast nearest neighbor algorithm (fast library for approximate nearest neighbors, FLANN).
After a relatively accurate matching result is obtained, combining internal and external parameters calibrated by a camera, and completing reconstruction of a three-dimensional space point cloud of remote controlled equipment and a production environment thereof by using a python opencv library to generate a virtual simulation environment.
In the embodiment, the main control device acquires image data of an environment where the controlled device is located from the cloud server; generating a virtual simulation environment according to the image data; displaying a virtual simulation environment through wearable virtual reality equipment, and collecting teaching action information input by a user through a teaching handle; generating teaching data according to the teaching action information; transmitting the teaching data to a cloud server so that the cloud server caches the teaching data; when the cloud server receives the teaching data fed back by the plurality of main control devices, selecting target teaching data from the teaching data fed back by the plurality of main control devices and sending the target teaching data to the controlled device, so that the controlled device completes corresponding teaching actions according to the target teaching data. Through the mode, remote teaching of the manipulator is realized, so that the manipulator is suitable for multiple production requirements of multiple users, a teaching scheme can be timely changed when production needs are changed, the teaching capability of the demonstrator for dealing with workpieces with complex and changeable structures is improved, and field supervision of technicians is not needed.
In addition, technical details not described in detail in the present embodiment may refer to the remote teaching methods provided in the first embodiment and the second embodiment of the remote teaching method of the present invention, and are not described herein.
Referring to fig. 5, fig. 5 is a block diagram illustrating a first embodiment of a remote teaching system according to the present invention.
As shown in fig. 5, the remote teaching system according to the embodiment of the present invention at least includes the cloud server 10 as described in any one of the above and the master control device 20 as described in any one of the above;
the cloud server 10 is configured to obtain image data of an environment where the controlled device is located, and send the image data to the corresponding one or more main control devices 20;
Each of the master control devices 20 is configured to generate a virtual simulation environment according to the image data, display the virtual simulation environment through a wearable virtual reality device, collect teaching action information input by a user through a teaching handle, generate teaching data according to the teaching action information, and send the teaching data to the cloud server 10;
The cloud server 10 is further configured to select, when receiving the teaching data fed back by the plurality of master control devices 20, target teaching data from the teaching data fed back by the plurality of master control devices 20, and send the target teaching data to the controlled device, so that the controlled device completes a corresponding teaching action according to the target teaching data.
It should be understood that the foregoing is illustrative only and is not limiting, and that in specific applications, those skilled in the art may set the invention as desired, and the invention is not limited thereto.
In the embodiment, a cloud end server acquires image data of an environment where a controlled device is located; the method comprises the steps of sending image data to one or more corresponding main control devices, enabling the main control devices to generate virtual simulation environments according to the image data, displaying the virtual simulation environments through wearable virtual reality devices, collecting teaching action information input by a user through a teaching handle, and generating and feeding back teaching data according to the teaching action information; when the teaching data fed back by the plurality of main control devices are received, target teaching data are selected from the teaching data fed back by the plurality of main control devices and are sent to the controlled device, so that the controlled device completes corresponding teaching actions according to the target teaching data. Through the mode, the multi-production requirement of multiple users is met, the teaching scheme of the manipulator can be timely changed when production needs to be changed, the teaching capability of the demonstrator for dealing with workpieces with complex and changeable structures is improved, and field supervision of technicians is not needed.
It should be noted that the above-described working procedure is merely illustrative, and does not limit the scope of the present invention, and in practical application, a person skilled in the art may select part or all of them according to actual needs to achieve the purpose of the embodiment, which is not limited herein.
In addition, technical details not described in detail in this embodiment may refer to the remote teaching method provided in any embodiment of the present invention, which is not described herein.
Furthermore, it should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. Read Only Memory)/RAM, magnetic disk, optical disk) and including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (7)

1. The remote teaching system is characterized by comprising a cloud server and a master control device;
the cloud server is used for acquiring image data of the environment where the controlled equipment is located and sending the image data to one or more corresponding main control equipment;
Each main control device is used for generating a virtual simulation environment according to the image data, displaying the virtual simulation environment through a wearable virtual reality device, collecting teaching action information input by a user through a teaching handle, generating teaching data according to the teaching action information, and sending the teaching data to the cloud server;
The cloud server is further configured to select target teaching data from the teaching data fed back by the plurality of master control devices and send the target teaching data to the controlled device when the teaching data fed back by the plurality of master control devices are received, so that the controlled device completes corresponding teaching actions according to the target teaching data;
The remote teaching method applied to the cloud server comprises the following steps:
Acquiring image data of an environment where controlled equipment is located;
The image data are sent to one or more corresponding main control devices, so that the main control devices generate virtual simulation environments according to the image data, the virtual simulation environments are displayed through wearable virtual reality devices, teaching action information input by a user is collected through a teaching handle, and teaching data are generated and fed back according to the teaching action information;
When teaching data fed back by a plurality of main control devices are received, selecting target teaching data from the teaching data fed back by the plurality of main control devices and sending the target teaching data to the controlled device, so that the controlled device completes corresponding teaching actions according to the target teaching data;
the selecting target teaching data from the teaching data fed back by the plurality of main control devices and sending the target teaching data to the controlled device comprises the following steps:
acquiring the priority corresponding to each main control device;
Selecting the master control equipment with the highest priority as a target master control equipment;
The teaching data fed back by the target main control equipment is used as target teaching data to be sent to the controlled equipment;
after the priority corresponding to each master control device is obtained, the method further comprises:
Sequencing teaching data fed back by a plurality of main control devices according to the priority;
When the teaching data fed back by the plurality of master control devices are received, selecting target teaching data from the teaching data fed back by the plurality of master control devices and sending the target teaching data to the controlled device, wherein the method further comprises the following steps:
And when action completion information fed back by the controlled equipment is received, sending next teaching data to the controlled equipment according to the sequencing order.
2. The remote teaching system according to claim 1, wherein after selecting the master device with the highest priority as the target master device, further comprising:
when the target main control equipment is multiple, determining the complexity degree of teaching data fed back by the multiple target main control equipment;
And selecting the teaching data with the lowest complexity as target teaching data and sending the target teaching data to the controlled equipment.
3. The remote teaching system according to claim 1, wherein before selecting target teaching data from the teaching data fed back from the plurality of master devices and transmitting the target teaching data to the controlled device, the remote teaching system further comprises:
Judging whether the teaching data fed back by each main control device meets the motion logic corresponding to the controlled device;
deleting the teaching data which does not meet the motion logic, and prompting the corresponding main control equipment to carry out teaching operation again.
4. The remote teaching system according to claim 1, wherein after said transmitting said image data to the corresponding one or more master control devices, further comprising:
when a first main control device requests auxiliary teaching aiming at a second main control device, sending an authority request to the second main control device;
After the permission request passes, synchronizing first teaching data corresponding to the second main control equipment to the first main control equipment for display;
when the first main control equipment requests control, a corresponding control right switching request is sent to the second main control equipment;
after the control right switching request passes, receiving second teaching data acquired by the first main control equipment;
synchronizing the second teaching data to the first main control equipment for display;
after the teaching task is completed, receiving third teaching data sent by the first main control equipment;
And sending the third teaching data to the controlled device so that the controlled device can complete corresponding teaching actions according to the third teaching data.
5. The remote teaching system according to claim 1, wherein before said transmitting said image data to the corresponding one or more master control devices, further comprising:
when receiving a connection request of one or more master control devices for a controlled device, verifying one or more master control devices;
and taking the master control equipment which passes the verification as the current login equipment corresponding to the controlled equipment.
6. A remote teaching method applied to the master control device in the remote teaching system of claim 1, characterized in that the remote teaching method comprises:
Acquiring image data of an environment where controlled equipment is located from a cloud server;
generating a virtual simulation environment according to the image data;
displaying the virtual simulation environment through wearable virtual reality equipment, and collecting teaching action information input by a user through a teaching handle;
Generating teaching data according to the teaching action information;
and sending the teaching data to the cloud server so that the cloud server caches the teaching data.
7. The remote teaching method according to claim 6, wherein the generating a virtual simulation environment from the image data comprises:
Extracting features of the image data to obtain corresponding feature information;
establishing a stereo matching relationship between the image pairs according to the characteristic information;
And carrying out three-dimensional space point cloud reconstruction according to the characteristic information, the three-dimensional matching relation and the calibrated internal and external parameters corresponding to the image acquisition equipment to obtain a virtual simulation environment.
CN202210658818.7A 2022-06-09 2022-06-09 Remote teaching method and system Active CN115179256B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210658818.7A CN115179256B (en) 2022-06-09 2022-06-09 Remote teaching method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210658818.7A CN115179256B (en) 2022-06-09 2022-06-09 Remote teaching method and system

Publications (2)

Publication Number Publication Date
CN115179256A CN115179256A (en) 2022-10-14
CN115179256B true CN115179256B (en) 2024-04-26

Family

ID=83512859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210658818.7A Active CN115179256B (en) 2022-06-09 2022-06-09 Remote teaching method and system

Country Status (1)

Country Link
CN (1) CN115179256B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106409033A (en) * 2016-12-06 2017-02-15 北京奇虎科技有限公司 Remote teaching assisting system and remote teaching method and device for system
CN107263449A (en) * 2017-07-05 2017-10-20 中国科学院自动化研究所 Robot remote teaching system based on virtual reality
CN108472810A (en) * 2016-01-29 2018-08-31 三菱电机株式会社 Robot teaching apparatus and robot control program's generation method
CN110238831A (en) * 2019-07-23 2019-09-17 青岛理工大学 Robot teaching system and method based on RGB-D image and teaching machine

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10165041B2 (en) * 2016-10-13 2018-12-25 Equalearning Corp. System and method for uninterrupted learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108472810A (en) * 2016-01-29 2018-08-31 三菱电机株式会社 Robot teaching apparatus and robot control program's generation method
CN106409033A (en) * 2016-12-06 2017-02-15 北京奇虎科技有限公司 Remote teaching assisting system and remote teaching method and device for system
CN107263449A (en) * 2017-07-05 2017-10-20 中国科学院自动化研究所 Robot remote teaching system based on virtual reality
CN110238831A (en) * 2019-07-23 2019-09-17 青岛理工大学 Robot teaching system and method based on RGB-D image and teaching machine

Also Published As

Publication number Publication date
CN115179256A (en) 2022-10-14

Similar Documents

Publication Publication Date Title
CN109313484B (en) Virtual reality interaction system, method and computer storage medium
CN106403942B (en) Personnel indoor inertial positioning method based on substation field depth image identification
CN106744111A (en) Elevator repair and maintenance management system and method based on bluetooth and Mobile Data Communication Technology
JP6430079B1 (en) Monitoring system and monitoring method
JP2019513246A (en) Training method of random forest model, electronic device and storage medium
KR20200068075A (en) Remote guidance apparatus and method capable of handling hyper-motion step based on augmented reality and machine learning
CN102375972A (en) Distributive augmented reality platform based on mobile equipment
CN107077651A (en) Robot cooperation method, device, robot and computer program product
CN107071297A (en) A kind of virtual reality system that logical computer room displaying is believed for electric power
CN108415386A (en) Augmented reality system and its working method for intelligent workshop
CN110977981A (en) Robot virtual reality synchronization system and synchronization method
CN114662714A (en) Machine room operation and maintenance management system and method based on AR equipment
CN113752264A (en) Mechanical arm intelligent equipment control method and system based on digital twins
CN111246181B (en) Robot monitoring method, system, equipment and storage medium
CN108776444A (en) Augmented reality man-machine interactive system suitable for CPS automatic control systems
CN115346413A (en) Assembly guidance method and system based on virtual-real fusion
CN115179256B (en) Remote teaching method and system
US11258939B2 (en) System, method and apparatus for networking independent synchronized generation of a series of images
CN111652659A (en) VR product evaluation system based on big data
CN112558761A (en) Remote virtual reality interaction system and method for mobile terminal
CN115213890B (en) Grabbing control method, grabbing control device, grabbing control server, electronic equipment and storage medium
CN112947752B (en) Collaborative human-computer interaction control method based on intelligent equipment
CN112192564B (en) Remote control method, device, equipment and storage medium for robot
CN111660294B (en) Augmented reality control system of hydraulic heavy-duty mechanical arm
WO2019037073A1 (en) Method, device and sever for data synchronization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant