CN108422435B - Remote monitoring and control system based on augmented reality - Google Patents

Remote monitoring and control system based on augmented reality Download PDF

Info

Publication number
CN108422435B
CN108422435B CN201810236758.3A CN201810236758A CN108422435B CN 108422435 B CN108422435 B CN 108422435B CN 201810236758 A CN201810236758 A CN 201810236758A CN 108422435 B CN108422435 B CN 108422435B
Authority
CN
China
Prior art keywords
manipulator
monitoring
module
robot body
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810236758.3A
Other languages
Chinese (zh)
Other versions
CN108422435A (en
Inventor
陈成军
于浩
信寄遥
王天诺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yantai Longwen Auto Parts Co ltd
Original Assignee
Yantai Longwen Auto Parts Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yantai Longwen Auto Parts Co Ltd filed Critical Yantai Longwen Auto Parts Co Ltd
Priority to CN201810236758.3A priority Critical patent/CN108422435B/en
Publication of CN108422435A publication Critical patent/CN108422435A/en
Application granted granted Critical
Publication of CN108422435B publication Critical patent/CN108422435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/006Controls for manipulators by means of a wireless system for controlling one or several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Manipulator (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a remote monitoring and control system based on augmented reality, which comprises an interactive simulation control mode and a monitoring mode, wherein in the interactive simulation mode, a server receives depth images shot by an RGBD (red, green and blue) camera on a mobile robot, and synthesizes the depth images shot at different angles into a scene three-dimensional point cloud model; then combining a scene three-dimensional point cloud model, operating a manipulator simulation model, planning a path, driving the manipulator simulation model to move by using the path, and finally superposing the dynamic manipulator simulation model in a color image shot by an RGBD camera to realize virtual-real combined simulation display and path planning; under the monitoring mode, monitoring data are collected through a remote data collection system, the monitoring images of the monitored objects are obtained through patrol and inspection of the mobile robot, the monitoring data are obtained through a server, the monitoring data and the monitoring data are displayed in a superposition mode, and a display interface is more visual.

Description

Remote monitoring and control system based on augmented reality
Technical Field
The invention relates to a remote monitoring and control system based on augmented reality, and belongs to the field of intelligent manufacturing and computer measurement and control.
Background
The current remote monitoring and control are widely applied to production, however, in the aspect of monitoring, the current monitoring system respectively displays monitoring videos and monitoring data, monitoring personnel need to link the video contents and the monitoring data together, and the reality of an interface is not visual enough. In addition, in the aspect of control, a controller can only estimate three-dimensional information of a remote environment through a color image to plan a path of the manipulator, so that the remote manipulator is controlled to complete operation, and the control of the robot is not accurate enough.
The invention patent with publication number CN106863303A (CN 106863303A) has the following specific technical scheme: the angular displacement of each joint is collected through a coder arranged at each joint of the manipulator, the collected data is sent to a computer for processing, the computer plans a manipulator path according to the angular displacement data of each joint and controls the manipulator to complete welding of a welding seam along the path. The technical scheme is an equipment field path planning mode and cannot meet the requirement of the current enterprise remote path planning.
Disclosure of Invention
In order to solve the technical problems, the invention provides a remote monitoring and control system based on augmented reality, which realizes the fusion display of video monitoring and data monitoring of remote inspection application and the generation of local path planning and simulation of remote operation of a robot at a server.
The technical scheme of the invention is as follows:
a remote monitoring and control system based on augmented reality comprises an interactive simulation control mode, wherein the mode finishes local path planning and simulation of manipulator remote operation in a virtual-real combined mode through a server and a mobile robot located in remote equipment and an environment, and specifically comprises the following steps:
the mobile robot comprises a robot body, and a manipulator and an RGBD (red green blue) camera which are arranged on the robot body;
the server comprises a remote environment reconstruction module, a manipulator operation simulation module and an augmented reality display module; the remote environment reconstruction module receives depth images shot by the RGBD camera and synthesizes the depth images shot at different angles into a scene three-dimensional point cloud model; the manipulator operation simulation module establishes a manipulator simulation model, then combines a scene three-dimensional point cloud model, operates the manipulator simulation model, plans a path, and finally drives the manipulator simulation model to move by using the path; and the augmented reality display module superposes the dynamic manipulator simulation model in a color image shot by the RGBD camera to realize virtual-real combined simulation display.
Preferably, the collision detection is also completed in the interactive simulation control mode, specifically: the manipulator operation simulation module drives the manipulator simulation model to move by utilizing the path in the scene three-dimensional point cloud model, and checks whether the manipulator simulation model interferes with the scene three-dimensional point cloud model or not to realize virtual and real collision detection; and if the interference exists, optimizing the path, then sending the optimized path to the mobile robot, and if the interference does not exist, sending the path to the mobile robot, and controlling the movement of the manipulator by the mobile robot according to the path to complete the operation task.
Preferably, the mobile robot further comprises an image acquisition controller, a robot body controller and a manipulator controller; the image collector controls the camera holder to move and collects a color image and a depth image on the RGBD camera, and the RGBD camera is fixed on the camera holder; the robot body controller controls the robot body to move and feeds back the direction and the posture of the robot body; the manipulator controller controls the motion of the manipulator;
the server is also provided with a cradle head teleoperation module, a robot body teleoperation module and a manipulator teleoperation module; the cradle head teleoperation module sends an instruction to the image acquisition controller to control the camera cradle head to move according to the instruction, so that the shooting angle of the RGBD camera is changed; the robot body teleoperation module sends an instruction to the robot body controller to control the robot body to move according to the instruction; the manipulator teleoperation module sends an instruction to a manipulator controller to control the manipulator to move according to the instruction;
when the remote environment reconstruction module synthesizes a scene three-dimensional point cloud model, control instructions are sent to a robot body controller and a manipulator controller which correspond to the robot body controller and the manipulator controller respectively through the robot body teleoperation module and the manipulator teleoperation module, so that the robot body and the manipulator are kept fixed, then the instructions are sent to an image acquisition controller through the holder teleoperation module, a camera holder is controlled to rotate at multiple angles according to the instructions, and depth images are shot at all angles, so that depth images at different angles are obtained, and the scene three-dimensional point cloud model is synthesized.
Preferably, the manipulator simulation model established by the manipulator operation simulation module comprises a manipulator three-dimensional virtual model and an inverse kinematics model, the trajectory and the posture of the tail end point of the manipulator are planned in an interactive simulation environment, a tail end point trajectory and a posture sequence are obtained, the tail end point trajectory and the posture sequence are used as the input of the inverse kinematics model, the angle sequence of each joint of the manipulator is obtained, then the angle sequence of each joint of the manipulator is read in sequence, and each joint of the manipulator three-dimensional virtual model is driven to move.
Preferably, the augmented reality display module superposes the dynamic manipulator three-dimensional virtual model in a color image shot by the RGBD camera according to the current position and posture of the robot body and the azimuth information of the camera holder, and then displays the dynamic manipulator three-dimensional virtual model to realize virtual-real visual fusion.
Preferably, the current position and posture of the robot body send control instructions to the robot body controller through the robot body teleoperation module to acquire the information; the camera pan-tilt azimuth information sends an instruction to the image acquisition controller through the pan-tilt teleoperation module to acquire the azimuth information of the camera pan-tilt;
preferably, the remote monitoring and control system further includes a monitoring mode, which completes the fusion display of the monitoring data and the monitoring image in a virtual-real combined manner through a remote data acquisition system located in a remote device and an environment, the server and the mobile robot, specifically:
the remote data acquisition system acquires monitoring data of each monitored object and then sends the monitoring data to the server; each monitoring object is provided with a label;
the mobile robot receives a control instruction from the server, controls the robot body and the camera holder to move, enables the robot body to move to a monitored object, controls the camera holder to rotate, enables the RGBD camera to shoot a monitoring image containing label information and sends the monitoring image to the server;
the server also comprises an equipment identification module and a storage and fault diagnosis module;
the equipment identification module analyzes the monitoring image shot by the RGBD camera, divides and identifies the label information on the monitoring image, and sends an inquiry request containing the label information;
the storage and fault diagnosis module receives and stores the monitoring data sent by the remote data acquisition system, diagnoses whether a monitored object has a fault, and sends alarm information if the monitored object has the fault; when the storage and fault diagnosis module receives an inquiry request from the equipment identification module, inquiring monitoring data corresponding to the label information according to the label information, and then sending the monitoring data to the augmented reality display module;
the augmented reality display module superposes the monitoring data on the monitoring image shot by the RGBD camera, the nearby area of the corresponding label is displayed, the superposition display of the monitoring data and the monitoring data is realized, and the display interface is more visual.
Preferably, the remote data acquisition system comprises an industrial personal computer and sensors, wherein each sensor is connected to the industrial personal computer, each monitored object is provided with the sensor, and each sensor is correspondingly provided with one label;
preferably, the storage and fault diagnosis module receives and stores the monitoring data sent by the remote data acquisition system according to the corresponding relationship among the tag, the monitoring object and the monitoring data, compares the monitoring data with the corresponding alarm threshold value according to a preset alarm threshold value, diagnoses whether the monitoring object has a fault, and sends alarm information if the monitoring object has the fault; when the storage and fault diagnosis module receives an inquiry request from the equipment identification module, the monitoring data corresponding to the label information is inquired according to the label information and then sent to the augmented reality display module.
Preferably, the monitored images include pictures and videos.
The invention has the following beneficial effects:
1. according to the remote monitoring and control system based on augmented reality, a depth image of an environment is obtained through a mobile robot located in a far-end device and the environment, then a three-dimensional point cloud model of the far-end device and the environment is calculated and synthesized at a server end, and then local path planning and simulation of remote operation of the mobile robot are realized at the server end through a virtual simulation manipulator model, so that the remote monitoring and control system based on augmented reality is a virtual-real combined remote control system, and the control accuracy of the mobile robot is improved;
2. according to the remote monitoring and control system based on augmented reality, monitoring data are acquired through the remote data acquisition system, then a monitoring image of a monitored object is acquired through the mobile robot, label information is identified at the server side, current state information of the monitored object is acquired, and then the monitoring information is superposed on the monitoring image, so that data monitoring and field video monitoring or fusion display of the data monitoring and the image monitoring of remote inspection applications is realized.
Drawings
FIG. 1 is a block diagram of the system of the present invention;
FIG. 2 is a flow chart of an interactive simulation control mode of the present invention;
FIG. 3 is a flow chart of the monitor mode of the present invention.
The reference numbers in the figures denote:
1. a robot body; 2. a manipulator; 3. an RGBD camera; 4. a camera pan-tilt; 5. a robot body controller; 6. a manipulator controller; 7. an image acquisition controller; 10. a robot body teleoperation module; 20. a manipulator teleoperation module; 30. a cradle head teleoperation module; 40. a remote environment reconstruction module; 50. a manipulator operation simulation module; 60. an augmented reality display module; 70. a device identification module; 80. a storage and fault diagnosis module; 100. an industrial personal computer; 200. a sensor; 300. and (4) a label.
Detailed Description
The invention is described in detail below with reference to the figures and the specific embodiments.
The first embodiment is as follows:
referring to fig. 1 and 2, an augmented reality-based remote monitoring and control system includes an interactive simulation control mode, which completes local path planning and simulation of remote operation of a manipulator 2 in a virtual-real combined manner through a server and a mobile robot located in a remote device and an environment, specifically:
the mobile robot comprises a robot body 1, a manipulator 2 and an RGBD camera 3, wherein the manipulator 2 and the RGBD camera 3 are arranged on the robot body 1;
the server comprises a remote environment reconstruction module 40, a manipulator operation simulation module 50 and an augmented reality display module 60; the remote environment reconstruction module 40 receives depth images shot by the RGBD camera 3, synthesizes the depth images shot at different angles into a scene three-dimensional point cloud model, the manipulator operation simulation module 50 establishes a manipulator simulation model, then operates the manipulator simulation model in combination with the scene three-dimensional point cloud model, plans a path, and finally drives the manipulator simulation model to move by using the path; the augmented reality display module 60 superimposes the dynamic manipulator simulation model on the color image shot by the RGBD camera 3, so as to realize virtual-real combined analog display. The manipulator 2 includes a multi-degree-of-freedom manipulator such as a six-degree-of-freedom manipulator, a three-degree-of-freedom manipulator, or a two-degree-of-freedom manipulator, and the more degrees of freedom, the more flexibility the manipulator.
The collision detection is also completed in the interactive simulation control mode, specifically: the manipulator operation simulation module 50 drives the manipulator simulation model to move in the scene three-dimensional point cloud model by using the path, and checks whether the manipulator simulation model interferes with the scene three-dimensional point cloud model to realize virtual and real collision detection; if the interference exists, optimizing the path, and then sending the optimized path to the mobile robot, so as to control the motion of the manipulator 2 on the mobile robot and complete the operation task of the manipulator 2; the optimized path may also be displayed in a simulated manner by the augmented reality display module 60.
The mobile robot also comprises an image acquisition controller 7, a robot body controller 5 and a manipulator controller 6; the image collector controls the camera holder 4 to move and collects a color image and a depth image on the RGBD camera 3, and the RGBD camera 3 is fixed on the camera holder 4; the robot body controller 5 controls the robot body 1 to move and feeds back the direction and the posture of the robot body 1; the manipulator controller 6 controls the movement of the manipulator 2;
the server is also provided with a holder teleoperation module 30, a robot body teleoperation module 10 and a manipulator teleoperation module 20; the holder teleoperation module 30 sends an instruction to the image acquisition controller 7 to control the camera holder 4 to move according to the instruction, so that the shooting angle of the RGBD camera 3 is changed; the robot body teleoperation module 10 sends an instruction to the robot body controller 5 to control the robot body 1 to move according to the instruction; the manipulator teleoperation module 20 sends an instruction to the manipulator controller 6 to control the manipulator 2 to move according to the instruction;
when the remote environment reconstruction module 40 synthesizes a scene three-dimensional point cloud model, firstly, the robot body teleoperation module 10 and the manipulator teleoperation module 20 respectively send control instructions to the robot body controller 5 and the manipulator controller 6 which correspond to the robot body teleoperation module, so that the robot body 1 and the manipulator 2 are kept stationary, then the holder teleoperation module 30 sends instructions to the image acquisition controller 7, the camera holder 4 is controlled to rotate at multiple angles according to the instructions, and depth images are shot at all angles, so that depth images at different angles are obtained, and the scene three-dimensional point cloud model is synthesized.
The manipulator simulation model established by the manipulator operation simulation module 50 includes a manipulator three-dimensional virtual model and an inverse kinematics model, and the trajectory and the posture of the terminal point of the manipulator 2 are planned in an interactive simulation environment (i.e., in combination with a scene three-dimensional point cloud model), so as to obtain a terminal point trajectory and a posture sequence, the terminal point trajectory and the posture sequence are used as the input of the inverse kinematics model, so as to obtain an angle sequence of each joint of the manipulator 2, read the angle sequence of each joint of the manipulator 2 in sequence, and drive each joint of the manipulator three-dimensional virtual model to move.
The augmented reality display module 60 superimposes the dynamic manipulator three-dimensional virtual model in the color image shot by the RGBD camera 3 according to the current position and posture of the robot body 1 and the azimuth information of the camera holder 4, and then displays the superimposed image to realize virtual-real visual fusion. The current position and posture of the robot body 1 sends a control instruction to the robot body controller 5 through the robot body teleoperation module 10 to acquire the information. The azimuth information of the camera pan-tilt 4 is sent to the image acquisition controller 7 through the pan-tilt teleoperation module 30, and the azimuth information of the camera pan-tilt 4 is obtained. The augmented reality display module 60 obtains the orientation of the camera pan-tilt 4 in a world coordinate system (the world coordinate system is an absolute coordinate system of the system, and coordinates of all points on a picture before the user coordinate system is not established are determined by the origin of the coordinate system) according to the current position and the attitude of the robot body 1 and the orientation information of the camera pan-tilt 4, so that augmented reality registration is realized, and the three-dimensional virtual model of the manipulator is superimposed in a color image shot by the RGBD camera 3 according to the coordinate information of the three-dimensional virtual model of the manipulator.
Specifically, collision detection is also completed in the interactive simulation control mode, and the manipulator operation simulation module 50 drives the manipulator three-dimensional virtual model to move in the scene three-dimensional point cloud model by using the path, and checks whether the manipulator three-dimensional virtual model interferes with the scene three-dimensional point cloud model, so as to realize virtual and real collision detection; if the interference exists, optimizing the path, then sending the optimized path to the mobile robot, and receiving and controlling the motion of the manipulator 2 on the mobile robot by the manipulator controller 6 of the mobile robot to complete the operation task of the mobile robot.
The operation process of the interactive simulation control mode is as follows:
step 1, a manipulator operation simulation module 50 establishes a manipulator simulation model of a manipulator 2, wherein the manipulator simulation model comprises a manipulator three-dimensional virtual model and an inverse kinematics model;
step 2, keeping the robot body 1 and the manipulator 2 fixed, receiving depth images shot by the RGBD camera 3 by the far-end environment reconstruction module 40, and synthesizing the depth images shot at different angles into a scene three-dimensional point cloud model;
step 3, interactively planning the tail end point track and the posture of the manipulator three-dimensional virtual model in an interactive simulation environment to obtain a tail end point track and a posture sequence;
step 4, taking the track and the attitude sequence of the terminal point as the input of an inverse kinematics model of the manipulator 2, and solving the angle sequence of each joint of the manipulator 2;
step 5, sequentially reading the angle sequence of each joint of the manipulator 2, and driving each joint of the three-dimensional virtual model of the manipulator to move; whether a manipulator three-dimensional virtual model interferes with a scene three-dimensional point cloud model or not is checked in the motion process, virtual and real collision detection is achieved, if collision exists, a path is optimized, and then the optimized path is sent to a manipulator controller 6 of the mobile robot, so that a manipulator 2 is controlled to complete an operation task;
meanwhile, a dynamic manipulator three-dimensional virtual model can be superposed in a color image according to the current position and posture of the robot body 1 and the azimuth information of the camera holder 4, and virtual and real simulation display is realized through the augmented reality display module 60.
The remote monitoring and control system based on augmented reality can calculate and synthesize a three-dimensional point cloud model of remote equipment and an environment at the server side, and further realize local path planning and simulation of remote operation of the mobile robot through a virtual simulation manipulator model at the server side, so that the system is a virtual-real combined remote control system, and the control accuracy of the mobile robot is improved.
Example II,
Referring to fig. 1 and fig. 3, the difference between the present embodiment and the first embodiment is that the second embodiment can execute the monitoring mode while executing the interactive simulation control mode, and the interactive simulation control mode is not described again.
The monitoring mode is characterized in that fusion display of monitoring data and monitoring images is completed in a virtual-real combined mode through a remote data acquisition system positioned in remote equipment and an environment, the server and the mobile robot, and the monitoring images can be pictures and/or videos. Specifically, the method comprises the following steps:
the remote data acquisition system comprises an industrial personal computer 100 and sensors 200, wherein each sensor 200 is connected to the industrial personal computer 100, each monitored object is provided with the sensor 200, and each sensor 200 is correspondingly provided with a label 300; the label 300 comprises a label 300 in an asymmetric form such as a two-dimensional code and a bar code;
the mobile robot receives a control instruction from the server, controls the robot body 1 and the RGBD camera 3, enables the robot body 1 to move to a monitored object, enables the RGBD camera 3 to shoot a monitoring image (the monitoring image is a color image) containing label information, and sends the monitoring image to the server; in another embodiment, the RGBD camera 3 is fixed by the camera pan-tilt 4, and the rotation of the camera pan-tilt 4 is controlled by a control instruction of the server, so that the RGBD camera 3 shoots a monitoring image containing the tag information. The mobile robot can also receive a control command from a server to control the motion of the manipulator 2; specifically, a robot body teleoperation module 10 and a holder teleoperation module 30 of the server respectively send control instructions to a robot body controller 5 and an image acquisition controller 7 of the mobile robot, the robot body controller 5 controls the robot body 1 to move to a monitored object, and the image acquisition controller 7 controls a camera holder 4 to rotate, so that an RGBD camera 3 shoots a monitored image containing label information, and the mobile robot is remotely controlled to obtain the monitored image containing the label information and the running state of the current monitored object; the manipulator teleoperation module 20 of the server can also send a control instruction to the manipulator controller 6 of the mobile robot, and the manipulator controller 6 controls the motion of the manipulator 2 to realize the remote control of the manipulator 2;
the server further comprises a device identification module 70 and a storage and fault diagnosis module 80;
the device identification module 70 analyzes the monitoring image shot by the RGBD camera 3, divides and identifies the tag information on the monitoring image, and sends an inquiry request including the tag information;
the storage and fault diagnosis module 80 receives and stores the monitoring data sent by the remote data acquisition system according to the corresponding relationship among the tag 300, the monitoring object and the monitoring data, compares the monitoring data with a corresponding alarm threshold value according to a preset alarm threshold value, diagnoses whether the monitoring object has a fault, and sends alarm information if the monitoring object has the fault; when the storage and fault diagnosis module 80 receives the query request from the device identification module 70, the monitoring data corresponding to the tag information is queried according to the tag information, and then is sent to the augmented reality display module 60;
the augmented reality display module 60 superimposes the monitoring data on the monitoring image shot by the RGBD camera 3 to be displayed in the area near the corresponding label 300, so that the monitoring data and the monitoring image (namely, the color image of the monitored object) are superimposed and displayed, and the display interface is more visual.
The operation process of the interactive simulation monitoring mode is as follows:
step 10, an operator operates computer input equipment at a server end, and sends control instructions to a robot controller, a robot body controller 5 and an image acquisition controller 7 through a manipulator teleoperation module 20, a robot body teleoperation module 10 and a tripod head teleoperation module 30 respectively to control a manipulator 2, a robot body 1 and a camera tripod head 4 which are positioned in remote equipment and environment to move;
step 20, the equipment identification module receives the monitoring image shot by the RGBD camera 3, and divides and identifies the label information on the monitoring image;
step 30, inquiring the current information of the monitoring object corresponding to the label 300 from the storage and fault diagnosis module 80 according to the identified label information;
and step 40, sending the inquired information to the augmented reality display module 60, and displaying the information in the vicinity of the corresponding label 300 by overlapping the information on the monitoring image shot by the RGBD camera 3, so as to realize overlapping display of the monitoring information and the monitoring image.
The remote monitoring and control system based on augmented reality can locally control a remote mobile robot at a server, acquire a monitoring image of a monitored object, identify label information at the server, acquire the current state of the monitored object, and then superimpose the monitoring information on the monitoring image, thereby realizing data monitoring and field video monitoring of remote inspection application or fusion display of the data monitoring and the image monitoring.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. The utility model provides a remote monitoring and control system based on augmented reality which characterized in that: the system comprises an interactive simulation control mode, wherein the interactive simulation control mode completes local path planning and simulation of remote operation of a manipulator (2) in a virtual-real combined mode through a server and a mobile robot located in remote equipment and an environment, and specifically comprises the following steps:
the mobile robot comprises a robot body (1), and a manipulator (2) and an RGBD (Red Green blue) camera (3) which are arranged on the robot body (1);
the server comprises a remote environment reconstruction module (40), a manipulator operation simulation module (50) and an augmented reality display module (60); the remote environment reconstruction module (40) receives depth images shot by the RGBD camera (3) and synthesizes the depth images shot at different angles into a scene three-dimensional point cloud model; the manipulator operation simulation module (50) establishes a manipulator simulation model, then operates the manipulator simulation model by combining a scene three-dimensional point cloud model, plans a path, and finally drives the manipulator simulation model to move by utilizing the path; the augmented reality display module (60) superposes a dynamic manipulator simulation model in a color image shot by the RGBD camera (3) to realize virtual-real combined simulation display and path planning;
still include the monitoring mode, it passes through server, be located remote data acquisition system and the mobile robot in remote equipment and environment to the amalgamation of monitoring data and monitoring image is shown to the mode that virtuality and reality combines is accomplished, specifically:
the remote data acquisition system acquires monitoring data of each monitored object and then sends the monitoring data to the server; each monitored object is provided with a label (300);
when the mobile robot patrols and examines, a control instruction from the server is received, the robot body (1) and the RGBD camera (3) are controlled, the robot body (1) moves to a monitored object, the RGBD camera (3) shoots a monitoring image containing label information and sends the monitoring image to the server;
the server further comprises a device identification module (70) and a storage and fault diagnosis module (80);
the equipment identification module (70) analyzes the monitoring image shot by the RGBD camera (3), divides and identifies the label information on the monitoring image, and sends an inquiry request containing the label information;
the storage and fault diagnosis module (80) receives and stores the monitoring data sent by the remote data acquisition system, diagnoses whether a monitored object has a fault, and sends alarm information if the monitored object has the fault; when the storage and fault diagnosis module (80) receives an inquiry request from the equipment identification module (70), inquiring monitoring data corresponding to the label information according to the label information, and then sending the monitoring data to the augmented reality display module (60);
the augmented reality display module (60) superposes the monitoring data on the monitoring image shot by the RGBD camera (3), the nearby area of the corresponding label (300) is displayed, the superposition display of the monitoring data and the monitoring image is realized, and the display interface is more visual.
2. The augmented reality-based remote monitoring and control system according to claim 1, wherein: the collision detection is also completed in the interactive simulation control mode, specifically: the manipulator operation simulation module (50) drives the manipulator simulation model to move by utilizing the path in the scene three-dimensional point cloud model, and checks whether the manipulator simulation model interferes with the scene three-dimensional point cloud model or not to realize virtual and real collision detection; if interference exists, optimizing the path, then sending the optimized path to the mobile robot, if interference does not exist, sending the path to the mobile robot, and controlling the movement of the manipulator (2) by the mobile robot according to the path to complete the operation task.
3. The augmented reality-based remote monitoring and control system of claim 2, wherein: the mobile robot is also provided with an image acquisition controller (7), a robot body controller (5) and a manipulator controller (6); the image collector controls the camera holder (4) to move and collects a color image and a depth image on the RGBD camera (3), and the RGBD camera (3) is fixed on the camera holder (4); the robot body controller (5) controls the robot body (1) to move and feeds back the direction and the posture of the robot body (1); the manipulator controller (6) controls the movement of the manipulator (2);
the server is also provided with a cradle head teleoperation module (30), a robot body teleoperation module (10) and a manipulator teleoperation module (20); the cradle head teleoperation module (30) sends an instruction to the image acquisition controller (7) and controls the camera cradle head (4) to move according to the instruction, so that the shooting angle of the RGBD camera (3) is changed; the robot body teleoperation module (10) sends an instruction to the robot body controller (5) to control the robot body (1) to move according to the instruction; the manipulator teleoperation module (20) sends an instruction to the manipulator controller (6) to control the manipulator (2) to move according to the instruction;
when the far-end environment reconstruction module (40) synthesizes a scene three-dimensional point cloud model, control instructions are sent to a robot body controller (5) and a manipulator controller (6) which correspond to the robot body controller through a robot body teleoperation module (10) and a manipulator teleoperation module (20) respectively to ensure that the robot body (1) and the manipulator (2) are kept fixed, then instructions are sent to an image acquisition controller (7) through a holder teleoperation module (30), a camera holder (4) is controlled to rotate in multiple angles according to the instructions, and depth images are shot at all angles, so that depth images at different angles are obtained, and the scene three-dimensional point cloud model is synthesized.
4. The augmented reality-based remote monitoring and control system of claim 3, wherein: the manipulator simulation model established by the manipulator operation simulation module (50) comprises a manipulator three-dimensional virtual model and an inverse kinematics model, the tail end point track and the posture of the manipulator (2) are planned in an interactive simulation environment, the tail end point track and the posture sequence are obtained, the tail end point track and the posture sequence are used as the input of the inverse kinematics model, the angle sequence of each joint of the manipulator (2) is obtained, then the angle sequence of each joint of the manipulator (2) is read in sequence, and each joint of the manipulator three-dimensional virtual model is driven to move.
5. The augmented reality-based remote monitoring and control system of claim 4, wherein: the augmented reality display module (60) superposes a dynamic manipulator three-dimensional virtual model in a color image shot by the RGBD camera (3) according to the current position and posture of the robot body (1) and the azimuth information of the camera holder (4), and then displays the dynamic manipulator three-dimensional virtual model to realize virtual-real visual fusion.
6. The augmented reality-based remote monitoring and control system of claim 5, wherein: the current position and posture of the robot body (1) send control instructions to a robot body controller (5) through a robot body teleoperation module (10) to acquire the information; and the azimuth information of the camera pan-tilt (4) sends an instruction to the image acquisition controller (7) through the pan-tilt teleoperation module (30) to acquire the azimuth information of the camera pan-tilt (4).
7. The augmented reality-based remote monitoring and control system of claim 6, wherein: the remote data acquisition system comprises an industrial personal computer (100) and sensors (200), wherein each sensor (200) is connected to the industrial personal computer (100), each monitoring object is provided with the sensor (200), and each sensor (200) is correspondingly provided with one label (300).
8. The augmented reality-based remote monitoring and control system of claim 7, wherein: the storage and fault diagnosis module (80) receives and stores the monitoring data sent by the remote data acquisition system according to the corresponding relation among the label (300), the monitoring object and the monitoring data, compares the monitoring data with the corresponding alarm threshold value according to a preset alarm threshold value, diagnoses whether the monitoring object has a fault, and sends alarm information if the monitoring object has the fault; when the storage and fault diagnosis module (80) receives an inquiry request from the equipment identification module (70), the monitoring data corresponding to the label information is inquired according to the label information and then sent to the augmented reality display module (60).
9. The augmented reality-based remote monitoring and control system of claim 8, wherein: the monitored images include pictures and videos.
CN201810236758.3A 2018-03-21 2018-03-21 Remote monitoring and control system based on augmented reality Active CN108422435B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810236758.3A CN108422435B (en) 2018-03-21 2018-03-21 Remote monitoring and control system based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810236758.3A CN108422435B (en) 2018-03-21 2018-03-21 Remote monitoring and control system based on augmented reality

Publications (2)

Publication Number Publication Date
CN108422435A CN108422435A (en) 2018-08-21
CN108422435B true CN108422435B (en) 2020-05-12

Family

ID=63159299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810236758.3A Active CN108422435B (en) 2018-03-21 2018-03-21 Remote monitoring and control system based on augmented reality

Country Status (1)

Country Link
CN (1) CN108422435B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10803314B2 (en) 2018-10-10 2020-10-13 Midea Group Co., Ltd. Method and system for providing remote robotic control
US10816994B2 (en) 2018-10-10 2020-10-27 Midea Group Co., Ltd. Method and system for providing remote robotic control
US10678264B2 (en) 2018-10-10 2020-06-09 Midea Group Co., Ltd. Method and system for providing remote robotic control
US11192253B2 (en) * 2018-10-12 2021-12-07 Toyota Research Institute, Inc. Systems and methods for conditional robotic teleoperation
CN109483234B (en) * 2018-11-02 2020-06-09 北京卫星制造厂有限公司 Intelligent manufacturing system and method based on mobile robot
CN109434808A (en) * 2018-12-13 2019-03-08 上海菡为智能科技有限公司 A kind of cloud remote service Study of Intelligent Robot Control network system realization
CN111818115B (en) * 2019-04-12 2021-10-22 华为技术有限公司 Processing method, device and system
CN110047150B (en) * 2019-04-24 2023-06-16 大唐环境产业集团股份有限公司 Complex equipment operation on-site simulation system based on augmented reality
CN110238831B (en) * 2019-07-23 2020-09-18 青岛理工大学 Robot teaching system and method based on RGB-D image and teaching device
CN110434877A (en) * 2019-08-14 2019-11-12 纳博特南京科技有限公司 One kind being based on augmented reality and idiodynamic robot control method
CN111459274B (en) * 2020-03-30 2021-09-21 华南理工大学 5G + AR-based remote operation method for unstructured environment
CN111660294B (en) * 2020-05-18 2022-03-18 北京科技大学 Augmented reality control system of hydraulic heavy-duty mechanical arm
CN111754647A (en) * 2020-05-21 2020-10-09 江苏锐士安防科技有限公司 Intelligent security robot inspection method based on environment perception
CN114434437A (en) * 2020-10-30 2022-05-06 西门子(中国)有限公司 Remote control method and device for robot
CN112549034B (en) * 2020-12-21 2021-09-03 南方电网电力科技股份有限公司 Robot task deployment method, system, equipment and storage medium
CN112819966A (en) * 2021-01-05 2021-05-18 上海大学 Environment fusion system and method suitable for man-machine interaction operation of underwater remote control robot
CN113100934A (en) * 2021-04-06 2021-07-13 德智鸿(上海)机器人有限责任公司 Operation assisting method, device, computer equipment and storage medium
CN113276110B (en) * 2021-04-22 2022-12-16 国网浙江省电力有限公司嘉兴供电公司 Transformer substation operation robot control system and method based on AR technology
CN113618731A (en) * 2021-07-22 2021-11-09 中广核研究院有限公司 Robot control system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107635133A (en) * 2017-11-01 2018-01-26 广州供电局有限公司 A kind of robot of data center inspection tour system based on augmented reality
CN107657682A (en) * 2017-10-30 2018-02-02 成都极致空觉科技有限公司 A kind of power transformation method for inspecting based on augmented reality
CN107765854A (en) * 2017-10-20 2018-03-06 国网湖北省电力公司检修公司 A kind of polling transmission line method based on augmented reality

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102528811B (en) * 2011-12-19 2014-06-18 上海交通大学 Mechanical arm positioning and obstacle avoiding system in Tokamak cavity
US9836117B2 (en) * 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
CN205405613U (en) * 2016-03-07 2016-07-27 广东理工学院 Robot is rebuild to indoor three -dimensional scene of building
CN206561409U (en) * 2017-01-20 2017-10-17 上海大学 A kind of VR shoots kinematic robot system
CN107309882B (en) * 2017-08-14 2019-08-06 青岛理工大学 A kind of robot teaching programming system and method
CN207087855U (en) * 2017-08-17 2018-03-13 苏州中德睿博智能科技有限公司 Mobile robot platform for the modeling of coal mine roadway three-dimensional live
CN107578487A (en) * 2017-09-19 2018-01-12 北京枭龙科技有限公司 A kind of cruising inspection system based on augmented reality smart machine

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765854A (en) * 2017-10-20 2018-03-06 国网湖北省电力公司检修公司 A kind of polling transmission line method based on augmented reality
CN107657682A (en) * 2017-10-30 2018-02-02 成都极致空觉科技有限公司 A kind of power transformation method for inspecting based on augmented reality
CN107635133A (en) * 2017-11-01 2018-01-26 广州供电局有限公司 A kind of robot of data center inspection tour system based on augmented reality

Also Published As

Publication number Publication date
CN108422435A (en) 2018-08-21

Similar Documents

Publication Publication Date Title
CN108422435B (en) Remote monitoring and control system based on augmented reality
CN110587600B (en) Point cloud-based autonomous path planning method for live working robot
US7714895B2 (en) Interactive and shared augmented reality system and method having local and remote access
CN109397249B (en) Method for positioning and grabbing robot system by two-dimensional code based on visual identification
US8989876B2 (en) Situational awareness for teleoperation of a remote vehicle
CN109164829B (en) Flying mechanical arm system based on force feedback device and VR sensing and control method
CN111633644A (en) Industrial robot digital twin system combined with intelligent vision and operation method thereof
US20180190014A1 (en) Collaborative multi sensor system for site exploitation
JP3343682B2 (en) Robot operation teaching device and operation teaching method
CN111300384B (en) Registration system and method for robot augmented reality teaching based on identification card movement
WO2022000713A1 (en) Augmented reality self-positioning method based on aviation assembly
CN110977981A (en) Robot virtual reality synchronization system and synchronization method
CN111383348A (en) Method for remotely and synchronously controlling robot through virtual reality
CN116160440A (en) Remote operation system of double-arm intelligent robot based on MR remote control
CN110751734B (en) Mixed reality assistant system suitable for job site
Schwarz et al. Low-latency immersive 6D televisualization with spherical rendering
CN113276110B (en) Transformer substation operation robot control system and method based on AR technology
Wang et al. The design of an augmented reality system for urban search and rescue
JPH0421105A (en) Stereoscopic teaching device for manipulator
CN111283664B (en) Registration system and method for robot augmented reality teaching
JPH11338532A (en) Teaching device
CN113597362B (en) Method and control device for determining the relationship between a robot coordinate system and a mobile device coordinate system
JP2001062766A (en) User interface system for remote control of bipedal walking robot
JP2778376B2 (en) Camera viewpoint change method
CN107363831B (en) Teleoperation robot control system and method based on vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200417

Address after: No. 63, Xingyu Road, Dongjiang street, Longkou City, Yantai City, Shandong Province

Applicant after: Yantai Longwen Auto Parts Co.,Ltd.

Address before: 266555 Jialing River Road 777, Qingdao economic and Technological Development Zone, Qingdao, Shandong

Applicant before: Qindao University of Technology

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Remote monitoring and control system based on augmented reality

Effective date of registration: 20200618

Granted publication date: 20200512

Pledgee: Shandong Longkou Rural Commercial Bank Co.,Ltd.

Pledgor: Yantai Longwen Auto Parts Co.,Ltd.

Registration number: Y2020980003223

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20210520

Granted publication date: 20200512

Pledgee: Shandong Longkou Rural Commercial Bank Co.,Ltd.

Pledgor: Yantai Longwen Auto Parts Co.,Ltd.

Registration number: Y2020980003223

PC01 Cancellation of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A remote monitoring and control system based on Augmented Reality

Effective date of registration: 20210531

Granted publication date: 20200512

Pledgee: Shandong Longkou Rural Commercial Bank Co.,Ltd.

Pledgor: Yantai Longwen Auto Parts Co.,Ltd.

Registration number: Y2021980004111

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20220624

Granted publication date: 20200512

Pledgee: Shandong Longkou Rural Commercial Bank Co.,Ltd.

Pledgor: Yantai Longwen Auto Parts Co.,Ltd.

Registration number: Y2021980004111

PC01 Cancellation of the registration of the contract for pledge of patent right