Remote monitoring and control system based on augmented reality
Technical Field
The invention relates to a remote monitoring and control system based on augmented reality, and belongs to the field of intelligent manufacturing and computer measurement and control.
Background
The current remote monitoring and control are widely applied to production, however, in the aspect of monitoring, the current monitoring system respectively displays monitoring videos and monitoring data, monitoring personnel need to link the video contents and the monitoring data together, and the reality of an interface is not visual enough. In addition, in the aspect of control, a controller can only estimate three-dimensional information of a remote environment through a color image to plan a path of the manipulator, so that the remote manipulator is controlled to complete operation, and the control of the robot is not accurate enough.
The invention patent with publication number CN106863303A (CN 106863303A) has the following specific technical scheme: the angular displacement of each joint is collected through a coder arranged at each joint of the manipulator, the collected data is sent to a computer for processing, the computer plans a manipulator path according to the angular displacement data of each joint and controls the manipulator to complete welding of a welding seam along the path. The technical scheme is an equipment field path planning mode and cannot meet the requirement of the current enterprise remote path planning.
Disclosure of Invention
In order to solve the technical problems, the invention provides a remote monitoring and control system based on augmented reality, which realizes the fusion display of video monitoring and data monitoring of remote inspection application and the generation of local path planning and simulation of remote operation of a robot at a server.
The technical scheme of the invention is as follows:
a remote monitoring and control system based on augmented reality comprises an interactive simulation control mode, wherein the mode finishes local path planning and simulation of manipulator remote operation in a virtual-real combined mode through a server and a mobile robot located in remote equipment and an environment, and specifically comprises the following steps:
the mobile robot comprises a robot body, and a manipulator and an RGBD (red green blue) camera which are arranged on the robot body;
the server comprises a remote environment reconstruction module, a manipulator operation simulation module and an augmented reality display module; the remote environment reconstruction module receives depth images shot by the RGBD camera and synthesizes the depth images shot at different angles into a scene three-dimensional point cloud model; the manipulator operation simulation module establishes a manipulator simulation model, then combines a scene three-dimensional point cloud model, operates the manipulator simulation model, plans a path, and finally drives the manipulator simulation model to move by using the path; and the augmented reality display module superposes the dynamic manipulator simulation model in a color image shot by the RGBD camera to realize virtual-real combined simulation display.
Preferably, the collision detection is also completed in the interactive simulation control mode, specifically: the manipulator operation simulation module drives the manipulator simulation model to move by utilizing the path in the scene three-dimensional point cloud model, and checks whether the manipulator simulation model interferes with the scene three-dimensional point cloud model or not to realize virtual and real collision detection; and if the interference exists, optimizing the path, then sending the optimized path to the mobile robot, and if the interference does not exist, sending the path to the mobile robot, and controlling the movement of the manipulator by the mobile robot according to the path to complete the operation task.
Preferably, the mobile robot further comprises an image acquisition controller, a robot body controller and a manipulator controller; the image collector controls the camera holder to move and collects a color image and a depth image on the RGBD camera, and the RGBD camera is fixed on the camera holder; the robot body controller controls the robot body to move and feeds back the direction and the posture of the robot body; the manipulator controller controls the motion of the manipulator;
the server is also provided with a cradle head teleoperation module, a robot body teleoperation module and a manipulator teleoperation module; the cradle head teleoperation module sends an instruction to the image acquisition controller to control the camera cradle head to move according to the instruction, so that the shooting angle of the RGBD camera is changed; the robot body teleoperation module sends an instruction to the robot body controller to control the robot body to move according to the instruction; the manipulator teleoperation module sends an instruction to a manipulator controller to control the manipulator to move according to the instruction;
when the remote environment reconstruction module synthesizes a scene three-dimensional point cloud model, control instructions are sent to a robot body controller and a manipulator controller which correspond to the robot body controller and the manipulator controller respectively through the robot body teleoperation module and the manipulator teleoperation module, so that the robot body and the manipulator are kept fixed, then the instructions are sent to an image acquisition controller through the holder teleoperation module, a camera holder is controlled to rotate at multiple angles according to the instructions, and depth images are shot at all angles, so that depth images at different angles are obtained, and the scene three-dimensional point cloud model is synthesized.
Preferably, the manipulator simulation model established by the manipulator operation simulation module comprises a manipulator three-dimensional virtual model and an inverse kinematics model, the trajectory and the posture of the tail end point of the manipulator are planned in an interactive simulation environment, a tail end point trajectory and a posture sequence are obtained, the tail end point trajectory and the posture sequence are used as the input of the inverse kinematics model, the angle sequence of each joint of the manipulator is obtained, then the angle sequence of each joint of the manipulator is read in sequence, and each joint of the manipulator three-dimensional virtual model is driven to move.
Preferably, the augmented reality display module superposes the dynamic manipulator three-dimensional virtual model in a color image shot by the RGBD camera according to the current position and posture of the robot body and the azimuth information of the camera holder, and then displays the dynamic manipulator three-dimensional virtual model to realize virtual-real visual fusion.
Preferably, the current position and posture of the robot body send control instructions to the robot body controller through the robot body teleoperation module to acquire the information; the camera pan-tilt azimuth information sends an instruction to the image acquisition controller through the pan-tilt teleoperation module to acquire the azimuth information of the camera pan-tilt;
preferably, the remote monitoring and control system further includes a monitoring mode, which completes the fusion display of the monitoring data and the monitoring image in a virtual-real combined manner through a remote data acquisition system located in a remote device and an environment, the server and the mobile robot, specifically:
the remote data acquisition system acquires monitoring data of each monitored object and then sends the monitoring data to the server; each monitoring object is provided with a label;
the mobile robot receives a control instruction from the server, controls the robot body and the camera holder to move, enables the robot body to move to a monitored object, controls the camera holder to rotate, enables the RGBD camera to shoot a monitoring image containing label information and sends the monitoring image to the server;
the server also comprises an equipment identification module and a storage and fault diagnosis module;
the equipment identification module analyzes the monitoring image shot by the RGBD camera, divides and identifies the label information on the monitoring image, and sends an inquiry request containing the label information;
the storage and fault diagnosis module receives and stores the monitoring data sent by the remote data acquisition system, diagnoses whether a monitored object has a fault, and sends alarm information if the monitored object has the fault; when the storage and fault diagnosis module receives an inquiry request from the equipment identification module, inquiring monitoring data corresponding to the label information according to the label information, and then sending the monitoring data to the augmented reality display module;
the augmented reality display module superposes the monitoring data on the monitoring image shot by the RGBD camera, the nearby area of the corresponding label is displayed, the superposition display of the monitoring data and the monitoring data is realized, and the display interface is more visual.
Preferably, the remote data acquisition system comprises an industrial personal computer and sensors, wherein each sensor is connected to the industrial personal computer, each monitored object is provided with the sensor, and each sensor is correspondingly provided with one label;
preferably, the storage and fault diagnosis module receives and stores the monitoring data sent by the remote data acquisition system according to the corresponding relationship among the tag, the monitoring object and the monitoring data, compares the monitoring data with the corresponding alarm threshold value according to a preset alarm threshold value, diagnoses whether the monitoring object has a fault, and sends alarm information if the monitoring object has the fault; when the storage and fault diagnosis module receives an inquiry request from the equipment identification module, the monitoring data corresponding to the label information is inquired according to the label information and then sent to the augmented reality display module.
Preferably, the monitored images include pictures and videos.
The invention has the following beneficial effects:
1. according to the remote monitoring and control system based on augmented reality, a depth image of an environment is obtained through a mobile robot located in a far-end device and the environment, then a three-dimensional point cloud model of the far-end device and the environment is calculated and synthesized at a server end, and then local path planning and simulation of remote operation of the mobile robot are realized at the server end through a virtual simulation manipulator model, so that the remote monitoring and control system based on augmented reality is a virtual-real combined remote control system, and the control accuracy of the mobile robot is improved;
2. according to the remote monitoring and control system based on augmented reality, monitoring data are acquired through the remote data acquisition system, then a monitoring image of a monitored object is acquired through the mobile robot, label information is identified at the server side, current state information of the monitored object is acquired, and then the monitoring information is superposed on the monitoring image, so that data monitoring and field video monitoring or fusion display of the data monitoring and the image monitoring of remote inspection applications is realized.
Drawings
FIG. 1 is a block diagram of the system of the present invention;
FIG. 2 is a flow chart of an interactive simulation control mode of the present invention;
FIG. 3 is a flow chart of the monitor mode of the present invention.
The reference numbers in the figures denote:
1. a robot body; 2. a manipulator; 3. an RGBD camera; 4. a camera pan-tilt; 5. a robot body controller; 6. a manipulator controller; 7. an image acquisition controller; 10. a robot body teleoperation module; 20. a manipulator teleoperation module; 30. a cradle head teleoperation module; 40. a remote environment reconstruction module; 50. a manipulator operation simulation module; 60. an augmented reality display module; 70. a device identification module; 80. a storage and fault diagnosis module; 100. an industrial personal computer; 200. a sensor; 300. and (4) a label.
Detailed Description
The invention is described in detail below with reference to the figures and the specific embodiments.
The first embodiment is as follows:
referring to fig. 1 and 2, an augmented reality-based remote monitoring and control system includes an interactive simulation control mode, which completes local path planning and simulation of remote operation of a manipulator 2 in a virtual-real combined manner through a server and a mobile robot located in a remote device and an environment, specifically:
the mobile robot comprises a robot body 1, a manipulator 2 and an RGBD camera 3, wherein the manipulator 2 and the RGBD camera 3 are arranged on the robot body 1;
the server comprises a remote environment reconstruction module 40, a manipulator operation simulation module 50 and an augmented reality display module 60; the remote environment reconstruction module 40 receives depth images shot by the RGBD camera 3, synthesizes the depth images shot at different angles into a scene three-dimensional point cloud model, the manipulator operation simulation module 50 establishes a manipulator simulation model, then operates the manipulator simulation model in combination with the scene three-dimensional point cloud model, plans a path, and finally drives the manipulator simulation model to move by using the path; the augmented reality display module 60 superimposes the dynamic manipulator simulation model on the color image shot by the RGBD camera 3, so as to realize virtual-real combined analog display. The manipulator 2 includes a multi-degree-of-freedom manipulator such as a six-degree-of-freedom manipulator, a three-degree-of-freedom manipulator, or a two-degree-of-freedom manipulator, and the more degrees of freedom, the more flexibility the manipulator.
The collision detection is also completed in the interactive simulation control mode, specifically: the manipulator operation simulation module 50 drives the manipulator simulation model to move in the scene three-dimensional point cloud model by using the path, and checks whether the manipulator simulation model interferes with the scene three-dimensional point cloud model to realize virtual and real collision detection; if the interference exists, optimizing the path, and then sending the optimized path to the mobile robot, so as to control the motion of the manipulator 2 on the mobile robot and complete the operation task of the manipulator 2; the optimized path may also be displayed in a simulated manner by the augmented reality display module 60.
The mobile robot also comprises an image acquisition controller 7, a robot body controller 5 and a manipulator controller 6; the image collector controls the camera holder 4 to move and collects a color image and a depth image on the RGBD camera 3, and the RGBD camera 3 is fixed on the camera holder 4; the robot body controller 5 controls the robot body 1 to move and feeds back the direction and the posture of the robot body 1; the manipulator controller 6 controls the movement of the manipulator 2;
the server is also provided with a holder teleoperation module 30, a robot body teleoperation module 10 and a manipulator teleoperation module 20; the holder teleoperation module 30 sends an instruction to the image acquisition controller 7 to control the camera holder 4 to move according to the instruction, so that the shooting angle of the RGBD camera 3 is changed; the robot body teleoperation module 10 sends an instruction to the robot body controller 5 to control the robot body 1 to move according to the instruction; the manipulator teleoperation module 20 sends an instruction to the manipulator controller 6 to control the manipulator 2 to move according to the instruction;
when the remote environment reconstruction module 40 synthesizes a scene three-dimensional point cloud model, firstly, the robot body teleoperation module 10 and the manipulator teleoperation module 20 respectively send control instructions to the robot body controller 5 and the manipulator controller 6 which correspond to the robot body teleoperation module, so that the robot body 1 and the manipulator 2 are kept stationary, then the holder teleoperation module 30 sends instructions to the image acquisition controller 7, the camera holder 4 is controlled to rotate at multiple angles according to the instructions, and depth images are shot at all angles, so that depth images at different angles are obtained, and the scene three-dimensional point cloud model is synthesized.
The manipulator simulation model established by the manipulator operation simulation module 50 includes a manipulator three-dimensional virtual model and an inverse kinematics model, and the trajectory and the posture of the terminal point of the manipulator 2 are planned in an interactive simulation environment (i.e., in combination with a scene three-dimensional point cloud model), so as to obtain a terminal point trajectory and a posture sequence, the terminal point trajectory and the posture sequence are used as the input of the inverse kinematics model, so as to obtain an angle sequence of each joint of the manipulator 2, read the angle sequence of each joint of the manipulator 2 in sequence, and drive each joint of the manipulator three-dimensional virtual model to move.
The augmented reality display module 60 superimposes the dynamic manipulator three-dimensional virtual model in the color image shot by the RGBD camera 3 according to the current position and posture of the robot body 1 and the azimuth information of the camera holder 4, and then displays the superimposed image to realize virtual-real visual fusion. The current position and posture of the robot body 1 sends a control instruction to the robot body controller 5 through the robot body teleoperation module 10 to acquire the information. The azimuth information of the camera pan-tilt 4 is sent to the image acquisition controller 7 through the pan-tilt teleoperation module 30, and the azimuth information of the camera pan-tilt 4 is obtained. The augmented reality display module 60 obtains the orientation of the camera pan-tilt 4 in a world coordinate system (the world coordinate system is an absolute coordinate system of the system, and coordinates of all points on a picture before the user coordinate system is not established are determined by the origin of the coordinate system) according to the current position and the attitude of the robot body 1 and the orientation information of the camera pan-tilt 4, so that augmented reality registration is realized, and the three-dimensional virtual model of the manipulator is superimposed in a color image shot by the RGBD camera 3 according to the coordinate information of the three-dimensional virtual model of the manipulator.
Specifically, collision detection is also completed in the interactive simulation control mode, and the manipulator operation simulation module 50 drives the manipulator three-dimensional virtual model to move in the scene three-dimensional point cloud model by using the path, and checks whether the manipulator three-dimensional virtual model interferes with the scene three-dimensional point cloud model, so as to realize virtual and real collision detection; if the interference exists, optimizing the path, then sending the optimized path to the mobile robot, and receiving and controlling the motion of the manipulator 2 on the mobile robot by the manipulator controller 6 of the mobile robot to complete the operation task of the mobile robot.
The operation process of the interactive simulation control mode is as follows:
step 1, a manipulator operation simulation module 50 establishes a manipulator simulation model of a manipulator 2, wherein the manipulator simulation model comprises a manipulator three-dimensional virtual model and an inverse kinematics model;
step 2, keeping the robot body 1 and the manipulator 2 fixed, receiving depth images shot by the RGBD camera 3 by the far-end environment reconstruction module 40, and synthesizing the depth images shot at different angles into a scene three-dimensional point cloud model;
step 3, interactively planning the tail end point track and the posture of the manipulator three-dimensional virtual model in an interactive simulation environment to obtain a tail end point track and a posture sequence;
step 4, taking the track and the attitude sequence of the terminal point as the input of an inverse kinematics model of the manipulator 2, and solving the angle sequence of each joint of the manipulator 2;
step 5, sequentially reading the angle sequence of each joint of the manipulator 2, and driving each joint of the three-dimensional virtual model of the manipulator to move; whether a manipulator three-dimensional virtual model interferes with a scene three-dimensional point cloud model or not is checked in the motion process, virtual and real collision detection is achieved, if collision exists, a path is optimized, and then the optimized path is sent to a manipulator controller 6 of the mobile robot, so that a manipulator 2 is controlled to complete an operation task;
meanwhile, a dynamic manipulator three-dimensional virtual model can be superposed in a color image according to the current position and posture of the robot body 1 and the azimuth information of the camera holder 4, and virtual and real simulation display is realized through the augmented reality display module 60.
The remote monitoring and control system based on augmented reality can calculate and synthesize a three-dimensional point cloud model of remote equipment and an environment at the server side, and further realize local path planning and simulation of remote operation of the mobile robot through a virtual simulation manipulator model at the server side, so that the system is a virtual-real combined remote control system, and the control accuracy of the mobile robot is improved.
Example II,
Referring to fig. 1 and fig. 3, the difference between the present embodiment and the first embodiment is that the second embodiment can execute the monitoring mode while executing the interactive simulation control mode, and the interactive simulation control mode is not described again.
The monitoring mode is characterized in that fusion display of monitoring data and monitoring images is completed in a virtual-real combined mode through a remote data acquisition system positioned in remote equipment and an environment, the server and the mobile robot, and the monitoring images can be pictures and/or videos. Specifically, the method comprises the following steps:
the remote data acquisition system comprises an industrial personal computer 100 and sensors 200, wherein each sensor 200 is connected to the industrial personal computer 100, each monitored object is provided with the sensor 200, and each sensor 200 is correspondingly provided with a label 300; the label 300 comprises a label 300 in an asymmetric form such as a two-dimensional code and a bar code;
the mobile robot receives a control instruction from the server, controls the robot body 1 and the RGBD camera 3, enables the robot body 1 to move to a monitored object, enables the RGBD camera 3 to shoot a monitoring image (the monitoring image is a color image) containing label information, and sends the monitoring image to the server; in another embodiment, the RGBD camera 3 is fixed by the camera pan-tilt 4, and the rotation of the camera pan-tilt 4 is controlled by a control instruction of the server, so that the RGBD camera 3 shoots a monitoring image containing the tag information. The mobile robot can also receive a control command from a server to control the motion of the manipulator 2; specifically, a robot body teleoperation module 10 and a holder teleoperation module 30 of the server respectively send control instructions to a robot body controller 5 and an image acquisition controller 7 of the mobile robot, the robot body controller 5 controls the robot body 1 to move to a monitored object, and the image acquisition controller 7 controls a camera holder 4 to rotate, so that an RGBD camera 3 shoots a monitored image containing label information, and the mobile robot is remotely controlled to obtain the monitored image containing the label information and the running state of the current monitored object; the manipulator teleoperation module 20 of the server can also send a control instruction to the manipulator controller 6 of the mobile robot, and the manipulator controller 6 controls the motion of the manipulator 2 to realize the remote control of the manipulator 2;
the server further comprises a device identification module 70 and a storage and fault diagnosis module 80;
the device identification module 70 analyzes the monitoring image shot by the RGBD camera 3, divides and identifies the tag information on the monitoring image, and sends an inquiry request including the tag information;
the storage and fault diagnosis module 80 receives and stores the monitoring data sent by the remote data acquisition system according to the corresponding relationship among the tag 300, the monitoring object and the monitoring data, compares the monitoring data with a corresponding alarm threshold value according to a preset alarm threshold value, diagnoses whether the monitoring object has a fault, and sends alarm information if the monitoring object has the fault; when the storage and fault diagnosis module 80 receives the query request from the device identification module 70, the monitoring data corresponding to the tag information is queried according to the tag information, and then is sent to the augmented reality display module 60;
the augmented reality display module 60 superimposes the monitoring data on the monitoring image shot by the RGBD camera 3 to be displayed in the area near the corresponding label 300, so that the monitoring data and the monitoring image (namely, the color image of the monitored object) are superimposed and displayed, and the display interface is more visual.
The operation process of the interactive simulation monitoring mode is as follows:
step 10, an operator operates computer input equipment at a server end, and sends control instructions to a robot controller, a robot body controller 5 and an image acquisition controller 7 through a manipulator teleoperation module 20, a robot body teleoperation module 10 and a tripod head teleoperation module 30 respectively to control a manipulator 2, a robot body 1 and a camera tripod head 4 which are positioned in remote equipment and environment to move;
step 20, the equipment identification module receives the monitoring image shot by the RGBD camera 3, and divides and identifies the label information on the monitoring image;
step 30, inquiring the current information of the monitoring object corresponding to the label 300 from the storage and fault diagnosis module 80 according to the identified label information;
and step 40, sending the inquired information to the augmented reality display module 60, and displaying the information in the vicinity of the corresponding label 300 by overlapping the information on the monitoring image shot by the RGBD camera 3, so as to realize overlapping display of the monitoring information and the monitoring image.
The remote monitoring and control system based on augmented reality can locally control a remote mobile robot at a server, acquire a monitoring image of a monitored object, identify label information at the server, acquire the current state of the monitored object, and then superimpose the monitoring information on the monitoring image, thereby realizing data monitoring and field video monitoring of remote inspection application or fusion display of the data monitoring and the image monitoring.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.