CN111580519B - Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay - Google Patents

Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay Download PDF

Info

Publication number
CN111580519B
CN111580519B CN202010399002.8A CN202010399002A CN111580519B CN 111580519 B CN111580519 B CN 111580519B CN 202010399002 A CN202010399002 A CN 202010399002A CN 111580519 B CN111580519 B CN 111580519B
Authority
CN
China
Prior art keywords
lunar
lunar surface
teleoperation
mobile robot
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010399002.8A
Other languages
Chinese (zh)
Other versions
CN111580519A (en
Inventor
王鹏基
王勇
胡勇
胡海东
孙赫婕
徐拴锋
魏春岭
邢琰
毛晓艳
贾永
安思颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control Engineering
Original Assignee
Beijing Institute of Control Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control Engineering filed Critical Beijing Institute of Control Engineering
Priority to CN202010399002.8A priority Critical patent/CN111580519B/en
Publication of CN111580519A publication Critical patent/CN111580519A/en
Application granted granted Critical
Publication of CN111580519B publication Critical patent/CN111580519B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

A lunar surface detection quasi-real-time teleoperation system under the lunar surface time delay comprises a lunar surface task operation site, a ground control center and a world transmission link; the lunar task operation site comprises a lunar mobile robot and a lunar environment; a plurality of sets of measuring sensors are arranged on the lunar surface mobile robot; the lunar surface environment comprises lunar surface topography and illumination conditions; the lunar surface task operation site receives teleoperation instructions uploaded by the ground control center in real time through a world transmission link, drives the lunar surface mobile robot to move according to the instructions, and remotely measures and downloads sensor measurement data on the lunar surface mobile robot to the ground control center through the world transmission link; the heaven-earth transmission link is responsible for information interaction between the ground control center and the lunar surface task operation site; the ground control center is responsible for the prediction and correction of the movement state of the lunar mobile robot under the lunar time delay, ground teleoperation and instruction analysis and uploading, and the generated teleoperation instruction reaches the lunar task operation site through the heaven-earth transmission link to drive the lunar mobile robot to carry out movement detection in real time.

Description

Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay
Technical Field
The invention relates to a quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay, and belongs to the field of manned lunar detection.
Background
Teleoperation of the lunar mobile robot by ground operators is an important mode of lunar exploration, the lunar terrain environment is complex and unknown, and large uncertainty exists, so that a large challenge is brought to the teleoperation task of the lunar mobile robot. Due to the limitation of the transmission delay and the transmission bandwidth of the distance between the ground and the moon, on-orbit data cannot be transmitted to the ground in real time, and ground operators cannot directly control the robot in real time according to the motion state of the lunar surface robot and lunar surface topographic image data. Therefore, the unmanned lunar rover lunar surface detection tasks realized by the soviet union, the Jade rabbit number in China and the like all adopt a Move-Wait-Watch phase target teleoperation strategy: in the current movement period, ground personnel splice and passability analysis are carried out according to local images around a plurality of lunar vehicles downloaded by remote measurement, teleoperation instructions are generated autonomously or manually, and the generated operation instructions are uploaded to the sky for execution after being fully verified on the ground, so that the safety of lunar surface detection is ensured. The detection mode of 'stop-and-go' enables the lunar rover to move only a few tens of meters within a lunar day period (about one month), and the lunar surface detection efficiency is very low. Thus, such a teleoperation is clearly inadequate for future lunar surface wide field unknown environmental detection, and even for the construction of unmanned lunar base tasks.
Disclosure of Invention
The invention solves the technical problems that: aiming at the defects of stop-and-go and low efficiency of current ground-month-delay teleoperation inspection detection, the lunar surface detection quasi-real-time teleoperation system based on the combination of mobile robot state prediction correction and virtual-real scene matching simulation is provided, the problems of lunar surface robot motion state prediction, virtual-real scene quasi-real-time matching and lunar surface operation site virtual reality real-time simulation during the ground-month delay period are solved, the real-time performance and accuracy of ground-month teleoperation are ensured, and the detection efficiency is improved.
The technical scheme of the invention is as follows: a lunar surface detection quasi-real-time teleoperation system under the time delay of earth and moon, which comprises a lunar surface task operation site, a ground control center and an earth and sky transmission link,
moon face task operation site: the lunar surface mobile robot and the lunar surface environment are included; the lunar mobile robot is provided with a plurality of measuring sensors, including a panoramic camera for lunar imaging, a binocular camera and a laser radar for local lunar imaging, and an inertial measurement unit IMU for measuring the position and the attitude of the robot; the lunar surface environment comprises lunar surface topography and illumination conditions; the lunar surface task operation site receives teleoperation instructions uploaded by the ground control center in real time through a world transmission link, drives the lunar surface mobile robot to move according to the instructions, and remotely measures and downloads sensor measurement data on the lunar surface mobile robot to the ground control center through the world transmission link;
the world transmission link: the method is responsible for information interaction between a ground control center and a lunar surface task operation site, namely remote control uploading of ground control instructions and remote measurement downloading of lunar surface task operation site measurement data;
ground control center: and the generated teleoperation instruction reaches a lunar mission operation site through a heaven-earth transmission link to drive the lunar mobile robot to carry out movement detection in real time.
The ground control center comprises a teleoperation prediction correction subsystem, a mixed reality simulation subsystem and a man-machine interaction subsystem; the teleoperation prediction correction subsystem is responsible for predicting the future motion state of the lunar rover in the process of earth-moon time delay, correcting the state prediction result according to IMU measurement data and image data which are transmitted in a time delay manner, and predicting the correction result as an input condition of the mixed reality simulation subsystem; the mixed reality simulation subsystem is responsible for constructing a virtual task scene of a lunar surface task operation site, updating and displaying the virtual scene according to a lunar surface mobile robot motion state prediction correction result and image data downloaded by remote measurement, and the constructed virtual lunar surface task scene is used as an input condition of the man-machine interaction subsystem; the man-machine interaction subsystem is responsible for ground teleoperation and instruction generation, ground operators continuously control the handles according to the virtual lunar surface task scene, generate sequence teleoperation instructions, send the instructions to the teleoperation prediction correction subsystem, and remotely control and upload the instructions to the lunar surface task operation scene after instruction analysis.
The teleoperation prediction correction subsystem comprises a state prediction module, a positioning and attitude determination module, a prediction correction module and an instruction uploading module; wherein:
a state prediction module: establishing a motion state prediction model of the lunar mobile robot, wherein the motion state prediction model comprises a kinematic model and a dynamic model and a mathematical model of lunar topography fluctuation; the state prediction module takes the position and the posture of the lunar surface mobile robot at the current moment and a teleoperation instruction as the input of a motion state prediction model in each teleoperation period delta t, and obtains the position and posture prediction data of the lunar surface mobile robot through mathematical calculation in a time interval when the lunar surface telemetry data is not downloaded to the ground;
and a positioning and attitude-fixing module: each image downloading delay period T takes binocular camera and laser radar image data which are delayed and downloaded on the lunar surface as input conditions, combines the sequence IMU pose measurement data which are telemetered and downloaded in a time interval T, and obtains the accurate position and pose information of the lunar surface mobile robot at the imaging moment of the sensor by utilizing an instant positioning and mapping method SLAM as an accurate initial value of prediction correction;
and a prediction correction module: the system comprises a short period prediction correction sub-module and a long period prediction correction sub-module; the short-period prediction correction sub-module executes one operation in each teleoperation period delta T, and the long-period prediction correction sub-module executes one operation in each image downloading delay period T; wherein:
and the instruction uploading module: the handle displacement and angle operation instruction generated by the real-time operation of the handle by the ground operator is converted into a wheel speed driving instruction which can be identified by the mobile robot, and the wheel speed driving instruction is remotely controlled and uploaded to the lunar surface mobile robot through the world transmission link.
The short period prediction correction sub-module: the prediction correction module takes IMU data downloaded in a delayed manner as an initial value in each teleoperation period delta t, and calculates the IMU data from the beginning of downloading of the IMU data according to a prediction model and a teleoperation instruction in the state prediction module until the current moment; in the calculation process, the position and posture data obtained in each teleoperation period delta t is the posture data after short period prediction correction;
the long period prediction correction sub-module: the prediction correction module takes the accurate pose data of the mobile robot at the imaging moment obtained by the positioning and pose-fixing module as an initial value, and calculates the initial value from the image data downloading moment according to the prediction model and the teleoperation instruction in the state prediction module until the current moment; in the calculation process, the position and posture data obtained in each teleoperation period deltat are the posture data after long-period prediction correction.
The mixed reality simulation subsystem comprises a three-dimensional reconstruction module, a scene updating module, an information generating module and an enhanced display module; wherein:
and a three-dimensional reconstruction module: performing three-dimensional reconstruction on a lunar surface task operation site by using the remote-measurement downloaded lunar surface topographic image data and the known lunar surface mobile robot outline structure data in a large range around the mobile robot to construct a virtual lunar surface task scene comprising a virtual lunar surface topography and a virtual mobile robot; the large-range lunar surface topographic image data is obtained through a high-resolution camera on a lunar orbit device and a panoramic camera, a binocular camera and a laser radar which are configured on a mobile robot;
scene update module: according to the predicted and corrected pose data of the mobile robot obtained by the teleoperation prediction and correction subsystem, and by combining the installation direction of the binocular camera and the laser radar on the mobile robot body, the virtual lunar surface task scene at the current moment is updated and displayed in real time; meanwhile, according to the local fine lunar image data around the mobile robot which is downloaded in a delayed manner, the virtual lunar task scene of the corresponding area is refined and updated;
an information generation module: three-dimensional measurement and recording are carried out on obstacle characteristics such as protrusions, pits and the like in the virtual scene while the virtual lunar surface task scene is obtained through three-dimensional reconstruction, and the dangerous degrees of different obstacle characteristics are analyzed and marked according to the overall dimension and obstacle crossing capacity of the mobile robot; simultaneously, a reference travelling path is planned by using the cognition of ground operators to the virtual lunar surface task scene; information such as three-dimensional data of obstacle characteristics, risk degrees and the like is called enhancement information, and is collectively called guide information together with a reference travel path;
and an enhanced display module: and accurately displaying the enhancement information and the travelling path information acquired by the information generating module in the virtual lunar surface task scene.
The man-machine interaction subsystem comprises ground control personnel and an operation handle; the man-machine interaction subsystem establishes an interaction relation between ground operators and a lunar surface task operation site through the three-dimensional reconstructed virtual lunar surface task scene and the operation handle, and the ground operators manually control the operation handle to realize continuous teleoperation on the virtual mobile robot, namely the lunar surface mobile robot according to the real-time updated virtual lunar surface task scene.
Compared with the prior art, the invention has the advantages that:
(1) The existing remote operation of the lunar rover adopts a serial control mode, and ground personnel waits for the on-orbit motion state of the robot and the lunar image data to be downloaded to the ground before image splicing, trafficability analysis and remote operation planning instruction generation; the invention adopts a parallel control mode, namely, the motion pose of the robot is predicted simultaneously in the waiting robot on-orbit motion state and the moon image data downloading process. In this way, the robot pose prediction result can be used as the input of ground virtual simulation after accurately correcting the real motion state and the lunar image data which are downloaded to the ground, so that ground personnel can accurately see the motion of the lunar robot in the ground-lunar time delay process in real time.
(2) The existing lunar rover 'moving-waiting' teleoperation adopts a mode of splicing a plurality of local topographic images to construct a planar image of a local range around the lunar rover, and ground personnel performs path planning according to the local images and generates an operation instruction; the invention adopts a virtual reality simulation technology based on large-scale three-dimensional reconstruction, fully utilizes sensors on landers and lunar surface mobile robots to acquire lunar surface images with different ranges and different resolutions to carry out three-dimensional reconstruction, and supplements large-scale scenes and local fine terrains mutually, thereby improving the global performance and the accuracy of large-scale task planning and teleoperation control of ground personnel.
(3) The existing moon rover 'moving-waiting' teleoperation adopts a semi-open loop mode, after uploading teleoperation instructions, ground personnel monitors the on-track state in an off-line mode in the execution period (the duration is hours or even is calculated in days), and teleoperation execution errors only can wait for the next execution period to correct, so that the operation efficiency is reduced and error accumulation is easy to cause; the invention adopts a teleoperation error real-time closed loop correction mode: the motion state of the lunar surface mobile robot is predicted in the lunar time delay period, and the prediction error is corrected at fixed time according to the downloaded on-orbit data, so that the aims of correcting the pose error in real time, further controlling the lunar surface mobile robot in a quasi-real-time manner and improving the detection efficiency and the teleoperation control precision are fulfilled.
Drawings
FIG. 1 is a block diagram of the quasi-real-time teleoperation system composition and principle under the earth-month delay;
FIG. 2 is a quasi-real-time teleoperation workflow with earth-month delay.
Detailed Description
General technical scheme
1. System composition and function
FIG. 1 shows a block diagram of the quasi-real-time teleoperation system under the earth-month delay. The system consists of three parts of a lunar mission operation site, a ground control center and a world transmission link. Wherein:
the lunar surface task operation site mainly comprises a mobile robot for inspection and detection on a lunar surface, a lunar surface environment and the like, and belongs to the astronomical part in the system. The lunar mobile robot is provided with a panoramic camera for imaging a lunar surface in a large range, a binocular camera for imaging and navigating a local lunar surface around the robot, a laser radar for acquiring local lunar surface three-dimensional point cloud image data, an Inertial Measurement Unit (IMU) for acquiring motion state information such as the position and the posture of the robot in real time, and the like. The lunar surface environment mainly refers to the topography and illumination condition of lunar surface.
The heaven-earth transmission link is mainly used for connecting a measurement and control communication link between the ground control center and the lunar surface task operation site and is responsible for information interaction between the ground control center and the lunar surface task operation site, namely, remote control uploading of ground control instructions and remote measurement downloading of lunar surface task operation site data.
The ground control center mainly comprises three subsystems of teleoperation prediction correction, mixed reality simulation and man-machine interaction, and belongs to the ground part in the system.
The three components of the quasi-real-time teleoperation system are organically combined under the earth-month time delay to form an earth-month teleoperation closed loop system. The ground control center is a key of quasi-real-time teleoperation under the breakthrough of the ground-month time delay, and the composition and the functions of each subsystem of the ground control center are explained in detail below.
(1) Teleoperation prediction correction subsystem
The teleoperation prediction and correction subsystem is the core of the realization of the quasi-real-time teleoperation of the earth-moon time delay, and consists of a mobile robot state prediction module, a positioning and attitude determination module, a prediction and correction module, an instruction uploading module and the like. The prediction quality of the motion state of the mobile robot depends on the comprehensive action results of positioning and attitude determination, prediction and correction, and any part of performance reduction directly influences the final prediction performance, thereby influencing the implementation of a quasi-real-time teleoperation task.
a) State prediction: and (3) extrapolation prediction is carried out on the position and the posture of the robot under the earth-moon time delay by establishing a prediction model (a kinematic and dynamic model containing lunar surface topography fluctuation) of the lunar surface mobile robot.
b) Positioning and posture determination: according to the position and attitude of the robot and the lunar surface topographic image data which are transmitted in a delayed way on the lunar surface, the real position and attitude of the robot at the imaging moment is accurately calculated by utilizing an SLAM (instant positioning and mapping) method, which is important reference data for prediction and correction. Because of the existence of the earth-month time delay, the pose data cannot be acquired in real time and can only be limited frame data.
c) Prediction correction: short-period prediction correction is realized by utilizing Inertial Measurement Unit (IMU) data which are transmitted in a delayed manner, long-period prediction correction is realized by combining pose information obtained by SLAM technology, the convergence of the prediction error of the motion state of the robot is ensured, and the precision meets the requirement.
d) And (3) uploading an instruction: and analyzing the handle operation instructions (including handle displacement, angle and the like) obtained by the real-time operation of the operator into driving instructions such as wheel speeds and the like which can be identified by the mobile robot, and uploading the driving instructions to the lunar surface mobile robot through a world transmission link.
(2) Mixed reality simulation subsystem
The mixed reality simulation subsystem mainly comprises a three-dimensional reconstruction module, a scene updating module, an information generating module, an enhanced display module and the like.
a) Three-dimensional reconstruction: the precise three-dimensional reconstruction of the real scene of the lunar surface task operation site including a large-range lunar surface topography (the lunar surface topography data of the region to be detected is acquired by a lunar orbit space station, an optical camera carried on a lander and a robot, a laser radar and the like) and the mobile robot is carried out, so that the digitization of the lunar surface real scene is realized.
b) Scene update: on one hand, according to the state prediction and corrected robot pose data, the virtual lunar surface scene is updated by combining the installation direction of the binocular camera; and on the other hand, updating the local virtual scene according to the delayed and downloaded binocular camera and laser radar lunar surface imaging data.
c) And (3) information generation: for the virtual lunar surface scene, the enhancement information such as the topographic feature size, the dangerous degree and the like of the scene is generated, and meanwhile, the planning path information is generated manually, and the information is collectively called as guiding information.
d) Enhancement display: the generated enhanced information and planned path information are accurately displayed in the virtual lunar surface scene, so that an accurate and high-realistic mixed reality visual scene is provided for operators of the ground control center, and the operators can conveniently operate the mobile robot in real time.
(3) Man-machine interaction subsystem
The man-machine interaction subsystem consists of ground operators, an operating handle, helmets and other mixed reality equipment. The man-machine interaction subsystem constructs an interaction relation between ground operators and a lunar surface task operation site through the virtual lunar surface scene and the operation handle, and the operators manually control the operation handle to realize quasi-real-time teleoperation of the lunar surface mobile robot according to the real-time updated virtual lunar surface scene.
2. Principle of system operation
Firstly, a mixed reality simulation subsystem performs three-dimensional reconstruction aiming at a lunar surface task operation site, a digital virtual lunar surface scene is constructed, and guide information is formed.
Then uploading a teleoperation instruction executed by ground operators by a prediction and correction subsystem, predicting the motion state of the mobile robot on the ground before the teleoperation instruction reaches the lunar surface for execution, and performing short-period prediction correction by using pose measurement data downloaded by IMU delay; after the moon surface image is delayed and downloaded, a more accurate pose result of the mobile robot is obtained through resolving by an SLAM algorithm, and long-period prediction correction is carried out according to the position result, so that the prediction accuracy is further improved.
Meanwhile, the pose prediction result of the mobile robot and the image data downloaded by the time delay are sent to the mixed reality simulation subsystem in real time, and the mixed reality simulation subsystem is used for updating the virtual lunar surface scene in real time and displaying guiding information, so that a virtual reality digital scene which has high presence and is in time delay synchronous change with the lunar surface task operation site is provided for operators.
Then, ground operators continuously control the operation handles according to the real-time changing virtual reality digital scene to generate a sequence operation instruction for driving the lunar surface mobile robot.
And finally, analyzing the sequence operation instruction into a driving instruction of the mobile robot by the prediction and correction subsystem, continuously uploading the driving instruction to the mobile robot on the lunar surface task operation site through a heaven-earth transmission link, and carrying out delayed execution.
The system is used for circularly carrying out operations such as state prediction, positioning and attitude determination, prediction correction, three-dimensional reconstruction, scene update display, instruction analysis, uploading and the like, thereby forming a quasi-real-time teleoperation closed-loop system under earth-month time delay.
(II) System workflow
Figure 1 shows the basic workflow of a lunar surface detection quasi-real-time teleoperation system under a lunar time delay.
As shown in FIG. 2, the middle axis with an arrow is a time axis, and the upper end of the axis is "day", i.e. the actual operation site in the lunar environment, the time of day is t s A representation; the lower end of the shaft is "ground", i.e. the virtual operating site of the ground control center, the ground time being denoted by t. The difference between the uploading of teleoperation instruction and the execution of lunar surface can be considered as a delay T of one earth-month transmission distance trans . The telemetering downloaded sensor measurement data can be divided into two types of short-period time delay data and long-period time delay data: the short period time delay data is the moving state data (position, gesture, etc.) of the mobile robot measured by an Inertial Measurement Unit (IMU), and the time delay is mainly the terrestrial-lunar transmission distance time delay T trans The method comprises the steps of carrying out a first treatment on the surface of the The long-period time delay data are image data, including visual images acquired by a binocular camera and three-dimensional point cloud image data acquired by a laser radar, and the time delay is image downloading time delay T, including ground-month transmission distance time delay T trans And image data transmission time T pic Two parts. Δt represents the teleoperation period, i.e. the uploading time interval of the ground teleoperation instruction.
The workflow of the lunar surface detection quasi-real-time teleoperation system under the lunar time delay is specifically described as follows:
(1) Initializing.
a) Initial image acquisition. At time t=0, the lunar mobile robot is in a stationary state on the lunar task operation site. And respectively acquiring a large-range static panoramic image and a local image of the lunar surface task operation site by using a panoramic camera on the mobile robot, a binocular camera and a high-resolution camera on the lunar orbit device, and downloading image data to a ground control center.
b) And (5) three-dimensional reconstruction of the static scene. At the time t=0, the mixed reality simulation subsystem positioned in the ground control center carries out static three-dimensional reconstruction on the lunar surface topography and the mobile robot on the lunar surface task operation site, and realizes the digitization of the lunar surface real scene, namely, the virtual lunar surface topography and the virtual mobile robot are constructed.
c) The guidance information is displayed. At time t=0, generating teleoperation guiding information aiming at the three-dimensional reconstructed virtual lunar surface task scene, wherein the teleoperation guiding information comprises enhancement information such as feature sizes of terrains, dangerous degree grades and the like and planning path information which is generated autonomously or manually, and displaying the virtual lunar surface task scene and the guiding information on a ground control center display screen.
(2) And executing operation under short period time delay.
Short period delay refers to terrestrial-month transmission distance delay T trans . The time interval is the downloading time interval of the IMU measurement data, which is the short period time delay data.
a) And (5) one-step teleoperation. Delay time interval T at current earth-moon distance trans In the method, ground operators operate handles through a man-machine interaction subsystem to conduct one-step teleoperation according to the three-dimensional reconstructed virtual lunar task scene, wherein 'one-step' refers to each teleoperation period deltat. After the handle operation instruction is converted into the wheel speed instruction, the virtual mobile robot is driven to move in a virtual lunar surface task scene according to the planned track. Meanwhile, the ground control center analyzes the current teleoperation instruction and then uploads the current teleoperation instruction to the real mobile robot on the lunar surface task operation site.
b) And (5) one-step prediction. Delay time interval T at current earth-moon distance trans In each teleoperation period deltat, the prediction and correction subsystem of the ground control center predicts the motion state (vehicle body position, speed, gesture and the like) of the virtual mobile robot in one step. "one step" herein refers to each teleoperation period Δt.
c) One-step scene update. Delay time interval T at current earth-moon distance trans And the mixed reality simulation subsystem of the ground control center dynamically updates the lunar surface topography and the robot position and posture in the virtual lunar surface task scene according to the predicted motion state of the virtual mobile robot.
d) And (5) carrying out delay downloading on the short period delay data (IMU data). Due to the earth-moon distance delay T trans Is present, the first frame teleoperation instruction is at t=t trans The time can be uploaded to the lunar surface for execution, and the Inertial Measurement Unit (IMU) starts downloading the first frame of short-period time delay data after the same time delay of one ground-to-month distanceTime T trans After, i.e. at t=2t trans And the ground control center is reached at the moment. Each teleoperation period Δt is then provided with short-period delay data (i.e., IMU measurement data) that is delayed for downloading.
e) Short period predictive correction. A frame of teleoperation instruction sent by the ground control center passes through a short period time delay, namely a ground-month transmission distance time delay T trans Then is executed on the lunar surface, and a short period time delay T is passed trans Then, the execution effect (i.e., IMU measurement data) can be fed back to the ground control center. During this time, the real-time motion of the mobile robot is achieved by predicting the motion state. Prediction within a larger delay interval has errors, which need to be reduced or even eliminated by closed loop correction. After the short period time delay data (IMU data) is downloaded to the ground, the prediction and correction subsystem of the ground control center takes the IMU data downloaded by the prediction and correction subsystem of the ground control center as an initial value, combines a mobile robot prediction model and a teleoperation instruction, and transmits 2T to the front from the current moment trans And correcting the predicted result of the time interval until the current moment. And then, each teleoperation period delta t, the prediction and correction subsystem performs short-period prediction correction according to the IMU data which is delayed and downloaded until the first frame long-period time-delay image data is downloaded to the ground.
(3) And executing operation under long period time delay.
The long period time delay refers to the image downloading time delay T, and comprises the terrestrial-month transmission distance time delay T trans And image data transmission time T pic The time interval is the downloading time interval of long period time delay data-image data.
a) The operation is cyclically performed with a short period of time delay. And repeatedly executing the operation under the short period time delay within the time delay interval T in which the long period time delay data is not yet downloaded to the ground, namely teleoperation, instruction uploading, prediction, updating and correction.
b) Long period delay data (image data) is downloaded in a delayed manner. Because of the large image data capacity, the image data is limited by the bandwidth of the terrestrial transmission link and needs a certain time interval T pic Can be transmitted completely, and also needs to consider the time brought by the distance between the earth and the monthAnd (5) extending. Thus, the time interval for telemetry download of image data is somewhat, or even much, larger than the IMU data, and is primarily affected by the terrestrial transmission link bandwidth. Image data acquired by a binocular camera and a laser radar on the lunar rover can be downloaded to a ground control center after an image downloading time delay T, wherein the T is the downloading period of long-period time delay data.
c) And positioning and posture fixing for a long period. After long-period time delay data, namely image data, are transmitted to the ground in a time delay mode, a prediction and correction subsystem of the ground control center accurately calculates the position and posture of the mobile robot at the moment corresponding to the image (the moment is forward by T time intervals) according to the transmitted image data and by combining short-period time delay IMU data, and provides reference posture information for long-period prediction correction.
d) And (5) long-period prediction correction. The prediction and correction subsystem takes the accurate position and posture before a long-period time delay T time interval determined by long-period positioning and posture determination as an initial value, combines a mobile robot prediction model, a teleoperation instruction and short-period time delay IMU data, and carries out long-period accurate correction on a short-period prediction correction result from the front of the T time interval until the current moment.
e) Scene update and information display. The mixed reality simulation subsystem of the ground control center takes the long-period prediction corrected mobile robot pose data and the delayed downloaded image data as inputs, updates the lunar surface topography in the virtual lunar surface task scene and the motion state of the virtual mobile robot, updates the planned path, the lunar surface topography characteristic dimension, the dangerous degree and other enhancement information, and updates and displays on a display screen of the ground control center, thereby providing more accurate task scene for ground operators.
The execution operation steps under the short period time delay and the long period time delay are repeated, so that the ground teleoperation, instruction uploading, pose prediction, scene updating, data downloading, prediction correction, scene updating display and ground teleoperation are realized, and the quasi-real-time teleoperation process of the lunar surface detection under the ground-lunar time delay is realized until the task is finished.

Claims (5)

1. The utility model provides a lunar surface detection quasi-real-time teleoperation system under time delay of earth month, includes lunar surface task operation scene, ground control center and world transmission link triplex, its characterized in that:
moon face task operation site: the lunar surface mobile robot and the lunar surface environment are included; the lunar mobile robot is provided with a plurality of measuring sensors, including a panoramic camera for lunar imaging, a binocular camera and a laser radar for local lunar imaging, and an inertial measurement unit IMU for measuring the position and the attitude of the robot; the lunar surface environment comprises lunar surface topography and illumination conditions; the lunar surface task operation site receives teleoperation instructions uploaded by the ground control center in real time through a world transmission link, drives the lunar surface mobile robot to move according to the instructions, and remotely measures and downloads sensor measurement data on the lunar surface mobile robot to the ground control center through the world transmission link;
the world transmission link: the method is responsible for information interaction between a ground control center and a lunar surface task operation site, namely remote control uploading of ground control instructions and remote measurement downloading of lunar surface task operation site measurement data;
ground control center: the method is responsible for the prediction and correction of the motion state of the lunar mobile robot under the lunar time delay, the ground teleoperation and the instruction analysis and uploading, and the generated teleoperation instruction reaches the lunar task operation site through the heaven-earth transmission link to drive the lunar mobile robot to carry out the movement detection in real time;
the ground control center comprises a teleoperation prediction correction subsystem, a mixed reality simulation subsystem and a man-machine interaction subsystem; the teleoperation prediction correction subsystem is responsible for predicting the future motion state of the lunar rover in the process of earth-moon time delay, correcting the state prediction result according to IMU measurement data and image data which are transmitted in a time delay manner, and predicting the correction result as an input condition of the mixed reality simulation subsystem; the mixed reality simulation subsystem is responsible for constructing a virtual task scene of a lunar surface task operation site, updating and displaying the virtual scene according to a lunar surface mobile robot motion state prediction correction result and image data downloaded by remote measurement, and the constructed virtual lunar surface task scene is used as an input condition of the man-machine interaction subsystem; the man-machine interaction subsystem is responsible for ground teleoperation and instruction generation, ground operators continuously control the handles according to the virtual lunar surface task scene, generate sequence teleoperation instructions, send the instructions to the teleoperation prediction correction subsystem, and remotely control and upload the instructions to the lunar surface task operation scene after instruction analysis.
2. The quasi-real-time teleoperation system for lunar surface detection under a lunar time delay as claimed in claim 1, wherein: the teleoperation prediction correction subsystem comprises a state prediction module, a positioning and attitude determination module, a prediction correction module and an instruction uploading module; wherein:
a state prediction module: establishing a motion state prediction model of the lunar mobile robot, wherein the motion state prediction model comprises a kinematic model and a dynamic model and a mathematical model of lunar topography fluctuation; the state prediction module takes the position and the posture of the lunar surface mobile robot at the current moment and a teleoperation instruction as the input of a motion state prediction model in each teleoperation period delta t, and obtains the position and posture prediction data of the lunar surface mobile robot through mathematical calculation in a time interval when the lunar surface telemetry data is not downloaded to the ground;
and a positioning and attitude-fixing module: each image downloading delay period T takes binocular camera and laser radar image data which are delayed and downloaded on the lunar surface as input conditions, combines the sequence IMU pose measurement data which are telemetered and downloaded in a time interval T, and obtains the accurate position and pose information of the lunar surface mobile robot at the imaging moment of the sensor by utilizing an instant positioning and mapping method SLAM as an accurate initial value of prediction correction;
and a prediction correction module: the system comprises a short period prediction correction sub-module and a long period prediction correction sub-module; the short-period prediction correction sub-module executes one operation in each teleoperation period delta T, and the long-period prediction correction sub-module executes one operation in each image downloading delay period T; wherein:
and the instruction uploading module: the handle displacement and angle operation instruction generated by the real-time operation of the handle by the ground operator is converted into a wheel speed driving instruction which can be identified by the mobile robot, and the wheel speed driving instruction is remotely controlled and uploaded to the lunar surface mobile robot through the world transmission link.
3. The quasi-real-time teleoperation system for lunar surface detection under a lunar time delay as claimed in claim 2, wherein: the short period prediction correction sub-module: the prediction correction module takes IMU data downloaded in a delayed manner as an initial value in each teleoperation period delta t, and calculates the IMU data from the beginning of downloading of the IMU data according to a prediction model and a teleoperation instruction in the state prediction module until the current moment; in the calculation process, the position and posture data obtained in each teleoperation period delta t is the posture data after short period prediction correction;
the long period prediction correction sub-module: the prediction correction module takes the accurate pose data of the mobile robot at the imaging moment obtained by the positioning and pose-fixing module as an initial value, and calculates the initial value from the image data downloading moment according to the prediction model and the teleoperation instruction in the state prediction module until the current moment; in the calculation process, the position and posture data obtained in each teleoperation period deltat are the posture data after long-period prediction correction.
4. The quasi-real-time teleoperation system for lunar surface detection under a lunar time delay as claimed in claim 2, wherein: the mixed reality simulation subsystem comprises a three-dimensional reconstruction module, a scene updating module, an information generating module and an enhanced display module; wherein:
and a three-dimensional reconstruction module: performing three-dimensional reconstruction on a lunar surface task operation site by using the remote-measurement downloaded lunar surface topographic image data and the known lunar surface mobile robot outline structure data in a large range around the mobile robot to construct a virtual lunar surface task scene comprising a virtual lunar surface topography and a virtual mobile robot; the large-range lunar surface topographic image data is obtained through a high-resolution camera on a lunar orbit device and a panoramic camera, a binocular camera and a laser radar which are configured on a mobile robot;
scene update module: according to the predicted and corrected pose data of the mobile robot obtained by the teleoperation prediction and correction subsystem, and by combining the installation direction of the binocular camera and the laser radar on the mobile robot body, the virtual lunar surface task scene at the current moment is updated and displayed in real time; meanwhile, according to the local fine lunar image data around the mobile robot which is downloaded in a delayed manner, the virtual lunar task scene of the corresponding area is refined and updated;
an information generation module: three-dimensional measurement and recording are carried out on obstacle characteristics including protrusions and pits in the virtual scene while the virtual lunar surface task scene is obtained through three-dimensional reconstruction, and the dangerous degrees of different obstacle characteristics are analyzed and marked according to the overall dimension and obstacle crossing capacity of the mobile robot; simultaneously, a reference travelling path is planned by using the cognition of ground operators to the virtual lunar surface task scene; the three-dimensional data of the obstacle characteristics and the risk degree information are called enhancement information, and are collectively called guide information together with a reference travel path;
and an enhanced display module: and accurately displaying the enhancement information and the travelling path information acquired by the information generating module in the virtual lunar surface task scene.
5. The quasi-real-time teleoperation system for lunar surface detection under a lunar time delay as claimed in claim 2, wherein: the man-machine interaction subsystem comprises ground control personnel and an operation handle; the man-machine interaction subsystem establishes an interaction relation between ground operators and a lunar surface task operation site through the three-dimensional reconstructed virtual lunar surface task scene and the operation handle, and the ground operators manually control the operation handle to realize continuous teleoperation on the virtual mobile robot, namely the lunar surface mobile robot according to the real-time updated virtual lunar surface task scene.
CN202010399002.8A 2020-05-12 2020-05-12 Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay Active CN111580519B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010399002.8A CN111580519B (en) 2020-05-12 2020-05-12 Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010399002.8A CN111580519B (en) 2020-05-12 2020-05-12 Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay

Publications (2)

Publication Number Publication Date
CN111580519A CN111580519A (en) 2020-08-25
CN111580519B true CN111580519B (en) 2023-06-30

Family

ID=72112173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010399002.8A Active CN111580519B (en) 2020-05-12 2020-05-12 Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay

Country Status (1)

Country Link
CN (1) CN111580519B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577494B (en) * 2020-11-13 2022-11-18 上海航天控制技术研究所 SLAM method, electronic device and storage medium suitable for lunar vehicle
CN112722331B (en) * 2021-01-27 2022-07-22 东南大学 Interaction device and interaction control method of lunar manned mobile vehicle system
CN115565430B (en) * 2022-09-15 2024-03-29 北京科技大学 System for simulating remote teleoperation of lunar rover
CN116382476B (en) * 2023-03-30 2023-10-13 哈尔滨工业大学 Wearable interaction system for moon surface man-machine collaborative operation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101434066A (en) * 2008-10-20 2009-05-20 北京理工大学 Method and platform for predicating teleoperation of robot
CN102880063A (en) * 2012-09-13 2013-01-16 中国人民解放军63921部队 System and method for synchronously controlling teleoperation
CN104015190A (en) * 2014-05-13 2014-09-03 中国科学院力学研究所 Robot remote control method and system under indeterminate bidirectional time delay condition
CN109933097A (en) * 2016-11-21 2019-06-25 清华大学深圳研究生院 A kind of robot for space remote control system based on three-dimension gesture

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9519286B2 (en) * 2013-03-19 2016-12-13 Robotic Research, Llc Delayed telop aid
US9880551B2 (en) * 2015-03-06 2018-01-30 Robotic Research, Llc Point-and-click control of unmanned, autonomous vehicle using omni-directional visors
US11281208B2 (en) * 2018-03-02 2022-03-22 Carnegie Mellon University Efficient teleoperation of mobile robots via online adaptation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101434066A (en) * 2008-10-20 2009-05-20 北京理工大学 Method and platform for predicating teleoperation of robot
CN102880063A (en) * 2012-09-13 2013-01-16 中国人民解放军63921部队 System and method for synchronously controlling teleoperation
CN104015190A (en) * 2014-05-13 2014-09-03 中国科学院力学研究所 Robot remote control method and system under indeterminate bidirectional time delay condition
CN109933097A (en) * 2016-11-21 2019-06-25 清华大学深圳研究生院 A kind of robot for space remote control system based on three-dimension gesture

Also Published As

Publication number Publication date
CN111580519A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111580519B (en) Quasi-real-time teleoperation system for lunar surface detection under earth-moon time delay
Volpe Rover functional autonomy development for the mars mobile science laboratory
Nguyen et al. Virtual reality interfaces for visualization and control of remote vehicles
Biesiadecki et al. Tradeoffs between directed and autonomous driving on the mars exploration rovers
JP2022512440A (en) Trajectory tracking controller test method, equipment, media and equipment
Benninghoff et al. Development and hardware-in-the-loop test of a guidance, navigation and control system for on-orbit servicing
CN110989605A (en) Three-body intelligent system architecture and detection robot
CN115200588B (en) SLAM autonomous navigation method and device for mobile robot
Theil et al. ATON (Autonomous Terrain-based Optical Navigation) for exploration missions: recent flight test results
KR20210043522A (en) Method, apparatus, electronic device and storage medium for generating offline map
Castano et al. Current results from a rover science data analysis system
Tunstel et al. Mars exploration rover mobility and robotic arm operational performance
Wedler et al. Preliminary results for the multi-robot, multi-partner, multi-mission, planetary exploration analogue campaign on mount etna
Wettergreen et al. Operating nomad during the Atacama Desert trek
Volpe Rover technology development and mission infusion beyond MER
Hayati et al. Long Range Science Rover (Rocky7) Mojave Desert Field Tests
Weclewski et al. Sample Fetch Rover guidance, navigation and control subsystem-an overview
Marc et al. Capabilities of long range autonomous multi-mode rover navigation system-ERGO field trials and planned evolution
Bresina et al. K9 operation in May’00 dual-rover field experiment
Bornschlegl et al. Space robotics in Europe, a compendium
Cheng et al. Research on intuitive controlling of unmanned Lunar Rover
Correal et al. Autonomy for ground-level robotic space exploration: framework, simulation, architecture, algorithms and experiments
CN116679587A (en) Mixed reality teleoperation simulation system and method for tour inspection of extraterrestrial star meter
Koch et al. Helicopter and Rover Operations on Mars Using the Robot Sequencing and Visualization Program (RSVP)
MORITA et al. Hayabusa descent navigation based on accurate landmark tracking scheme

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant