CN116382284A - Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment - Google Patents

Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment Download PDF

Info

Publication number
CN116382284A
CN116382284A CN202310375211.2A CN202310375211A CN116382284A CN 116382284 A CN116382284 A CN 116382284A CN 202310375211 A CN202310375211 A CN 202310375211A CN 116382284 A CN116382284 A CN 116382284A
Authority
CN
China
Prior art keywords
robot
tunnel
information
judging whether
central line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310375211.2A
Other languages
Chinese (zh)
Inventor
刘帅
王国河
王华刚
袁野
张美玲
周国丰
吴凤岐
毛冰滟
吴玉
王超
殷勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China General Nuclear Power Corp
China Nuclear Power Technology Research Institute Co Ltd
CGN Power Co Ltd
Original Assignee
China General Nuclear Power Corp
China Nuclear Power Technology Research Institute Co Ltd
CGN Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China General Nuclear Power Corp, China Nuclear Power Technology Research Institute Co Ltd, CGN Power Co Ltd filed Critical China General Nuclear Power Corp
Priority to CN202310375211.2A priority Critical patent/CN116382284A/en
Publication of CN116382284A publication Critical patent/CN116382284A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The invention relates to an automatic driving method and system of a nuclear tunnel cleaning robot and electronic equipment. The method comprises the following steps: s1, acquiring relative pose information of a robot in a tunnel; s2, obtaining displacement deviation information according to the relative pose information; and S3, controlling the robot according to the displacement deviation information so that the robot runs along the central line of the tunnel. According to the invention, the relative pose information of the robot in the tunnel is obtained, the displacement deviation information is further obtained, and then the robot is controlled to travel along the central line of the tunnel according to the displacement deviation information, so that the robot can automatically drive in the tunnel, and the travel efficiency of the robot is improved.

Description

Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment
Technical Field
The invention relates to the field of automatic driving, in particular to an automatic driving method, an automatic driving system and electronic equipment of a nuclear tunnel cleaning robot.
Background
Most of domestic coastal nuclear power stations build diversion tunnels from offshore water intake, and seawater is led to serve as final cooling water. When the diversion tunnel is used for a long time, accumulated marine organisms, sludge and other impurities easily cause water intake blockage, and a cold source water intake blockage event of the nuclear power station is caused, so that the marine organisms, the sludge and other impurities in the tunnel need to be cleaned regularly. The cleaning of tunnel marine organisms, sludge and other impurities can be performed using a number of large cleaning and collecting devices. Currently, these devices are controlled in close proximity along the tunnel by a worker holding an operator in the tunnel to avoid damaging the tunnel wall. The control mode is complex in operation and slow in response, the running track of the equipment cannot be judged in time, the working efficiency is affected, toxic gas exists in the tunnel, and the working risk of staff is high.
Disclosure of Invention
The invention aims to solve the technical problem of providing an automatic driving method, an automatic driving system and electronic equipment of a nuclear tunnel cleaning robot.
The technical scheme adopted for solving the technical problems is as follows: the utility model provides a nuclear tunnel cleaning robot autopilot method, which comprises the following steps:
s1, acquiring relative pose information of a robot in a tunnel;
s2, obtaining displacement deviation information according to the relative pose information;
and S3, controlling the robot according to the displacement deviation information so that the robot runs along the central line of the tunnel.
Preferably, the relative pose information includes position and direction information of the robot in the tunnel,
in the step S1, it includes:
s11, acquiring point cloud data of the surrounding environment of the robot;
s12, obtaining position and direction information of the robot in the tunnel according to the point cloud data.
Preferably, in the step S12, it includes:
filtering the point cloud data;
performing Hough transformation on the processed point cloud data;
obtaining the longest two straight line segments according to the transformed point cloud data;
judging whether the number of data points in the point cloud data through which the two straight line segments pass is larger than a threshold value or not; if not, go to step S11;
if yes, the coordinate information of the left side wall and the right side wall of the tunnel is obtained according to the two straight line segments, and the position and the direction information of the robot in the tunnel are obtained according to the coordinate information of the left side wall and the right side wall of the tunnel.
Preferably, in the step S1, it includes:
and acquiring the relative pose information through detection equipment arranged on the robot.
Preferably, in step S3, it includes:
judging whether the robot does not deflect and is on the tunnel central line according to the displacement deviation information, and if so, controlling the robot to move straight;
judging whether the robot deflects leftwards and is on the tunnel central line according to the displacement deviation information, and if so, controlling the robot to turn rightwards;
judging whether the robot deflects to the right or not and is on the tunnel central line according to the displacement deviation information, and if so, controlling the robot to turn to the left;
judging whether the robot deflects rightwards or not and is positioned on the left side of the central line of the tunnel according to the displacement deviation information; if yes, controlling the robot to move straight;
judging whether the robot deflects leftwards or not according to the displacement deviation information, and judging whether the robot deflects leftwards or not and is positioned at the left side of the central line of the tunnel; if yes, controlling the robot to turn right and advance;
judging whether the robot deflects leftwards or not and is positioned on the right side of the central line of the tunnel according to the displacement deviation information; if yes, controlling the robot to move straight;
judging whether the robot deflects rightwards or not according to the displacement deviation information, and judging whether the robot deflects rightwards or not and is positioned on the right side of the central line of the tunnel; if yes, controlling the robot to turn left and advance.
Preferably, in step S3, the controlling the robot according to the displacement deviation information includes:
and adjusting the robot by adopting a PID method according to the displacement deviation information.
Preferably, the method further comprises:
and acquiring the running state information of the robot, judging whether the robot fails according to the running state information, and if so, not allowing the robot to advance.
Preferably, the method further comprises:
acquiring pressure values of the robot and the left and right side walls of the tunnel;
judging whether the pressure values of the robot and the left side wall of the tunnel are larger than a first upper limit threshold value and the pressure values of the robot and the right side wall of the tunnel are smaller than a first lower limit threshold value, and if yes, controlling the robot to turn right;
judging whether the pressure values of the robot and the left side wall of the tunnel are smaller than a first lower limit threshold value and the pressure values of the robot and the right side wall of the tunnel are larger than a first upper limit threshold value, and if yes, controlling the robot to turn left.
The invention also provides an automatic driving system of the nuclear tunnel cleaning robot, which comprises the following components:
the driving information acquisition module is used for acquiring relative pose information of the robot in the tunnel;
and the running information processing module is used for obtaining displacement deviation information according to the relative pose information and controlling the robot according to the displacement deviation information so as to enable the robot to run along the central line of the tunnel.
The invention also provides an electronic device comprising a processor and a memory, wherein the memory is used for storing a computer program, and the processor is used for executing the computer program of the memory to realize the nuclear tunnel cleaning robot automatic driving method.
The implementation of the invention has the following beneficial effects: the displacement deviation information is further obtained by acquiring the relative pose information of the robot in the tunnel, and the robot is controlled to travel along the central line of the tunnel according to the displacement deviation information, so that the robot can automatically drive in the tunnel, and the travel efficiency of the robot is improved.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a schematic flow chart of an embodiment of an autonomous driving method of a nuclear tunnel cleaning robot according to the present invention;
FIG. 2 is a schematic diagram of a robotic offset of some embodiments of the invention;
fig. 3 is a schematic diagram of robotic offsets of some embodiments of the invention.
Detailed Description
For a clearer understanding of technical features, objects and effects of the present invention, a detailed description of embodiments of the present invention will be made with reference to the accompanying drawings.
As shown in fig. 1, in an embodiment of the nuclear tunnel cleaning robot autopilot method of the present invention, the method includes the steps of:
s1, acquiring relative pose information of the robot in the tunnel. Specifically, the tunnel is built by concrete, and laser can be reflected normally by the two side walls of the tunnel, so that a two-dimensional laser radar or a three-dimensional laser radar can be adopted as detection equipment, and relative pose information can be obtained through the detection equipment arranged on the robot. The front end and the rear end of the top of the robot are respectively provided with a detection device, and the detection devices can be arranged at other positions as required, so long as the related functions can be realized. The relative posture and displacement between the detection equipment and the robot are fixed because the installation position of the detection equipment on the robot is known, so that the relative posture information of the robot in the tunnel can be obtained according to the information such as the distance between the front detection equipment and the rear detection equipment and the left side wall and the right side wall of the tunnel.
In an alternative embodiment, the relative pose information includes position and orientation information of the robot in the tunnel. In step S1, it includes:
s11, acquiring point cloud data of the surrounding environment of the robot. The laser radars at the front end and the rear end of the top of the robot are used for scanning the surrounding environment of the robot, then point cloud data are acquired, and the point cloud data are fed back to an industrial personal computer in the robot.
S12, obtaining the position and direction information of the robot in the tunnel according to the point cloud data.
Further, in step S12, the method includes:
since marine organisms or seawater may adhere to the tunnel wall, noise points or outliers may be mixed in the point cloud data, and a certain error influence is generated on the measurement result, so that filtering processing is required to be performed on the point cloud data. In addition, the filtered point cloud data can be displayed on a display interface in real time, so that a worker can check the real-time driving condition of the robot, and when the robot fails, the robot can stop automatic driving through manual intervention.
And carrying out Hough transformation on the processed point cloud data. The data points in the point cloud data are subjected to space transformation by using the Hough transformation, and the straight lines obtained after the point transformation on the same straight line in the point cloud data are intersected at one place, so that the straight lines in the point cloud data can be detected through the Hough transformation.
And obtaining the longest two straight line segments according to the transformed point cloud data. Specifically, the longest two straight line segments are obtained through statistics, and the more the number of the straight lines obtained after transformation intersect at one position, the more data points of the corresponding straight lines in the point cloud data are indicated, the longer the straight line segments are. In addition, the points on the same straight line are not necessarily completely intersected on the same point after Hough transformation, so that the longest two straight line segments are required to be counted and searched after blurring processing is carried out, and the accuracy of straight line detection is improved.
Judging whether the number of the point cloud data points passing through the two straight line segments is larger than a threshold value or not; if not, go to step S11. Specifically, whether data points in the point cloud data passing through the two straight line segments are more than 200 is judged, if not, a straight line which does not meet the condition exists, the point cloud data of the surrounding environment of the robot is acquired again, and then detection is carried out.
If so, acquiring coordinate information of the left side wall and the right side wall of the tunnel according to the two straight line segments, and acquiring position and direction information of the robot in the tunnel according to the coordinate information of the left side wall and the right side wall of the tunnel. When the number of data points in the point cloud data passed by the two straight line segments exceeds 200, coordinate information of the left side wall and the right side wall of the tunnel is obtained according to the two straight line segments, and position and direction information of the robot in the tunnel are calculated according to the coordinate information of the two side walls. The position information in the tunnel refers to the longest distance between the left side vehicle body of the robot and the left side wall of the tunnel and the longest distance between the right side vehicle body of the robot and the right side wall of the tunnel; the direction information in the tunnel refers to the angle of the head direction of the robot relative to the direction of travel along the tunnel centerline. Along the advancing direction of the robot in the tunnel, the tunnel wall at the left side of the robot body is a left side wall, and the tunnel wall at the right side of the robot body is a right side wall.
And S2, obtaining displacement deviation information according to the relative pose information. Specifically, the displacement deviation information includes a distance deviation value and an angle deviation value of the robot body with respect to the tunnel center line. Referring to fig. 2, when the robot body is at the center line without offset, the left side body is at a distance d from the left side wall of the tunnel, and the right side body is at a distance d from the right side wall of the tunnel. In the driving process, the phenomenon of center line deviation is inevitably caused due to the influence of factors such as road surface slipping, vehicle body mechanical errors, tunnel construction errors and the like. When the robot body is deviated, the longest distance between the left side body and the left side wall of the tunnel is d1, and the longest distance between the right side body and the right side wall of the tunnel is d2; the angle of the head direction of the robot at this time relative to the advancing direction along the tunnel center line is β, that is, the angle deviation value. And calculating the distance deviation value of the vehicle body relative to the central line of the tunnel according to the difference value of d1 and d and the difference value of d2 and d respectively. In other embodiments, in order to improve the accuracy of calculation, after calculating the relative pose information by collecting multiple point cloud data, the average value of the results of multiple times of calculation is used as the relative pose information, so as to obtain the displacement deviation information.
And S3, controlling the robot according to the displacement deviation information so that the robot runs along the central line of the tunnel. When the robot is moved by the human body, the robot is adjusted according to the obtained information such as the distance deviation value and the angle deviation value of the robot body relative to the central line of the tunnel, and the robot is controlled to drive back to the central line.
Specifically, in step S3, the method includes:
and judging whether the robot does not deflect and is on the central line of the tunnel according to the displacement deviation information, and if so, controlling the robot to move straight. Referring to the robot offset schematic diagram of fig. 3, the robot body No. 1 is not deflected and on the tunnel center line, and is controlled to go straight.
And judging whether the robot deflects leftwards and is on the central line of the tunnel according to the displacement deviation information, and if so, controlling the robot to turn rightwards. And the head of the No. 2 robot body deflects leftwards, and the body is arranged on the central line of the tunnel, so that the right turning direction of the robot body is controlled to ensure that the head is not deflected.
And judging whether the robot deflects to the right or not according to the displacement deviation information and is on the central line of the tunnel, and if so, controlling the robot to turn to the left. And the head of the No. 3 robot body deflects to the right and the body is on the central line of the tunnel, so that the left turning direction of the No. 3 robot body is controlled to ensure that the head is not deflected.
Judging whether the robot deflects rightwards or not and is positioned on the left side of the central line of the tunnel according to the displacement deviation information; if yes, the robot is controlled to move straight. And the head of the No. 4 robot body deflects rightwards and the body is at the left side of the central line of the tunnel, so that the robot body is controlled to run straight and return to the central line.
Judging whether the robot deflects leftwards or not according to the displacement deviation information, wherein the robot is positioned at the left side of the central line of the tunnel; if yes, the robot is controlled to turn right and advance. And the head of the No. 5 robot body deflects leftwards, and the body is arranged at the left side of the central line of the tunnel, so that the right turning and steering of the No. 5 robot body are controlled, and the robot body is driven back to the central line before. The head of the No. 6 robot body does not deflect, and the body is at the left side of the central line of the tunnel, so that the right turn steering of the No. 6 robot body is controlled, and the robot body is driven back to the central line before.
Judging whether the robot deflects leftwards and is positioned on the right side of the central line of the tunnel according to the displacement deviation information; if yes, the robot is controlled to move straight. And the head of the No. 8 robot body deflects leftwards and the body is arranged on the right side of the central line of the tunnel, so that the robot body is controlled to run straight and return to the central line.
Judging whether the robot deflects rightwards or not according to the displacement deviation information, and judging whether the robot deflects rightwards or not and is positioned on the right side of the central line of the tunnel; if yes, the robot is controlled to turn left and advance. The head of the No. 9 robot body deflects to the right and the body is on the right side of the central line of the tunnel, so that the left turning and steering of the No. 9 robot body are controlled and the robot body is driven back to the central line before. The head of the No. 7 robot body does not deflect, and the body is arranged on the right side of the central line of the tunnel, so that the left turning and steering of the No. 7 robot body are controlled, and the robot body is driven back to the central line before.
Further, in step S3, the step of controlling the robot according to the displacement deviation information includes: and adjusting the robot by adopting a PID method according to the displacement deviation information. Specifically, the robot may employ a crawler-type running gear or a wheel-type running gear. And calculating the equivalent adding speeds of the running devices on the left side and the right side according to the angle deviation value and the distance deviation value, and further obtaining the target speeds of the running devices on the left side and the right side. Then, the PID method is adopted to carry out closed-loop control on the robot: PID control signals are obtained through PID calculation according to the target speeds of the running devices on the left side and the right side and the detected actual speeds, the PID control signals are transmitted to the control valve, and the motor rotating speed is further regulated so as to control the running speeds of the running devices on the two sides of the robot to enable the running speeds to return to the central line.
In an alternative embodiment, the method further comprises: and acquiring the running state information of the robot, judging whether the robot fails according to the running state information, and if so, not allowing the robot to advance. Specifically, in the running process of the robot, the overall running condition of the robot needs to be monitored in real time, when the robot fails, the robot is not allowed to advance, and when the robot runs normally, the robot is controlled to run along the central line of the tunnel according to the control methods from step S1 to step S3.
In an alternative embodiment, the method further comprises: acquiring pressure values of the left and right side walls of the robot and the tunnel; judging whether the pressure value of the robot and the left side wall of the tunnel is larger than a first upper limit threshold value and the pressure value of the robot and the right side wall of the tunnel is smaller than a first lower limit threshold value, and if so, controlling the robot to turn right; judging whether the pressure value of the robot and the left side wall of the tunnel is smaller than a first lower limit threshold value and the pressure value of the robot and the right side wall of the tunnel is larger than a first upper limit threshold value, and if yes, controlling the robot to rotate left. Specifically, when the robot performs the work such as marine organism cleaning or marine organism gathering and collecting on the side wall of the tunnel through the left side component and the right side component, a certain pressure exists between the left side component and the right side component and the side wall of the tunnel. In order to prevent damage to the tunnel wall, this pressure is controlled to be within a range greater than a first lower threshold and less than a first upper threshold, and then the robot is controlled to travel along the tunnel center line according to the control method of steps S1 to S3.
It will be appreciated that in other embodiments, the overall operation of the robot may be monitored in real time during the running of the robot, and the robot is not allowed to advance after a fault occurs in the robot; when the robot normally operates, the pressures of the left and right side parts and the side wall of the tunnel are controlled to be within a range larger than a first lower limit threshold and smaller than a first upper limit threshold, and then the robot is controlled to run along the central line of the tunnel according to the control methods of the steps S1 to S3.
The working process for controlling the robot to run along the central line in the tunnel by using the automatic driving method comprises the following steps: when the robot initially enters the tunnel portal, the robot is manually operated or remotely operated to adjust the position of the vehicle body, and an automatic driving function is started. Firstly, starting a robot, checking point cloud data and images acquired by a laser radar after confirming the connection state of the laser radar and an industrial personal computer, and starting an automatic navigation mode if the data and the images are displayed normally. The robot runs along the central line of the tunnel during navigation, and the collected point cloud images and data are displayed in real time, so that the navigation can be stopped at any time by manual intervention, and faults are avoided. The method has anti-interference performance, and can accurately identify the position even when the wall is uneven; the device is not interfered by temperature and humidity, and can work normally in a tunnel environment. The running efficiency of the robot is improved in the working process, and workers are not required to observe the control equipment in the tunnel manually, so that the working danger is greatly reduced.
In addition, the method of the invention can be used for linear tunnels and curved tunnels, and when the method is used for the curved tunnels, the construction size data of the tunnels are also required to be acquired to control the automatic driving of the robot.
The nuclear tunnel cleaning robot autopilot system of the present invention may be used to perform the nuclear tunnel cleaning robot autopilot method of the above-described embodiments.
The robot autopilot system of the present embodiment includes:
and the driving information acquisition module is used for acquiring the relative pose information of the robot in the tunnel. Specifically, a two-dimensional laser radar or a three-dimensional laser radar may be used as the detection device, and the relative pose information may be obtained by the detection device provided on the robot. The front end and the rear end of the top of the robot are respectively provided with a detection device, and the detection devices can be arranged at other positions as required, so long as the related functions can be realized. The relative posture and displacement between the detection equipment and the robot are fixed because the installation position of the detection equipment on the robot is known, so that the relative posture information of the robot in the tunnel can be obtained according to the information such as the distance between the front detection equipment and the rear detection equipment and the left side wall and the right side wall of the tunnel.
Optionally, the driving information obtaining module is further configured to obtain point cloud data of an environment around the robot, and obtain position and direction information of the robot in the tunnel according to the point cloud data. The laser radars at the front end and the rear end of the top of the robot are used for scanning the surrounding environment of the robot, then point cloud data are acquired, and the point cloud data are fed back to the driving information processing module.
Specifically, the driving information acquisition module is used for performing filtering processing on the point cloud data, performing Hough transformation on the processed point cloud data, obtaining the longest two straight line segments according to the transformed point cloud data, and continuously acquiring the point cloud data of the surrounding environment of the robot when the number of the point cloud data points passing through the two straight line segments is not greater than a threshold value; when the number of the point cloud data points passed by the two straight-line segments is larger than a threshold value, coordinate information of the left side wall and the right side wall of the tunnel is obtained according to the two straight-line segments, and position and direction information of the robot in the tunnel are obtained according to the coordinate information of the left side wall and the right side wall of the tunnel.
Optionally, the driving information obtaining module is further configured to obtain operation state information of the robot, and send the operation state information to the driving information processing module.
Optionally, the driving information obtaining module is further configured to obtain pressure values of the left and right sidewalls of the robot and the tunnel, and send the pressure values to the driving information processing module.
And the driving information processing module is used for obtaining displacement deviation information according to the relative pose information and controlling the robot according to the displacement deviation information so as to drive the robot along the central line of the tunnel.
Specifically, the displacement deviation information includes a distance deviation value and an angle deviation value of the robot body with respect to the tunnel center line. Referring to fig. 2, when the robot body is at the center line without offset, the left side body is at a distance d from the left side wall of the tunnel, and the right side body is at a distance d from the right side wall of the tunnel. In the driving process, the phenomenon of center line deviation is inevitably caused due to the influence of factors such as road surface slipping, vehicle body mechanical errors, tunnel construction errors and the like. When the robot body is deviated, the longest distance between the left side body and the left side wall of the tunnel is d1, and the longest distance between the right side body and the right side wall of the tunnel is d2; the angle of the head direction of the robot at this time relative to the advancing direction along the tunnel center line is β, that is, the angle deviation value. And calculating the distance deviation value of the vehicle body relative to the central line of the tunnel according to the difference value of d1 and d and the difference value of d2 and d respectively. In other embodiments, in order to improve the calculation accuracy, the driving information processing module is further configured to calculate the relative pose information by collecting multiple point cloud data, and then use an average value of the results of the multiple calculations as the relative pose information, so as to obtain the displacement deviation information.
Optionally, the driving information processing module is further configured to determine whether the robot is not deflected and is on the tunnel centerline according to the displacement deviation information, and if yes, control the robot to move straight. Referring to the robot offset schematic diagram of fig. 3, the robot body No. 1 is not deflected and on the tunnel center line, and is controlled to go straight.
And judging whether the robot deflects leftwards and is on the central line of the tunnel according to the displacement deviation information, and if so, controlling the robot to turn rightwards. And the head of the No. 2 robot body deflects leftwards, and the body is arranged on the central line of the tunnel, so that the right turning direction of the robot body is controlled to ensure that the head is not deflected.
And judging whether the robot deflects to the right or not according to the displacement deviation information and is on the central line of the tunnel, and if so, controlling the robot to turn to the left. And the head of the No. 3 robot body deflects to the right and the body is on the central line of the tunnel, so that the left turning direction of the No. 3 robot body is controlled to ensure that the head is not deflected.
Judging whether the robot deflects rightwards or not and is positioned on the left side of the central line of the tunnel according to the displacement deviation information; if yes, the robot is controlled to move straight. And the head of the No. 4 robot body deflects rightwards and the body is at the left side of the central line of the tunnel, so that the robot body is controlled to run straight and return to the central line.
Judging whether the robot deflects leftwards or not according to the displacement deviation information, wherein the robot is positioned at the left side of the central line of the tunnel; if yes, the robot is controlled to turn right and advance. And the head of the No. 5 robot body deflects leftwards, and the body is arranged at the left side of the central line of the tunnel, so that the right turning and steering of the No. 5 robot body are controlled, and the robot body is driven back to the central line before. The head of the No. 6 robot body does not deflect, and the body is at the left side of the central line of the tunnel, so that the right turn steering of the No. 6 robot body is controlled, and the robot body is driven back to the central line before.
Judging whether the robot deflects leftwards and is positioned on the right side of the central line of the tunnel according to the displacement deviation information; if yes, the robot is controlled to move straight. And the head of the No. 8 robot body deflects leftwards and the body is arranged on the right side of the central line of the tunnel, so that the robot body is controlled to run straight and return to the central line.
Judging whether the robot deflects rightwards or not according to the displacement deviation information, and judging whether the robot deflects rightwards or not and is positioned on the right side of the central line of the tunnel; if yes, the robot is controlled to turn left and advance. The head of the No. 9 robot body deflects to the right and the body is on the right side of the central line of the tunnel, so that the left turning and steering of the No. 9 robot body are controlled and the robot body is driven back to the central line before. The head of the No. 7 robot body does not deflect, and the body is arranged on the right side of the central line of the tunnel, so that the left turning and steering of the No. 7 robot body are controlled, and the robot body is driven back to the central line before.
Optionally, the driving information processing module is further configured to adjust the robot by using a PID method according to the displacement deviation information. Specifically, the robot may employ a crawler-type running gear or a wheel-type running gear. And calculating the equivalent adding speeds of the running devices on the left side and the right side according to the angle deviation value and the distance deviation value, and further obtaining the target speeds of the running devices on the left side and the right side. Then, the PID method is adopted to carry out closed-loop control on the robot: and the running information processing module performs PID calculation according to the target speeds of the running devices at the left side and the right side and the detected actual speed to obtain PID control signals, and transmits the PID control signals to the control valve to further adjust the rotating speed of the motor so as to control the running speeds of the running devices at the two sides of the robot to enable the running speed to return to the central line.
Optionally, the running information processing module is further configured to determine whether the robot has a fault according to the running state information, and if yes, the robot is not allowed to advance. Specifically, in the running process of the robot, the overall running condition of the robot needs to be monitored in real time, when the robot fails, the robot is not allowed to advance, and when the robot runs normally, the robot is controlled to run along the central line of the tunnel according to the pose information.
Optionally, the driving information processing module is further configured to determine whether a pressure value of the robot and a left side wall of the tunnel is greater than a first upper limit threshold value and a pressure value of the robot and a right side wall of the tunnel is less than a first lower limit threshold value, and if so, control the robot to turn right; judging whether the pressure value of the robot and the left side wall of the tunnel is smaller than a first lower limit threshold value and the pressure value of the robot and the right side wall of the tunnel is larger than a first upper limit threshold value, and if yes, controlling the robot to rotate left. Specifically, when the robot performs the work such as marine organism cleaning or marine organism gathering and collecting on the side wall of the tunnel through the left side component and the right side component, a certain pressure exists between the left side component and the right side component and the side wall of the tunnel. In order to prevent damage to the tunnel wall, the pressure is controlled to be within a range greater than a first lower threshold and less than a first upper threshold, and then the robot is controlled to travel along the tunnel center line according to the relative pose information.
Optionally, the running information processing module is further configured to monitor an overall running condition of the robot in real time during a running process of the robot, and not allow the robot to advance after a fault occurs in the robot; when the robot normally operates, the pressures of the left side part, the right side part and the side wall of the tunnel are controlled to be in a range which is larger than a first lower limit threshold value and smaller than a first upper limit threshold value, and then the robot is controlled to run along the central line of the tunnel according to the relative pose information.
The invention also provides an electronic device, which comprises a processor and a memory, wherein the memory is used for storing a computer program, and the processor is used for executing the computer program of the memory to realize the automatic driving method of the nuclear tunnel cleaning robot in any embodiment. In particular, the processes described above may be implemented as computer software programs according to embodiments of the invention. For example, embodiments of the present invention include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may perform the above-described functions defined in the methods of embodiments of the present invention when downloaded and installed and executed by an electronic device. The electronic equipment in the invention can be a terminal such as a notebook, a desktop, a tablet computer, a smart phone and the like, and also can be a server.
It is to be understood that the above examples only represent preferred embodiments of the present invention, which are described in more detail and are not to be construed as limiting the scope of the invention; it should be noted that, for a person skilled in the art, the above technical features can be freely combined, and several variations and modifications can be made without departing from the scope of the invention; therefore, all changes and modifications that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (10)

1. An automatic driving method of a nuclear tunnel cleaning robot is characterized by comprising the following steps:
s1, acquiring relative pose information of a robot in a tunnel;
s2, obtaining displacement deviation information according to the relative pose information;
and S3, controlling the robot according to the displacement deviation information so that the robot runs along the central line of the tunnel.
2. The nuclear tunnel cleaning robot autopilot method of claim 1 wherein the relative pose information includes position and orientation information of the robot in the tunnel,
in the step S1, it includes:
s11, acquiring point cloud data of the surrounding environment of the robot;
s12, obtaining position and direction information of the robot in the tunnel according to the point cloud data.
3. The nuclear tunnel cleaning robot autopilot method of claim 2 wherein, in said step S12, comprising:
filtering the point cloud data;
performing Hough transformation on the processed point cloud data;
obtaining the longest two straight line segments according to the transformed point cloud data;
judging whether the number of data points in the point cloud data through which the two straight line segments pass is larger than a threshold value or not; if not, go to step S11;
if yes, the coordinate information of the left side wall and the right side wall of the tunnel is obtained according to the two straight line segments, and the position and the direction information of the robot in the tunnel are obtained according to the coordinate information of the left side wall and the right side wall of the tunnel.
4. The nuclear tunnel cleaning robot autopilot method of claim 1 wherein, in step S1, comprising:
and acquiring the relative pose information through detection equipment arranged on the robot.
5. The nuclear tunnel cleaning robot autopilot method of claim 1 comprising, in step S3:
judging whether the robot does not deflect and is on the tunnel central line according to the displacement deviation information, and if so, controlling the robot to move straight;
judging whether the robot deflects leftwards and is on the tunnel central line according to the displacement deviation information, and if so, controlling the robot to turn rightwards;
judging whether the robot deflects to the right or not and is on the tunnel central line according to the displacement deviation information, and if so, controlling the robot to turn to the left;
judging whether the robot deflects rightwards or not and is positioned on the left side of the central line of the tunnel according to the displacement deviation information; if yes, controlling the robot to move straight;
judging whether the robot deflects leftwards or not according to the displacement deviation information, and judging whether the robot deflects leftwards or not and is positioned at the left side of the central line of the tunnel; if yes, controlling the robot to turn right and advance;
judging whether the robot deflects leftwards or not and is positioned on the right side of the central line of the tunnel according to the displacement deviation information; if yes, controlling the robot to move straight;
judging whether the robot deflects rightwards or not according to the displacement deviation information, and judging whether the robot deflects rightwards or not and is positioned on the right side of the central line of the tunnel; if yes, controlling the robot to turn left and advance.
6. The nuclear tunnel cleaning robot autopilot method of claim 1 wherein in step S3, the controlling the robot step according to the displacement deviation information comprises:
and adjusting the robot by adopting a PID method according to the displacement deviation information.
7. The nuclear tunnel cleaning robot autopilot method of claim 1 further comprising:
and acquiring the running state information of the robot, judging whether the robot fails according to the running state information, and if so, not allowing the robot to advance.
8. The nuclear tunnel cleaning robot autopilot method of claim 1 further comprising:
acquiring pressure values of the robot and the left and right side walls of the tunnel;
judging whether the pressure values of the robot and the left side wall of the tunnel are larger than a first upper limit threshold value and the pressure values of the robot and the right side wall of the tunnel are smaller than a first lower limit threshold value, and if yes, controlling the robot to turn right;
judging whether the pressure values of the robot and the left side wall of the tunnel are smaller than a first lower limit threshold value and the pressure values of the robot and the right side wall of the tunnel are larger than a first upper limit threshold value, and if yes, controlling the robot to turn left.
9. A nuclear tunnel cleaning robot autopilot system comprising:
the driving information acquisition module is used for acquiring relative pose information of the robot in the tunnel;
and the running information processing module is used for obtaining displacement deviation information according to the relative pose information and controlling the robot according to the displacement deviation information so as to enable the robot to run along the central line of the tunnel.
10. An electronic device comprising a processor and a memory, the memory for storing a computer program, the processor for executing the computer program of the memory to implement the nuclear electric tunnel cleaning robot autopilot method of any one of claims 1-8.
CN202310375211.2A 2023-04-10 2023-04-10 Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment Pending CN116382284A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310375211.2A CN116382284A (en) 2023-04-10 2023-04-10 Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310375211.2A CN116382284A (en) 2023-04-10 2023-04-10 Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment

Publications (1)

Publication Number Publication Date
CN116382284A true CN116382284A (en) 2023-07-04

Family

ID=86965316

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310375211.2A Pending CN116382284A (en) 2023-04-10 2023-04-10 Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment

Country Status (1)

Country Link
CN (1) CN116382284A (en)

Similar Documents

Publication Publication Date Title
US10195741B2 (en) Controlling a robot in the presence of a moving object
CN110456797B (en) AGV repositioning system and method based on 2D laser sensor
CN104765367B (en) Realize the service robot of intelligent obstacle detouring
CN104808664B (en) A kind of method realizing intelligent obstacle detouring
CN104476548B (en) A kind of excavator AUTONOMOUS TASK control method
CN107315410B (en) Automatic obstacle removing method for robot
CN109358340A (en) A kind of AGV indoor map construction method and system based on laser radar
CN108241373A (en) Barrier-avoiding method and intelligent robot
CN107479500B (en) A kind of machining center motion positions digital control system and method with sighting device
CN111622296A (en) Excavator safety obstacle avoidance system and method
CN109872355B (en) Shortest distance acquisition method and device based on depth camera
CN109605390A (en) A kind of automobile washing machine people's control system
CN111230854A (en) Intelligent cooperative robot safety control software system
CN114322980A (en) Method for obtaining position coordinates and drawing electronic map, computer-readable storage medium, and autonomous operating apparatus
CN116382284A (en) Nuclear tunnel cleaning robot automatic driving method and system and electronic equipment
CN207281588U (en) A kind of dredging path exploration system of dredging robot
CN116713996A (en) High-speed excavating system based on ground cylinder fermented grain material robot and control method
JP5439552B2 (en) Robot system
Ai et al. Research on AGV navigation system based on binocular vision
CN111445519A (en) Industrial robot three-dimensional attitude estimation method and device and storage medium
CN113218384B (en) Indoor AGV self-adaptive positioning method based on laser SLAM
CN204241962U (en) Realize the service robot of intelligent obstacle detouring
CN105922258B (en) A kind of Omni-mobile manipulator autonomous navigation method based on iGPS
CN207448487U (en) A kind of exploration robot
CN112091973A (en) Mechanical arm protective door anti-collision detection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination