CN113885504A - Autonomous inspection method and system for train inspection robot and storage medium - Google Patents

Autonomous inspection method and system for train inspection robot and storage medium Download PDF

Info

Publication number
CN113885504A
CN113885504A CN202111186472.7A CN202111186472A CN113885504A CN 113885504 A CN113885504 A CN 113885504A CN 202111186472 A CN202111186472 A CN 202111186472A CN 113885504 A CN113885504 A CN 113885504A
Authority
CN
China
Prior art keywords
inspection
train
robot
inspected
chassis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111186472.7A
Other languages
Chinese (zh)
Inventor
章海兵
郑斌
刘明亮
李林
汪中原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Technological University Intelligent Robot Technology Co ltd
Original Assignee
Hefei Technological University Intelligent Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Technological University Intelligent Robot Technology Co ltd filed Critical Hefei Technological University Intelligent Robot Technology Co ltd
Priority to CN202111186472.7A priority Critical patent/CN113885504A/en
Publication of CN113885504A publication Critical patent/CN113885504A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an autonomous inspection method, system and storage medium for a train inspection robot, belonging to the technical field of inspection robots and comprising the following steps: respectively receiving a first inspection instruction, a second inspection instruction and inspection information sent by an upper computer, wherein the inspection information at least comprises stop road information of a train to be inspected, a first inspection list, a second inspection list and train marshalling information; loading a 3D map and a first routing inspection list outside a corresponding parking lane according to the parking lane information based on a first routing inspection instruction, and routing inspection to two sides of a train to be routed; and loading a chassis map and a second inspection list of the train to be inspected based on the second inspection instruction, and inspecting the chassis of the train to be inspected. The invention realizes autonomous navigation and positioning in a narrow and similar environment in a train parking lane and in an open and similar environment outside the train parking lane based on a multi-sensor sensing technology, and realizes defect detection.

Description

Autonomous inspection method and system for train inspection robot and storage medium
Technical Field
The invention relates to the technical field of inspection robots, in particular to an autonomous inspection method, an autonomous inspection system and a storage medium for a train inspection robot.
Background
The urban rail transit is a traffic tool with large transportation volume, high speed, low energy consumption, little pollution and high reliability, and is a public traffic system preferentially developed by countries in the world. With the great investment and use of subways, daily overhaul and maintenance are indispensable links.
The subway train inspection is daily necessary work for subway operation, particularly carries out more detailed inspection on a chassis, and timely discovers potential safety hazards. The subway inspection work environment is hard, and the subway inspection work environment is manual inspection at present. Replacing this part of the work with machines is a very urgent requirement of the railway sector. But the technology difficulty is high, the detection speciality is strong, and no mature product exists in the market.
In addition, the quality of the overhaul is not uniform due to different abilities and comprehensive qualities of the overhaul personnel. In order to improve the detection quality and the detection efficiency, modern rail repair equipment and system are used for assisting in the maintenance of urban rail transit trains in recent years. However, the automation, informatization and intelligentization degrees of the maintenance equipment and system in the urban rail transit section are low, the whole urban rail transit train inspection process cannot be well communicated, maintenance personnel still need to participate in most maintenance processes, and real intelligent maintenance cannot be realized.
The train inspection robot can gradually replace manpower due to the advantages of a rapid inspection mode, standardized inspection, capability of adjusting inspection items at any time and the like, wherein the core difficulty of autonomous inspection of the robot is a navigation positioning technology and an AI defect detection technology.
The current technical scheme of navigation in the tunnel mainly comprises the following two types:
1) laying tracks or adding auxiliary markers in the tunnel. The defect of the method is that the tunnel needs to be constructed and transformed, and normal maintenance is affected.
2) And (5) constructing a diagram of the tunnel. The defect exists in that the defect detection recognition rate is influenced because centimeter-level errors exist at the position where the train stops every time.
The environment outside the tunnel is similar and the route is longer, and the condition that the environment changes due to the train entering and exiting also exists, and the current navigation technical scheme outside the tunnel mainly comprises the following two types:
1) laying auxiliary markers such as magnetic stripes, magnetic nails and reflectors on the ground outside the tunnel. The defect that the robot needs to be constructed, and the maintenance personnel can cause the auxiliary marker to fall off and cover in daily maintenance, so that the normal inspection of the robot is influenced.
2) Laser SLAM navigation is employed. The method has the defects that the scene length is larger than the length of the whole train, the environment is changed frequently, and the sensing distance of laser is short, so that the map building and the navigation cannot be stably realized.
The defect identification can be divided into nine major categories, namely crack detection, anti-loosening mark detection, brake wear detection, tread detection, temperature detection, relay temperature measurement and sticking and reading meter, oil level reading, surface abnormity and defect detection and surface oil leakage detection.
Disclosure of Invention
The invention aims to provide an autonomous inspection method and an autonomous inspection system for a train inspection robot, and aims to realize autonomous navigation and positioning in a narrow and similar environment in a train parking lane and in an open and similar environment outside the train parking lane.
In order to achieve the above purpose, the invention provides an autonomous inspection method for a train inspection robot, which comprises the following steps:
respectively receiving a first inspection instruction, a second inspection instruction and inspection information sent by an upper computer, wherein the inspection information at least comprises stop road information of a train to be inspected, a first inspection list, a second inspection list and train marshalling information;
loading a 3D map and a first inspection list outside a corresponding parking lane according to the parking lane information based on the first inspection instruction, and inspecting two sides of the train to be inspected;
and loading the chassis map of the train to be inspected and the second inspection list based on the second inspection instruction, and inspecting the chassis of the train to be inspected.
Further, based on the first inspection instruction, loading a 3D map and a first inspection list outside a corresponding parking lane according to the parking lane information, and inspecting the two sides of the train to be inspected, the method includes:
respectively acquiring laser data and perception data based on a laser sensor and a first depth camera carried by the train inspection robot;
fusing the laser data and the perception data, and positioning and navigating on the 3D map according to a fusion result;
in the navigation process, executing a task scheduling algorithm, and controlling an industrial camera carried by the train inspection robot to shoot a detection area of the train to be inspected to obtain a picture to be detected;
and analyzing the picture to be detected by adopting a visual detection algorithm based on deep learning, and uploading an analysis result to the upper computer.
Further, based on the second inspection instruction, loading the chassis map of the train to be inspected and the second inspection list, and inspecting the chassis of the train to be inspected, including:
controlling the elevation angle of a laser sensor carried by the train inspection robot to be 45 degrees, and realizing positioning based on a train chassis and center line patrol navigation of a parking road on the chassis map according to data collected by the laser sensor carried by the train inspection robot and a depth camera;
in the navigation process, controlling the laser sensor and an industrial camera carried by the train inspection robot to shoot a to-be-detected area of the train to be inspected, and respectively obtaining 3D laser data and 2D data;
and analyzing the 3D laser data and the 2D data, and uploading an analysis result to the upper computer.
Further, the control the laser sensor elevation angle that the train inspection robot carried is 45 degrees to according to the data that laser sensor and the depth camera carried that the train inspection robot carried gathered, realize on the chassis map based on the location of train chassis and the navigation of patrolling line of parking way center, include:
controlling the elevation angle of a laser sensor and a first depth camera carried by the train inspection robot to be 45 degrees, and acquiring chassis data of a train to be inspected through the laser sensor and the first depth camera;
fusing the acquired chassis data to realize the positioning of the train chassis;
and extracting the wall surfaces on two sides of the parking lane by adopting a second depth camera and a third depth camera carried by the train inspection robot to obtain the position of the center line of the parking lane and realize the center line patrol navigation of the parking lane, wherein the second depth camera and the third depth camera are respectively a front horizontal depth camera and a rear horizontal depth camera.
Further, in the navigation process, the process of controlling the camera carried by the train inspection robot to shoot the to-be-inspected area of the train to be inspected comprises the following steps:
mechanical arm control pose (x) for acquiring point position shooting image of certain inspection point1,y1,z1,r1,o1,p1) Wherein x is1,y1,z1Is the three-dimensional coordinate of the end of the arm relative to the base, r1,o1,p1Rotation, pitch and roll angles of the end of the arm with respect to the base, respectively;
outputting three-dimensional positioning (x) using a positioning algorithm2,y2,z2,r2,o2,p2) The coordinate of the inspection point is (x)3,y3,z3,r3,o3,p3) And calculating the coordinate (x) of the mechanical arm control shooting1+x2-x3,y1+y2-y3,z1+z2-z3,r1+r2-r3,o1+o2-o3,p1+p2-p3),x2,y2,z2Respectively representing the three-dimensional coordinates r of the robot in the train chassis map in the autonomous inspection process2,o2,p2Respectively rotation, pitching and rolling of the robot in the train chassis map in the autonomous inspection processAngle, x3,y3,z3Respectively representing three-dimensional coordinates of the robot in the train chassis map during teaching, r3,o3,p3Respectively rotation, pitching and rolling angles of the robot in a train chassis map during teaching;
and controlling the industrial camera to shoot according to the shooting coordinate controlled by the mechanical arm.
Further, in the navigation process, the process of controlling the laser sensor to shoot the to-be-detected area of the train to be detected comprises the following steps:
controlling a mechanical arm of the train inspection robot to move to a detection preparation point according to the position of the area to be detected, wherein the detection preparation point is four points taught in advance and corresponds to four quadrants at the top of the train inspection robot;
calculating a detection starting point and a detection ending point according to the position of the area to be detected;
and controlling the laser sensor to start linear scanning laser at the detection starting point, and after the laser sensor horizontally moves to the detection end point, ending the linear scanning.
Further, the 3D laser data and the 2D data are analyzed, and the analysis comprises at least one of crack detection, anti-loosening mark detection, brake wear detection, tread detection, temperature detection, relay temperature measurement paste reading identification, oil level identification, surface abnormity and defect detection and surface oil leakage detection.
Further, wait to examine after the train patrols and examines, still include:
and the train inspection robot automatically returns to a charging room or executes the inspection task of the next train.
In addition, in order to achieve the above object, the present invention further provides an autonomous inspection system for a train inspection robot, including: the robot is examined to the train and host computer, and the robot is examined to the train carries laser sensor, degree of depth camera, industry camera and controlgear, controlgear includes:
the system comprises an information receiving module, a first inspection module, a second inspection module and an inspection module, wherein the information receiving module is used for respectively receiving a first inspection instruction, a second inspection instruction and inspection information which are sent by an upper computer, and the inspection information at least comprises stop road information of a train to be inspected, a first inspection list, a second inspection list and train marshalling information;
the first inspection module is used for loading a 3D map and a first inspection list corresponding to the outside of the parking lane according to the parking lane information based on the first inspection instruction and inspecting two sides of the train to be inspected;
and the second inspection module is used for loading the chassis map of the train to be inspected and the second inspection list based on the second inspection instruction, and inspecting the chassis of the train to be inspected.
In addition, to achieve the above object, the present invention also provides a computer readable storage medium having stored thereon computer readable instructions executable by a processor to implement the train inspection robot autonomous inspection method as described above.
Compared with the prior art, the invention has the following technical effects: after a train drives into an overhaul channel, an upper computer is used for sending an inspection instruction and inspection information to two train inspection robots, and after a robot group loads a map and a preset path according to lane information, the robot group drives to a detection starting point at two sides of a train side channel to start a daily overhaul process of a target train; for the exterior of the train, the robot group does not stop at two sides for detection, positioning navigation is carried out based on the horizontal front 3D laser and the depth camera, and the angle of the holder is controlled so that the high-definition industrial camera shoots a region to be detected and carries out real-time analysis. After the outer side is finished, the robot group drives into the tunnel through a slope in tandem and controls the 3D laser to face upward by 45 degrees from the horizontal, a train chassis map is adopted, positioning navigation is carried out on the basis of the 3D laser facing upward by 45 degrees from the horizontal, a depth camera facing upward by 45 degrees from the horizontal and a front depth camera and a rear depth camera which are horizontal, the robot group carries out division work cooperative operation by taking a center line of the train chassis as a boundary, and the mechanical arm is controlled at each inspection point to enable the tail end line scanning laser and the 2D high-definition camera to cooperate to acquire and analyze data of the area to be detected, so that the inspection task of the train is finished. According to the invention, by building an autonomous navigation system, an AI defect detection system and an autonomous inspection system, various detection and inspection data are fused, the aggregation of sensing resources is completed, the application requirements of practical scenes such as subway train chassis and the like are combined, the technical difficulty that a train inspection robot completes related tasks under the complex scene of the subway train chassis is solved, the comprehensiveness and controllability of rail transit inspection contents can be improved, the automation, the intellectualization and the convenience of rail transit train daily inspection are realized, and the problems of high labor intensity, high intensity, low accuracy and the like of rail transit train inspection are solved.
Drawings
The following detailed description of embodiments of the invention refers to the accompanying drawings in which:
FIG. 1 is a flow chart of an autonomous inspection method of a train inspection robot according to the invention;
FIG. 2 is a schematic line scan of a laser sensor of the present invention;
fig. 3 is a schematic diagram of the inside and outside cooperation of the parking lane of the robot group of the present invention.
Detailed Description
To further illustrate the features of the present invention, refer to the following detailed description of the invention and the accompanying drawings. The drawings are for reference and illustration purposes only and are not intended to limit the scope of the present disclosure.
As shown in fig. 1, the present embodiment discloses an autonomous inspection method for a train inspection robot, including the following steps S1 to S3:
s1, receiving a first inspection instruction, a second inspection instruction and inspection information sent by an upper computer respectively, wherein the inspection information at least comprises stop road information of a train to be inspected, a first inspection list, a second inspection list and train marshalling information;
s2, loading a 3D map and a first inspection list outside a corresponding parking lane according to the parking lane information based on the first inspection instruction, and inspecting two sides of the train to be inspected;
and S3, loading the chassis map of the train to be inspected and the second inspection list based on the second inspection instruction, and inspecting the chassis of the train to be inspected.
In this embodiment, after the train to be inspected enters the access road, the upper computer main control sends a first inspection instruction, a second inspection instruction and inspection information to the two idle robots, and after the robot group loads a map and a preset path according to the lane information, the robot group drives to the detection starting points on the two sides of the train side road to start a routine maintenance process for the target train, including the inspection of the outside and the inspection of the bottom of the train, and uploads the inspection result to the upper computer.
The robot is equipped with a laser sensor, a first depth camera, a second depth camera, a third depth camera, and a high-definition industrial camera, wherein the angle between the laser sensor and the first depth camera is adjustable, and the second depth camera and the third depth camera are horizontally arranged.
As a more preferable technical solution, in step S1: loading a 3D map and a first inspection list outside a corresponding parking lane according to the parking lane information based on the first inspection instruction, and inspecting two sides of the train to be inspected, wherein the method comprises the following steps of S11 to S14:
s11, respectively acquiring laser data and perception data based on a laser sensor and a first depth camera carried by the train inspection robot;
s12, fusing the laser data and the perception data, and positioning and navigating on the 3D map according to the fusion result;
s13, in the navigation process, executing a task scheduling algorithm, and controlling an industrial camera carried by the train inspection robot to shoot detection areas on two sides of the train to be inspected to obtain a picture to be detected;
it should be noted that, the task scheduling algorithm is executed to perform task allocation on two robots, for example, after one polling train left side and one polling train right side are polled, the robots queue up at the access point, and sequentially enter the access point to perform polling.
And S14, analyzing the picture to be detected by adopting a visual detection algorithm based on deep learning, such as YOLOv5, and uploading an analysis result to the upper computer.
As a more preferable technical solution, in step S2: based on the second inspection instruction, loading the chassis map and the second inspection list of the train to be inspected, and inspecting the chassis of the train to be inspected, wherein the method comprises the following steps of S21 to S23:
s21, controlling the elevation angle of a laser sensor carried by the train inspection robot to be 45 degrees, and realizing positioning based on a train chassis and center line patrol navigation of a parking road on the chassis map according to data collected by the laser sensor and a depth camera carried by the train inspection robot;
s22, in the navigation process, controlling the laser sensor and an industrial camera carried by the train inspection robot to shoot the to-be-inspected area of the train to be inspected, and respectively obtaining 3D laser data and 2D data;
and S23, analyzing the 3D laser data and the 2D data, and uploading the analysis result to the upper computer.
As a more preferable technical solution, in step S21: control the laser sensor elevation angle that the train inspection robot carried is 45 degrees to according to the data that laser sensor and the degree of depth camera that the train inspection robot carried gathered, realize on the chassis map based on the location of train chassis and the navigation of patrolling line in the center of the lane that parks, specifically include:
controlling the elevation angle of a laser sensor and a first depth camera carried by the train inspection robot to be 45 degrees, and acquiring chassis data of a train to be inspected through the laser sensor and the first depth camera;
fusing the acquired chassis data to realize the positioning of the train chassis;
and extracting the wall surfaces on two sides of the parking lane by adopting a second depth camera and a third depth camera carried by the train inspection robot to obtain the position of the center line of the parking lane and realize the center line patrol navigation of the parking lane, wherein the second depth camera and the third depth camera are respectively a front horizontal depth camera and a rear horizontal depth camera.
It should be noted that in the positioning algorithm in the parking lane of this embodiment, a 3D laser and a depth camera with an elevation angle of 45 degrees are used, and through the fusion of the chassis homologous heterogeneous sensing data, the chassis of different trains of the same model has more similarity description capability, so that stable positioning is achieved. The navigation algorithm in the parking lane adopts a front horizontal depth camera and a rear horizontal depth camera to extract wall surfaces on two sides in the parking lane, and the position of a center line is obtained.
As a further preferred technical solution, in the navigation process, the process of controlling the camera carried by the train inspection robot to shoot the to-be-inspected area of the train to be inspected includes:
mechanical arm control pose (x) for acquiring point position shooting image of certain inspection point1,y1,z1,r1,o1,p1) Wherein x is1,y1,z1Is the three-dimensional coordinate of the end of the arm relative to the base, r1,o1,p1Rotation, pitch and roll angles of the end of the arm with respect to the base, respectively;
outputting three-dimensional positioning (x) using a positioning algorithm2,y2,z2,r2,o2,p2) The coordinate of the inspection point is (x)3,y3,z3,r3,o3,p3) And calculating the coordinate (x) of the mechanical arm control shooting1+x2-x3,y1+y2-y3,z1+z2-z3,r1+r2-r3,o1+o2-o3,p1+p2-p3),x2,y2,z2Respectively representing the three-dimensional coordinates r of the robot in the train chassis map in the autonomous inspection process2,o2,p2Respectively the rotation, pitching and rolling angles, x, of the robot in the train chassis map in the autonomous inspection process3,y3,z3Respectively representing three-dimensional coordinates of the robot in the train chassis map during teaching, r3,o3,p3Respectively rotation, pitching and rolling angles of the robot in a train chassis map during teaching;
and controlling the industrial camera to shoot according to the shooting coordinate controlled by the mechanical arm.
As a further preferable technical solution, as shown in fig. 2, in the navigation process, the process of controlling the laser sensor to shoot the to-be-detected area of the train to be detected includes:
controlling a mechanical arm of the train inspection robot to move to a detection preparation point according to the position of the area to be detected, wherein the detection preparation point is four points taught in advance and corresponds to four quadrants at the top of the train inspection robot;
calculating a detection starting point and a detection ending point according to the position of the area to be detected;
and controlling the laser sensor to start linear scanning laser at the detection starting point, and after the laser sensor horizontally moves to the detection end point, ending the linear scanning.
As a further preferable technical solution, the 3D laser data and the 2D data are analyzed, including at least one of crack detection, check mark detection, brake wear detection, tread surface detection, temperature detection, relay temperature measurement label reading identification, oil level identification, surface abnormality and defect detection, and surface oil leakage detection.
It should be noted that, in this embodiment, the defect detection of 9 categories is realized by analyzing based on the 2D visual data and the 3D laser data, specifically:
1) for crack detection, fusing 2D high-definition camera data and 3D laser data, adopting a deep learning algorithm for identification, and simultaneously detecting the depth of the crack;
2) for anti-loose mark detection, identifying red lines in the 2D high-definition image and adopting 3D laser data registration to calculate horizontal and deflection errors;
3) for brake wear detection, only 3D laser data are adopted, a brake pad and a brake pad fixer are identified, and the height difference between the brake pad and the brake pad fixer is calculated;
4) for tread detection, only 3D laser data is adopted, scratches are detected through a depth threshold value, and abrasion is detected according to the height difference between the scratches and the thickness of a wheel rim;
5) for temperature detection, shooting an area to be detected by using a thermal infrared imager only through 2D infrared data, extracting temperature equipment to be detected by using a shooting contour detection algorithm, measuring the temperature of the area in the contour, and outputting the highest temperature, the average temperature and the lowest temperature;
6) for the reading identification of the relay temperature measurement patch, only 2D high-definition data is used for detecting the position of a measuring strip, and a result is output according to the percentage according to calibration data;
7) for oil level identification, detecting an oil level line and calculating the position of the oil level line only by using 2D high-definition data, and outputting a result according to the percentage according to calibration data;
8) for surface anomaly and defect detection, fusing 2D high-definition and 3D laser data, and identifying by adopting a target detection algorithm based on deep learning;
9) for surface oil leakage detection, 2D high-definition data are only used, and a target detection algorithm based on deep learning is adopted for identification.
As a further preferable technical solution, after the inspection of the train to be inspected is finished, the method further includes step S4:
and S4, the train inspection robot automatically returns to a charging room or performs the inspection task of the next train.
The method includes that after a train enters an access road, an upper computer main control sends detection instructions and relevant information to two idle robots, and after a robot group loads a map and a preset path according to lane information, the robot group drives to detection starting points on two sides of a train side road to start a daily maintenance process of a target train. For the exterior of the train, the robot group does not stop at two sides for detection, positioning navigation is carried out based on the horizontal front 3D laser and the depth camera, and the angle of the holder is controlled so that the high-definition camera shoots a region to be detected and carries out real-time analysis. After the outer side is finished, the robot group drives into the tunnel through a slope in tandem and controls the 3D laser to face upward by 45 degrees from the horizontal, a train chassis map is adopted and is based on the 45-degree 3D laser, the 45-degree depth camera and the horizontal front and rear depth cameras for positioning and navigation, the robot group performs work division cooperative operation by taking the center line of the train chassis as a boundary, and the mechanical arm is controlled at each inspection point to enable the tail end linear scanning laser and the 2D high-definition camera to cooperate to acquire and analyze data of an area to be detected. And after the detection is finished, the robot group automatically returns to a charging room or executes the polling task of the next train.
The invention realizes autonomous navigation and positioning under the similar environment and the narrow and open train parking lane and the similar environment in the train parking lane based on the multi-sensor sensing technology, and realizes 9 types of defect detection.
As shown in fig. 3, the autonomous inspection steps of the train inspection robot in this embodiment are as follows:
the robot mainly carries 1 infrared pair of taking a photograph cloud platform of 2D high definition, 1 3D laser module that can pitch, 3 depth camera (preceding, back, elevation angle 45 degrees each 1), 1 6 degree of freedom machinery, 1 line sweep laser, equipment such as 1 2D high definition industry camera.
(1) And controlling the No. 1 robot to run to the outer side of the parking way from the charging room and surround the train for one circle to build a real-time 3D map.
(2) After the map outside the parking lane is constructed, a routing inspection path and a holder angle are set, and the inspection path and the holder angle are compared with an inspection rule, so that a 2D high-definition camera in the holder can shoot an area to be detected.
(3) And after the inspection is finished according to the maintenance rule, the robot issuing the inspection task sign 1 carries out preliminary inspection test and optimization. And after the requirement is met, importing the data of the No. 1 robot into the No. 2 robot for inspection verification and optimization until the inspection requirement is met.
(4) And respectively controlling the two robots to enter the maintenance road, and implementing by taking the center line of the train chassis as a boundary. And respectively controlling mechanical arms, a tail end 2D high-definition camera and tail end line scanning lasers of the two robots to implement the to-be-detected area of the chassis according to the maintenance rules, and storing mechanical arm control data, high-definition images, 3D point cloud data, detection types and the like.
(5) And after the chassis is completely implemented, issuing a chassis inspection task to the two robots, and testing and optimizing until the inspection requirements are met.
(6) And a map switching point is arranged on the map outside the parking lane, so that the robot can be switched into a train chassis map before entering the parking lane, and can autonomously drive from the ramp to the parking lane.
(7) And issuing an integral inspection task to enable the robot group to start from the charging pile to cooperatively work, and sequentially finish the independent inspection outside and inside the parking lane.
This embodiment also provides a train inspection robot system of independently patrolling and examining, includes: the robot is examined to the train and host computer, and the robot is examined to the train carries laser sensor, degree of depth camera, industry camera and controlgear, controlgear includes:
the system comprises an information receiving module, a first inspection module, a second inspection module and an inspection module, wherein the information receiving module is used for respectively receiving a first inspection instruction, a second inspection instruction and inspection information which are sent by an upper computer, and the inspection information at least comprises stop road information of a train to be inspected, a first inspection list, a second inspection list and train marshalling information;
the first inspection module is used for loading a 3D map and a first inspection list corresponding to the outside of the parking lane according to the parking lane information based on the first inspection instruction and inspecting two sides of the train to be inspected;
and the second inspection module is used for loading the chassis map of the train to be inspected and the second inspection list based on the second inspection instruction, and inspecting the chassis of the train to be inspected.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not described in detail in this embodiment may refer to the writing method provided in any embodiment of the present invention, and are not described herein again.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, on which computer-readable instructions are stored, where the computer-readable instructions are executable by a processor to implement the train inspection robot autonomous inspection method described above.
Since the storage medium adopts all technical solutions of all the embodiments, at least all the beneficial effects brought by the technical solutions of the embodiments are achieved, and no further description is given here.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. Read Only Memory (ROM)/RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a writing device, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. The train inspection robot autonomous inspection method is characterized by comprising the following steps:
respectively receiving a first inspection instruction, a second inspection instruction and inspection information sent by an upper computer, wherein the inspection information at least comprises stop road information of a train to be inspected, a first inspection list, a second inspection list and train marshalling information;
loading a 3D map and a first inspection list outside a corresponding parking lane according to the parking lane information based on the first inspection instruction, and inspecting two sides of the train to be inspected;
and loading the chassis map of the train to be inspected and the second inspection list based on the second inspection instruction, and inspecting the chassis of the train to be inspected.
2. The train inspection robot autonomous inspection method according to claim 1, wherein the loading of the 3D map and the first inspection list outside the corresponding parking lane according to the parking lane information based on the first inspection instruction to inspect both sides of the train to be inspected comprises:
respectively acquiring laser data and perception data based on a laser sensor and a first depth camera carried by the train inspection robot;
fusing the laser data and the perception data, and positioning and navigating on the 3D map according to a fusion result;
in the navigation process, executing a task scheduling algorithm, and controlling an industrial camera carried by the train inspection robot to shoot a detection area of the train to be inspected to obtain a picture to be detected;
and analyzing the picture to be detected by adopting a visual detection algorithm based on deep learning, and uploading an analysis result to the upper computer.
3. The train inspection robot autonomous inspection method according to claim 1, wherein the loading the chassis map of the train to be inspected and the second inspection list based on the second inspection instruction to inspect the chassis of the train to be inspected comprises:
controlling the elevation angle of a laser sensor carried by the train inspection robot to be 45 degrees, and realizing positioning based on a train chassis and center line patrol navigation of a parking road on the chassis map according to data collected by the laser sensor carried by the train inspection robot and a depth camera;
in the navigation process, controlling the laser sensor and an industrial camera carried by the train inspection robot to shoot a to-be-detected area of the train to be inspected, and respectively obtaining 3D laser data and 2D data;
and analyzing the 3D laser data and the 2D data, and uploading an analysis result to the upper computer.
4. The autonomous inspection method of a train inspection robot according to claim 3, wherein the elevation angle of the laser sensor carried by the train inspection robot is controlled to be 45 degrees, and positioning based on a train chassis and the center line patrol navigation of a parking lane are realized on the chassis map according to data collected by the laser sensor carried by the train inspection robot and a depth camera, and the method comprises the following steps:
controlling the elevation angle of a laser sensor and a first depth camera carried by the train inspection robot to be 45 degrees, and acquiring chassis data of a train to be inspected through the laser sensor and the first depth camera;
fusing the acquired chassis data to realize the positioning of the train chassis;
and extracting the wall surfaces on two sides of the parking lane by adopting a second depth camera and a third depth camera carried by the train inspection robot to obtain the position of the center line of the parking lane and realize the center line patrol navigation of the parking lane, wherein the second depth camera and the third depth camera are respectively a front horizontal depth camera and a rear horizontal depth camera.
5. The train inspection robot autonomous inspection method according to claim 3, wherein the process of controlling a camera carried by the train inspection robot to shoot the area to be inspected of the train to be inspected in the navigation process comprises:
mechanical arm control pose (x) for acquiring point position shooting image of certain inspection point1,y1,z1,r1,o1,p1) Wherein x is1,y1,z1Is the three-dimensional coordinate of the end of the arm relative to the base, r1,o1,p1Rotation, pitch and roll angles of the end of the arm with respect to the base, respectively;
outputting three-dimensional positioning (x) using a positioning algorithm2,y2,z2,r2,o2,p2) The coordinate of the inspection point is (x)3,y3,z3,r3,o3,p3) And calculating the coordinate (x) of the mechanical arm control shooting1+x2-x3,y1+y2-y3,z1+z2-z3,r1+r2-r3,o1+o2-o3,p1+p2-p3),x2,y2,z2Respectively representing the three-dimensional coordinates r of the robot in the train chassis map in the autonomous inspection process2,o2,p2Respectively rotation and pitching of the robot in the train chassis map in the autonomous inspection processAnd roll angle, x3,y3,z3Respectively representing three-dimensional coordinates of the robot in the train chassis map during teaching, r3,o3,p3Respectively rotation, pitching and rolling angles of the robot in a train chassis map during teaching;
and controlling the industrial camera to shoot according to the shooting coordinate controlled by the mechanical arm.
6. The train inspection robot autonomous inspection method according to claim 3, wherein in the navigation process, the process of controlling the laser sensor to photograph the area to be inspected of the train to be inspected comprises:
controlling a mechanical arm of the train inspection robot to move to a detection preparation point according to the position of the area to be detected, wherein the detection preparation point is four points taught in advance and corresponds to four quadrants at the top of the train inspection robot;
calculating a detection starting point and a detection ending point according to the position of the area to be detected;
and controlling the laser sensor to start linear scanning laser at the detection starting point, and after the laser sensor horizontally moves to the detection end point, ending the linear scanning.
7. The autonomous inspection method according to claim 3, wherein the 3D laser data and the 2D data are analyzed, and the analysis includes at least one of crack detection, check mark detection, brake wear detection, tread detection, temperature detection, reading identification of a relay temperature measuring paste, oil level identification, surface anomaly and defect detection, and surface oil leakage detection.
8. The train inspection robot autonomous inspection method according to claim 1, further comprising, after the inspection of the train to be inspected is finished:
and the train inspection robot automatically returns to a charging room or executes the inspection task of the next train.
9. The utility model provides a train inspection robot is system of patrolling and examining independently which characterized in that includes: the robot is examined to the train and host computer, and the robot is examined to the train carries laser sensor, degree of depth camera, industry camera and controlgear, controlgear includes:
the system comprises an information receiving module, a first inspection module, a second inspection module and an inspection module, wherein the information receiving module is used for respectively receiving a first inspection instruction, a second inspection instruction and inspection information which are sent by an upper computer, and the inspection information at least comprises stop road information of a train to be inspected, a first inspection list, a second inspection list and train marshalling information;
the first inspection module is used for loading a 3D map and a first inspection list corresponding to the outside of the parking lane according to the parking lane information based on the first inspection instruction and inspecting two sides of the train to be inspected;
and the second inspection module is used for loading the chassis map of the train to be inspected and the second inspection list based on the second inspection instruction, and inspecting the chassis of the train to be inspected.
10. A computer-readable storage medium having computer-readable instructions stored thereon which are executable by a processor to implement the train inspection robot autonomous patrol method according to any one of claims 1-8.
CN202111186472.7A 2021-10-12 2021-10-12 Autonomous inspection method and system for train inspection robot and storage medium Pending CN113885504A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111186472.7A CN113885504A (en) 2021-10-12 2021-10-12 Autonomous inspection method and system for train inspection robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111186472.7A CN113885504A (en) 2021-10-12 2021-10-12 Autonomous inspection method and system for train inspection robot and storage medium

Publications (1)

Publication Number Publication Date
CN113885504A true CN113885504A (en) 2022-01-04

Family

ID=79006138

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111186472.7A Pending CN113885504A (en) 2021-10-12 2021-10-12 Autonomous inspection method and system for train inspection robot and storage medium

Country Status (1)

Country Link
CN (1) CN113885504A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113334406A (en) * 2021-06-25 2021-09-03 北京铁道工程机电技术研究所股份有限公司 Track traffic vehicle side inspection robot system and detection method
CN116341880A (en) * 2023-05-26 2023-06-27 成都盛锴科技有限公司 Distributed scheduling method for column inspection robot based on finite state machine

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1600351A1 (en) * 2004-04-01 2005-11-30 Heuristics GmbH Method and system for detecting defects and hazardous conditions in passing rail vehicles
CN113334406A (en) * 2021-06-25 2021-09-03 北京铁道工程机电技术研究所股份有限公司 Track traffic vehicle side inspection robot system and detection method
CN113436366A (en) * 2021-06-25 2021-09-24 北京铁道工程机电技术研究所股份有限公司 Synchronous and cooperative inspection method for bottom and side edges of rail transit vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1600351A1 (en) * 2004-04-01 2005-11-30 Heuristics GmbH Method and system for detecting defects and hazardous conditions in passing rail vehicles
CN113334406A (en) * 2021-06-25 2021-09-03 北京铁道工程机电技术研究所股份有限公司 Track traffic vehicle side inspection robot system and detection method
CN113436366A (en) * 2021-06-25 2021-09-24 北京铁道工程机电技术研究所股份有限公司 Synchronous and cooperative inspection method for bottom and side edges of rail transit vehicle

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
刘婉玲: "城市轨道交通运输设备", vol. 1, 31 July 2010, 西南交通大学出版社, pages: 45 *
程春阳;: "基于智能列检机器人的地铁车辆维修策略研究", 铁道勘测与设计, no. 02, 15 May 2020 (2020-05-15), pages 89 - 93 *
罗蕾: "嵌入式实时操作系统及应用开发", vol. 1, 31 January 2005, 北京航空航天大学出版社, pages: 22 - 25 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113334406A (en) * 2021-06-25 2021-09-03 北京铁道工程机电技术研究所股份有限公司 Track traffic vehicle side inspection robot system and detection method
CN113334406B (en) * 2021-06-25 2024-06-04 北京铁道工程机电技术研究所股份有限公司 Rail transit vehicle side inspection robot system and detection method
CN116341880A (en) * 2023-05-26 2023-06-27 成都盛锴科技有限公司 Distributed scheduling method for column inspection robot based on finite state machine
CN116341880B (en) * 2023-05-26 2023-08-11 成都盛锴科技有限公司 Distributed scheduling method for column inspection robot based on finite state machine

Similar Documents

Publication Publication Date Title
Liu et al. A review of applications of visual inspection technology based on image processing in the railway industry
CN106680290B (en) Multifunctional detection vehicle in narrow space
CN103778681B (en) A kind of vehicle-mounted highway cruising inspection system and data acquisition and disposal route
CN205898699U (en) Single track box roof beam inspection device of suspension type
CN110991466A (en) Highway road surface condition detecting system based on novel vision sensing equipment
CN113885504A (en) Autonomous inspection method and system for train inspection robot and storage medium
KR102017870B1 (en) Real-time line defect detection system
CN109489584B (en) Tunnel clearance detection system and tunnel clearance identification method based on 3D technology
CN103837087B (en) Pantograph automatic testing method based on active shape model
CN114140439B (en) Laser welding seam characteristic point identification method and device based on deep learning
CN114037703B (en) Subway valve state detection method based on two-dimensional positioning and three-dimensional attitude calculation
CN106494611B (en) A kind of dual-purpose patrol unmanned machine of sky rail
CN112819766A (en) Bridge defect overhauling method, device, system and storage medium
CN115562284A (en) Method for realizing automatic inspection by high-speed rail box girder inspection robot
CN112508911A (en) Rail joint touch net suspension support component crack detection system based on inspection robot and detection method thereof
CN110806411A (en) Unmanned aerial vehicle rail detecting system based on line structure light
CN111855667A (en) Novel intelligent train inspection system and detection method suitable for metro vehicle
CN116591005A (en) Intelligent detection and repair device for apparent diseases of asphalt pavement
Rolfsen et al. The use of the BIM-model and scanning in quality assurance of bridge constructions
CN115857040A (en) Dynamic visual detection device and method for foreign matters on locomotive roof
CN113295094B (en) Pantograph dynamic envelope intrusion detection method, device and system
CN114434036A (en) Three-dimensional vision system for gantry robot welding of large ship structural member and operation method
CN116922448B (en) Environment sensing method, device and system for high-speed railway body-in-white transfer robot
CN117115249A (en) Container lock hole automatic identification and positioning system and method
CN116380935A (en) High-speed railway box girder damage detection robot car and damage detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination