CN105262989A - Automatic inspection and real-time image acquisition transmission method of railway line unmanned aerial plane - Google Patents

Automatic inspection and real-time image acquisition transmission method of railway line unmanned aerial plane Download PDF

Info

Publication number
CN105262989A
CN105262989A CN201510644021.1A CN201510644021A CN105262989A CN 105262989 A CN105262989 A CN 105262989A CN 201510644021 A CN201510644021 A CN 201510644021A CN 105262989 A CN105262989 A CN 105262989A
Authority
CN
China
Prior art keywords
node
image
nodes
prediction
program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510644021.1A
Other languages
Chinese (zh)
Other versions
CN105262989B (en
Inventor
彭彦平
张万宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHENGDU TIMES TECH Co Ltd
Original Assignee
CHENGDU TIMES TECH Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CHENGDU TIMES TECH Co Ltd filed Critical CHENGDU TIMES TECH Co Ltd
Priority to CN201510644021.1A priority Critical patent/CN105262989B/en
Publication of CN105262989A publication Critical patent/CN105262989A/en
Application granted granted Critical
Publication of CN105262989B publication Critical patent/CN105262989B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses an automatic inspection and real-time image acquisition transmission method of a railway line unmanned aerial plane. Through the method, automatic track plan inspection can be realized, visual navigation, image identification and obstacle avoidance are supported, parameters can be properly changed as no transmission delay influence is exerted, so the image identification rate is improved, a satellite communication network mode is fused with a traditional communication mode, high speed exchange for high-capacity image data can be realized, and the method further has an advantage of relatively high security.

Description

Automatic patrolling and real-time image acquisition and transmission method for railway line unmanned aerial vehicle
Technical Field
The invention relates to the field of unmanned aerial vehicle investigation, in particular to an automatic inspection and real-time image acquisition and transmission method for an aerial unmanned aerial vehicle on a railway line.
Background
Railway line inspection is the basic work of railway line routine maintenance, and inspection modes can be divided into manual inspection, unmanned helicopter inspection and unmanned aerial vehicle inspection. Although manual inspection is the most common inspection mode, the defects of low efficiency, climate geographical environment restriction and the like exist all the time, the machine inspection mode is widely researched and applied, especially the safety and high efficiency characteristic of unmanned aerial vehicle inspection, and the unmanned aerial vehicle in the railway line has greater and greater application value.
At present, a manual planning mode is mostly adopted for task planning of railway line aerial unmanned aerial vehicle inspection, the unmanned aerial vehicle flight safety is ensured in the manual planning mode, the efficiency is low, the requirement of large-scale unmanned aerial vehicle inspection cannot be met, and meanwhile, the optimal planning is difficult to realize in a large area in the manual planning mode. The intelligent unmanned aerial vehicle line patrol task planning firstly needs to construct a data structure for all important nodes in the whole railway line region so as to carry out intelligent algorithm planning.
Most of the existing return of investigation video images are based on analog video signals, the images are not clear, and equipment called IOSD is needed, and the equipment actually superposes the analog video signals of the high-definition camera and flight parameters and returns the superposed images to the ground, so that although the high-definition images are stored on the airplane, the images returned to the ground are the analog images superposed with flight state parameters, and people often need to see the high-definition digital images shot on the airplane in real time.
The unmanned aerial vehicle system comprises an unmanned aerial vehicle body platform, a task load and a data wireless transmission part. The key for realizing the unmanned aerial vehicle video data transmission application lies in a wireless transmission link means. The current wireless transmission technologies mainly include the following technologies: 3G networks (CDMA2000, WCDMA, TD-SCDMA), 4G (TD-LTE and FDD-LTE) networks, wireless local area networks (WIFI), satellites, microwaves, etc.
Satellite and microwave technology are traditional means of wireless video transmission, and the satellite communication technology has the greatest advantages of wide service range, strong functions, flexible use, no influence from geographical environment and other external environments, and particularly no influence from external electromagnetic environment. However, the cost of the two technologies is high, and the expensive initial construction cost and communication cost are often prohibitive, and cannot be popularized in a large area.
Disclosure of Invention
The invention provides a railway line aerial unmanned aerial vehicle automatic inspection and real-time image acquisition and transmission method, which can automatically plan an inspection track, support visual navigation, image identification and obstacle avoidance, and can properly change parameters under the condition of not being influenced by transmission delay, thereby improving the image identification rate, combining a satellite communication network mode and a traditional communication mode, solving the high-speed exchange of large-capacity image data and having higher safety.
In order to achieve the purpose, the invention provides an automatic inspection and real-time image acquisition and transmission method for an aerial unmanned aerial vehicle on a railway line, which specifically comprises the following steps:
s1, planning a patrol route by a patrol route planning module;
s2, starting a monitoring program by the central processing module, reading and executing the intelligent planning line, and starting a GPS navigation program by the satellite navigation module;
s3, acquiring a video image by the high-definition high-power zooming motion camera according to the track of the monitoring program, and processing the image by a machine-end image processing module;
s4, a video image wireless transmitting module and a video image receiving module are matched to complete wireless transmission and reception of image signals;
and S5, the central station image processing module processes the received image signal and displays the image signal on the display terminal.
Preferably, in step S1, the method specifically includes the following steps:
s11, predicting the node distribution of the railway line by adopting a unitary nonlinear regression prediction method to generate a plurality of node lines, wherein each node line covers a plurality of nodes;
in step S11, the three-dimensional space model is first simplified to a two-dimensional space model, and a unitary nonlinear regression prediction method is used to predict the distribution of railway line nodes; the prediction mode adopts a confidence interval mode, and new input data (explanatory variables) are predicted and judged according to historical node data (response variables).
S12, connecting a plurality of node lines to form a line connection graph by using the critical condition of node distribution; wherein the critical conditions include: the crossing condition, the parallel distribution condition of a plurality of lines with close distance, the node turning condition and the node branching condition;
in step S12, if the node distribution reaches a critical condition due to a special condition, performing a critical type determination and processing; the critical categories of node distribution are classified into the following:
cross-over situation: predicting that a plurality of points appear in the interval, exceed the set number of nodes and have cross points;
the parallel distribution condition that the distance of a plurality of lines is short: predicting that a plurality of points appear in the interval, exceed the set number of nodes and have no cross points;
node turning situation: compared with a prediction equation, a unique inflection point appears in a prediction interval;
node branching conditions: in contrast to the prediction equation, a plurality of inflection points occur within the prediction interval.
And S13, constructing and storing a railway line inspection plan.
Step S11 specifically includes the following steps:
s111, dimension reduction: performing dimension reduction processing on the three-dimensional node geographic coordinate to convert the three-dimensional node geographic coordinate into a two-dimensional coordinate; setting the three-dimensional coordinate of the original A node as (x)t,yt,zt),xtRepresenting the three-dimensional spatial longitude coordinate, y, of a nodetRepresenting the latitude coordinate of the node, ztRepresenting the altitude of the node, the coordinate of the node A after dimensionality reduction is (x)t,yt);
Establishment of the regression equation of S112: the unitary linear regression prediction model formula applied to the power transmission line mission planning is as follows:
Y ^ t = a + bx t - - - ( 1 )
in the formula xtRepresenting the longitude coordinates of the node at time t,representing the estimated latitude coordinate at the time t;
s113, taking the regression prediction step length as N, and obtaining a solving equation of the parameters a and b in the regression equation as follows:
a = ΣY i N - b ΣX i N b = NΣY i X i - ΣY i ΣX i NΣX i 2 - ( ΣX i ) 2 - - - ( 2 )
wherein,n is a predicted moving step length; because the distance between two base nodes is different from dozens of meters to hundreds of meters, most of the cases are that the continuous multiple base nodes form a similar straight line segment.
Preferably, in step S113, the step length N is 5 m to 10 m.
Preferably, step S11 further includes step 114: and constructing a prediction interval, wherein because the curve equation determined by the actual node has a certain deviation with the prediction equation, the Y value is subjected to interval prediction, namely the prediction interval of the average value is constructed, the significance level a is set according to the node distribution condition, and the prediction interval of which the confidence coefficient of the Y average value is 1-a is calculated.
Preferably, step S13 specifically includes the following steps:
s131, establishing a node matrix, wherein row and column coordinates represent a node number, and matrix data are node geographical position information; the nodes are divided into two types according to the prediction result: important nodes and non-important nodes; the important nodes comprise line starting and stopping nodes and cross nodes; the non-important nodes are internal nodes which only belong to a single line;
s132, for the important nodes, constructing an important node adjacency list;
and S133, storing a matrix structure for the non-important nodes, wherein the row coordinates of the matrix represent the lines to which the non-important nodes belong, and the column coordinates represent the node numbers.
In step S132, for the important node, a chain storage structure-adjacency list is adopted in consideration of the algorithm storage space and the algorithm efficiency due to the large node data amount.
Preferably, in step S2, the monitoring program includes an application-level program, a real-time task scheduler and an external interrupt handler, a hardware initialization program, a hardware driver, a CAN communication protocol program, and a LAN (TCP/IP) communication protocol program, the application-level program is connected to the real-time task scheduler and the external interrupt handler, the real-time task scheduler and the external interrupt handler are connected to the hardware initialization program, and the hardware initialization program is connected to the hardware driver.
Preferably, the application-level program includes an application layer interface program, a power management and power monitoring program, a flight indicator light control program, a safety control program, a visual control program, a track control program, a stability augmentation control program, a remote controller decoding program, and a communication processing program.
Preferably, in step S3, the video image may be processed using one or more of the following steps:
s31: a data receiving unit receives an image encoding stream including image encoding data and parameters;
s32: changing a parameter based on an index indicating an image recognition accuracy;
s33: changing a parameter based on environmental information of the image receiving apparatus;
s34: changing parameters according to the operation condition;
s35: changing parameters of a deblocking filter;
s36: changing the quantization parameter;
s37: the orthogonal transform coefficients are changed.
Preferably, in step S4, the multi-channel distribution system detects channels, selects an optimal channel, and sequentially has the following priorities: short-range wireless transmission, mobile communication transmission, satellite communication transmission.
Preferably, in step S5, the method includes the following sub-steps:
s51, segmenting the video file by a video file segmenter;
s52, compressing the divided files by a video compression encoder;
and S53, the encryption device carries out encryption operation on the compressed video file.
Preferably, in step S5, after the decryption device of the central site image processing module decrypts the video file, the decoding device decodes the file, and the display device displays the video in real time.
The invention has the following advantages and beneficial effects: (1) the intelligent planning can be carried out on the railway line, the planning efficiency is improved, the optimal patrol route at the planning position in the railway line network is planned through an intelligent traversal algorithm, and the traversal of all important nodes in the area under the condition of the shortest patrol distance is met; (2) the high-definition digital image is supported to be transmitted back to the ground in real time, the requirement of high-definition digital transmission is met, the visual navigation, obstacle avoidance and image target recognition and tracking are supported, and the requirement of new technology development is met; (3) the parameter can be appropriately changed without being affected by the transmission delay, thereby improving the image recognition rate; (4) the device combines a satellite communication network mode with a traditional communication mode, and can bind two communication links to transmit audio and video signals only by one set of video image acquisition system and multi-channel distribution system device, so that the cost of the emergency command communication broadband is reduced, and the application range is enlarged.
Drawings
Fig. 1 shows a block diagram of an automatic inspection and real-time image acquisition and transmission system for an aerial unmanned aerial vehicle for a railway line.
Fig. 2 shows a flow chart of an automatic patrol and real-time image acquisition and transmission method for an aerial unmanned aerial vehicle for a railway line.
Detailed Description
Fig. 1 shows an automatic inspection and real-time image acquisition and transmission system for an aerial unmanned aerial vehicle for a railway line. The system comprises: a monitoring device 1 installed in the unmanned aerial vehicle and a video transmission device 2 installed at a ground central station.
Wherein, monitoring device 1 includes: the unmanned aerial vehicle system comprises a central processing module 11, a satellite navigation module 13, a high-definition high-power zooming motion camera 12, an end image processing module 14, a video image wireless transmitting module 15 and a patrol route planning module 16 which are installed on the unmanned aerial vehicle.
Wherein, patrol route planning module 16 is used for the circuit of intelligent planning unmanned aerial vehicle patrol, includes: the device comprises a unary nonlinear regression prediction unit, a circuit diagram communication unit and a circuit construction and storage unit; the unary nonlinear regression prediction unit predicts the node distribution of the railway line to generate a plurality of node lines, and each key node line covers a plurality of nodes; the circuit diagram communication unit makes a plurality of node circuits communicated to form a circuit communication diagram by using the critical condition of node distribution; and the patrol route constructing and storing unit is used for constructing and storing a final structure of the patrol route, and the central processing module reads and executes the patrol route from the unit. The nodes may include stations, substations, crossings, line crossings, and the like.
The unary nonlinear regression prediction unit simplifies the three-dimensional space model into a two-dimensional space model, and predicts the important node distribution of the railway line by adopting an unary nonlinear regression prediction method; the prediction mode adopts a confidence interval mode, and new input data (explanatory variables) are predicted and judged according to historical important node data (response variables).
Wherein the critical conditions include: the crossing condition, the parallel distribution condition of a plurality of lines with close distance, the turning condition of important nodes and the branching condition of important nodes.
The patrol line construction and storage unit establishes a node matrix, wherein row and column coordinates represent a node number, and matrix data are node geographical position information; the nodes are divided into two types according to the prediction result: important nodes and non-important nodes, and for the important nodes, constructing an important node adjacency list; and for the non-important nodes, storing a matrix structure, wherein the row coordinates of the matrix represent the lines to which the nodes belong, and the column coordinates represent the node numbers.
The central processing module 11 is further embedded with an ethernet switch chip (LANswitch), the ethernet switch chip (LANswitch) is connected with the central processing module 11(ARM) through a Local Area Network (LAN),
the machine-side image processing module 14 is connected with the central processing module through a hundred-mega Ethernet port, receives pictures returned by a high-definition motion camera through an Ethernet switched bus extended by an Ethernet switched chip (LANSWITCH) of the central processing module, analyzes and solves the pictures, fuses the pictures with data of an optical flow sensor, an ultrasonic sensor and an inertia measurement unit, and performs visual navigation, obstacle avoidance and image target identification tracking.
A data receiving unit receives packet data and extracts an image coding stream from the packet data. The image coded stream is coded image data called elementary (elementary) stream. For example, some elementary streams conform to coding standards such as MPEG-2 (MPEG: moving picture experts group) and HEVC (high efficiency video coding) h.264, having a two-layer structure composed of at least a sequence level and a picture level, each level including a header portion and a data portion. The header portion contains various parameters for encoding. The data portion is decoded by a typical decoder using the parameters as decoding parameters. The parameter changing unit changes a parameter in the image coded stream and supplies the image coded stream containing the changed parameter to the decoder. The decoder decodes the data portion of the image coded stream using the changed parameter in the image coded stream as a decoding parameter, thereby generating a decoded image. The image recognition unit detects, recognizes, tracks an object in the decoded image, and the like.
The image recognition unit calculates an index indicating an image recognition accuracy in the image recognition process, and the parameter changing unit changes the parameter received by the data receiving unit based on the index indicating the image recognition accuracy calculated in the image recognition process.
The parameter changing method by the parameter changing unit will be described in detail later. The parameter in the header contained in the encoded stream by the image is changed from the value generated and added by the encoder unit to another value. Assuming that the decoded image is for human viewing, the parameters generated and added by the encoder unit are optimized to suppress image degradation. The parameter is not always set at an appropriate value for recognition in the image recognition unit. Thus, the parameter changing unit changes the parameter in the header contained in the image encoding stream received through the network to an appropriate value for identification in the image identifying unit. This can improve the image recognition rate in the image recognition unit. The parameters can be changed quickly in an appropriate manner without being affected by transmission delays, unlike the case of changing the values of the parameters generated by the encoder.
At this time, the image recognition unit preferably calculates an index indicating the accuracy of image recognition in the image recognition process, and then supplies the index to the parameter changing unit, which preferably changes the parameter according to the index indicating the accuracy of image recognition. This is because the parameter values can be changed more appropriately for the image recognition performed by the image recognition unit.
For example, the index indicating the accuracy of image recognition is an index indicating the accuracy of the result of image detection, recognition, and tracking in the image recognition unit, and is information on the recognition area or image recognition area information. The accuracy of the recognition and detection results can be determined according to a threshold indicating the degree of similarity in each process or according to the number of discriminator (discriminator) stages passed. The accuracy of the recognition and detection results can be determined by various methods using algorithms and applications for recognition and detection as follows.
The decoding unit includes a deblocking filter, and the parameter changing unit changes at least one of a parameter indicating whether or not the deblocking filter is utilized for the image encoding data and a filter coefficient of the deblocking filter as the parameter received by the data receiving unit.
The decoding unit includes an inverse quantization unit, and the parameter includes a quantization parameter included in encoding for generating encoded image data. The parameter changing unit changes a quantization parameter contained in the parameters received by the data receiving unit and then supplies the quantization parameter to the inverse quantizing unit.
The decoding unit includes an orthogonal inverse transform unit. The parameters contain orthogonal transform coefficients for orthogonal transform included in encoding performed to generate image encoded data. The parameter changing unit changes the orthogonal transform coefficient contained in the parameter received by the data receiving unit and then supplies the coefficient to the orthogonal inverse transform unit.
The central processing module 11 is provided with an image coding unit for coding the image acquired by the high-definition high-power zooming motion camera, then the terminal image processing module receives the image coding stream generated by the image coding through the data receiving unit, and the parameter changing unit changes the parameters received by the data receiving unit according to the running condition of the unmanned aerial vehicle.
The high-definition high-power zoom motion camera 12 is directly connected with an ethernet switching bus extended by the central processing module 11 through an ethernet port, supports forwarding of a plurality of video streams, and transmits high-definition video data to a terminal image processing module (DSP + ARM) through an ethernet switching chip (LANswitch) to perform image calculation.
The video image wireless transmitting module 15 can be compatible with a plurality of signal transmitting modes, including short-distance wireless transmission, satellite signal transmitting mode, 3G/4G mobile signal transmitting mode, etc.
The satellite navigation module 13 is a GPS/Beidou receiving chip, a magnetic compass and a single chip microcomputer, a CAN bus is connected with a central processing module (ARM), GPS and Beidou navigation positioning are supported, a magnetic compass is supported to resolve the aircraft attitude, data fusion is carried out on the aircraft attitude and an Inertial Measurement Unit (IMU), and finally the aircraft attitude and the aircraft position are resolved by the central processing module 11.
The video transmission device 2 includes: a video image receiving module 21, a multi-channel distribution module 22, a central site image processing module 23 and a display terminal 24. The video image receiving module 21 receives the image signal transmitted 14 by the image transmitting module via a satellite network or a mobile communication network; the multi-channel distribution module 22 is composed of a video compression encoder, a multi-channel communication distribution device, a communication device and a gateway device, wherein the communication device comprises a wired transmission device, a short-distance wireless communication device, a mobile communication device and a satellite communication device, and the central image processing system is composed of a decoding device and an image display device.
The multi-channel distribution system searches for an optimal channel through detection of an existing channel, a video compression encoder performs compression encoding on videos and images acquired by a video image acquisition system, reduces the size of files and the pressure of the channel, performs video file transmission through the optimal channel, transmits the video files to a network server, and a central image processing system is accessed to an internet public network, decodes the video files in real time and displays the video files on image display equipment.
The multi-channel distribution equipment is provided with an encryption device, the central site image processing system is provided with a decryption device, after the design is adopted, the data is encrypted, so that the safety in the data transmission process is ensured, the hardware encryption and hardware decryption devices are adopted, so that the software cracking difficulty is very high, even if someone intercepts and captures related files, the files are difficult to decrypt due to the absence of corresponding hardware, and the safety of the transmitted files is ensured to the greatest extent.
The mobile communication equipment adopts various network standard equipment and is compatible with 3G and 4G networks. After the design is adopted, the national 3G is basically stable, the 4G is developed at a high speed, at the present stage, the 3G and the 4G coexist, the two systems can meet the requirement of transmitting audio and video files, because the covering surfaces and the covering strength are different, the method compatible with the 3G and the 4G is the best choice, the 4G has larger data transmission amount and poorer covering surface, is suitable for carrying out high-quality video transmission in a place with 4G signals, the 3G has wider covering surface and smaller data transmission amount, and is suitable for carrying out video transmission in a place without 4G signals.
The satellite communication equipment comprises a satellite antenna, a satellite power amplifier, an LNB and a satellite modem, and after the design is adopted, video data can be transmitted through satellite signals through the satellite communication equipment, so that the application range of the equipment is expanded.
Fig. 2 shows an automatic inspection and real-time image acquisition and transmission method for an aerial unmanned aerial vehicle on a railway line. The method specifically comprises the following steps:
s1, planning a patrol route by a patrol route planning module;
s2, starting a monitoring program by the central processing module, reading and executing the intelligent planning line, and starting a GPS navigation program by the satellite navigation module;
s3, acquiring a video image by the high-definition high-power zooming motion camera according to the track of the monitoring program, and processing the image by a machine-end image processing module;
s4, a video image wireless transmitting module and a video image receiving module are matched to complete wireless transmission and reception of image signals;
and S5, the central station image processing module processes the received image signal and displays the image signal on the display terminal.
Preferably, in step S1, the method specifically includes the following steps:
s11, predicting the node distribution of the railway line by adopting a unitary nonlinear regression prediction method to generate a plurality of node lines, wherein each node line covers a plurality of nodes;
in step S11, the three-dimensional space model is first simplified to a two-dimensional space model, and a unitary nonlinear regression prediction method is used to predict the distribution of railway line nodes; the prediction mode adopts a confidence interval mode, and new input data (explanatory variables) are predicted and judged according to historical node data (response variables). Due to the particularity of node distribution, the special case types are summarized, and the existence condition of each special case is determined so as to be classified in the algorithm processing process.
S12, connecting a plurality of node lines to form a line connection graph by using the critical condition of node distribution; wherein the critical conditions include: the crossing condition, the parallel distribution condition of a plurality of lines with close distance, the node turning condition and the node branching condition;
in step S12, if the node distribution reaches a critical condition due to a special condition, performing a critical type determination and processing; the critical categories of node distribution are classified into the following:
cross-over situation: predicting that a plurality of points appear in the interval, exceed the set number of nodes and have cross points;
the parallel distribution condition that the distance of a plurality of lines is short: predicting that a plurality of points appear in the interval, exceed the set number of nodes and have no cross points;
node turning situation: compared with a prediction equation, a unique inflection point appears in a prediction interval;
node branching conditions: in contrast to the prediction equation, a plurality of inflection points occur within the prediction interval.
And S13, constructing and storing a railway line inspection plan.
The node connected graph is intelligently constructed under the condition of only node geographic coordinates through a linear regression prediction algorithm so as to carry out intelligent task planning.
Step S11 specifically includes the following steps:
s111, dimension reduction: performing dimension reduction processing on the three-dimensional node geographic coordinate to convert the three-dimensional node geographic coordinate into a two-dimensional coordinate; setting three-dimensional coordinates of original A nodeIs (x)t,yt,zt),xtRepresenting the three-dimensional spatial longitude coordinate, y, of a nodetRepresenting the latitude coordinate of the node, ztRepresenting the altitude of the node, the coordinate of the node A after dimensionality reduction is (x)t,yt);
Establishment of the regression equation of S112: the unitary linear regression prediction model formula applied to the power transmission line mission planning is as follows:
Y ^ t = a + bx t - - - ( 1 )
in the formula xtRepresenting the longitude coordinates of the node at time t,representing the estimated latitude coordinate at the time t;
s113, taking the regression prediction step length as N, and obtaining a solving equation of the parameters a and b in the regression equation as follows:
a = ΣY i N - b ΣX i N b = NΣY i X i - ΣY i ΣX i NΣX i 2 - ( ΣX i ) 2 - - - ( 2 )
wherein,n is a predicted moving step length; because the distance between two base nodes is different from dozens of meters to hundreds of meters, most of the cases are that the continuous multiple base nodes form a similar straight line segment.
According to another embodiment of the present invention, in step S113, the step length N is 5 meters to 10 meters.
According to another embodiment of the present invention, step S11 further includes step 114: and constructing a prediction interval. Because the curve equation determined by the actual node has a certain deviation from the prediction equation, the Y value is subjected to interval prediction, namely a prediction interval of the average value is constructed. And according to the node distribution condition, setting a significance level a, and calculating a prediction interval with the confidence coefficient of the Y average value being 1-a.
According to another embodiment of the present invention, step S13 specifically includes the following steps:
s131, establishing a node matrix, wherein row and column coordinates represent a node number, and matrix data are node geographical position information; the nodes are divided into two types according to the prediction result: important nodes and non-important nodes; the important nodes comprise line starting and stopping nodes and cross nodes; the non-important nodes are internal nodes which only belong to a single line;
s132, for the important nodes, constructing an important node adjacency list;
and S133, storing a matrix structure for the non-important nodes, wherein the row coordinates of the matrix represent the lines to which the non-important nodes belong, and the column coordinates represent the node numbers.
In step S132, for the important node, a chain storage structure-adjacency list is adopted in consideration of the algorithm storage space and the algorithm efficiency due to the large node data amount.
Preferably, in step S2, the following navigation positioning steps are further included:
the central processing module 11 determines the positioning data transmitted by the satellite navigation module 13:
if the positioning data is within the normal range: the central processing module 11 stores the received positioning data in the memory;
the positioning data in the normal range refers to: comparing every two longitude values, every two latitude values and every two height values of two adjacent sampling points in the positioning data, and if the difference of the longitudes of the two adjacent sampling points is not more than 0.0002 degree, the difference of the latitudes of the two adjacent sampling points is not more than 0.00018 degree, and the difference of the heights of the two adjacent sampling points is not more than 20 meters, judging that the positioning data is in a normal range;
if the positioning data is abnormal: the central processing module 11 calls out the positioning data stored in the memory and returns to the starting position according to the historical track;
the positioning data is abnormal by: and comparing every two longitude values, every two latitude values and every two altitude values of two adjacent sampling points in the positioning data, and if the difference value of the longitude exceeds 0.0002 degrees, or the difference value of the latitude exceeds 0.00018 degrees, or the difference value of the altitude exceeds 20 meters, judging that the positioning data is abnormal.
Preferably, the positioning data is a set of longitude information x, latitude information y and altitude information z of the unmanned aerial vehicle at each time point, and is recorded as { xtytzt }; wherein,
(x1y1z1) is longitude, latitude and altitude information of the unmanned aerial vehicle at the 1 st time point;
(x2y2z2) is longitude, latitude and altitude information of the unmanned aerial vehicle at the 2 nd time point;
by analogy, (xt-1yt-1zt-1) is longitude, latitude and altitude information of the unmanned aerial vehicle at the t-1 time point; (xtytzt) is longitude, latitude and altitude information of the unmanned plane at the t-th time point;
the interval between two adjacent time points is 0.5 to 5.0 seconds; each historical positioning data is stored in the memory of the central processing module 11;
comparing the positioning data of the t-th time point with the positioning data of the t-1 th time point:
if xt-xt-1 < 0.0002, yt-yt-1 < 0.00018 and zt-zt-1 < 20 m,
when the difference of the longitude is not more than 0.0002 degree, the difference of the latitude is not more than 0.00018 degree, and the difference of the altitude is not more than 20 meters, it is determined that the positioning data of the t-th time point belongs to the normal range, and the positioning data of the t-th time point is stored in the memory of the central processing module 11;
if xt-xt-1 is more than or equal to 0.0002, or yt-yt-1 is more than or equal to 0.00018, or zt-zt-1 is more than or equal to 20 m; that is, any one of the difference value of the longitude, the difference value of the latitude and the difference value of the altitude exceeds the normal range, it is determined that the positioning data at the t-th time point is abnormal, that is, it is considered that the flight of the unmanned aerial vehicle is abnormal;
the central processing module 11 sequentially reads the positioning data of the t-1 th time point, the positioning data of the t-2 th time point, the positioning data of the … … 2 nd time point and the positioning data of the 1 st time point in the memory, and controls the starting place of the unmanned aerial vehicle to return according to the original track.
Preferably, in step S2, the monitoring program includes an application-level program, a real-time task scheduler and an external interrupt handler, a hardware initialization program, a hardware driver, a CAN communication protocol program, and a LAN (TCP/IP) communication protocol program, the application-level program is connected to the real-time task scheduler and the external interrupt handler, the real-time task scheduler and the external interrupt handler are connected to the hardware initialization program, and the hardware initialization program is connected to the hardware driver.
Preferably, the application-level program includes an application layer interface program, a power management and power monitoring program, a flight indicator light control program, a safety control program, a visual control program, a track control program, a stability augmentation control program, a remote controller decoding program, and a communication processing program.
Preferably, in step S3, the video image may be processed using one or more of the following steps:
s31: the data receiving unit receives an image encoding stream including image encoding data and parameters.
The data receiving unit receives an image encoding stream including image encoding data and parameters. The parameter changing unit is capable of changing the parameter received by the data receiving unit. The decoding unit generates image decoded data by decoding an image encoded stream including the image encoded data received by the data receiving unit and the parameter changed by the parameter changing unit. The image recognition unit performs image recognition on the image decoded data.
Thus, it is possible to quickly change the parameters in an appropriate manner without being affected by the transmission delay, thereby improving the image recognition rate. This is because the parameters contained in the image encoding stream are transmitted by the encoder in the image transmitting apparatus, and this can then be appropriately changed to parameters suitable for image recognition in the image receiving apparatus.
S32: the parameter is changed based on an index indicating the image recognition accuracy.
The image recognition unit calculates an index indicating an image recognition accuracy in an image recognition process. The parameter changing unit changes the parameter received by the data receiving unit based on an index indicating the image recognition accuracy calculated in the image recognition process.
This enables more appropriate change of parameters for image recognition.
S33: the parameter is changed based on the environment information of the image receiving apparatus.
The parameter changing unit changes the parameter received by the data receiving unit based on the environment information of the image receiving apparatus.
S34: the parameters are changed according to the operating conditions.
The parameter changing unit changes the parameters received by the data receiving unit according to the operation condition of the unmanned aerial vehicle.
S35: changing parameters of deblocking filter
The decoding unit includes a deblocking filter. The parameter changing unit changes at least one of a parameter indicating whether or not to utilize the deblocking filter for the image encoding data and a filter coefficient of the deblocking filter as the parameter received by the data receiving unit.
Thus, in the case where the image recognition rate is not sufficiently high, the strength of the deblocking filter is reduced to avoid suppression of high-frequency components of the image, or the degree of suppression is reduced, thereby increasing the recognition rate.
S36: varying quantization parameters
The decoding unit includes an inverse quantization unit. The parameters include quantization parameters for quantization included in encoding for generating encoded image data. The parameter changing unit changes a quantization parameter contained in the parameters received by the data receiving unit and then supplies the quantization parameter to the inverse quantizing unit.
Thus, in the case where the image recognition rate is not sufficiently high, the quantization parameter is increased to enlarge and emphasize the prediction error component, thereby improving the recognition rate.
S37: changing orthogonal transform coefficients
The decoding unit includes an orthogonal inverse transform unit. The parameters contain orthogonal transform coefficients for orthogonal transform included in encoding performed to generate image encoded data. The parameter changing unit changes the orthogonal transform coefficient contained in the parameter received by the data receiving unit and then supplies the coefficient to the orthogonal inverse transform unit.
Therefore, in the case where the image recognition rate is not sufficiently high, the orthogonal transform coefficient can be changed to improve the recognition rate. For example, the high frequency range of the orthogonal transform coefficient is deleted, thereby allowing the frequency components of the decoded image input to the image recognition unit to match the frequency components required for image recognition.
Preferably, in step S4, the multi-channel distribution system detects channels, selects an optimal channel, and sequentially has the following priorities: short-range wireless transmission, mobile communication transmission, satellite communication transmission.
Preferably, in step S5, the method includes the following sub-steps:
s51, segmenting the video file by a video file segmenter;
s52, compressing the divided files by a video compression encoder;
and S53, the encryption device carries out encryption operation on the compressed video file.
Preferably, in step S5, after the decryption device of the central site image processing module decrypts the video file, the decoding device decodes the file, and the display device displays the video in real time.
As described above, although the embodiments and the drawings defined by the embodiments have been described, it is apparent to those skilled in the art that various modifications and variations can be made from the above description. For example, the present invention may be carried out in a different order from the method described in the technology described, or may be combined or combined in a different manner from the method described for the constituent elements such as the system, the structure, the device, the circuit, and the like described, or may be replaced or substituted with other constituent elements or equivalents. For those skilled in the art to which the invention pertains, several equivalent substitutions or obvious modifications, which are equivalent in performance or use, should be considered to fall within the scope of the present invention without departing from the spirit of the invention.

Claims (10)

1. The method for automatic inspection and real-time image acquisition and transmission of the unmanned aerial vehicle in the air on the railway line specifically comprises the following steps:
s1, planning a patrol route by a patrol route planning module;
s2, starting a monitoring program by the central processing module, reading and executing the intelligent planning line, and starting a GPS navigation program by the satellite navigation module;
s3, acquiring a video image by the high-definition high-power zooming motion camera according to the track of the monitoring program, and processing the image by a machine-end image processing module;
s4, a video image wireless transmitting module and a video image receiving module are matched to complete wireless transmission and reception of image signals;
and S5, the central station image processing module processes the received image signal and displays the image signal on the display terminal.
2. The method according to claim 1, wherein in step S1, the method specifically comprises the following steps:
s11, predicting the node distribution of the railway line by adopting a unitary nonlinear regression prediction method to generate a plurality of node lines, wherein each node line covers a plurality of nodes;
in step S11, the three-dimensional space model is first simplified to a two-dimensional space model, and a unitary nonlinear regression prediction method is used to predict the distribution of railway line nodes; the prediction mode adopts a confidence interval mode, and new input data (explanatory variables) are predicted and judged according to historical node data (response variables).
S12, connecting a plurality of node lines to form a line connection graph by using the critical condition of node distribution; wherein the critical conditions include: the crossing condition, the parallel distribution condition of a plurality of lines with close distance, the node turning condition and the node branching condition;
in step S12, if the node distribution reaches a critical condition due to a special condition, performing a critical type determination and processing; the critical categories of node distribution are classified into the following:
cross-over situation: predicting that a plurality of points appear in the interval, exceed the set number of nodes and have cross points;
the parallel distribution condition that the distance of a plurality of lines is short: predicting that a plurality of points appear in the interval, exceed the set number of nodes and have no cross points;
node turning situation: compared with a prediction equation, a unique inflection point appears in a prediction interval;
node branching conditions: in contrast to the prediction equation, a plurality of inflection points occur within the prediction interval.
And S13, constructing and storing a railway line inspection plan.
3. The method according to claim 2, wherein step S11 specifically comprises the steps of:
s111, dimension reduction: performing dimension reduction processing on the three-dimensional node geographic coordinate to convert the three-dimensional node geographic coordinate into a two-dimensional coordinate; setting the three-dimensional coordinate of the original A node as (x)t,yt,zt),xtRepresenting the three-dimensional spatial longitude coordinate, y, of a nodetRepresenting the latitude coordinate of the node, ztRepresenting the altitude of the node, the coordinate of the node A after dimensionality reduction is (x)t,yt);
S112, establishing a regression equation: the unitary linear regression prediction model formula applied to the power transmission line mission planning is as follows:
Y ^ t = a + bx t - - - ( 1 )
in the formula xtRepresenting the longitude coordinates of the node at time t,representing the estimated latitude coordinate at the time t;
s113, taking the regression prediction step length as N, and obtaining a solving equation of the parameters a and b in the regression equation as follows:
a = &Sigma;Y i N - b &Sigma;X i N b = N&Sigma;Y i X i - &Sigma;Y i &Sigma;X i N&Sigma;X i 2 - ( &Sigma;X i ) 2 - - - ( 2 )
wherein,n is a predicted moving step length; because the distance between two base nodes is different from dozens of meters to hundreds of meters, most of the cases are that the continuous multiple base nodes form a similar straight line segment.
4. The method of claim 3, wherein the step length N is 5 m to 10 m in step S113.
5. The method of claim 2, wherein step S11 further comprises step 114: and constructing a prediction interval, wherein because the curve equation determined by the actual node has a certain deviation with the prediction equation, the Y value is subjected to interval prediction, namely the prediction interval of the average value is constructed, the significance level a is set according to the node distribution condition, and the prediction interval of which the confidence coefficient of the Y average value is 1-a is calculated.
6. The method according to claim 2, wherein step S13 specifically comprises the steps of:
s131, establishing a node matrix, wherein row and column coordinates represent a node number, and matrix data are node geographical position information; the nodes are divided into two types according to the prediction result: important nodes and non-important nodes; the important nodes comprise line starting and stopping nodes and cross nodes; the non-important nodes are internal nodes which only belong to a single line;
s132, for the important nodes, constructing an important node adjacency list;
and S133, storing a matrix structure for the non-important nodes, wherein the row coordinates of the matrix represent the lines to which the non-important nodes belong, and the column coordinates represent the node numbers.
7. The method as claimed in claim 1, wherein the monitoring program includes an application level program, a real-time task scheduler and an external interrupt handler, a hardware initializer, a hardware driver, a CAN communication protocol program, a LAN (TCP/IP) communication protocol program, the application level program being connected with the real-time task scheduler and the external interrupt handler, the real-time task scheduler and the external interrupt handler being connected with the hardware initializer, the hardware initializer being connected with the hardware driver at step S2.
8. The method of claim 1, wherein the application level programs include an application layer interface program, a power management and power monitoring program, a flight indicator light control program, a safety control program, a vision control program, a track control program, a stability augmentation control program, a remote control decoding program, a communication processing program.
9. The method of claim 1, wherein in step S3, the video image is processed using one or more of the following steps:
s31: a data receiving unit receives an image encoding stream including image encoding data and parameters;
s32: changing a parameter based on an index indicating an image recognition accuracy;
s33: changing a parameter based on environmental information of the image receiving apparatus;
s34: changing parameters according to the operation condition;
s35: changing parameters of a deblocking filter;
s36: changing the quantization parameter;
s37: the orthogonal transform coefficients are changed.
10. The method as claimed in claim 1, wherein in step S5, the sub-steps of:
s51, segmenting the video file by a video file segmenter;
s52, compressing the divided files by a video compression encoder;
and S53, the encryption device carries out encryption operation on the compressed video file.
CN201510644021.1A 2015-10-08 2015-10-08 The aerial unmanned plane of railway line is patrolled and real time image collection transmission method automatically Active CN105262989B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510644021.1A CN105262989B (en) 2015-10-08 2015-10-08 The aerial unmanned plane of railway line is patrolled and real time image collection transmission method automatically

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510644021.1A CN105262989B (en) 2015-10-08 2015-10-08 The aerial unmanned plane of railway line is patrolled and real time image collection transmission method automatically

Publications (2)

Publication Number Publication Date
CN105262989A true CN105262989A (en) 2016-01-20
CN105262989B CN105262989B (en) 2018-08-14

Family

ID=55102472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510644021.1A Active CN105262989B (en) 2015-10-08 2015-10-08 The aerial unmanned plane of railway line is patrolled and real time image collection transmission method automatically

Country Status (1)

Country Link
CN (1) CN105262989B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105501248A (en) * 2016-02-16 2016-04-20 株洲时代电子技术有限公司 Railway line inspection system
CN106406343A (en) * 2016-09-23 2017-02-15 北京小米移动软件有限公司 Control method, device and system of unmanned aerial vehicle
CN108284855A (en) * 2017-12-27 2018-07-17 河南辉煌信通软件有限公司 A kind of rail polling system based on unmanned plane
CN109120900A (en) * 2018-09-17 2019-01-01 武汉卓尔无人机制造有限公司 Unmanned vehicle images processing system and its processing method
CN109213196A (en) * 2018-09-05 2019-01-15 福州日兆信息科技有限公司 A kind of communication iron tower intelligent patrol detection unmanned plane device
CN109510700A (en) * 2018-12-20 2019-03-22 滨州学院 A kind of data transmission system based on chaos encryption
CN110658850A (en) * 2019-11-12 2020-01-07 重庆大学 Greedy strategy-based flight path planning method for unmanned aerial vehicle
CN110708830A (en) * 2019-10-25 2020-01-17 湖南汇纳景观亮化工程有限公司 Intelligent lamp inspection system
CN111086452A (en) * 2019-12-27 2020-05-01 深圳疆程技术有限公司 Method, device and server for compensating lane line delay
CN111864618A (en) * 2019-04-24 2020-10-30 广州煜煊信息科技有限公司 Unmanned aerial vehicle inspection method and system for power system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102183955A (en) * 2011-03-09 2011-09-14 南京航空航天大学 Transmission line inspection system based on multi-rotor unmanned aircraft
US20110311099A1 (en) * 2010-06-22 2011-12-22 Parrot Method of evaluating the horizontal speed of a drone, in particular a drone capable of performing hovering flight under autopilot
CN103606261A (en) * 2013-11-29 2014-02-26 文杰 Dynamic cell patrolling system based on aerial photography
CN203773717U (en) * 2013-11-12 2014-08-13 武汉大学 Remote visual touch screen control system for unmanned plane

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110311099A1 (en) * 2010-06-22 2011-12-22 Parrot Method of evaluating the horizontal speed of a drone, in particular a drone capable of performing hovering flight under autopilot
CN102183955A (en) * 2011-03-09 2011-09-14 南京航空航天大学 Transmission line inspection system based on multi-rotor unmanned aircraft
CN203773717U (en) * 2013-11-12 2014-08-13 武汉大学 Remote visual touch screen control system for unmanned plane
CN103606261A (en) * 2013-11-29 2014-02-26 文杰 Dynamic cell patrolling system based on aerial photography

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
于陈平,徐毓,高婷: "基于回归分析的航迹关联算法研究", 《舰船电子对抗》 *
郭志军: "应用 Excel 对一元线性回归模型的分析", 《宁波职业技术学院学报》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105501248A (en) * 2016-02-16 2016-04-20 株洲时代电子技术有限公司 Railway line inspection system
CN106406343A (en) * 2016-09-23 2017-02-15 北京小米移动软件有限公司 Control method, device and system of unmanned aerial vehicle
CN108284855A (en) * 2017-12-27 2018-07-17 河南辉煌信通软件有限公司 A kind of rail polling system based on unmanned plane
CN109213196A (en) * 2018-09-05 2019-01-15 福州日兆信息科技有限公司 A kind of communication iron tower intelligent patrol detection unmanned plane device
CN109120900B (en) * 2018-09-17 2019-05-24 武汉卓尔无人机制造有限公司 Unmanned vehicle images processing system and its processing method
CN109120900A (en) * 2018-09-17 2019-01-01 武汉卓尔无人机制造有限公司 Unmanned vehicle images processing system and its processing method
CN109510700A (en) * 2018-12-20 2019-03-22 滨州学院 A kind of data transmission system based on chaos encryption
CN111864618A (en) * 2019-04-24 2020-10-30 广州煜煊信息科技有限公司 Unmanned aerial vehicle inspection method and system for power system
CN110708830A (en) * 2019-10-25 2020-01-17 湖南汇纳景观亮化工程有限公司 Intelligent lamp inspection system
CN110708830B (en) * 2019-10-25 2021-06-08 湖南汇纳景观亮化工程有限公司 Intelligent lamp inspection system
CN110658850A (en) * 2019-11-12 2020-01-07 重庆大学 Greedy strategy-based flight path planning method for unmanned aerial vehicle
CN110658850B (en) * 2019-11-12 2022-07-12 重庆大学 Greedy strategy-based flight path planning method for unmanned aerial vehicle
CN111086452A (en) * 2019-12-27 2020-05-01 深圳疆程技术有限公司 Method, device and server for compensating lane line delay

Also Published As

Publication number Publication date
CN105262989B (en) 2018-08-14

Similar Documents

Publication Publication Date Title
CN105262989B (en) The aerial unmanned plane of railway line is patrolled and real time image collection transmission method automatically
CN105208347B (en) The aerial unmanned plane of railway line is patrolled and real time image collection transmitting, monitoring device automatically
CN105208348B (en) The aerial unmanned plane of railway line is patrolled and real time image collection Transmission system automatically
CN105120240B (en) The aerial high definition multidimensional of high power zoom unmanned plane investigates transmitting, monitoring device in real time
CN105120230B (en) Unmanned plane picture control and Transmission system
CN105120232B (en) Unmanned plane picture control and transmission method
CN105208335B (en) The aerial high definition multidimensional of high power zoom unmanned plane investigates Transmission system in real time
US8115812B2 (en) Monitoring system, camera, and video encoding method
US12028788B2 (en) Communication system and base station
US20240271955A1 (en) Information transmission method and client device
Nishio et al. When wireless communications meet computer vision in beyond 5G
KR102374670B1 (en) UAV data transmission system, method, apparatus and computer equipment
CN107533792A (en) System for transmitting order and video flowing between remote control of machine and ground station in such as unmanned plane etc.
CN104950906A (en) Unmanned aerial vehicle remote measuring and control system and method based on mobile communication network
US20220147042A1 (en) Near Real-Time Data and Video Streaming System for a Vehicle, Robot or Drone
CN110636255A (en) Unmanned aerial vehicle image and video transmission and distribution system and method based on 4G network
US20220108489A1 (en) Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device
US20220036595A1 (en) Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device
CN112822447A (en) Robot remote monitoring video transmission method and system based on 5G network
CN105208336B (en) The aerial high definition multidimensional of high power zoom unmanned plane investigates transmission method in real time
US20220207782A1 (en) Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device
US10694534B1 (en) Transferring data through a bonded communication link
US20240005564A1 (en) Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device
Angelino et al. Sensor aided h. 264 video encoder for uav applications
Jin et al. Design of UAV video and control signal real-time transmission system based on 5G network

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant