Disclosure of Invention
The invention provides a method for generating a simulated laser point cloud image by aerial photographing of an image, which is used for obtaining a target image by unmanned aerial vehicle, processing the target image and simulating the laser point cloud image, so that a learner is prevented from directly operating a laser point cloud system.
The invention provides a method for generating a simulated laser point cloud image by an aerial image, which comprises the following steps:
s1: the unmanned aerial vehicle aerial photographs are obtained to obtain RGB images, and the flying height of the unmanned aerial vehicle is calculated;
s2: performing analog processing on the RGB image, including:
converting R, G, B channels of the RGB image into Y channels to obtain Y channel images, wherein R, G, B sequentially represents the values of red, green and blue channels, and Y represents the brightness value;
detecting a closed region in the Y channel image, and increasing Y in the closed region to simulate the reflection intensity of the laser radar;
projecting the processed Y channel image to an RGB image to obtain a pseudo-color image;
s3: and adjusting the resolution of the pseudo-color image according to the laser radar resolution to be simulated, and generating a simulated laser point cloud image.
Preferably, at the moment when the unmanned aerial vehicle acquires the RGB image, the flying height of the unmanned aerial vehicle is measured for a plurality of times, and an average value is obtained.
Preferably, in S2, the R, G, B channel of the RGB image is converted to the Y channel using the following formula:
。
preferably, in S2, the closed region in the Y-channel image is detected by binarization, and the average brightness mean of the Y-channel image is used as a threshold for binarization to obtain a binary image BW as follows:
。
preferably, the process of increasing Y in the closed region in S2 is as follows:
carrying out corrosion treatment on the binary image BW;
expanding the corroded binary images to different degrees, and sequentially obtaining binary images, namely, pengzhang1, pengzhang2, pengzhang3 and pengzhang4 from small to large in expansion degree;
regenerating the expanded binary image into a uint8 type image, wherein the images are as follows:
averaging the generated uint8 type image to obtain a final expansion imageThe method comprises the following steps:;
y-channel image and final inflation imageThe superposition is carried out, and an concave-convex effect image aotu simulating the reflection intensity of the laser point cloud image is formed as follows: />;
And if the superposed numerical value is larger than 255, assigning 225 to the superposed numerical value.
Preferably, the image aotu is projected onto the RGB image to obtain a pseudo-color image weicaise, and a projection formula of a R, G, B channel of the pseudo-color image weicaise is as follows:
compared with the prior art, the invention has the following beneficial effects:
according to the invention, the unmanned aerial vehicle is used for carrying the low-cost visible light load, the unmanned aerial vehicle is used for shooting the aerial image, the aerial image is processed by the method provided by the invention, the laser point cloud image obtained by the laser radar is simulated, the simulated laser point cloud image can be generated in real time for a learner to watch, the practical operation of the laser point cloud system by the beginner is avoided, and the training cost of the laser point Yun Lei for an operator is obviously reduced. In addition, the method has small operand, can be operated on an onboard computer, and has better popularization prospect.
Drawings
FIG. 1 is a flow chart of a method for generating a simulated laser point cloud image from an aerial image provided in accordance with an embodiment of the present invention;
FIG. 2 is an aerial acquired RGB image provided in accordance with an embodiment of the present invention;
FIG. 3 is a Y-channel image obtained by RGB conversion in S2 provided in accordance with an embodiment of the present invention;
fig. 4 is a binary image BW obtained by the binarization detection in S2 according to an embodiment of the present invention;
FIG. 5 is a binary image after corrosion in S2 provided in accordance with an embodiment of the present invention;
FIG. 6 is a image of the inflated pengzhang1 in S2 provided in accordance with an embodiment of the present invention;
FIG. 7 is a post-expansion pengzhang2 image in S2 provided in accordance with an embodiment of the present invention;
FIG. 8 is a post-expansion pengzhang3 image in S2 provided in accordance with an embodiment of the present invention;
FIG. 9 is a post-expansion pengzhang4 image in S2 provided in accordance with an embodiment of the present invention;
FIG. 10 is a view of regenerating the after_pengzhang1 image of the uint8 type in S2 according to an embodiment of the present invention;
FIG. 11 is a view of regenerating an after_pengzhang2 image of the uint8 type in S2 according to an embodiment of the present invention;
FIG. 12 is a view of regenerating the after_pengzhang3 image of the uint8 type in S2 according to an embodiment of the present invention;
FIG. 13 is an after_pengzhang4 image of the uint8 type regenerated in S2 according to an embodiment of the present invention;
FIG. 14 is a final dilated image obtained after averaging in S2 provided in accordance with embodiments of the present invention;
fig. 15 is a concave-convex effect image reflecting the reflection intensity in S2 provided according to the embodiment of the present invention;
FIG. 16 is a pseudo color image in S2 provided in accordance with an embodiment of the present invention;
fig. 17 is a simulated laser point cloud image with final resolution adjustment provided in accordance with an embodiment of the present invention.
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. In the following description, like modules are denoted by like reference numerals. In the case of the same reference numerals, their names and functions are also the same. Therefore, a detailed description thereof will not be repeated.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not to be construed as limiting the invention.
According to the method for generating the simulated laser point cloud image by the aerial image, provided by the embodiment of the invention, the simulated laser point cloud image obtained by the simulated radar is compiled on the airborne computer, calculated and processed, and is transmitted to the ground station through the radio link, so that the trainee on the ground station can watch and learn the laser point cloud image, the knowledge and cognition of the trainee on the laser point cloud image are enhanced, and as shown in fig. 1, the specific simulation process is as follows:
s1: the unmanned aerial vehicle performs aerial photography on the target area or the target object to obtain an RGB image of the target area or the target object shown in fig. 2, and at the moment of obtaining the RGB image by the unmanned aerial vehicle, the flying height of the unmanned aerial vehicle, namely the distance between the unmanned aerial vehicle and the ground, is measured for a plurality of times through technologies such as photoelectric ranging, the average value of the multi-measurement results is calculated as the final flying height, the measurement times are usually 5 times, and the measurement times and the measurement frequency can be adjusted simultaneously.
S2: on the unmanned aerial vehicle, the on-board computer carries out simulation processing on the RGB image according to a preset tool or algorithm, and specific simulation parameters can be preset or adjusted according to parameters of the laser radar to be simulated, and the specific process is as follows:
the R, G, B three primary color channels of the RGB image shown in fig. 2 were converted into Y channels, and a Y channel image shown in fig. 3 was obtained, in which R, G, B sequentially represents the values of the three primary color channels of red, green, and blue, and Y represents the luminance value. The R, G, B channel of the RGB image is converted to the Y channel as calculated:
。
and detecting a closed region in the Y-channel image by using an image processing algorithm, and carrying out uplift processing on the closed region, namely gradually increasing the value of Y in the closed region, so as to simulate the reflection intensity of the laser radar.
The occlusion region detection may be performed by a connected graph algorithm or binarization, morphological processing, or the like, and in this embodiment, the occlusion region in the Y-channel image is detected by binarization. In the binarization process, the brightness average mean of the Y-channel image is used as a binarization threshold value, the brightness values Y in the Y-channel image are judged one by one and assigned, and the assignment basis of the obtained binary image BW is as follows:
;
namely, the brightness values of all the points with the brightness value Y larger than the brightness average value mean are assigned to be 1, the brightness values of other points are assigned to be 0, and the binary image BW shown in fig. 4 is obtained after assignment.
The binary image BW shown in fig. 4 is subjected to corrosion treatment, and the binary image BW is subjected to smoothing operation, so that effects of noise reduction, detail removal, target region shrinkage and the like are realized, and the corroded image shown in fig. 5 is recorded as fushi.
The corroded binary image shown in fig. 5 is expanded to different degrees to realize area expansion, and the embodiment of the invention carries out expansion treatment of different degrees for 4 times to ensure the expansion effect, and the expanded binary image is obtained sequentially from small expansion degree to large expansion degree:
pengzhang1 as shown in fig. 6;
pengzhang2 as shown in fig. 7;
pengzhang3 as shown in fig. 8;
pengzhang4 as shown in fig. 9.
Performing image conversion on the expanded binary image according to the uint8 rule, and regenerating an image of the uint8 type, wherein the image comprises the following steps of:
as shown in the figure10, shown in the drawing。
As shown in fig. 11。
As shown in fig. 12。
As shown in fig. 13。
To prevent numerical value overflow, the four uint8 type images generated in the above-mentioned fig. 10 to 13 are averaged and averaged to obtain the final expanded imageThe specific calculation formula is as follows:;
the final inflation image obtained after the average calculation is shown in fig. 14.
The final expanded image will be shown in fig. 14The method is overlapped with the Y channel image shown in fig. 3 to realize mutual overlapping of image values, after the overlapping, an concave-convex effect image aotu simulating the reflection intensity of the laser point cloud image shown in fig. 15 is obtained, the image aotu realizes the uplift effect of a closed area, the concave-convex effect caused by different reflection intensities of the actual laser point cloud image is simulated, and the process of obtaining aotu through overlapping is expressed as follows:;
wherein, because the numerical summation is performed in the superposition process, the problem of numerical overflow may occur, and therefore, the overflow numerical value is additionally specified: if the superimposed value is greater than 255, the superimposed value is assigned 225.
The concave-convex effect image aotu shown in fig. 15 is projected onto the RGB image shown in fig. 2, and a pseudo-color image weibaise shown in fig. 16 is obtained by projection, so that simulation display is facilitated for students. The projection formula of the R, G, B channel of the pseudo-color image weicaise is as follows:
s3: according to the resolution of the laser radar which is required to be simulated, the resolution of the pseudo-color image weicaise is adjusted, the resolution reduction processing of the pseudo-color image weicaise is carried out to different degrees, and finally the simulated laser point cloud image Lidar shown in fig. 17 is obtained. The unmanned aerial vehicle is transmitted to the ground station through the radio link, so that students on the ground station can watch and learn, and the resolution of the output is automatically switched according to the flying height of the unmanned aerial vehicle, and the specific switching process is as follows:
setting the pixel value of the ith row and the jth column of the simulated laser point cloud image Lidar as Lidar i,j 。
When the unmanned aerial vehicle flight height is 1000m, the resolution is 0.8m.
When the unmanned aerial vehicle flight height is 500m, the resolution is 0.4m.
When the unmanned aerial vehicle flight height is 200m, the resolution is 0.2m.
When the resolution of the simulated laser point cloud image is 0.2m, combining every two pixels into 1 pixel, and reducing the resolution to 1/2 of the original image, wherein the method specifically comprises the following steps of:
;
when the resolution of the simulated laser point cloud image is 0.4m, every 4 pixels are combined into 1 pixel, the resolution is reduced to 1/4 of the original image, and the processing is specifically performed according to the following formula:
;
when the resolution of the simulated laser point cloud image is 0.8m, merging every 8 pixels into 1 pixel, reducing the resolution to 1/8 of the original image, and carrying out pixel merging processing according to the following formula:
。
while embodiments of the present invention have been illustrated and described above, it will be appreciated that the above described embodiments are illustrative and should not be construed as limiting the invention. Variations, modifications, alternatives and variations of the above-described embodiments may be made by those of ordinary skill in the art within the scope of the present invention.
The above embodiments of the present invention do not limit the scope of the present invention. Any other corresponding changes and modifications made in accordance with the technical idea of the present invention shall be included in the scope of the claims of the present invention.