CN117456092A - Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey - Google Patents

Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey Download PDF

Info

Publication number
CN117456092A
CN117456092A CN202311377081.2A CN202311377081A CN117456092A CN 117456092 A CN117456092 A CN 117456092A CN 202311377081 A CN202311377081 A CN 202311377081A CN 117456092 A CN117456092 A CN 117456092A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
image
image control
control point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311377081.2A
Other languages
Chinese (zh)
Inventor
于祥波
朱毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xuzhou Shuoxiang Information Technology Co ltd
Original Assignee
Xuzhou Shuoxiang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xuzhou Shuoxiang Information Technology Co ltd filed Critical Xuzhou Shuoxiang Information Technology Co ltd
Priority to CN202311377081.2A priority Critical patent/CN117456092A/en
Publication of CN117456092A publication Critical patent/CN117456092A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/02Computing arrangements based on specific mathematical models using fuzzy logic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Computer Graphics (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Remote Sensing (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Algebra (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Fuzzy Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey, which comprises the following steps: setting a processing station and a base point on the ground, reasonably arranging a plurality of groups of image control points, generating a strip-shaped aerial survey area, generating a coordinate graph according to the image control points, and acquiring a navigation route; controlling the unmanned aerial vehicle to go to an image control point to perform oblique photography, obtaining image data with the image control point, preprocessing the obtained image data through an image control point identification classification model, and screening out waste chips; and inputting the image data obtained each time into an image quality model for precision evaluation, and re-obtaining the image if the precision does not accord with a preset value until the update of the live-action of all the image control points is realized, and forming a live-action model diagram after modeling. The invention belongs to the technical field of unmanned aerial vehicle aerial survey, and particularly provides a three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey, which are used for solving the problems of insufficient aerial survey quality precision, poor measurement precision and large workload in the prior art.

Description

Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle aerial survey, and particularly relates to a three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey.
Background
In the field of building measurement, with the development of unmanned aerial vehicle technology, unmanned aerial vehicle aerial survey has been gradually increased in advantage, and is dominant in the market. In the existing mapping modeling technology, a measurer needs to remotely control an unmanned aerial vehicle to shoot a live-action at first, carry data to return after the unmanned aerial vehicle shooting is completed, then obtain a survey model diagram, and then perform modeling and other modes. When the unmanned aerial vehicle performs navigation, a rectangular or irregular polygonal area is usually generated, and then a route is planned by adjusting control points on the area, however, in the conventional operation, the control points are required to be adjusted for multiple times to generate a banded aerial photographing area, the operation is very inconvenient, and the drawn aerial photographing area possibly covers a non-area, so that redundant data and data processing workload are increased.
The present patent of China with application number 2019110557828 discloses a three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey. The three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey comprises the following steps: acquiring aerial survey data by using unmanned aerial vehicle aerial survey, performing image analysis by an aerial triangle analysis method to convert a series of two-dimensional aerial images into three-dimensional dense point clouds of the building engineering to be detected, and then performing data post-processing to obtain a digital line drawing map and a digital surface model of the building engineering to be detected to obtain a real-scene three-dimensional model; performing live-action inspection of the building engineering to be detected based on the live-action three-dimensional model and the real earth surface point cloud to obtain construction execution data of the building engineering to be detected; and researching and issuing a construction scheduling instruction based on the comparison of the three-dimensional planning design and construction execution data of the building engineering to be tested, and checking and rectifying the execution effect of the scheduling instruction. The three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey is high in efficiency and low in cost.
In the above scheme, the problem of precision brought by manual measurement can be solved by an aerial triangle analysis method, but the aerial flight quality and the scanning quality obtained by the aerial triangle analysis method are problematic, if the film base itself has certain system deformation, or dynamic geometric deformation can be caused by the action of certain strain force in the aerial photography, photographic processing and scanning processes, the measurement precision is poor, and the working capacity is increased.
Disclosure of Invention
Aiming at the situation, in order to overcome the defects of the prior art, the invention provides a three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey, which are used for solving the problems of insufficient aerial survey quality precision, poor measurement precision and large workload in the prior art.
The technical scheme adopted by the invention is as follows:
the scheme discloses a three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey, which comprises the following steps:
s1: setting a processing station and a base point on the ground, reasonably arranging a plurality of groups of image control points, presetting aerial photographing parameters of an unmanned aerial vehicle by the processing station, generating a banded aerial survey area, inputting position information of the image control points to the processing station to form a coordinate graph, and acquiring a navigation route from the current position to the image control points;
s2: the processing station controls the unmanned aerial vehicle to go to the corresponding image control point from the reference coordinate control point, performs oblique photography through the unmanned aerial vehicle to obtain image data with the image control point, performs preprocessing on the obtained image data, and screens out waste chips;
s3: in the photographing process, inputting the image data obtained each time into an image quality model for precision evaluation; if the precision accords with the preset value, continuing to acquire the image data of the next image control point; if the precision does not accord with the preset value, re-acquiring the image;
s4: and covering the original blurred vision by using the re-acquired image conforming to the precision, updating the live-action, and modeling according to the acquired image to form a final live-action model diagram.
In a further scheme, in the step S1, the layout of the image control points adopts a measuring method of a graph root point and an axisymmetric layout mode, and the layout density of the image control points is increased at the complex boundary of the aerial survey area.
Further, in the step S1, the setting of aerial photographing parameters includes: and setting course overlapping degree, side overlapping degree, aerial survey height and datum point height, and generating a band aerial survey area according to the aerial photographing parameters and the unmanned aerial vehicle lens parameters.
In a further aspect, the image quality model flow includes the following steps:
step one, selecting a certain target point of a shot image for instance analysis;
step two, selecting different heading overlapping rates, side overlapping degrees, aerial survey heights and layout of datum point heights, and counting elevation errors and horizontal errors of all examples;
and thirdly, calculating the precision of various conditions and the elevation error and the horizontal error of each example through a fuzzy comprehensive evaluation model, and carrying out comparison analysis.
Further, in the step S2, an image control point identification classification model is used to identify the image control point identification in the image data and perform preprocessing of the image.
Further, the construction and training process of the image control point identification classification model comprises the following steps:
step 1: constructing a plurality of image control point marks, shooting an original image of the image control point marks, preprocessing the image, including cutting, size adjustment, gray level conversion and normalization of the image, and ensuring consistency and usability of image data;
step 2: automatically learning the features by using a convolutional neural network in a deep learning model, and converting the images into feature expression forms which can be processed by a machine learning algorithm; a neural network based on an encoder-decoder of VGG11 network is employed and trained.
Step 3: labeling the preprocessed image data set to form a labeling file, distributing a correct classification label for each sample, selecting a proper classification model according to the characteristics of the task, and forming a sample set by the labeling file corresponding to the original image;
step 4: when the model is trained, firstly, image classification is carried out on an original image, whether an image control point is in the image is judged, and if so, image segmentation is carried out on the image, so that the classification of each pixel is accurate;
step 5: dividing the sample set into a training set and a test set, wherein the training set accounts for 70% of the total number of samples, and the test set accounts for 30% of the total number of samples; evaluating the trained model by using a test set, calculating the accuracy and precision of the model, and optimizing the model according to an evaluation result;
step 6: and finally, deploying the trained model into an actual application environment for realizing the function of identifying and classifying the image control points of the image.
The scheme also discloses a three-dimensional live-action modeling system based on unmanned aerial vehicle aerial survey, which comprises an unmanned aerial vehicle, a processing station and a communication base station, wherein the unmanned aerial vehicle is connected with the communication base station to realize communication with a ground processing station, and the processing station is used for controlling the unmanned aerial vehicle;
the unmanned aerial vehicle comprises an unmanned aerial vehicle control unit, a data acquisition unit and a processing station control unit;
the unmanned aerial vehicle control unit includes: the system comprises an unmanned aerial vehicle processor, an inertial measurement system, a positioning system, a power supply system, a storage system and a wireless communication system, wherein the systems are connected with the unmanned aerial vehicle processor, and the unmanned aerial vehicle processor is used for receiving and processing signals;
the inertial measurement system consists of an accelerometer and a gyroscope and is used for sensing the acceleration of the unmanned aerial vehicle and obtaining related data such as the speed and the gesture of the unmanned aerial vehicle through integral operation. The accelerometer is used for measuring acceleration relative to an inertia space in the movement process of the unmanned aerial vehicle and indicating the direction of a local vertical line; the gyroscope is used for measuring the angular displacement of the unmanned aerial vehicle relative to the rotating motion direction of the carrying platform, indicating the direction of the earth rotation shaft, and acquiring the attitude of the unmanned aerial vehicle through the arrangement of the inertial measurement system, so that the control of angle shooting is facilitated.
The positioning system adopts GPS positioning or Beidou navigation positioning and is used for positioning the position of the unmanned aerial vehicle. The power supply system adopts a lithium ion power battery, and the storage system stores data in a mode of combining a storage card with an external hard disk. The wireless communication system adopts a remote WiFi module to realize communication.
The data acquisition unit includes: the intelligent unmanned aerial vehicle comprises a camera, an infrared range finder and a battery information sensor, wherein the camera is used for shooting pictures and videos, the infrared range finder is used for measuring the height distance between an unmanned aerial vehicle body and a base point as well as between the unmanned aerial vehicle body and an image control point, the battery information sensor is used for collecting battery electric quantity information, when the real-time electric quantity of an unmanned aerial vehicle battery is smaller than an electric quantity setting threshold value in the aerial survey process, an unmanned aerial vehicle processor sends a signal to a processing station through a wireless communication system, and staff is reminded to control the unmanned aerial vehicle to return to the air, so that loss is avoided. In the use, power supply system gives unmanned aerial vehicle power supply, and battery information sensor is used for monitoring the information of battery electric quantity, and the charging of returning to the journey in good time is according to the electric quantity, acquires unmanned aerial vehicle's speed, gesture and shooting picture in real time at unmanned aerial vehicle's flight's in-process.
The processing station control unit is connected with the processing station in a communication way, the processing station comprises a data receiving unit and a monitoring screen, and the monitoring screen and the data receiving unit are connected with the unmanned aerial vehicle processor and the data acquisition unit through a wireless communication system to play roles in monitoring, controlling and information processing.
The scheme discloses a three-dimensional live-action modeling method for aerial survey of an unmanned aerial vehicle, and the beneficial effects obtained by adopting the scheme are as follows:
1. before aerial survey, preparation before survey is carried out according to the characteristics of the region to be measured, the layout of the image control points adopts a measuring method of a drawing root point and an axisymmetric layout mode, the layout density of the image control points is increased at the complex boundary of the aerial survey region, and the problems of redundant workload contained in the banded aerial survey region and overlapping of non-survey regions can be reduced.
2. And (3) going to the corresponding image control point and performing oblique photography, so that image data with the image control point is obtained, the image is preprocessed through neural network learning in the construction and training processes of the image control point identification classification model, the image is classified, and whether the image control point exists in the image is judged, so that the image is segmented, the classification identification of the image control point is realized, the waste piece is screened out, the later workload is reduced, and the classification of the image and the later live-action modeling are facilitated.
3. The image data acquired according to each image control point is input into an image quality model for precision evaluation, the image quality model obtains the precision of various conditions and the elevation error and the horizontal error of the embodiment under the calculation of a fuzzy comprehensive evaluation model for comparison analysis through different course overlapping rates, side overlapping degrees, aerial survey heights and reference point heights, so that the precision of the image quality is judged, whether the image quality is compounded or not is judged, and a redundant image is deleted, thereby solving the problem of large workload in the prior art, and simultaneously solving the problems of insufficient aerial survey quality precision, poor measurement precision and the like in the prior art.
4. In addition, the system provided by the scheme comprises a real-time monitoring function, the processing station is provided with a monitoring screen for real-time monitoring, the functions of monitoring, controlling and information processing are achieved, and meanwhile, the system comprises the monitoring of the battery power of the unmanned aerial vehicle, so that the normal use of the unmanned aerial vehicle is guaranteed.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention.
FIG. 1 is a method flow diagram of the present modeling method;
FIG. 2 is a diagram of the components of the present modeling system;
FIG. 3 is a flowchart of a process for constructing and training an image control point identification classification model according to an embodiment;
fig. 4 is a flowchart of image quality model evaluation to acquire image accuracy in the embodiment.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention; all other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention provides a three-dimensional live-action modeling system based on unmanned aerial vehicle aerial survey, which is shown in fig. 2, and comprises an unmanned aerial vehicle, a processing station and a communication base station, wherein the unmanned aerial vehicle is connected with the communication base station so as to realize communication with a ground processing station, and the processing station is used for controlling the unmanned aerial vehicle.
Referring to fig. 2, the unmanned aerial vehicle includes an unmanned aerial vehicle control unit, a data acquisition unit and a processing station control unit.
The unmanned aerial vehicle control unit is used for monitoring and acquiring various states of the unmanned aerial vehicle and ensuring normal operation aerial survey work of the unmanned aerial vehicle; the unmanned aerial vehicle control unit includes: unmanned aerial vehicle treater, inertial measurement system, positioning system, power supply system, memory system, wireless communication system, above-mentioned system all is connected with unmanned aerial vehicle treater, and unmanned aerial vehicle treater is used for receiving and processing the signal use. In a preferred embodiment, the inertial measurement system is composed of an accelerometer and a gyroscope and is used for sensing the acceleration of the unmanned aerial vehicle and obtaining relevant data such as the speed and the gesture of the unmanned aerial vehicle through integral operation. The accelerometer is used for measuring acceleration relative to an inertia space in the movement process of the unmanned aerial vehicle and indicating the direction of a local vertical line; the gyroscope is used for measuring the angular displacement of the unmanned aerial vehicle relative to the rotating motion direction of the carrying platform, indicating the direction of the earth rotation shaft, and acquiring the attitude of the unmanned aerial vehicle through the arrangement of the inertial measurement system, so that the control of angle shooting is facilitated. The positioning system adopts GPS positioning or Beidou navigation positioning and is used for positioning the position of the unmanned aerial vehicle. The power supply system adopts a lithium ion power battery, and the storage system stores data in a mode of combining a storage card with an external hard disk. The wireless communication system adopts a remote WiFi module to realize communication.
The data acquisition unit includes: the intelligent unmanned aerial vehicle comprises a camera, an infrared range finder and a battery information sensor, wherein the camera is used for shooting pictures and videos, the infrared range finder is used for measuring the height distance between an unmanned aerial vehicle body and a base point as well as between the unmanned aerial vehicle body and an image control point, the battery information sensor is used for collecting battery electric quantity information, when the real-time electric quantity of an unmanned aerial vehicle battery is smaller than an electric quantity setting threshold value in the aerial survey process, an unmanned aerial vehicle processor sends a signal to a processing station through a wireless communication system, and staff is reminded to control the unmanned aerial vehicle to return to the air, so that loss is avoided. In the use, power supply system gives unmanned aerial vehicle power supply, and battery information sensor is used for monitoring the information of battery electric quantity, and the charging of returning to the journey in good time is according to the electric quantity, acquires unmanned aerial vehicle's speed, gesture and shooting picture in real time at unmanned aerial vehicle's flight's in-process.
The processing station control unit: the processing station comprises a data receiving unit and a monitoring screen, wherein the monitoring screen and the data receiving unit are connected with an unmanned aerial vehicle processor and a data acquisition unit through a wireless communication system, and the functions of monitoring, controlling and information processing are achieved.
Referring to fig. 1, 3 and 4, on the basis of the system, the present solution also discloses a three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey, which comprises the following steps:
s1: setting a processing station and a base point on the ground, reasonably arranging a plurality of groups of image control points, wherein the image control points are arranged in a measuring method of a picture root point and an axisymmetric arrangement mode, increasing the arrangement density of the image control points at a complex boundary of an aerial survey area, presetting aerial photographing parameters of an unmanned aerial vehicle by the processing station, generating a banded aerial survey area, inputting the position information of the image control points to the processing station to form a coordinate graph, and acquiring a navigation route from a current position to the image control points;
wherein, take photo by plane parameter setting includes: and setting course overlapping degree, side overlapping degree, aerial survey height and datum point height, and generating a band aerial survey area according to the aerial photographing parameters and the unmanned aerial vehicle lens parameters.
S2: the processing station controls the unmanned aerial vehicle to go to the corresponding image control point from the reference coordinate control point, performs oblique photography through the unmanned aerial vehicle to obtain image data with the image control point, and adopts an image control point identification classification model to identify the image control point identification in the image data and perform preprocessing work of the image so as to screen out waste chips;
the construction and training process of the image control point identification classification model comprises the following steps:
s2.1: constructing a plurality of image control point marks, shooting an original image of the image control point marks, preprocessing the image, including cutting, size adjustment, gray level conversion and normalization of the image, and ensuring consistency and usability of image data;
s2.2: automatically learning the features by using a convolutional neural network in a deep learning model, and converting the images into feature expression forms which can be processed by a machine learning algorithm; a neural network of an encoder-decoder based on a VGG11 network is adopted and trained;
s2.3: labeling the preprocessed image data set to form a labeling file, distributing a correct classification label for each sample, selecting a proper classification model according to the characteristics of the task, and forming a sample set by the labeling file corresponding to the original image;
s2.4: when the model is trained, firstly, image classification is carried out on an original image, whether an image control point is in the image is judged, and if so, image segmentation is carried out on the image, so that the classification of each pixel is accurate;
s2.5: dividing the sample set into a training set and a test set, wherein the training set accounts for 70% of the total number of samples, and the test set accounts for 30% of the total number of samples; evaluating the trained model by using a test set, calculating the accuracy and precision of the model, and optimizing the model according to an evaluation result;
s2.6: and finally, deploying the trained model into an actual application environment for realizing the function of identifying and classifying the image control points of the image.
And (3) going to the corresponding image control point and performing oblique photography, so that image data with the image control point is obtained, the image is preprocessed through neural network learning in the construction and training processes of the image control point identification classification model, the image is classified, and whether the image control point exists in the image is judged, so that the image is segmented, the classification identification of the image control point is realized, the waste piece is screened out, the later workload is reduced, and the classification of the image and the later live-action modeling are facilitated.
S3: in the photographing process, inputting the image data obtained each time into an image quality model for precision evaluation; if the precision accords with the preset value, continuing to acquire the image data of the next image control point until all the image data of the image control point are acquired; if the precision does not accord with the preset value, re-acquiring the image;
the evaluation flow of the image quality model comprises the following steps:
s3.1: selecting a certain target point of the shot image for instance analysis;
s3.2: selecting different course overlapping rates, side overlapping degrees, aerial survey heights and layout of datum point heights, and counting elevation errors and horizontal errors of all examples;
s3.3: and calculating the precision of each condition and the elevation error and the horizontal error of each example through the fuzzy comprehensive evaluation model for comparison analysis.
S4: and covering the original blurred vision by using the re-acquired image conforming to the precision, updating the live-action, and modeling according to the acquired image to form a final live-action model diagram.
The image quality model is input into the image quality model for precision evaluation according to the image data acquired by each image control point, and the image quality model obtains the precision of various conditions and the elevation error and the horizontal error of the embodiment for comparison analysis under the calculation of the fuzzy comprehensive evaluation model through the arrangement of different heading overlapping rates, side overlapping degrees, aerial survey heights and datum point heights, so that the precision of the image quality is judged, whether the image quality is standard is judged, and whether a redundant image is required is judged, so that the problem of large workload in the prior art is solved, and meanwhile, the problems of insufficient aerial survey quality precision, poor measurement precision and the like in the prior art are solved.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.

Claims (7)

1. The three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey is characterized by comprising the following steps of:
s1: setting a processing station and a base point on the ground, reasonably arranging a plurality of groups of image control points, presetting aerial photographing parameters of an unmanned aerial vehicle by the processing station, generating a banded aerial survey area, inputting position information of the image control points to the processing station to form a coordinate graph, and acquiring a navigation route from the current position to the image control points;
s2: the processing station controls the unmanned aerial vehicle to go to the corresponding image control point from the reference coordinate control point, performs oblique photography through the unmanned aerial vehicle to obtain image data with the image control point, performs preprocessing on the obtained image data, and screens out waste chips;
s3: in the photographing process, inputting the image data obtained each time into an image quality model for precision evaluation; if the precision accords with the preset value, continuing to acquire the image data of the next image control point until all the image data of the image control point are acquired; if the precision does not accord with the preset value, re-acquiring the image;
s4: and covering the original blurred vision by using the re-acquired image conforming to the precision, updating the live-action, and modeling according to the acquired image to form a final live-action model diagram.
2. The three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey of claim 1, wherein the method comprises the following steps: the setting of aerial photographing parameters in the step S1 comprises the following steps: and setting course overlapping degree, side overlapping degree, aerial survey height and datum point height, and generating a band aerial survey area according to the aerial photographing parameters and the unmanned aerial vehicle lens parameters.
3. The three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey according to claim 2, wherein the method comprises the following steps: and in the step 2, an image control point identification classification model is adopted to identify the image control point identification in the image data and preprocess the image.
4. A three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey according to claim 3, wherein: the construction and training process of the image control point identification classification model comprises the following steps:
step 1: constructing a plurality of image control point marks, shooting an original image of the image control point marks, preprocessing the image, including cutting, size adjustment, gray level conversion and normalization of the image, and ensuring consistency and usability of image data;
step 2: automatically learning the features by using a convolutional neural network in a deep learning model, and converting the images into feature expression forms which can be processed by a machine learning algorithm; a neural network of an encoder-decoder based on a VGG11 network is adopted and trained;
step 3: labeling the preprocessed image data set to form a labeling file, distributing a correct classification label for each sample, selecting a proper classification model according to the characteristics of the task, and forming a sample set by the labeling file corresponding to the original image;
step 4: when the model is trained, firstly, image classification is carried out on an original image, whether an image control point is in the image is judged, and if so, image segmentation is carried out on the image, so that the classification of each pixel is accurate;
step 5: dividing the sample set into a training set and a test set, wherein the training set accounts for 70% of the total number of samples, and the test set accounts for 30% of the total number of samples; evaluating the trained model by using a test set, calculating the accuracy and precision of the model, and optimizing the model according to an evaluation result;
step 6: and finally, deploying the trained model into an actual application environment for realizing the function of identifying and classifying the image control points of the image.
5. The three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey of claim 4, wherein the method comprises the following steps: the evaluation flow of the image quality model in the step S3 includes the following steps:
step one, selecting a certain target point of a shot image for instance analysis;
step two, selecting different heading overlapping rates, side overlapping degrees, aerial survey heights and layout of datum point heights, and counting elevation errors and horizontal errors of all examples;
and thirdly, calculating the precision of various conditions and the elevation error and the horizontal error of each example through a fuzzy comprehensive evaluation model, and carrying out comparison analysis.
6. The three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey of claim 1, wherein the method comprises the following steps: in the step S1, the layout of the image control points adopts a measuring method of the root points and an axisymmetric layout mode, and the layout density of the image control points is increased at the complex boundary of the aerial survey area.
7. The modeling system of the three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey according to any one of claims 1 to 6, wherein: the unmanned aerial vehicle comprises an unmanned aerial vehicle, a processing station and a communication base station, wherein the unmanned aerial vehicle is connected with the communication base station so as to realize communication with a ground processing station, and the processing station is used for controlling the unmanned aerial vehicle;
the unmanned aerial vehicle comprises an unmanned aerial vehicle control unit, a data acquisition unit and a processing station control unit;
the unmanned aerial vehicle control unit includes: the system comprises an unmanned aerial vehicle processor, an inertial measurement system, a positioning system, a power supply system, a storage system and a wireless communication system, wherein the systems are connected with the unmanned aerial vehicle processor;
the unmanned aerial vehicle processor is used for receiving and processing the signal for use;
the inertial measurement system consists of an accelerometer and a gyroscope, and is used for sensing the acceleration of the unmanned aerial vehicle and obtaining relevant data of the speed and the gesture of the unmanned aerial vehicle through integral operation, wherein the accelerometer is used for measuring the acceleration of the unmanned aerial vehicle relative to an inertial space in the movement process and indicating the direction of a local vertical line; the gyroscope is used for measuring the angular displacement of the unmanned aerial vehicle relative to the rotating motion direction of the carrying platform and indicating the direction of the earth rotation shaft, and the attitude of the unmanned aerial vehicle is acquired through the arrangement of the inertial measurement system, so that the control of angle shooting is facilitated;
the positioning system adopts GPS positioning or Beidou navigation positioning and is used for positioning the position of the unmanned aerial vehicle;
the power supply system adopts a lithium ion power battery;
the storage system stores data in a mode of combining a storage card with an external hard disk;
the wireless communication system adopts a remote WiFi module to realize communication;
the data acquisition unit includes: the unmanned aerial vehicle comprises a camera, an infrared range finder and a battery information sensor, wherein the camera is used for shooting pictures and videos, the infrared range finder is used for measuring the height distance between an unmanned aerial vehicle body and a base point and between the unmanned aerial vehicle body and an image control point, the battery information sensor is used for collecting battery electric quantity information, and when the real-time electric quantity of an unmanned aerial vehicle battery is smaller than an electric quantity setting threshold value in the aerial survey process, an unmanned aerial vehicle processor sends a signal to a processing station through a wireless communication system to remind workers;
the processing station control unit is connected with the processing station in a communication way, the processing station comprises a data receiving unit and a monitoring screen, and the monitoring screen and the data receiving unit are connected with the unmanned aerial vehicle processor and the data acquisition unit through a wireless communication system to play roles in monitoring, controlling and information processing.
CN202311377081.2A 2023-10-24 2023-10-24 Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey Pending CN117456092A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311377081.2A CN117456092A (en) 2023-10-24 2023-10-24 Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311377081.2A CN117456092A (en) 2023-10-24 2023-10-24 Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey

Publications (1)

Publication Number Publication Date
CN117456092A true CN117456092A (en) 2024-01-26

Family

ID=89588451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311377081.2A Pending CN117456092A (en) 2023-10-24 2023-10-24 Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey

Country Status (1)

Country Link
CN (1) CN117456092A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117950355A (en) * 2024-03-27 2024-04-30 西安爱生无人机技术有限公司 Reconnaissance unmanned aerial vehicle supervision control system and reconnaissance unmanned aerial vehicle supervision control method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117950355A (en) * 2024-03-27 2024-04-30 西安爱生无人机技术有限公司 Reconnaissance unmanned aerial vehicle supervision control system and reconnaissance unmanned aerial vehicle supervision control method
CN117950355B (en) * 2024-03-27 2024-06-11 西安爱生无人机技术有限公司 Reconnaissance unmanned aerial vehicle supervision control system and reconnaissance unmanned aerial vehicle supervision control method

Similar Documents

Publication Publication Date Title
CN112567201B (en) Distance measuring method and device
CN105928498B (en) Method, the geodetic mapping and survey system, storage medium of information about object are provided
CN111931565B (en) Autonomous inspection and hot spot identification method and system based on photovoltaic power station UAV
US20140336928A1 (en) System and Method of Automated Civil Infrastructure Metrology for Inspection, Analysis, and Information Modeling
CN111968048B (en) Method and system for enhancing image data of less power inspection samples
CN110196454B (en) Geological survey integrated system based on unmanned aerial vehicle
CN112113542A (en) Method for checking and accepting land special data for aerial photography construction of unmanned aerial vehicle
CN102298070A (en) Method for assessing the horizontal speed of a drone, particularly of a drone capable of hovering on automatic pilot
CN109813335A (en) Scaling method, device, system and the storage medium of data collection system
CN103886640A (en) Method and system for obtaining three-dimensional model of building
CN117456092A (en) Three-dimensional live-action modeling system and method based on unmanned aerial vehicle aerial survey
WO2020181508A1 (en) Digital surface model construction method, and processing device and system
CN106292126A (en) A kind of intelligence aerial survey flight exposal control method, unmanned aerial vehicle (UAV) control method and terminal
CN106969721A (en) A kind of method for three-dimensional measurement and its measurement apparatus
CN116625354A (en) High-precision topographic map generation method and system based on multi-source mapping data
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN113415433A (en) Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle
CN110618696B (en) Air-ground integrated surveying and mapping unmanned aerial vehicle
CN114659499B (en) Smart city 3D map model photography establishment method based on unmanned aerial vehicle technology
KR102393300B1 (en) Object map generation system and method therefor
CN116203976A (en) Indoor inspection method and device for transformer substation, unmanned aerial vehicle and storage medium
CN113610001B (en) Indoor mobile terminal positioning method based on combination of depth camera and IMU
CN115665553A (en) Automatic tracking method and device for unmanned aerial vehicle, electronic equipment and storage medium
CN115164769A (en) Three-dimensional real estate measuring and calculating method based on oblique photography technology
CN111595303A (en) Method for screening aerial photos

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication