CN114841956A - Damage assessment method, system, equipment and storage medium based on image analysis - Google Patents
Damage assessment method, system, equipment and storage medium based on image analysis Download PDFInfo
- Publication number
- CN114841956A CN114841956A CN202210465292.0A CN202210465292A CN114841956A CN 114841956 A CN114841956 A CN 114841956A CN 202210465292 A CN202210465292 A CN 202210465292A CN 114841956 A CN114841956 A CN 114841956A
- Authority
- CN
- China
- Prior art keywords
- target
- damage
- image
- damaged
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000006378 damage Effects 0.000 title claims abstract description 156
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000010191 image analysis Methods 0.000 title claims abstract description 15
- 238000003860 storage Methods 0.000 title claims abstract description 11
- 239000000203 mixture Substances 0.000 claims abstract description 32
- 238000012549 training Methods 0.000 claims description 26
- 239000011159 matrix material Substances 0.000 claims description 17
- 238000009826 distribution Methods 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 10
- 238000010586 diagram Methods 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 6
- 239000013598 vector Substances 0.000 claims description 6
- 238000011156 evaluation Methods 0.000 claims description 5
- 230000006870 function Effects 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims description 4
- 238000007906 compression Methods 0.000 claims description 3
- 230000006835 compression Effects 0.000 claims description 3
- 238000003702 image correction Methods 0.000 claims description 3
- 238000013139 quantization Methods 0.000 claims description 3
- 239000000126 substance Substances 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 3
- 230000008859 change Effects 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 238000004880 explosion Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- -1 carrier Substances 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003153 chemical reaction reagent Substances 0.000 description 1
- 239000000306 component Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000009525 mild injury Effects 0.000 description 1
- 230000009526 moderate injury Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000009528 severe injury Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30212—Military
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a damage assessment method, a system, equipment and a storage medium based on image analysis, wherein the image analysis technology is utilized to extract target damage characteristics, and the target damage characteristics have robustness on image rotation and image scaling, so that the method is suitable for target images shot at different heights and different angles; and the damage degree of the target can be accurately estimated through the trained Gaussian mixture model according to the damage characteristics of the target.
Description
Technical Field
The present invention relates to the field of damage assessment technologies, and in particular, to a method, a system, a device, and a storage medium for damage assessment based on image analysis.
Background
The damage assessment is carried out based on the images, namely after the target or the related area of the target is hit, the images of the damaged target are obtained through means of aviation, aerospace, human reconnaissance and the like, the change of the target images before and after the target is hit is analyzed by means of manual discrimination, image processing, mathematical modeling and the like, the damage degree or the damage grade of the target after the target is hit is assessed, and important support is provided for a commander to make a subsequent operation plan.
Foreign research on target damage assessment is started earlier, and the target damage degree is initially assessed mainly in a manual discrimination mode. The manual discrimination mode has low efficiency and cannot meet the requirements of the modern war on the real-time performance and the accuracy of damage assessment. At present, the method for automatically distinguishing images is mainly researched abroad to evaluate the damage of the target. Many researches on target damage assessment methods based on images are also developed in China. Common target damage assessment methods include a method based on the area of a fried spot, and a method based on a gray level co-occurrence matrix. The method based on the area of the explosion point judges the damage degree by detecting the area of the explosion point in the image, and the method is visual, has small calculated amount, but has low reliability and is difficult to adapt to damage evaluation of targets with different shapes. The method based on the gray level co-occurrence matrix usually calculates the change rate of features such as entropy and contrast according to the gray level co-occurrence matrix of the image, and then judges the damage level according to the distribution range of the feature change rate. The gray level co-occurrence matrix is large in calculation amount and is affected by image rotation change. With the development of deep learning, a convolutional neural network-based method is also applied to target damage assessment. Deep learning models such as convolutional neural networks require a large number of training images, and in the military field, it is very difficult to acquire a large number of target damage images. In addition, the convolutional neural network requires a large amount of computation, which puts high requirements on the hardware computation capability of the target damage assessment system. These factors make the convolutional neural network approach difficult to apply in missile-borne devices, small drones, and other scenarios.
Disclosure of Invention
The invention aims to provide a damage assessment method, a system, equipment and a storage medium based on image analysis, which only need a small number of training images, and simultaneously provide target damage characteristics, have small required calculation amount, have robustness on image rotation and image scaling, and can accurately assess the damage degree of a target.
The purpose of the invention is realized by the following technical scheme:
a damage assessment method based on image analysis, comprising:
correcting the image after the damage of the target by utilizing the image before the damage of the target;
respectively intercepting target area images from the image before the target damage and the corrected image after the target damage;
calculating a gray difference image and a correlation coefficient image between a target area image before the target is damaged and a target area image after the target is damaged;
combining the gray difference graph and the correlation coefficient graph to obtain a target damage characteristic;
and evaluating the target damage degree by using the target damage characteristics.
A damage assessment system based on image analysis, comprising:
the image correction unit is used for correcting the image after the target is damaged by utilizing the image before the target is damaged;
the target area image intercepting unit is used for respectively intercepting target area images from the image before the target is damaged and the corrected image after the target is damaged;
the gray difference map and correlation coefficient map calculating unit is used for calculating a gray difference map and a correlation coefficient map between a target area image before the target is damaged and a target area image after the target is damaged;
the target damage characteristic acquisition unit is used for combining the gray difference image and the correlation coefficient image to acquire target damage characteristics;
and the target damage evaluation unit is used for evaluating the target damage degree by utilizing the target damage characteristics.
A processing device, comprising: one or more processors; a memory for storing one or more programs;
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the aforementioned methods.
A readable storage medium, storing a computer program which, when executed by a processor, implements the aforementioned method.
According to the technical scheme provided by the invention, the image analysis technology is utilized to extract the target damage characteristics, and the target damage characteristics have robustness on image rotation and image scaling, so that the method is suitable for target images shot at different heights and different angles; and the damage degree of the target can be accurately estimated through the trained Gaussian mixture model according to the damage characteristics of the target.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a flowchart of a damage assessment method based on image analysis according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an uncorrected image before and after a target damage according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an image before and after a corrected target damage according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an intercepted target area image provided by an embodiment of the invention;
FIG. 5 is a schematic diagram of a gray scale difference between a pre-damage image and a post-damage image according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of correlation coefficient images between pre-damage and post-damage images according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a damage assessment system based on image analysis according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a processing apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The terms that may be used herein are first described as follows:
the terms "comprising," "including," "containing," "having," or other similar terms of meaning should be construed as non-exclusive inclusions. For example: including a feature (e.g., material, component, ingredient, carrier, formulation, material, dimension, part, component, mechanism, device, process, procedure, method, reaction condition, processing condition, parameter, algorithm, signal, data, product, or article of manufacture), is to be construed as including not only the particular feature explicitly listed but also other features not explicitly listed as such which are known in the art.
The scheme provided by the present invention is described in detail below. Details which are not described in detail in the embodiments of the invention belong to the prior art which is known to the person skilled in the art. Those not specifically mentioned in the examples of the present invention were carried out according to the conventional conditions in the art or conditions suggested by the manufacturer. The reagents or instruments used in the examples of the present invention are not specified by manufacturers, and are all conventional products available by commercial purchase.
Example one
As shown in fig. 1, a flowchart of a damage assessment method based on image analysis according to an embodiment of the present invention mainly includes the following steps:
The preferred embodiment of this step is as follows:
1) images before and after the damage of the target are obtained, and SURF (Speed Up Robust Feature) characteristics of the two images are calculated respectively. For the pre-target-damage image, the ith SURF feature is denoted as r l . For the image after the target damage, the h SURF feature is recorded as s h 。
2) And matching the SURF characteristics on the two graphs by using a nearest neighbor ratio method according to the Manhattan distance between the SURF characteristics to obtain an initial pairing of the SURF characteristics.
The specific process is as follows:
for r l Calculating the Manhattan distance d (r) l ,s h ):
d(r l ,s h )=||r l -s h || 1
By comparing the Manhattan distances between SURF features, the sum r is found in the target post-damage image l Nearest neighbor SURF features s 1 And next nearest neighbor SURF features s 2 . If d (r) l ,s 1 ) And d (r) l ,s 2 ) Is less than a given threshold, then r is indicated l And s 1 The matching is successful, the preparation is obtainedTo (r) l ,s 1 ). Other SURF feature pairs may be found in the same manner. These pairs are the initial pairs of SURF features, denoted as (r) k ,s k ) K is the number of pairs, 1, 2. SURF feature r k And s k The pixel coordinates of the corresponding feature points in the image are denoted as q (r), respectively k ) And q(s) k )。
3) And optimizing initial pairing of SURF characteristics by using a random sampling consistent sampling method, and calculating a perspective transformation matrix of the image after the target damage by taking the image before the target damage as a reference.
The specific process is as follows:
selecting K from the initial feature pair 1 (K 1 < K) pairs, and calculating to obtain a perspective transformation matrix H by minimizing the projection error of the characteristic points of the pairs on an image plane. In the form of homogeneous coordinates, the projection error is then:
h is a 2 × 3 matrix. After the least square method is adopted to obtain H, the rest K-K is calculated according to H 1 The projection error of each pair. Matching points for which the projection error is within the threshold are called interior points, and are otherwise called exterior points. And repeating the steps, wherein when the number of the inner points is the maximum, the corresponding matrix H is the solved perspective transformation matrix.
4) And carrying out perspective transformation on the image after the target is damaged by utilizing the perspective transformation matrix to obtain a corrected image after the target is damaged.
As shown in fig. 2, is an acquired uncorrected target image. The left and right images are before and after the target is damaged, respectively. As shown in fig. 3, the left and right images are the corrected target image before the damage of the target and the corrected target image after the damage of the target.
And 2, respectively intercepting target area images from the image before the target damage and the corrected image after the target damage.
The inventionIn the embodiment, in the original image before damage, the image is intercepted according to the circumscribed rectangle of the target area to obtain the normalized image I of the target area before damage 1 (ii) a According to the same external rectangle, a rectangular area is cut out from the corrected damaged target image to obtain a normalized damaged target area image I 2 。
As shown in FIG. 4, the left and right parts of the captured target area image are the target area image I before the damage of the target 1 Target area image I after target damage 2 。
And 3, calculating a gray difference image and a correlation coefficient image between the target area image before the target is damaged and the target area image after the target is damaged.
1) Calculating a gray difference map I diff : let (x, y) represent the abscissa and ordinate coordinates of a pixel of the image, the gray-scale difference I of the pixel point at the (x, y) position diff The (x, y) calculation method is as follows:
wherein, I 1 、I 2 Respectively representing a target area image before the damage of the target and a target area image after the damage of the target; (u, v) is the coordinate in the square neighborhood of the pixel at the (x, y) position, L is the side length of the square neighborhood, and g (u, v) is the Gaussian smoothing window.
Fig. 5 shows an example of a gray scale difference map between the pre-and post-damage images.
2) Calculating a correlation coefficient diagram:
first, a correlation coefficient matrix a (x, y) is calculated, expressed as:
wherein the content of the first and second substances,
since the value of the correlation coefficient matrix a (x, y) is between-1 and 1, the smaller the difference between the pre-and post-damage images, the larger the correlation coefficient. Therefore, scaling the matrix A (x, y) between 0 and 255 and taking the inverse results in the correlation coefficient map I coef Correlation coefficient value I of pixel point at (x, y) position coef The (x, y) calculation method is as follows:
as shown in fig. 6, is an example of a correlation coefficient image between pre-and post-lesional images.
And 4, combining the gray difference graph and the correlation coefficient graph to obtain the target damage characteristic.
In the embodiment of the invention, after the pixel values of the gray level difference image and the related coefficient image are respectively quantized and compressed, the histogram distribution of the gray level difference image and the related coefficient image after quantization and compression is calculated, and the two histogram vectors are connected in series to obtain the target damage characteristic.
Illustratively, after the gray difference map and the correlation coefficient map are subjected to pixel value quantization compression, the pixel values of the pixels are scaled from 0-255 to 0-63, then 64-dimensional histogram vectors corresponding to the gray difference map and the correlation coefficient map are obtained by calculating histogram distribution, the two histogram vectors are connected in series to obtain a 128-dimensional feature vector, and the 128-dimensional feature vector is used as the target damage feature. The target damage characteristic has robustness on rotation and scaling of the target.
And 5, evaluating the damage degree of the target by utilizing the target damage characteristics.
The preferred embodiment of this step is as follows:
1) and performing offline modeling on the probability distribution of the target damage characteristics by using a Gaussian mixture model, wherein the probability distribution is represented as:
wherein P (f | λ) is a Gaussian mixture model, λ is a parameter set of the Gaussian mixture model, and comprises a mixture coefficient w m Mean value of μ m Sum variance Σ m (ii) a M is the number of mixing states, N (f; mu) m ,∑ m ) F represents the damage characteristics of the target as a Gaussian function;
dividing the damage grade into C grades, each damage grade corresponds to a mixed Gaussian model, namely the damage characteristic probability distribution of the C-th damage grade can be used as a mixed Gaussian model P (f | lambda) c ) Denotes λ c Is the parameter C of the C-th Gaussian mixture model to form a [0,1,2]. For example, the grade of damage can be classified into 4 grades of no damage, mild damage, moderate damage and severe damage, i.e. C-4. Lambda [ alpha ] c Represents the c-th Gaussian mixture model P (f | λ) c ) The parameters are obtained by off-line training with training samples, and only need to be trained once. The damage grade of the training sample can be obtained through manual interpretation of experts.
2) For the target damage characteristic f, calculating the probability p (lambda) that the characteristic belongs to the c-th Gaussian mixture model c I.e., the posterior probability that the failure feature f belongs to the c-th failure level. p (lambda) c The formula for | f) is:
wherein P (f | λ) c ) Is the c-th Gaussian mixture model, p (lambda) c ) Is the prior distribution of the c-th gaussian mixture model. p (f) a prior distribution of the damage profile f. The discrimination result of the damage category is finally given by the maximum posterior probability criterion, and is expressed as:
wherein the content of the first and second substances,representing the level of damage assessed on the image, the posterior probability p (λ) being the same for all damage classes c If) is equivalent to p (f lambda) c )p(λ c ). Meanwhile, the prior distribution of the C Gaussian mixture models can be considered to be basically the same, so that the damage level can be finally judged by the following formula:
in the embodiment of the invention, a training set is constructed in advance, and the Gaussian mixture models with different damage levels are trained, namely the parameter sets of the Gaussian mixture models with different damage levels are obtained. The preferred embodiment is as follows:
1) according to expert knowledge, the damage grade of the existing image before and after the target damage is judged to be used as a training sample of the corresponding damage grade, and the training sample is expanded by using a data enhancement mode.
Because training images of the damaged images are usually fewer, the method adopts the modes of image rotation, scaling and Gaussian noise increase to expand the training samples. Training samples with different damage grades form a training set, and in the subsequent training, the training samples with corresponding damage grades are extracted from the Gaussian mixture models with different damage grades for training.
2) And for the Gaussian mixture model with any damage level, combining corresponding training samples, and acquiring a corresponding parameter set by adopting an expectation maximization method.
Setting any damage grade to contain N training samples, namely N pairs of images before and after the damage of the target, wherein each training sample corresponds to one target damage characteristic (obtained through the steps 1 to 4), and setting a target damage characteristic f corresponding to the ith training sample i The probability of belonging to the mth hidden state in the Gaussian mixture model is p (m | f) i λ), expressed as:
then, the parameters of the gaussian mixture model are updated using the following formula:
obtaining parameter sets of Gaussian mixture models with different damage levels in the mode of the 2) above.
Example two
Another embodiment of the present invention further provides a damage evaluation system based on image analysis, which is mainly used for implementing the method provided in the foregoing embodiment, as shown in fig. 7, the system mainly includes:
the image correction unit is used for correcting the image after the target is damaged by utilizing the image before the target is damaged;
the target area image intercepting unit is used for respectively intercepting target area images from the image before the target is damaged and the corrected image after the target is damaged;
the gray difference map and correlation coefficient map calculating unit is used for calculating a gray difference map and a correlation coefficient map between a target area image before the target is damaged and a target area image after the target is damaged;
the target damage characteristic acquisition unit is used for combining the gray difference image and the correlation coefficient image to acquire target damage characteristics;
and the target damage evaluation unit is used for evaluating the target damage degree by utilizing the target damage characteristics.
It will be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the system is divided into different functional modules to perform all or part of the above described functions.
It should be noted that the main principles of the units of the system have been described in detail in the previous method embodiments, and therefore, the detailed description is omitted.
EXAMPLE III
Another embodiment of the present invention further provides a processing apparatus, as shown in fig. 8, which mainly includes: one or more processors; a memory for storing one or more programs; wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the methods provided by the foregoing embodiments.
Further, the processing device further comprises at least one input device and at least one output device; in the processing device, a processor, a memory, an input device and an output device are connected through a bus.
In the embodiment of the present invention, the specific types of the memory, the input device, and the output device are not limited; for example:
the input device can be a touch screen, an image acquisition device, a physical button or a mouse and the like;
the output device may be a display terminal;
the Memory may be a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as a disk Memory.
Example four
Another embodiment of the present invention further provides a readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the method provided by the foregoing embodiment.
The readable storage medium in the embodiment of the present invention may be provided in the foregoing processing device as a computer readable storage medium, for example, as a memory in the processing device. The readable storage medium may be various media that can store program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. A damage assessment method based on image analysis, comprising:
correcting the image after the damage of the target by utilizing the image before the damage of the target;
respectively intercepting target area images from the image before the target damage and the corrected image after the target damage;
calculating a gray difference image and a correlation coefficient image between a target area image before the target is damaged and a target area image after the target is damaged;
combining the gray difference graph and the correlation coefficient graph to obtain a target damage characteristic;
and evaluating the damage degree of the target by using the damage characteristics of the target.
2. The method of claim 1, wherein the correcting the image after the target damage by using the image before the target damage comprises:
acquiring images before and after the damage of the target, and respectively calculating SURF characteristics of the two images;
according to the Manhattan distance between the SURF characteristics, the SURF characteristics on the two images are matched by using a nearest neighbor ratio method, and the initial pairing of the SURF characteristics is obtained;
optimizing initial pairing of SURF characteristics by using a random sampling consistent sampling method, and calculating a perspective transformation matrix of an image after the target is damaged by taking the image before the target is damaged as a reference;
and carrying out perspective transformation on the damaged target image by using the perspective transformation matrix to obtain a corrected damaged target image.
3. The method of claim 1, wherein the step of respectively capturing the images of the target area from the pre-target-damage image and the corrected post-target-damage image comprises:
in the original image before damage, the image is intercepted according to the circumscribed rectangle of the target area to obtain a normalized image I of the target area before damage 1 (ii) a According to the same external rectangle, a rectangular area is cut out from the corrected damaged target image to obtain a normalized damaged target area image I 2 。
4. The method of claim 1, wherein the calculating the gray difference map and the correlation coefficient map between the image of the target area before the damage of the target and the image of the target area after the damage of the target comprises:
calculating a gray difference map I diff : let (x, y) represent the abscissa and ordinate coordinates of a pixel of the image, the gray-scale difference I of the pixel point at the (x, y) position diff The (x, y) calculation method is as follows:
wherein, I 1 、I 2 Respectively representing a target area image before the damage of the target and a target area image after the damage of the target; (u, v) is the coordinate in the square neighborhood of the pixel point at the (x, y) position, L is the side length of the square neighborhood, and g (u, v) is a Gaussian smooth window;
calculating a correlation coefficient diagram:
a correlation coefficient matrix a (x, y) is calculated, expressed as:
wherein the content of the first and second substances,
scaling the matrix A (x, y) of the correlation numbers between 0 and 255, and taking the inverse to obtain the correlation coefficient chart I coef Correlation coefficient value I of pixel point at (x, y) position coef The (x, y) calculation method is as follows:
5. the method of claim 1, wherein the combining the gray difference map and the correlation coefficient map to obtain the target damage characteristic comprises:
and after the pixel values of the gray level difference image and the related coefficient image are respectively quantized and compressed, calculating the histogram distribution of the gray level difference image and the related coefficient image after quantization and compression, and connecting the two histogram vectors in series to obtain the target damage characteristic.
6. The method of claim 1, wherein the evaluating the target damage degree using the target damage characteristics comprises:
and performing offline modeling on the probability distribution of the target damage characteristics by using a Gaussian mixture model, wherein the probability distribution is represented as:
wherein P (f | λ) is a Gaussian mixture model, λ is a parameter set of the Gaussian mixture model, and comprises a mixture coefficient w m Mean value of μ m Sum variance Σ m (ii) a M is the number of mixing states, N (f; mu) m ,∑ m ) F represents the damage characteristics of the target as a Gaussian function;
dividing the damage grade into C grades, and respectively estimating the distribution P of the target damage characteristics of the images of the C grades by using C Gaussian mixture models according to expert knowledge and training images c (f|λ c ) C represents the C-th damage level, each damage level corresponds to a Gaussian mixture model, and C is [0,1,2, …, C-1 ═ C];
For the target damage characteristic f, calculating the probability p (lambda) of belonging to the c-th Gaussian mixture model c I, f), namely the posterior probability that the damage characteristic f belongs to the c-th damage level, the calculation formula is as follows:
wherein, P (f | λ) c ) Denotes the c-th Gaussian mixture model, p (λ) c ) A priori distribution of a c-th Gaussian mixture model, p (f) represents the prior distribution of the damage characteristics f;
the discrimination result of the damage category is finally given by the maximum posterior probability criterion and is expressed as:
7. the method of claim 6, further comprising: judging damage grades of the existing images before and after the target damage according to expert knowledge, taking the damage grades as training samples of corresponding damage grades, and expanding the training samples by using a data enhancement mode;
and for the Gaussian mixture model with any damage level, combining corresponding training samples, and acquiring a corresponding parameter set by adopting an expectation maximization method:
setting any damage grade to contain N training samples, namely N pairs of images before and after the target damage, wherein each training sample corresponds to one target damage characteristic, and setting a target damage characteristic f corresponding to the ith training sample i The probability of belonging to the mth hidden state in the Gaussian mixture model is p (m | f) i λ), expressed as:
updating parameters of the Gaussian mixture model by adopting the following formula:
8. a damage assessment system based on image analysis, for implementing the method of any one of claims 1 to 7, the system comprising:
the image correction unit is used for correcting the image after the target is damaged by utilizing the image before the target is damaged;
the target area image intercepting unit is used for respectively intercepting target area images from the image before the target is damaged and the corrected image after the target is damaged;
the gray difference map and correlation coefficient map calculating unit is used for calculating a gray difference map and a correlation coefficient map between a target area image before the target is damaged and a target area image after the target is damaged;
the target damage characteristic acquisition unit is used for combining the gray difference image and the correlation coefficient image to acquire target damage characteristics;
and the target damage evaluation unit is used for evaluating the target damage degree by utilizing the target damage characteristics.
9. A processing device, comprising: one or more processors; a memory for storing one or more programs;
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
10. A readable storage medium, storing a computer program, characterized in that the computer program, when executed by a processor, implements the method according to any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210465292.0A CN114841956A (en) | 2022-04-29 | 2022-04-29 | Damage assessment method, system, equipment and storage medium based on image analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210465292.0A CN114841956A (en) | 2022-04-29 | 2022-04-29 | Damage assessment method, system, equipment and storage medium based on image analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114841956A true CN114841956A (en) | 2022-08-02 |
Family
ID=82568754
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210465292.0A Pending CN114841956A (en) | 2022-04-29 | 2022-04-29 | Damage assessment method, system, equipment and storage medium based on image analysis |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114841956A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115984590A (en) * | 2022-12-27 | 2023-04-18 | 中船重工奥蓝托无锡软件技术有限公司 | Target vulnerability assessment method and device based on image recognition and electronic equipment |
CN116958607A (en) * | 2023-09-20 | 2023-10-27 | 中国人民解放军火箭军工程大学 | Data processing method and device for target damage prediction |
-
2022
- 2022-04-29 CN CN202210465292.0A patent/CN114841956A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115984590A (en) * | 2022-12-27 | 2023-04-18 | 中船重工奥蓝托无锡软件技术有限公司 | Target vulnerability assessment method and device based on image recognition and electronic equipment |
CN115984590B (en) * | 2022-12-27 | 2023-07-14 | 中船奥蓝托无锡软件技术有限公司 | Target vulnerability assessment method and device based on image recognition and electronic equipment |
CN116958607A (en) * | 2023-09-20 | 2023-10-27 | 中国人民解放军火箭军工程大学 | Data processing method and device for target damage prediction |
CN116958607B (en) * | 2023-09-20 | 2023-12-22 | 中国人民解放军火箭军工程大学 | Data processing method and device for target damage prediction |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021000524A1 (en) | Hole protection cap detection method and apparatus, computer device and storage medium | |
CN114841956A (en) | Damage assessment method, system, equipment and storage medium based on image analysis | |
CN110930352A (en) | Object color difference defect detection method and device, computer equipment and storage medium | |
CN113256667B (en) | SAR image ship target detection method based on Otsu and joint distribution | |
CN113076876B (en) | Face spoofing detection method and system based on three-dimensional structure supervision and confidence weighting | |
CN108986083B (en) | SAR image change detection method based on threshold optimization | |
CN116543001B (en) | Color image edge detection method and device, equipment and storage medium | |
CN111291712A (en) | Forest fire recognition method and device based on interpolation CN and capsule network | |
CN116051421A (en) | Multi-dimensional-based endoscope image quality evaluation method, device, equipment and medium | |
CN116597246A (en) | Model training method, target detection method, electronic device and storage medium | |
CN115100068A (en) | Infrared image correction method | |
CN113705672A (en) | Threshold value selection method, system and device for image target detection and storage medium | |
CN109447954B (en) | Camouflage effect evaluation method based on kernel density estimation | |
CN111179238A (en) | Subset confidence ratio dynamic selection method for subset-oriented guidance consistency enhancement evaluation | |
CN117011196B (en) | Infrared small target detection method and system based on combined filtering optimization | |
CN117474915B (en) | Abnormality detection method, electronic equipment and storage medium | |
CN114283296B (en) | Target identification self-evaluation method based on structural similarity | |
CN117036756B (en) | Remote sensing image matching method and system based on variation automatic encoder | |
CN110223250B (en) | SAR geometric correction method based on homography transformation | |
EP4328799A1 (en) | A method for training deterministic autoencoders | |
CN111914898B (en) | Adaptive threshold-based airborne SAR task adaptability evaluation method | |
CN109271851B (en) | SAR target identification method based on multiplicative sparse representation and Gamma distribution | |
CN112396648B (en) | Target identification method and system capable of positioning mass center of target object | |
Abdalmajeed et al. | Using the natural scenes’ edges for assessing image quality blindly and efficiently | |
Namuduri et al. | Image metrics for clutter characterization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |