CN111798539B - Adaptive camouflage online design method and system - Google Patents

Adaptive camouflage online design method and system Download PDF

Info

Publication number
CN111798539B
CN111798539B CN202010025181.9A CN202010025181A CN111798539B CN 111798539 B CN111798539 B CN 111798539B CN 202010025181 A CN202010025181 A CN 202010025181A CN 111798539 B CN111798539 B CN 111798539B
Authority
CN
China
Prior art keywords
color
camouflage
image
background
dominant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010025181.9A
Other languages
Chinese (zh)
Other versions
CN111798539A (en
Inventor
刘尊洋
孙晓泉
丁锋
豆贤安
叶庆
王自荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202010025181.9A priority Critical patent/CN111798539B/en
Publication of CN111798539A publication Critical patent/CN111798539A/en
Application granted granted Critical
Publication of CN111798539B publication Critical patent/CN111798539B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a self-adaptive camouflage online design method and a system, wherein the system implementation method comprises the following steps: acquiring a background area image in real time, determining a camouflage design candidate area, and extracting dominant color information of the background image and a sample candidate area; calculating the similarity between each candidate area and the dominant color information of the background image and determining the background area of the camouflage design; determining colors implemented by the camouflage from the candidate colors of the camouflage according to a color approaching principle; calculating the color distance between each pixel color in the candidate region and each dominant color of the background, and assigning the color value of the dominant color closest to the pixel point to obtain an original image of the image camouflage design; processing the original image by a mathematical morphology method to obtain a main color spot block image; and replacing the background main color in the main color patch image with the color which is closest to the main color patch image to obtain a camouflage design pattern, evaluating in real time after camouflage is carried out, improving the camouflage effect, and further obtaining the final camouflage design pattern.

Description

Adaptive camouflage online design method and system
Technical Field
The invention relates to a data identification technology, in particular to a self-adaptive camouflage painting online design method and a self-adaptive camouflage painting online design system.
Background
The self-adaptive camouflage technology requires equipment to change the camouflage color and the pattern of the surface of the equipment in a short time after the equipment enters a certain environment according to the color characteristic of the background of the environment, so that the fusion of the color and the pattern characteristic of the environment is realized. Therefore, high requirements are put on the rapid and accurate design of the camouflage painting. For the self-adaptive camouflage design of the maneuvering target, the design of camouflage is required to be completed in time after the environmental characteristics of the target are changed, and the design mainly relates to the work of fast acquisition of a background image, fast extraction of dominant colors of the background image, determination of camouflage colors, fast analysis of background color patch information, fast design of camouflage patterns and the like.
Patent publication No. CN104318570A describes a background-based adaptive camouflage design method, which uses a clustering algorithm to obtain dominant colors, but uses an RGB color space for calculating color differences. The RGB color space is suitable for hardware display, but the color distance calculation result of the RGB color space has larger color perception difference with human eyes, and in addition, the method clusters all pixel points, the efficiency is too low, so the effect and efficiency of background dominant color extraction of the method are not ideal, and the requirement of self-adaptive camouflage is difficult to meet; in addition, the method does not relate to a method for acquiring images required by the camouflage design, but directly uses the existing images, reasonably acquires real-time background images and develops the camouflage design based on the real-time background images, and is the premise of self-adaptive camouflage design.
The patent with publication number CN104463925A takes the existing spraying technology as a starting point, and proposes a digital camouflage design method based on automatic combination of spot templates, which is not suitable for adaptive camouflage, and does not relate to a method for determining the dominant color of camouflage and a method for acquiring real-time background images required by adaptive camouflage.
Patent publication No. CN106534816A proposes a method of providing a camera at the front end of a vehicle and projecting a processed image on the roof of the vehicle by a projector to implement camouflage. However, the camera is positioned at the front end of the vehicle, the acquisition range is small, the top projection can only cope with the overlooking reconnaissance from the top, the side view reconnaissance above the side cannot be coped, and the side view reconnaissance is also a reconnaissance mode for the self-adaptive camouflage needing to be coped with.
Disclosure of Invention
The invention aims to provide an adaptive camouflage online design method and an adaptive camouflage online design system.
The technical scheme for realizing the purpose of the invention is as follows: an adaptive camouflage online design method comprises the following steps:
step 1, collecting background area images and determining camouflage design candidate areas;
step 2, extracting dominant color information of the background image and the camouflage design candidate area, and counting respective dominant color proportion;
step 3, calculating the similarity of each camouflage design candidate area and the dominant color information of the background image, and arranging the camouflage design candidate areas from high to low according to the proximity;
step 4, selecting the candidate area with the highest proximity degree as a background area of the camouflage design;
step 5, sequentially calculating the color distance between the camouflage candidate color and the main color of the background image in a Lab color space, and determining the color implemented by the camouflage from the camouflage candidate colors according to a color approaching principle;
step 6, calculating the color distance between each pixel color in the camouflage design candidate area and each dominant color of the background in the Lab color space, assigning the color value of the dominant color closest to the pixel point, traversing all pixels of the image, and obtaining an original image of the image camouflage design;
step 7, processing the original image by using a mathematical morphology method to obtain a main color spot block image;
and 8, replacing the background main color in the main color patch image with the color which is closest to the main color patch image and is implemented by the camouflage obtained in the step 5 to obtain the final camouflage design pattern.
Further, the camouflage design candidate areas determined in the step 2 are divided into two types:
(1) when the background color is single, the length of the camouflage design candidate area is not less than the length of the external rectangle of the target to be camouflaged and is +2 times the height of the target to be camouflaged, and the width of the camouflage design candidate area is +2 times the width of the external rectangle of the target to be camouflaged and is equal to the height of the target to be camouflaged;
(2) when the background color is complex, for the background area with the size of A multiplied by B, the background area is divided into candidate areas according to the candidate area size a multiplied by B rectangle, and then the number of the candidate areas is
Figure BDA0002362202130000021
Wherein a is the length which is equal to the length of a circumscribed rectangle of the target to be disguised plus 2 times the height of the target to be disguised; and b is the width which is equal to the width of a circumscribed rectangle of the target to be disguised and is +2 times the height of the target to be disguised.
Further, the specific process of step 2 comprises:
representing the color image by using a CIELAB color space, quantizing the color space to obtain a plurality of levels of quantized colors, and counting the frequency of each level of quantized colors appearing in the color image;
mapping the clustering samples to a color space from a pixel space to obtain a color sample space;
clustering all levels of colors in the mapped color sample space by using a pedigree clustering algorithm and acquiring a result class center of pedigree clustering;
and taking the result of the pedigree clustering as an initial class center, carrying out fast FCM clustering on the clustering sample, and determining the main color of the background.
Further, the specific process of clustering the colors of each level in the mapped color sample space by using the pedigree clustering algorithm and obtaining the result core of the pedigree clustering comprises the following steps:
step 2301, setting each color sample in the color sample space as a class, setting
Figure BDA0002362202130000031
Figure BDA0002362202130000032
I ═ j ═ 1,2, …, N }, where y isjFor the jth sample to be classified, ΓjIs the jth cluster set, N is the number of samples;
step 2302, using the set { ΓjFinding a pair of conditions satisfying | j ∈ I }
Figure BDA0002362202130000033
Cluster set of (a) riAnd Γk,Δ(Γik) Is gammaiAnd ΓkI.k ∈ I;
step 2303, the gamma is processediIs incorporated into rkMiddle, and remove gammaiRemoving I from index set I, and recalculating gammakHeart-like color of
Figure BDA0002362202130000034
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 2304, if the number of merged classes is equal to the number of expected classes, terminating the calculation; otherwise go to step 2302.
Further, taking the result of the pedigree clustering as an initial class center, performing a rapid clustering algorithm on the clustering samples, and determining the main color of the background image or the candidate area specifically comprises the following steps:
step S2401, setting an iteration stop threshold epsilon and a maximum iteration number B, initializing a class center, and enabling the iteration number B to be 0;
step S2402, calculating a partition matrix according to the formula
Figure BDA0002362202130000035
Fuzzy membership of kth color sample to ith class
Figure BDA0002362202130000036
Figure BDA0002362202130000037
Wherein the content of the first and second substances,
Figure BDA0002362202130000038
in the b-th iteration, the fuzzy membership degree of the kth color sample to the ith class, m is a clustering control parameter,
Figure BDA0002362202130000041
color values Col (k) and ith centroid color for the kth sample
Figure BDA0002362202130000042
The color difference between them, define
Figure BDA0002362202130000043
c is the number of categories;
step S2403, updating the clustering center matrix according to the following formula
Figure BDA0002362202130000044
Figure BDA0002362202130000045
Wherein the content of the first and second substances,
Figure BDA0002362202130000046
representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step S2404, if
Figure BDA0002362202130000047
Or B is more than or equal to B, stopping and outputting
Figure BDA0002362202130000048
And
Figure BDA0002362202130000049
otherwise, go to step S2401 with b + 1.
Further, the specific process of representing the color image by the CIELAB color space and performing color space quantization to obtain several levels of quantized colors and counting the frequency of each level of quantized colors appearing in the color image is as follows:
step 101, for each of all quantized rectangular solid regions in the CIELAB color space, the length, width and height of each quantized rectangular solid region respectively correspond to the quantization interval of three coordinates, and the coordinate range is [ L [m,LM]、[am,aM]And [ b)m,bM];
Step 102, if the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMThen the pixel color appears once.
Further, the specific process of calculating the similarity between the candidate region and the dominant color information of the background image in step 3 is as follows:
step S301, matching the dominant colors of the candidate regions with the dominant colors of the background images one by one according to a color distance principle;
step S302, calculating the color difference Delta C (C) of each dominant color to two dominant colors in turn1,c2) Wherein c is1And c2C-th respectively representing candidate regions1C-th of the seed color and the full background image2A seed dominant color;
step S303, calculating the dominant color similarity of the candidate region and the background image using the following formula,
Figure BDA00023622021300000410
in the formula (f)n[P(c1),P(c2)]Is P (c)1) And P (c)2) Average value of (a), P (c)1) Denotes c1Statistical information of color in the candidate area, i.e. percentage of covered pixels to total pixels, P (c)2) Denotes c2Color is in the statistical information of the full background image;
step S304, calculating the similarity coefficient S between the candidate region and the color information of the full background imagec
Sc=1-Dc
Wherein the similarity coefficient ScTo indicate the degree of similarity.
Further, the camouflage candidate color corresponding to the minimum distance in the step 5 is used as the color of the determined camouflage implementation.
The technical scheme for realizing the system of the invention is as follows: an adaptive camouflage online design system can realize the method.
Compared with the prior art, the invention has the following advantages: (1) the unmanned aerial vehicle is used for collecting the camouflage design image, so that the background image meeting the camouflage design requirement can be collected on line in real time in the maneuvering process of a target such as a vehicle and the like needing camouflage design, and the method has the advantages of good effect, high speed, convenience in use and the like; (2) the unmanned matched central machine can control the actions of maneuvering, image acquisition and the like of the unmanned aerial vehicle, can receive the images downloaded by the unmanned aerial vehicle in real time, and can directly process the images downloaded by the unmanned aerial vehicle, thereby realizing the online design of camouflage painting; (3) the fast extraction algorithm of the dominant color of the color image can quickly extract the dominant color of the image, and effectively supports the timeliness requirement of the self-adaptive camouflage on camouflage design; (4) the camouflage effect is evaluated by using an unmanned aerial vehicle and a central machine to acquire images on line in real time, and the camouflage design method is perfected based on the evaluation result of the camouflage effect, so that the camouflage effect of the designed camouflage color is better ensured; (5) the camouflage real-time design center machine can be a tablet personal computer or integrated in a vehicle, can be used for self-adaptive camouflage design of a maneuvering target, and can also be used for fixing the design of imitating camouflage and protecting the camouflage of the target.
The invention is further described below with reference to the accompanying drawings.
Drawings
FIG. 1 is a block diagram of the system of the present invention.
FIG. 2 is a schematic flow chart of the method of the present invention.
Detailed Description
Example one
With reference to fig. 1, an adaptive camouflage online design system includes a central machine and an image acquisition device. The image acquisition device can be an unmanned aerial vehicle platform and mainly comprises a data transmission module, an image acquisition module, a flight control module, a platform power module and the like. The information communication module mainly completes information transmission of flight control instructions, image acquisition module control instructions, acquired images and the like through wireless transmission technologies such as Bluetooth or wifi and the like. The image acquisition module consists of an optical system, a pointing device, a detector and the like, is responsible for acquiring images of corresponding areas according to an image acquisition control instruction received by the information communication module and provides image materials for camouflage design or camouflage effect evaluation; the flight control module is used for forming an aircraft power control instruction by analyzing the remote control instruction received by the information communication module and transmitting the aircraft power control instruction to the platform power module; the platform power module mainly comprises a motor, wings and the like and is responsible for providing the flight power of the unmanned aerial vehicle platform. The central machine (which can be a tablet computer, a notebook computer or an embedded device) is mainly responsible for controlling the unmanned aerial vehicle platform to collect images meeting conditions, completes camouflage design work by processing the images collected by the unmanned aerial vehicle, and mainly comprises an information communication module, a device control module, a camouflage design module and an effect evaluation module. The information communication module is responsible for communicating with the unmanned aerial vehicle platform and mainly responsible for transmitting the unmanned aerial vehicle flight and image acquisition control instructions generated by the equipment control module to the unmanned aerial vehicle platform information communication module. The equipment control module mainly provides a friendly human-computer interaction interface, and assists an operator in completing the control of unmanned aerial vehicle maneuvering and image acquisition by converting the operation of the operator into a control instruction and transmitting the control instruction to the unmanned aerial vehicle platform through the information communication module; the camouflage design module performs the works of dominant color extraction, patch analysis and the like on the background image returned by the information communication module on the unmanned aerial vehicle platform to complete the quick design work of camouflage. And the effect evaluation module is used for carrying out disguise effect evaluation on the background image which is transmitted back by the unmanned aerial vehicle platform through the information communication module and contains the disguised target.
The unmanned aerial vehicle platform gathers the background region according to the maneuver route background condition, gathers the image promptly:
(1) the background of the maneuvering route is single, such as a desert background, only one color is provided, and when a road maneuvers, for example, a black road surface is the main, a small number of white (yellow) lane lines are provided, the acquired image is a direct background image of the target maneuvering route, the size of the acquired image does not need to be larger, and the size of the area is not smaller than a camouflage design candidate area (the length is not smaller than the length of a circumscribed rectangle of a target to be camouflaged by +2 times the height of the target to be camouflaged, and the width is the width of the circumscribed rectangle of the target to be camouflaged by +2 times the height of the target to be camouflaged);
(2) the background of the maneuvering route is complex, if maneuvering is performed in a grassland, a desert and other areas, the image acquisition size is large enough (can represent the overall characteristic of an activity area), in order to ensure the overall optimal camouflage effect (the overall camouflage effect is optimal, namely the dominant colors of the target and most of the background areas are the closest, and the probability camouflage effect of the target during maneuvering in the area is good), an area which is consistent with the overall area color characteristic and is suitable in size is selected as a camouflage design area, specifically, in the range of possible maneuvering of the target, the size of the background area is AxB, the candidate area is divided into candidate areas according to the size of the candidate areas, namely, the number of the candidate areas is
Figure BDA0002362202130000061
Wherein a is the length which is equal to the length of a circumscribed rectangle of the target to be disguised plus 2 times the height of the target to be disguised; and b is the width which is equal to the width of a circumscribed rectangle of the target to be disguised and is +2 times the height of the target to be disguised. The image size should satisfy
Figure BDA0002362202130000071
And is
Figure BDA0002362202130000072
The pattern painting design module of the central machine completes pattern painting design through the following steps:
step 1, collecting background area images and determining camouflage design candidate areas
Step 2, extracting dominant color information of the background image and the camouflage design candidate area, and counting respective dominant color proportion;
step 3, calculating the similarity of each camouflage design candidate area and the dominant color information of the background image, and arranging the camouflage design candidate areas from high to low according to the proximity;
step 4, selecting the candidate area with the highest proximity degree as a background area of the camouflage design;
step 5, sequentially calculating the color distance between the camouflage candidate color and the main color of the background image in a Lab color space, and determining the color implemented by the camouflage from the camouflage candidate colors according to a color approaching principle;
step 6, calculating the color distance between each pixel color in the camouflage design candidate area and each dominant color of the background in the Lab color space, assigning the color value of the dominant color closest to the pixel point, traversing all pixels of the image, and obtaining an original image of the image camouflage design;
step 7, processing the original image by using a mathematical morphology method to obtain a main color spot block image;
and 8, replacing the background main color in the main color patch image with the color which is closest to the main color patch image and is implemented by the camouflage obtained in the step 5 to obtain the final camouflage design pattern.
In the step 1, the camouflage design candidate areas are determined to be divided into two types:
(1) when the background color is single, the length of the camouflage design candidate area is not less than the length of the external rectangle of the target to be camouflaged and is +2 times the height of the target to be camouflaged, and the width of the camouflage design candidate area is +2 times the width of the external rectangle of the target to be camouflaged and is equal to the height of the target to be camouflaged;
(2) when the background color is complex, for the background area with the size of A multiplied by B, the background area is divided into candidate areas according to the candidate area size a multiplied by B rectangle, and then the number of the candidate areas is
Figure BDA0002362202130000073
Wherein a is the length which is equal to the length of a circumscribed rectangle of the target to be disguised plus 2 times the height of the target to be disguised; and b is the width which is equal to the width of a circumscribed rectangle of the target to be disguised and is +2 times the height of the target to be disguised. The image size should satisfy
Figure BDA0002362202130000074
And is
Figure BDA0002362202130000075
The specific process of the step 2 comprises:
representing the color image by using a CIELAB color space, quantizing the color space to obtain a plurality of levels of quantized colors, and counting the frequency of each level of quantized colors appearing in the color image;
mapping the clustering samples to a color space from a pixel space to obtain a color sample space;
clustering all levels of colors in the mapped color sample space by using a pedigree clustering algorithm and acquiring a result class center of pedigree clustering;
and taking the result of the pedigree clustering as an initial class center, carrying out fast FCM clustering on the clustering sample, and determining the main color of the background.
The specific process of representing the color image by CIELAB color space, quantizing the color space to obtain a plurality of levels of quantized colors and counting the frequency of each level of quantized colors appearing in the color image is as follows:
step 101, for each of all quantized rectangular-cube regions of the CIELAB color spaceThe length, width and height of the field respectively correspond to the quantization intervals of three coordinates, and the coordinate range is [ Lm,LM]、[am,aM]And [ b)m,bM];
Step 102, if the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMThen the pixel color appears once.
The specific process of clustering the colors of all levels in the mapped color sample space by using the pedigree clustering algorithm and acquiring the result core of the pedigree clustering comprises the following steps:
step 2301, setting each color sample in the color sample space as a class, setting
Figure BDA0002362202130000081
Figure BDA0002362202130000082
I ═ j ═ 1,2, …, N }, where y isjFor the jth sample to be classified, ΓjIs the jth cluster set, N is the number of samples;
step 2302, using the set { ΓjFinding a pair of conditions satisfying | j ∈ I }
Figure BDA0002362202130000083
Cluster set of (a) riAnd Γk,Δ(Γik) Is gammaiAnd ΓkI.k ∈ I;
step 2303, the gamma is processediIs incorporated into rkMiddle, and remove gammaiRemoving I from index set I, and recalculating gammakHeart-like color of
Figure BDA0002362202130000084
Wherein, VkIs the class center color of the kth class, NkCol (i) indicates the color value of the ith sample, and TC _ R (i) indicates the color of the ith sample in the number of samples in the kth classFrequency of occurrence in the image;
step 2304, if the number of merged classes is equal to the number of expected classes, terminating the calculation; otherwise go to step 2302.
Taking the result of the pedigree clustering as an initial class center, carrying out a rapid clustering algorithm on the clustering samples, and determining the main colors of the background image or the candidate area comprises the following specific processes:
step S2401, setting an iteration stop threshold epsilon and a maximum iteration number B, initializing a class center, and enabling the iteration number B to be 0;
step S2402, calculating a partition matrix according to the formula (2)
Figure BDA0002362202130000091
Fuzzy membership of kth color sample to ith class
Figure BDA0002362202130000092
Figure BDA0002362202130000093
Wherein the content of the first and second substances,
Figure BDA0002362202130000094
in the b-th iteration, the fuzzy membership degree of the kth color sample to the ith class, m is a clustering control parameter,
Figure BDA0002362202130000095
color values Col (k) and ith centroid color for the kth sample
Figure BDA0002362202130000096
The color difference between them, define
Figure BDA0002362202130000097
c is the number of categories;
step S2403, updating the clustering center matrix according to the formula (3)
Figure BDA0002362202130000098
Figure BDA0002362202130000099
Wherein the content of the first and second substances,
Figure BDA00023622021300000910
representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step S2404, if
Figure BDA00023622021300000911
Or B is more than or equal to B, stopping and outputting
Figure BDA00023622021300000912
And
Figure BDA00023622021300000913
otherwise, go to step S2401 with b + 1.
The specific process of calculating the similarity between the candidate region and the dominant color information of the background image in the step 3 is as follows:
step S301, matching the dominant colors of the candidate regions with the dominant colors of the background images one by one according to a color distance principle;
step S302, calculating the color difference Delta C (C) of each dominant color to two dominant colors in turn1,c2) Wherein c is1And c2C-th respectively representing candidate regions1C-th of the seed color and the full background image2A seed dominant color;
step S303, calculating the dominant color similarity between the candidate region and the background image by using the formula (4),
Figure BDA00023622021300000914
in the formula (f)n[P(c1),P(c2)]Is P (c)1) AndP(c2) Average value of (a), P (c)1) Denotes c1Statistical information of color in the candidate area, i.e. percentage of covered pixels to total pixels, P (c)2) Denotes c2Color is in the statistical information of the full background image;
step S304, calculating the similarity coefficient S between the candidate region and the color information of the full background imagec
Sc=1-Dc (5)
Wherein the similarity coefficient ScTo indicate the degree of similarity.
And 5, taking the camouflage candidate color corresponding to the minimum distance in the step 5 as the color of the determined camouflage implementation.
Example two
With reference to fig. 2, an adaptive camouflage online design method includes the following steps:
step 1, collecting background area images and determining camouflage design candidate areas;
step 2, extracting dominant color information of the panoramic background image: the method comprises the steps of taking an acquired overall background as an object, taking the number of colors to be subjected to camouflage painting as the number of dominant colors, extracting the dominant colors of a background panoramic image by utilizing a color image dominant color fast extraction method, and counting the proportion of each dominant color;
step 3, determining a camouflage design candidate background area;
step 4, extracting the dominant color information of the camouflage design candidate area: extracting the dominant colors and the proportion information of the sample candidate regions by using a color image dominant color fast extraction algorithm in sequence;
and 5, calculating the similarity between the candidate area and the dominant color information of the panoramic background image: according to the dominant color difference and the proportion information of the candidate area and the panoramic image, calculating the similarity between the candidate area and the panoramic image, calculating the proximity degree of the candidate area and the dominant color and proportion of the panoramic image, and sequencing the candidate area and the panoramic image according to the proximity degree from high to low, wherein the similarity is assumed to be N1, N2 and N3. from high to low;
step 6, selecting a camouflage design background area: selecting the background N1 with the highest proximity as a camouflage design background area;
step 7, determining the camouflage color: combining the result of the extraction of the dominant colors in the step 6, sequentially calculating the color distances between the camouflage candidate colors and the background dominant colors in the Lab color space by utilizing a CIE94 color formula according to the color approaching principle, wherein the camouflage candidate color corresponding to the minimum distance is used as the color implemented by the determined camouflage;
step 8, acquiring a dominant color distribution image of the camouflage design background area: calculating the color distance between each pixel color in the image and each dominant color of the background by using a CIE94 color formula in a Lab color space, assigning the color value of the dominant color closest to the pixel point, traversing all pixels of the image, and obtaining an original image (dominant color image) of the image camouflage design;
step 9, generating a main color spot block diagram: the main color chart contains more unnecessary detail information, and the implementation difficulty of the camouflage color is increased, so that the main color chart is further processed. And further processing the main color image by using a mathematical morphology method to obtain a main color spot block image. The mathematical morphology-based camouflage algorithm (Zhang Yong, New technology and New technology, 2018.9) introduces the mathematical morphology method in detail;
step 10, designing a camouflage pattern: replacing the background main color in the main color patch image with the closest camouflage color obtained in the step 9 to obtain a final camouflage design pattern;
step 11, realizing self-adaptive camouflage: and designing the camouflage pattern and color according to the step 10 by using temperature and illumination adjusting means or thermochromic materials and the like, so as to realize camouflage. Among them, patent publication No. CN2016989819U discloses a military adaptive camouflage device, describing a technology for designing camouflage patterns and colors by using temperature and illumination adjustment means. The patent with publication number CN108587369A discloses a thermochromic emissivity coating, a preparation method and applications thereof, and describes a technology for designing camouflage patterns and colors by means of thermochromic materials.
Further, in step 1, according to the background situation of the maneuvering route, a background area is collected:
(1) the background of the maneuvering route is single, such as a desert background, only one color is provided, and when a road maneuvers, for example, a black road surface is the main, a small number of white (yellow) lane lines are provided, the acquired image is a direct background image of the target maneuvering route, the size of the acquired image does not need to be larger, and the size of the area is not smaller than a camouflage design candidate area (the length is not smaller than the length of a circumscribed rectangle of a target to be camouflaged by +2 times the height of the target to be camouflaged, and the width is the width of the circumscribed rectangle of the target to be camouflaged by +2 times the height of the target to be camouflaged);
(2) the background of the maneuvering route is complex, if maneuvering is performed in a grassland, a desert and other areas, the image acquisition size is large enough (can represent the overall characteristic of an activity area), in order to ensure the overall optimal camouflage effect (the overall camouflage effect is optimal, namely the dominant colors of the target and most of the background areas are the closest, and the probability camouflage effect of the target during maneuvering in the area is good), an area which is consistent with the overall area color characteristic and is suitable in size is selected as a camouflage design area, specifically, in the range of possible maneuvering of the target, the size of the background area is AxB, the candidate area is divided into candidate areas according to the size of the candidate areas, namely, the number of the candidate areas is
Figure BDA0002362202130000111
Wherein a is the length which is equal to the length of a circumscribed rectangle of the target to be disguised plus 2 times the height of the target to be disguised; and b is the width which is equal to the width of a circumscribed rectangle of the target to be disguised and is +2 times the height of the target to be disguised. The image size should satisfy
Figure BDA0002362202130000112
And is
Figure BDA0002362202130000113
Further, the specific process of step 2 is:
step 201, color space quantization of color image: dividing colors existing in the images in the three coordinates of L, a and b of the CIELAB color space into n levels, and then totally sharing n3Color species;
step 202, mapping the clustered samples from the pixel space to the color space: mapping the clustering samples to a color space from a pixel space by counting the occurrence frequency of the color;
step 203, deleting redundant colors: traversing the color space, and deleting the color with the frequency of 0 from the quantized color;
step 204, obtaining an initial class center: clustering the color sample space by using a pedigree clustering algorithm, and removing the mean value of each class to obtain an initial class center of the class;
step 205, clustering determines dominant colors: and (3) clustering the clustering samples by using a color image fast FCM (CQFCM) clustering algorithm on the basis of the initial class center to determine the main color of the background.
In step 201, the grayscale image is generally 256 colors, that is, the grayscale values of all pixels in the image are 256 integer grayscale values of 0 to 255. The color values of color images are very numerous, for example, the three coordinates of the Lab space used herein have value ranges of (0 to 100, -128 to +127), and if integer color values are directly used as the cluster sample set, the total number of possible samples is 100 × 256 × 256 — 6553600, so if the clusters for pixels are directly mapped to the clusters for the color set, the purpose of reducing the number of samples cannot be achieved. The human eye has a threshold for resolving chromatic aberration, and when the chromatic aberration is less than the threshold, the human eye cannot resolve the chromatic aberration, so that the number of color types that can be resolved by the human eye is limited. The number of the image colors (such as 6553600) is much larger than the number of colors that can be distinguished by human eyes, and the number of the dominant colors is only 3-5, so that the dominant color extraction effect is not greatly affected by moderate compression of the number of the color types. There are many methods for compressing color, and experiments show that the method of performing equal-level quantization on each color coordinate can achieve a more ideal effect. That is, if the coordinates L, a, and b are all divided into n stages, the color n is shared3And (4) seed preparation. It is easy to understand that when n is smaller, the number of colors is smaller, the clustering algorithm is more efficient, but the color distortion is also increased, resulting in a decrease in clustering effect, which is also unacceptable. Therefore, the number of levels n must be determined taking into account both the efficiency of the algorithm and the degree of distortion of the color. In the present embodiment, n is 20, that is, the three coordinates of L, a and b have 20 levels, and the total number of colors is 20 × 20 × 20, 8000, which is accurate enough for the resolution of human eyes.
In step 202, mapping the cluster samples from the pixel space to the color space by counting the occurrence frequency of the color specifically as follows:
step 2021, for n of Lab three-dimensional space3The length, width and height of each quantized rectangular solid region in each quantized rectangular solid region respectively correspond to quantization intervals of three coordinates, and the coordinate range is [ Lm,LM]、[am,aM]And [ b)m,bM];
Step 2022, when the color of a certain pixel falls within the range (within the rectangular parallelepiped), i.e. when the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMIt means that the color appears once.
Traversing the whole image, the times of all colors can be obtained, namely the color occurrence frequency is represented by (TC), and the Lab coordinate values and the color occurrence frequency of all colors form the color space of the clustering sample.
The redundant clustering samples in step 203 refer to redundant colors with a statistical frequency of 0, that is, the colors do not appear in the image, so the redundant samples increase the number of samples, but have no influence on the clustering effect, thereby resulting in a meaningless increase in the complexity of the clustering algorithm. For this reason, redundant colors can be deleted from the quantized colors to further reduce the number of samples, improving algorithm efficiency.
The principle of the pedigree clustering algorithm in step 204 is that each subset formed by a small number of samples is regarded as a category, and then similar categories are combined step by step to form a nested sequence with the progressively reduced category number, and the specific process is as follows:
step 2041, set each color sample in the color sample space as a class, set
Figure BDA0002362202130000131
Figure BDA0002362202130000132
I ═ j ═ 1,2, …, N }, where y isjFor the jth sample to be classified, ΓjIs the jth cluster set, N is the number of samples;
step 2042, in the set { ΓjFinding a pair of conditions satisfying | j ∈ I }
Figure BDA0002362202130000133
Cluster set of (a) riAnd Γk,Δ(Γik) Is gammaiAnd ΓkI.k ∈ I;
step 2043, merge the two classes with the minimum distance into one class, i.e. let Γ beiIs incorporated into rkMiddle, and remove gammaiRecalculating ΓkSimilar heart of
Figure BDA0002362202130000134
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 2044, remove I from the index set I, if the cardinal number of I is equal to c, that is, the number of the merged classes is equal to the number of the expected classes, terminate the calculation; otherwise go to step 2042.
The distance in step 2042 refers to the size of chromatic aberration (color difference), where a large distance indicates a large chromatic aberration, a small distance indicates a small chromatic aberration, and the chromatic aberration calculation uses the CIE94 color formula. The study of the formula of chromatic aberration and its influence on the extraction of dominant colors of images (zhangyufa, application of photoelectric technology, 2010.6) has a clear calculation method for the CIE94 color formula. The influence of the complete clustering sample space is to be achieved so that information cannot be lost, and therefore, the frequency of the appearance of the color in the image needs to be counted.
In step 2045, the number of desired classes in this embodiment is determined to be 3-5. For camouflage, the closer the camouflage color and the background color are to the camouflage effect, the better the camouflage color and the background color are, however, the colors in the background are many, thousands of colors, but the camouflage color is generally only 3-5 colors.
The specific process of step 205 is:
step 2051, acquiring a color matrix Col, and counting a color occurrence frequency array TC;
step 205, setting an iteration stop threshold epsilon and a maximum iteration number B, initializing a class center, and making the iteration number B equal to 0;
step 2053, calculate partition matrix according to equation (2)
Figure BDA0002362202130000141
Fuzzy membership of the kth sample to the ith class
Figure BDA0002362202130000142
Figure BDA0002362202130000143
Where k denotes the kth sample, i denotes the ith class, b is the number of iterations,
Figure BDA0002362202130000144
in the b-th iteration, the fuzzy membership degree of the kth sample to the ith class, m is a clustering control parameter,
Figure BDA0002362202130000145
is the distance from the kth sample to the i cluster centers, i.e. the color value Col (k) of the kth sample and the i cluster center color
Figure BDA0002362202130000146
The color difference between the two images is obtained,
Figure BDA0002362202130000147
defining distance from k sample to j cluster center
Figure BDA0002362202130000148
Figure BDA0002362202130000149
Step 2054, updating the clustering center matrix according to the formula (3)
Figure BDA00023622021300001410
Figure BDA00023622021300001411
Wherein the content of the first and second substances,
Figure BDA00023622021300001412
representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step 2055, if
Figure BDA0002362202130000151
Or B is more than or equal to B, stopping and outputting
Figure BDA0002362202130000152
And
Figure BDA0002362202130000153
otherwise, let b +1 go to step 4053.
Further, step 4 obtains the dominant color information of the selected area by using the same calculation method as step 2.
Further, in step 2, the number of the covered pixels of each dominant color is counted, and the ratio of the number of the covered pixels of each dominant color is calculated as the ratio of the dominant color. The larger the scale, the more important the representation of the dominant color in the background.
Further, step 4 acquires the dominant colors and dominant color ratio information of the candidate regions in the same manner as step 2.
Further, the specific calculation process of the proximity of the dominant colors and proportions of the candidate region and the panoramic background image in step 5 is as follows:
step 501, calculating a color distance according to a CIE94 formula for each image, and pairing the dominant colors of the candidate area images with the dominant colors of the panoramic background images of an area one by one according to the principle of minimum color distance;
step 502, calculating the color difference Δ C (C) of each dominant color to two dominant colors in turn1,c2) Wherein c is1And c2C-th respectively representing candidate regions1C-th of the seed color and the full background image2A seed dominant color;
step 503, calculating the dominant color similarity between the candidate region and the background image using the following formula,
Figure BDA0002362202130000154
in the formula (f)n[P(c1),P(c2)]Is P (c)1) And P (c)2) Average value of (a), P (c)1) Denotes c1Statistical information of color in the candidate area, i.e. percentage of covered pixels to total pixels, P (c)2) Denotes c2Color is in the statistical information of the full background image;
step 504, calculating the similarity coefficient S between the candidate region and the color information of the full background imagec
Sc=1-Dc (5)
Wherein the similarity coefficient ScTo indicate the degree of similarity.
And 505, selecting the similarity coefficients of the candidate region and all regions of the panoramic background image, wherein the similarity coefficients are N1, N2 and N3. from high to low.
And only the dominant color similarity is used for selecting the candidate area, so that the efficiency is higher.
EXAMPLE III
The difference between this embodiment and the second embodiment is that the method further includes:
step 12, collecting a camouflage effect evaluation image: and acquiring a background image containing a target by using an unmanned aerial vehicle according to a certain specification.
Step 13, evaluating the camouflage effect: and the central machine receives the camouflage effect evaluation image and evaluates the camouflage effect. If the camouflage effect does not meet the requirement, turning to step 5, selecting the next candidate area to redesign camouflage; otherwise, the camouflage effect meets the requirements, the unmanned aerial vehicle is recycled, and the camouflage design is finished. And if the camouflage effect designed by all the candidate areas does not meet the requirement, selecting the scheme with the best camouflage effect as the final camouflage design scheme.
In step 12, the unmanned aerial vehicle platform collects images and collects at least five images according to the following mode:
(1) setting the advancing direction of the target as 0 degree direction, clockwise as positive, sequentially carrying out image acquisition on the area where the target is located from 0 degree, 90 degrees, 180 degrees and 270 degrees and the pitch angle of the image acquisition device as-30 degrees, wherein the horizontal pitch angle is 0 degree, upward is positive, and downward is negative;
(2) the image acquisition device acquires images right above the camera and the image acquisition pitch angle of the image acquisition device is-90 degrees.
The specific process of step 13 is:
step 131, judging whether the collected image meets the requirements: after receiving the image, the central machine preliminarily judges whether the image meets the requirement, further judges whether the image acquisition meets the requirement according to the information of the target position in the image, the target size in the image, the image size, the image acquisition angle and the like, if so, continues, and if not, changes 12 to acquire the image again.
Step 132, begin to evaluate the target camouflage effect in each image in turn: initializing a loop setting counter variable, and enabling i to be 0;
step 133, determining a camouflage effect evaluation image, i ═ i +1, and taking the ith image;
step 134, image preprocessing: and selecting the area where the target is located in the center machine, and selecting the minimum circumscribed rectangular area of the target in the image by clinging to the periphery of the target. And determining 8 equal-size rectangular areas directly connected with the target area around the target as an effect evaluation background area.
Step 135, extracting the dominant colors of the target image and the background image: and (4) taking the number of the camouflage colors as the number of the clustering centers, and respectively extracting the dominant colors of the target and 8 equal-size background images by utilizing a color image dominant color fast extraction algorithm.
Step 136, calculating the color statistical information of the target and background images: and traversing the target and background images, calculating the color distance between each pixel color and the dominant color based on a color distance formula (in Lab color space, by using the CIE94 color formula), classifying the dominant color with the minimum color distance, and counting the number of pixels covered by each dominant color to obtain the dominant color statistical information of the image.
Step 137, calculating color distribution information of the target and background images: and sequentially calculating the spatial distribution information entropy of each dominant color in the target image and the background image.
Step 138, calculating the similarity between the target and the background: and (4) integrating the dominant color statistical information and the distribution information, and sequentially calculating the similarity between the target and 8 neighborhood background images.
Step 139, evaluating the target camouflage effect in the single image: because the target is similar to part of the surrounding background, the target can be better disguised, and therefore after the calculation of the similarity coefficients of the target and 8 neighborhood backgrounds is completed, the average value of 4 maximum values of the similarity coefficients is taken as the comprehensive similarity coefficient of the target and the background.
Step 1310, determining whether all five orientation captured images have been processed, if yes, continuing step 1311, otherwise, turning to step 133.
And 1311, collecting image camouflage effect in five directions comprehensively, and evaluating the comprehensive camouflage effect of the target. And calculating the average value of the similarity coefficients of the 5 images as the target comprehensive camouflage effect.
And finally outputting the comprehensive camouflage effect and the camouflage effect in the 5 images. The comprehensive camouflage effect can provide a reference for comprehensive evaluation of the camouflage effect and improvement of the camouflage, and the single image camouflage effect can provide a reference for whether one or two surfaces of the target need to be improved in camouflage.
Further, the method for extracting the dominant colors of the target and background images in step 135 includes:
step 1351, color space quantization of the color image: dividing colors existing in the images in the three coordinates of L, a and b of the CIELAB color space into n levels, and then totally sharing n3Color species;
step 1352, mapping the clustered samples from the pixel space to the color space: mapping the clustering samples to a color space from a pixel space by counting the occurrence frequency of the color;
step 1353, deleting redundant colors: traversing the color space, and deleting the color with the frequency of 0 from the quantized color;
step 1354, obtaining an initial class center: clustering the color sample space by using a pedigree clustering algorithm, and removing the mean value of each class to obtain an initial class center of the class;
step 1355, clustering to determine dominant colors: clustering the clustering samples on the basis of the initial class center by using a color image fast clustering algorithm (CQFCM) to determine the main color of the background.
Specifically, the method comprises the following steps:
in step 1351, the gray-scale image is generally 256 colors, that is, the gray-scale values of all pixels in the image are 256 integer gray-scale values of 0 to 255. The color values of color images are very numerous, for example, the three coordinates of the Lab space used herein have value ranges of (0 to 100, -128 to +127), and if integer color values are directly used as the cluster sample set, the total number of possible samples is 100 × 256 × 256 — 6553600, so if the clusters for pixels are directly mapped to the clusters for the color set, the purpose of reducing the number of samples cannot be achieved. The human eye has a threshold for resolving chromatic aberration, and when the chromatic aberration is less than the threshold, the human eye cannot resolve the chromatic aberration, so that the number of color types that can be resolved by the human eye is limited. The number of the image colors (such as 6553600) is much larger than the number of colors that can be distinguished by human eyes, and the number of the dominant colors is only 3-5, so that the dominant color extraction effect is not greatly affected by moderate compression of the number of the color types. There are many methods for compressing color, and experiments show that the method of performing equal-level quantization on each color coordinate can achieve a more ideal effect. That is, if the coordinates L, a, and b are all divided into n stages, the color n is shared3And (4) seed preparation. It is easy to understand that when n is smaller, the number of colors is smaller, the clustering algorithm is more efficient, but the color distortion is also increased, resulting in a decrease in clustering effect, which is also unacceptable. Therefore, the number of levels n must be determined taking into account both the efficiency of the algorithm and the degree of distortion of the color. In this embodiment, a large number of experiments are performed to select both the target image and the background imageThe total number of colors is 20 × 20 × 20 × 8000, which is accurate enough for the resolution of human eyes, by selecting n as 20, i.e., the three coordinates of L, a, and b have 20 levels.
In step 1352, the clustering samples are mapped to the color space from the pixel space by counting the occurrence frequency of the color, specifically:
step 13521, for n in Lab three-dimensional space3The length, width and height of each quantized rectangular solid region in each quantized rectangular solid region respectively correspond to quantization intervals of three coordinates, and the coordinate range is [ Lm,LM]、[am,aM]And [ b)m,bM];
Step 13522, when the color of a certain pixel falls within the range (within the rectangular box), i.e. when the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMIt means that the color appears once.
Traversing the target image and the background image to obtain the times of all colors, namely the color occurrence frequency is represented by (TC), and the Lab coordinate values and the color occurrence frequency of all colors form the color space of the clustering sample.
The redundant clustering samples in step 1353 refer to redundant colors with a statistical frequency of 0, that is, the colors do not appear in the image, so that the redundant samples increase the number of samples, but have no influence on the clustering effect, thereby resulting in a meaningless increase in the complexity of the clustering algorithm. For this reason, redundant colors can be deleted from the quantized colors to further reduce the number of samples, improving algorithm efficiency.
The principle of the pedigree clustering algorithm in step 1354 is that each subset formed by a small number of samples is regarded as a category, and then similar categories are combined step by step to form a nested sequence with the progressively reduced category number, and the specific process is as follows:
step 13541, set each color sample in the color sample space as a class, set
Figure BDA0002362202130000191
Figure BDA0002362202130000192
It={jt|jt=1,2,…,NtIn the formula,
Figure BDA0002362202130000193
is jthtThe number of samples to be classified is one,
Figure BDA0002362202130000194
is the jthtA set of clusters, NtIs the number of samples, t is 0 representing the target image, t is 1 representing the background image;
step 13542, in the set
Figure BDA0002362202130000195
Find a pair of satisfied conditions
Figure BDA0002362202130000196
Cluster collection of
Figure BDA0002362202130000197
And
Figure BDA0002362202130000198
i.e. selecting the pair of samples with the smallest distance,
Figure BDA0002362202130000199
is that
Figure BDA00023622021300001910
And
Figure BDA00023622021300001911
a similarity measure between;
at step 13543, the two classes with the smallest distance are merged into one class, i.e. the two classes with the smallest distance are merged into one class
Figure BDA00023622021300001912
Is combined to
Figure BDA00023622021300001913
Middle, and remove gammaiRecalculating ΓkHeart-like color of
Figure BDA00023622021300001914
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 13544, remove I from the index set I, if the cardinal number of I is equal to c, that is, the number of the merged classes is equal to the number of the expected classes, terminate the calculation; otherwise go to step 13542.
The distance described in step 13542 refers to the size of the color difference (color difference), a large distance means a large color difference, a small distance means a small color difference, and the color difference calculation uses the CIE94 color formula. The CIE94 color formula has a definite calculation method in the text "study of color difference formula and its influence on extraction of dominant colors of images",
in step 13544, the number of desired classes in this embodiment is determined to be 3 or 4. For camouflage, the closer the camouflage color and the background color are to the camouflage effect, the better, however, the colors in the background are many, thousands of colors, but the camouflage color is generally only 3 or 4 colors.
The specific process of step 1355 is:
step 13551, obtain color matrix ColtStatistical color appearance frequency array TCt
Step 13552, set iteration stop threshold epsilontAnd maximum number of iterations BtInitializing the class core and making the number of iterations bt=0;
Step 13553, calculate the partition matrix according to equation (7)
Figure BDA00023622021300001915
Middle (k) thtSample pair ithtFuzzy membership of individual classes
Figure BDA0002362202130000201
Figure BDA0002362202130000202
Wherein k istDenotes the kthtA sample itDenotes the ithtA class, btIn order to be able to perform the number of iterations,
Figure BDA0002362202130000203
is the b thtK th at time of next iterationtSample pair ithtFuzzy membership of each category, m is a clustering control parameter,
Figure BDA0002362202130000204
is the k-thtFrom sample to itDistance of individual cluster centers, i.e. kthtColor values of individual samples Col (kt) and ithtColor of individual class heart
Figure BDA0002362202130000205
The color difference between them, define
Figure BDA0002362202130000206
Step 13554, updating the clustering center matrix according to the formula (8)
Figure BDA0002362202130000207
Figure BDA0002362202130000208
Wherein the content of the first and second substances,
Figure BDA0002362202130000209
denotes the b-thtIth obtained by sub-iterative computationtClass center of individual class, TC _ R (k)t) Is the k-thtFrequency of appearance of seed color, Col (k)t) Is the k-thtA sampleTo is the number of clustered samples;
step 13555, if
Figure BDA00023622021300002010
Or bt≥BtThen stop and output
Figure BDA00023622021300002011
And
Figure BDA00023622021300002012
otherwise, let bt=bt+1, go to step 13553.
Further, the specific process in step 137 is: dividing the target image or background image into LtAn area
Figure BDA00023622021300002013
Figure BDA00023622021300002014
Is a color ctAt the first of the imagetThe ratio of the number of area coverage pixels to the number of color coverage pixels in the whole image defines the spatial distribution information entropy e of the colort(ct) As shown in formula (9).
Figure BDA00023622021300002015
In the formula (I), the compound is shown in the specification,
Figure BDA00023622021300002016
can be calculated by equation (10).
Figure BDA00023622021300002017
In the formula (I), the compound is shown in the specification,
Figure BDA0002362202130000211
the number of each color in each interval,c is the set of all dominant colors.
Further, the specific process of step 138 is:
step 1381, firstly, matching the dominant colors of the target image and the dominant colors of the background image one by one according to a color distance principle, namely, calculating the distances between the dominant colors of the target image and all the dominant colors of the background image by using a CIE94 color distance formula, and matching according to a minimum distance principle;
step 1382, calculating the color difference delta C (C) of each main color to the two main colors in turn1,c2) And the corresponding difference in distribution entropy Δ e (c)1,c2) Wherein c is1And c2C-th representing target images respectively1The dominant color and the dominant color with which the background image is paired (c-th2Seed dominant color);
step 1383, calculating the similarity between the object and the background using the following formula,
Figure BDA0002362202130000212
the formula represents that the distances between the dominant color in the target image and the background dominant color paired with the dominant color are calculated in sequence, and weighted summation is carried out to obtain the distance dis between the two images.
In the formula, wcAnd weAnd the weighting coefficients are color difference and entropy difference respectively, and control the relative importance of the image color characteristic and the spatial distribution characteristic in evaluating the image similarity. w is acAnd weThe relationship satisfies the formula (12). W can be selected empiricallyc=0.7,we=0.3。
wc+we=1, wc,we∈[0,1] (12)
In the formula (f)n(P(c1),P(c2) Is a quantitative weighting function, is P (c)1) And P (c)2) Average value of (A), (B), (C)1) Denotes c1Color statistics in the target image, i.e. the percentage of covered pixels to total pixels, P (c)2) Denotes c2Color is the target image statistical information), as shown in equation (13). Because in an image a certain color is presentThe amount in the image should also be one of the factors determining the image characteristics, and a larger number of colors should contribute more to the image characteristics.
Figure BDA0002362202130000213
c _ n represents the number of camouflage colors, and since the dominant color extraction is performed with the number of camouflage colors as the centroid number, that is, the number of dominant colors.
After the calculation of the similarity distance between the object and the background is completed, the similarity coefficient S between the object and the background can be calculated by equation (14).
S=1-dis (14)
The larger the similarity coefficient is, the better the camouflage effect is, and the set intervals of 0.8-1.0, 0.6-0.8, 0.4-0.6 and 0-0.4 respectively represent that the camouflage effect is excellent, good, medium and poor.
In step 1311, the comprehensive similarity coefficients of the images collected at multiple angles are weighted and averaged to obtain a camouflage target self-adaptive camouflage comprehensive camouflage effect. The weight is defaulted to 0.2, and the weight can be adjusted according to the actual situation in the actual implementation. For example, if the threat is from the left side, the weight in the left direction should be increased and the weight in the right direction should be decreased.

Claims (3)

1. An adaptive camouflage online design method is characterized by comprising the following steps:
step 1, acquiring background area images in real time, and determining camouflage design candidate areas;
step 2, extracting dominant color information of the background image and the camouflage design candidate area, and counting respective dominant color proportion;
step 3, calculating the similarity of each camouflage design candidate area and the dominant color information of the background image, and arranging the camouflage design candidate areas from high to low according to the proximity;
step 4, selecting the candidate area with the highest proximity degree as a background area of the camouflage design;
step 5, sequentially calculating the color distance between the camouflage candidate color and the main color of the background image in a Lab color space, and determining the color of the camouflage implementation closest to the main color of each background image from the camouflage candidate colors according to a color approaching principle;
step 6, calculating the color distance between each pixel color in the camouflage design candidate area and each dominant color of the background in the Lab color space, assigning the color value of the dominant color closest to the pixel, and traversing all pixels of the image to obtain an original image of the image camouflage design;
step 7, processing the original image by using a mathematical morphology method to obtain a main color spot block image;
step 8, replacing the dominant color of the background image in the main color patch image with the color of the closest camouflage implementation obtained in the step 5 to obtain a final camouflage design pattern;
in the step 1, the camouflage design candidate areas are determined to be divided into two types:
(1) when the background color is single, the length of the camouflage design candidate area is not less than the length of the external rectangle of the target to be camouflaged and is +2 times the height of the target to be camouflaged, and the width of the camouflage design candidate area is +2 times the width of the external rectangle of the target to be camouflaged and is equal to the height of the target to be camouflaged;
(2) when the background color is complex, for the background area with the size of A multiplied by B, the background area is divided into candidate areas according to the candidate area size a multiplied by B rectangle, and then the number of the candidate areas is
Figure FDA0003448600960000011
Wherein a is the length which is equal to the length of a circumscribed rectangle of the target to be disguised plus 2 times the height of the target to be disguised; b is the width which is equal to the width of the external rectangle of the target to be disguised plus 2 times the height of the target to be disguised,
Figure FDA0003448600960000012
and is
Figure FDA0003448600960000013
The specific process of the step 2 comprises:
representing the color image by using a CIELAB color space, quantizing the color space to obtain a plurality of levels of quantized colors, and counting the frequency of each level of quantized colors appearing in the color image;
mapping the clustering samples to a color space from a pixel space to obtain a color sample space;
clustering all levels of colors in the mapped color sample space by using a pedigree clustering algorithm and acquiring a result class center of pedigree clustering;
taking the result of the pedigree clustering as an initial class center, carrying out fast FCM clustering on the clustering sample, and determining the main color of the background image or the candidate area;
the specific process of clustering the colors of all levels in the mapped color sample space by using the pedigree clustering algorithm and acquiring the result core of the pedigree clustering comprises the following steps:
step 2301, setting each color sample in the color sample space as one type, setting Γj=yj
Figure FDA0003448600960000021
In the formula, yjFor the jth sample to be classified, ΓjIs the jth cluster set, N is the number of samples;
step 2302, using the set { ΓjFinding a pair of conditions satisfying | j ∈ I }
Figure FDA0003448600960000022
Cluster set of (a) riAnd Γk,Δ(Γik) Is gammaiAnd ΓkI.k ∈ I;
step 2303, the gamma is processediIs incorporated into rkMiddle, and remove gammaiRemoving I from index set I, and recalculating gammakHeart-like color of
Figure FDA0003448600960000023
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 2304, if the number of merged classes is equal to the number of expected classes, terminating the calculation; otherwise go to step 2302;
taking the result of the pedigree clustering as an initial class center, carrying out a fast FCM clustering algorithm on the clustering samples, and determining the main colors of the background image or the candidate area comprises the following specific processes:
step S2401, setting an iteration stop threshold epsilon and a maximum iteration number B, initializing a class center, and enabling the iteration number B to be 0;
step S2402, calculating a partition matrix according to the following formula
Figure FDA0003448600960000024
Fuzzy membership of kth color sample to ith class
Figure FDA0003448600960000025
Figure FDA0003448600960000026
Where k denotes the kth color sample, i denotes the ith class,
Figure FDA0003448600960000031
in the b-th iteration, the fuzzy membership degree of the kth color sample to the ith class, m is a clustering control parameter,
Figure FDA0003448600960000032
color values Col (k) and ith centroid color for the kth sample
Figure FDA0003448600960000033
The color difference between them, define
Figure FDA0003448600960000034
Figure FDA0003448600960000035
c is the number of categories, phi represents an empty set;
step S2403, updating the clustering center matrix according to the following formula
Figure FDA0003448600960000036
Figure FDA0003448600960000037
Wherein the content of the first and second substances,
Figure FDA0003448600960000038
representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step S2404, if
Figure FDA0003448600960000039
Or B is more than or equal to B, stopping and outputting
Figure FDA00034486009600000310
And
Figure FDA00034486009600000311
otherwise, turning b to b +1 to step S2401;
the color image is expressed by CIELAB color space, color space quantization is carried out to obtain a plurality of levels of quantized colors, and the specific process of counting the frequency of each level of quantized colors appearing in the color image is as follows:
step 101, for each of all quantized rectangular solid regions in the CIELAB color space, the length, width and height of each quantized rectangular solid region respectively correspond to the quantization interval of three coordinates, and the coordinate range is [ L [m,LM]、[am,aM]And [ b)m,bM];
Step 102, if the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMThen the pixel color appears once;
the specific process of calculating the similarity between the candidate region and the dominant color information of the background image in the step 3 is as follows:
step S301, matching the dominant colors of the candidate regions with the dominant colors of the background images one by one according to a color distance principle;
step S302, calculating the color difference Delta C (C) of each dominant color to two dominant colors in turn1,c2) Wherein c is1And c2C-th respectively representing candidate regions1C-th of the seed color and background image paired therewith2A seed dominant color;
step S303, calculating the dominant color similarity of the candidate region and the background image using the following formula,
Figure FDA00034486009600000312
where c _ n is the number of dominant colors in the candidate region, fn[P(c1),P(c2)]Is P (c)1) And P (c)2) Average value of (a), P (c)1) Denotes c1Statistical information of color in the candidate area, i.e. percentage of covered pixels to total pixels, P (c)2) Denotes c2Color in background image statistics;
step S304, calculating the similarity coefficient S between the candidate area and the background image color informationc
Sc=1-Dc
Wherein the similarity coefficient ScTo indicate the degree of similarity.
2. The method of claim 1, wherein the camouflage candidate color corresponding to the smallest distance in step 5 is used as the color of the determined camouflage implementation.
3. An adaptive camouflage online design system, which is characterized by comprising a central machine and an image acquisition device, and realizes the method of any claim from 1 to 2.
CN202010025181.9A 2020-01-10 2020-01-10 Adaptive camouflage online design method and system Active CN111798539B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010025181.9A CN111798539B (en) 2020-01-10 2020-01-10 Adaptive camouflage online design method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010025181.9A CN111798539B (en) 2020-01-10 2020-01-10 Adaptive camouflage online design method and system

Publications (2)

Publication Number Publication Date
CN111798539A CN111798539A (en) 2020-10-20
CN111798539B true CN111798539B (en) 2022-03-22

Family

ID=72805842

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010025181.9A Active CN111798539B (en) 2020-01-10 2020-01-10 Adaptive camouflage online design method and system

Country Status (1)

Country Link
CN (1) CN111798539B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112509068A (en) * 2020-10-29 2021-03-16 北京达佳互联信息技术有限公司 Image dominant color recognition method and device, electronic equipment and storage medium
CN112396570B (en) * 2020-12-04 2024-02-20 中国核电工程有限公司 Camouflage design method
CN112561844B (en) * 2020-12-08 2023-05-02 中国人民解放军陆军炮兵防空兵学院 Automatic generation method of digital camouflage pattern fused with texture structure
CN112541232B (en) * 2020-12-28 2024-02-02 中国航空工业集团公司沈阳飞机设计研究所 Design method for low-visibility mark outside aircraft
CN112950733B (en) * 2021-03-01 2022-10-28 哈尔滨工程大学 Multi-terrain camouflage pattern generation method suitable for ground equipment
CN113284198A (en) * 2021-05-13 2021-08-20 稿定(厦门)科技有限公司 Automatic image color matching method and device
CN113870095B (en) * 2021-06-25 2022-12-20 中国人民解放军陆军工程大学 Deception target reconnaissance system method based on camouflage patch camouflage
CN114881537B (en) * 2022-06-20 2023-04-18 中国电子科技集团公司第二十八研究所 Facility security assessment quantification calculation method and device based on index system
CN115464557A (en) * 2022-08-15 2022-12-13 深圳航天科技创新研究院 Method for adjusting mobile robot operation based on path and mobile robot

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110120080A (en) * 2019-04-12 2019-08-13 青岛九维华盾科技研究院有限公司 A method of quickly generating standard pattern-painting mass-tone

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6944331B2 (en) * 2001-10-26 2005-09-13 National Instruments Corporation Locating regions in a target image using color matching, luminance pattern matching and hue plane pattern matching

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110120080A (en) * 2019-04-12 2019-08-13 青岛九维华盾科技研究院有限公司 A method of quickly generating standard pattern-painting mass-tone

Also Published As

Publication number Publication date
CN111798539A (en) 2020-10-20

Similar Documents

Publication Publication Date Title
CN111798539B (en) Adaptive camouflage online design method and system
CN110675423A (en) Unmanned aerial vehicle tracking method based on twin neural network and attention model
Dev et al. Systematic study of color spaces and components for the segmentation of sky/cloud images
CN111797840B (en) Self-adaptive camouflage effect online evaluation method and system
CN110956187A (en) Unmanned aerial vehicle image plant canopy information extraction method based on ensemble learning
CN105608474A (en) High-resolution-image-based regional adaptive cultivated land extraction method
CN105139028A (en) SAR image classification method based on hierarchical sparse filtering convolutional neural network
CN107274387B (en) The end member extraction method of target in hyperspectral remotely sensed image based on Evolutionary multiobjective optimization
CN111028327A (en) Three-dimensional point cloud processing method, device and equipment
CN110765948A (en) Target detection and identification method and system based on unmanned aerial vehicle
CN111008975B (en) Mixed pixel unmixing method and system for space artificial target linear model
CN109034233A (en) A kind of high-resolution remote sensing image multi classifier combination classification method of combination OpenStreetMap
CN112419333B (en) Remote sensing image self-adaptive feature selection segmentation method and system
Weyler et al. In-field phenotyping based on crop leaf and plant instance segmentation
CN110334584A (en) A kind of gesture identification method based on the full convolutional network in region
Pölönen et al. Tree species identification using 3D spectral data and 3D convolutional neural network
CN117409339A (en) Unmanned aerial vehicle crop state visual identification method for air-ground coordination
CN115272876A (en) Remote sensing image ship target detection method based on deep learning
Khudov et al. DEVISING A METHOD FOR SEGMENTING CAMOUFLAGED MILITARY EQUIPMENT ON IMAGES FROM SPACE SURVEILLANCE SYSTEMS USING A GENETIC ALGORITHM.
CN108229426B (en) Remote sensing image change vector change detection method based on difference descriptor
CN106971402B (en) SAR image change detection method based on optical assistance
CN108876803A (en) A kind of color image segmentation method based on spectral clustering community division
CN111339953B (en) Clustering analysis-based mikania micrantha monitoring method
CN109800690B (en) Nonlinear hyperspectral image mixed pixel decomposition method and device
CN111368776A (en) High-resolution remote sensing image classification method based on deep ensemble learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant