CN112950490B - Unmanned aerial vehicle remote sensing mapping image enhancement processing method - Google Patents

Unmanned aerial vehicle remote sensing mapping image enhancement processing method Download PDF

Info

Publication number
CN112950490B
CN112950490B CN202110099444.5A CN202110099444A CN112950490B CN 112950490 B CN112950490 B CN 112950490B CN 202110099444 A CN202110099444 A CN 202110099444A CN 112950490 B CN112950490 B CN 112950490B
Authority
CN
China
Prior art keywords
image
pixel points
sub
pixel
remote sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110099444.5A
Other languages
Chinese (zh)
Other versions
CN112950490A (en
Inventor
朱旭红
周佳玮
陈洪建
陈科明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Mingzhou Surveying and Mapping Institute
Original Assignee
Ningbo Yinzhou Surveying And Mapping Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Yinzhou Surveying And Mapping Institute filed Critical Ningbo Yinzhou Surveying And Mapping Institute
Priority to CN202110099444.5A priority Critical patent/CN112950490B/en
Publication of CN112950490A publication Critical patent/CN112950490A/en
Application granted granted Critical
Publication of CN112950490B publication Critical patent/CN112950490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an unmanned aerial vehicle remote sensing mapping image enhancement processing method, which comprises S1, carrying out graying processing on a remote sensing mapping image to obtain a grayed image; s2, dividing the gray image to obtain a plurality of sub-images; s3, for a single sub-image, detecting pixel points to be enhanced for pixel points in the sub-image to obtain pixel points to be enhanced; s4, performing enhancement processing on the pixel points to be enhanced to obtain enhanced pixel points; and S5, performing the operations of the step S3 and the step S4 on all the sub-images to obtain the enhanced remote sensing mapping image. By screening the pixel points needing enhancement processing, the invention avoids the phenomenon that the pixel points are easily over-enhanced when all the pixel points are enhanced in the traditional image enhancement mode. The excessive enhancement easily causes the normal pixel points to become noise points.

Description

Unmanned aerial vehicle remote sensing mapping image enhancement processing method
Technical Field
The invention relates to the field of image processing, in particular to an unmanned aerial vehicle remote sensing mapping image enhancement processing method.
Background
The unmanned aerial vehicle remote sensing mapping, namely, the advanced unmanned aerial vehicle technology, the remote sensing sensor technology, the remote measuring and remote controlling technology, the communication technology, the GPS differential positioning technology and the remote sensing application technology are utilized, and the space remote sensing mapping images of the territorial resources, the natural environment, the earthquake disaster area and the like can be automatically, intelligently and specially and rapidly acquired. After the remote sensing mapping image is obtained, it generally needs to be enhanced. In the prior art, the remote sensing image is generally subjected to global enhancement processing, and the enhancement effect is poor.
Disclosure of Invention
In view of the above problem, the present invention aims to provide an unmanned aerial vehicle remote sensing mapping image enhancement processing method, which includes:
s1, carrying out graying processing on the remote sensing mapping image to obtain a grayed image;
s2, dividing the gray image to obtain a plurality of sub-images;
s3, for a single sub-image, detecting pixel points to be enhanced for pixel points in the sub-image to obtain pixel points to be enhanced;
s4, performing enhancement processing on the pixel points to be enhanced to obtain enhanced pixel points;
and S5, performing the operations of the step S3 and the step S4 on all the sub-images to obtain the enhanced remote sensing mapping image.
Compared with the prior art, the invention has the advantages that:
by screening the pixel points needing enhancement processing, the method avoids the phenomenon that the pixel points are easily over enhanced when all the pixel points are enhanced in the traditional image enhancement mode. Excessive enhancement easily causes normal pixel points to become noise points.
Drawings
The invention is further illustrated by means of the attached drawings, but the embodiments in the drawings do not constitute any limitation to the invention, and for a person skilled in the art, other drawings can be obtained on the basis of the following drawings without inventive effort.
Fig. 1 is a diagram of an exemplary embodiment of an unmanned aerial vehicle remote sensing mapping image enhancement processing method according to the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
As shown in the embodiment in fig. 1, the invention provides an unmanned aerial vehicle remote sensing mapping image enhancement processing method, which includes:
s1, carrying out graying processing on the remote sensing mapping image to obtain a grayed image;
s2, dividing the gray image to obtain a plurality of sub-images;
s3, for a single sub-image, detecting pixel points to be enhanced for pixel points in the sub-image to obtain pixel points to be enhanced;
s4, performing enhancement processing on the pixel points to be enhanced to obtain enhanced pixel points;
and S5, performing the operations of the step S3 and the step S4 on all the sub-images to obtain the enhanced remote sensing mapping image.
By screening the pixel points needing enhancement processing, the invention avoids the phenomenon that the pixel points are easily over-enhanced when all the pixel points are enhanced in the traditional image enhancement mode. The excessive enhancement easily causes the normal pixel points to become noise points.
In one embodiment, the graying the remote sensing mapping image to obtain a grayed image includes:
converting the remote sensing mapping image into an intermediate gray image p using a weighted average methodmid
Establishing a model to be solved:
Figure BDA0002915177970000021
Figure BDA0002915177970000022
Figure BDA0002915177970000023
Figure BDA0002915177970000024
in the formula, beta1、β2、β3Respectively represent a first, a second and a third preset weight parameter, beta123Has a value of 1; alpha is alpha1And alpha2Representing preset fourth and fifth weight parameters, alpha1And alpha2The sum of (1); umo denotes the model to be solved; pidU denotes pmidSet of all pixel points in (1), fuPixel value, F, representing a pixel point u in a greyscale imageuIs shown at pmidThe pixel value of the pixel point u in (8), neiuA set of all pixels in a neighborhood of c x c size representing pixel u; f. ofvRepresented in a greyscale image, neiuThe pixel value of the pixel point v in (1), qz represents a value function, if wuGreater than wvFor all elements in colrU, qz has a value of 0.1, otherwise qz has a value of-0.1, colrU ═ L, a, b, L, a, b respectively represent luminance component, a component, b component, w component in Lab color modeluAnd wvRespectively representing component values of w components of pixel points u and v in an Lab color model corresponding to the remote sensing mapping image, bdU and xdU represent two subsets of pidU, wherein pidU is bdU U xdU, and for the pixel points in bdU, the component values of three components of the pixel points in the Lab color model corresponding to the remote sensing mapping image are equal; if waGreater than wbFor all elements in colrU, then
Figure BDA0002915177970000031
The value of (a) is 0.5, otherwise,
Figure BDA0002915177970000032
is 1; ph denotes neiuAll the pixel points in (1) are in pmidThe standard deviation of the pixel value, t represents a control parameter, and the value range of t is [9.5,10.5 ]];
Umo is solved to obtain pmidThe Umo obtains the value when the maximum value is obtained by each pixel point in the image, thereby obtaining the gray image.
According to the embodiment of the invention, the problem that the information quantity is seriously lost due to the fact that the absolute value of the difference of the pixel values between adjacent pixel points is reduced easily in the traditional image graying mode can be effectively solved. Since the color image is converted into a grayscale image, the amount of information in the image is reduced, and thus the amount of information is reduced, which is unavoidable. However, in the above embodiment of the present application, the remote sensing mapping image is converted into the grayscale image through the traditional graying method, and then the model to be solved is established to obtain the final grayscale image, so that the ratio of the pixel values between the adjacent pixel points in the intermediate grayscale image can be as same as or similar to the ratio of the pixel values in the remote sensing mapping image as possible, thereby effectively retaining the information amount in the obtained grayed image, facilitating to provide more information for subsequent image processing, and improving the accuracy of subsequent processing. Specifically, when the model is established, the difference of pixel values of the pixel points in the gray image to be solved and the pixel values of the intermediate gray image is considered, and the difference of the pixel points and the adjacent pixel points thereof in the pixel values and the difference in the Lab color space are also considered. Thereby realizing the gray scale of the image while keeping the image information quantity as much as possible.
In one embodiment, the remote sensing mapping image is converted into an intermediate gray image p by using a weighted average methodmidThe method comprises the following steps:
converting the remotely sensed mapping image to an intermediate grayscale image p using the following formulamid
pmid(d)=0.115R(d)+0.588G(d)+0.297B(d)
In the formula, pmid(d) And (d) representing the pixel value of the pixel point d in the intermediate gray level image, and R (d), G (d), B (d) respectively representing the component values of the red component, the green component and the blue component of the pixel point d in the RGB color model corresponding to the remote sensing mapping image.
In one embodiment, the dividing the grayed image to obtain a plurality of sub-images includes:
dividing the gray-scale image in an iterative division mode:
for iteration 1, dividing the grayscale image into 4 sub-images with equal areas, and storing the numbers of the sub-images into a set hk0
Separately calculate hk0If the partition parameter is less than the preset partition threshold or the area of the subimage is less than the preset area threshold, the subimage is not subjected to iterative partition processing, and hk is processed1The numbers of the sub-images to be subjected to the iterative division processing are stored in a set nd0Performing the following steps;
for the nth iteration, the set nd is divided into 4 sub-images with equal area in the same way as the grayscale image is divided inton-1The sub-image corresponding to the serial number in (1) is divided, and the obtained sub-image is stored into a set hknPerforming the following steps;
separately calculate hknIf the partition parameter is less than the preset partition threshold or the area of the subimage is less than the preset area threshold, the subimage is not subjected to iterative partition processing, and hk is processednThe number of the subimage needed to be processed by the iteration division is stored in a set ndnIn (1).
In one embodiment, the partition parameter is calculated as follows:
Figure BDA0002915177970000041
Figure BDA0002915177970000042
in the formula, fid represents a partition parameter, dw represents a value function, only the numerical value of a variable in brackets is taken to participate in calculation, hf represents the variance of pixel values of all pixel points in the sub-image, and χ represents1Hexix-2Respectively represent a seventh weight parameter and an eighth weight parameter, χ1Hexix-2Is 1, numz represents the total number of pixels in the subimage, feThe pixel value of the e-th pixel point in the sub-image is represented, sib represents the signal-to-noise ratio of the sub-image, and xiz represents the noise coefficient.
In the above embodiment of the present invention, when the image is divided, the image is not divided into the sub-images with the same area as the conventional dividing method, because if the area of the sub-image is too small or the sub-image only contains foreground pixels or only contains background pixels, the sub-image is easily over-enhanced or under-enhanced when the sub-image is enhanced. And this application still needs to strengthen the detection of pixel before carrying on image enhancement, if the subimage is too big, because the pixel that needs to traverse is too much to can lead to too much useless detection, cause the detection speed too slow easily. This problem can be well alleviated by dividing the image as described above. When the partition parameters are calculated, not only the variance of the pixel values of the pixel points in the sub-image which needs to be judged at present is considered, but also the noise problem in the sub-image is considered. The applicant researches and discovers that if the sub-image is relatively smooth, for example, only background pixel points or foreground pixel points are included, the variance of the pixel points in the sub-image is smaller, and correspondingly, the signal to noise ratio of the sub-image is larger.
In one embodiment, the detecting a pixel to be enhanced for a pixel in the sub-image to obtain a pixel to be enhanced includes:
performing edge detection on the subimage to obtain edge pixel points in the subimage, and storing all the edge pixel points in the subimage into an edge pixel point set barU;
for the edge pixel point h, storing all the pixel points with the similarity between the edge pixel point h and the edge pixel point h larger than a preset similarity threshold value into a similarity set simUh
Merging the similar sets corresponding to all the edge pixel points to obtain a set simU of all the pixel points with the similarity between the edge pixel points being greater than a preset similarity threshold valueall
The pixel points in the barU and the simU are comparedallThe pixel value of (1) is less than the preset pixel value thresholdAnd taking the pixel points as pixel points to be enhanced.
Through the calculation of similarity, some pixel points similar to the edge pixel points in terms of distance, position, gradient and the like can be added into the set to be enhanced, the effect of edge connection can be achieved for partial pixel points, for example, the pixel points between two edge pixel points, after the pixel points to be enhanced are enhanced, the pixel points to be enhanced have larger difference with the original pixel points, and all the pixel points to be enhanced can be considered to form the optimized edge image of the image. While passing through a pixel value threshold pair simUallThe pixel points in the process are screened, so that the excessive enhancement processing of the pixel points which are very similar to the edge pixel points can be avoided, and the pixel points are changed into noise points.
In one embodiment, the similarity is calculated by:
for the non-edge pixel point k in the subimage, the similarity simdx between the non-edge pixel point k and the edge pixel point hk,hThe calculation method of (c) is as follows:
Figure BDA0002915177970000061
in the formula, dil (k, h) represents the Euclidean distance between k and h, tekAnd tehGradient amplitudes, f, for k and h, respectivelykAnd fhRespectively representing the pixel values of k and h, and storing all the pixel points with Euclidean distance less than dil (k, h) in the sub-image into a set dilUh,dilf1Represents dilUhOf all pixel points in (a) and a standard deviation of Euclidean distances between h, tef1Represents dilUhOf all pixel points in (a) and a standard deviation, ff, of the gradient amplitude between h1Represents dilUhThe neihU represents the set of pixel points in the neighborhood of m × m of the pixel point h; dil (dw, h) represents the Euclidean distance between pixel points dw and h in the neihU; te (te)dwRepresenting the magnitude of the gradient of dw, fdwIndicating the pixel value of dw, dilf2Representing Euclidean distances between all pixel points in neihU and hStandard deviation, tef2Denotes the standard deviation, ff, of the gradient magnitude between all pixel points in the neihU and h1Representing the standard deviation of the pixel values between all pixel points in the neihU and h.
When calculating the similarity, the Euclidean distance between the pixel points is considered, the pixel value between the pixel points and the difference of the gradient amplitude are also considered, the more similar the pixel points are, the smaller the Euclidean distance is, the smaller the difference of the gradient amplitudes is, and the smaller the difference of the pixel values is, so that the function simdx isk,hThe larger the value of the similarity value is, the more the selected pixel points with the similarity greater than the preset similarity threshold value enter the candidate pixel point set to be enhanced, and therefore effective detection of the pixel points to be enhanced is improved.
In one embodiment, the enhancing the pixel to be enhanced includes:
and adopting a non-down sampling shear wave transformation algorithm to carry out enhancement processing on the pixel points to be enhanced.
While embodiments of the invention have been shown and described, it will be understood by those skilled in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (4)

1. An unmanned aerial vehicle remote sensing mapping image enhancement processing method is characterized by comprising the following steps:
s1, carrying out graying processing on the remote sensing mapping image to obtain a grayed image;
s2, dividing the gray image to obtain a plurality of sub-images;
s3, for a single sub-image, detecting pixel points to be enhanced in the sub-image to obtain pixel points to be enhanced;
s4, performing enhancement processing on the pixel points to be enhanced to obtain enhanced pixel points;
s5, performing the operations of the step S3 and the step S4 on all the sub-images to obtain an enhanced remote sensing mapping image;
the graying processing is carried out on the remote sensing mapping image to obtain a grayed image, and the grayed image comprises the following steps:
converting the remote sensing mapping image into an intermediate gray image p using a weighted average methodmid
Establishing a model to be solved:
Figure FDA0003695102360000011
Figure FDA0003695102360000012
Figure FDA0003695102360000013
Figure FDA0003695102360000014
in the formula, beta1、β2、β3Respectively represent a first, a second and a third preset weight parameter, beta123Has a value of 1; alpha is alpha1And alpha2Representing preset fourth and fifth weight parameters, alpha1And alpha2The sum of (1); umo denotes the model to be solved; pidU denotes pmidSet of all pixel points in (1), fuPixel value, F, representing a pixel point u in a greyscale imageuIs shown in pmidThe pixel value of the pixel point u in (8), neiuA set of all pixels in a neighborhood of c x c size representing pixel u; f. ofvRepresented in a grayed-out image neiuThe pixel value of the pixel point v in (1), qz represents a value function, if wuGreater than wvFor all elements in colrU, the value of qz is 0.1, otherwise qz is-0.1, colrU ═ L"a" and "b", L, a, b respectively represent the luminance component, the a component, the b component, and w component in the Lab color modeluAnd wvRespectively representing component values of w components of pixel points u and v in an Lab color model corresponding to the remote sensing mapping image, bdU and xdU represent two subsets of pidU, wherein pidU is bdU U xdU, and for the pixel points in bdU, the component values of three components of the pixel points in the Lab color model corresponding to the remote sensing mapping image are equal; if waGreater than wbFor all elements in colrU, then
Figure FDA0003695102360000021
The value of (a) is 0.5, otherwise,
Figure FDA0003695102360000022
is 1; ph denotes neiuAll the pixel points in (1) are in pmidThe standard deviation of the pixel value, t represents a control parameter, and the value range of t is [9.5,10.5 ]];
Umo is solved to obtain pmidThe Umo obtains the value when the maximum value is obtained by each pixel point in the image, thereby obtaining the gray image.
2. The method for enhancing the remote sensing mapping image of unmanned aerial vehicle according to claim 1, wherein the remote sensing mapping image is converted into a middle gray image p by using a weighted average methodmidThe method comprises the following steps:
converting the remotely sensed mapping image to an intermediate grayscale image p using the following formulamid
pmid(d)=0.115R(d)+0.588G(d)+0.297B(d)
In the formula, pmid(d) And (d) representing the pixel value of the pixel point d in the intermediate gray level image, and R (d), G (d), B (d) respectively representing the component values of the red component, the green component and the blue component of the pixel point d in the RGB color model corresponding to the remote sensing mapping image.
3. The unmanned aerial vehicle remote sensing mapping image enhancement processing method of claim 1, wherein the dividing of the grayed image to obtain a plurality of sub-images comprises:
dividing the gray-scale image in an iterative division mode:
for iteration 1, dividing the grayscale image into 4 sub-images with equal areas, and storing the numbers of the sub-images into a set hk0
Separately calculate hk0If the partition parameter is less than the preset partition threshold or the area of the subimage is less than the preset area threshold, the subimage is not subjected to iterative partition processing, and hk is processed1The number of the subimage needed to be processed by the iteration division is stored in a set nd0Performing the following steps;
for the nth iteration, the set nd is divided into 4 sub-images with equal area in the same way as the grayscale image is divided inton-1The sub-image corresponding to the serial number in (1) is divided, and the obtained sub-image is stored into a set hknPerforming the following steps;
separately calculate hknIf the partition parameter is less than the preset partition threshold or the area of the subimage is less than the preset area threshold, the subimage is not subjected to iterative partition processing, and hk is processednThe numbers of the sub-images to be subjected to the iterative division processing are stored in a set ndnIn (1).
4. The method for enhancing the unmanned aerial vehicle remote sensing mapping image according to claim 3, wherein the partition parameters are calculated in the following manner:
Figure FDA0003695102360000031
Figure FDA0003695102360000032
in the formula, fid represents a partition parameter, dw represents a value function, only the numerical value of a variable in brackets is taken to participate in calculation, hf represents the variance of pixel values of all pixel points in the sub-image, and χ represents1Hexix-2Respectively represent a seventh weight parameter and an eighth weight parameter, χ1Hexix-2Is 1, numz represents the total number of pixels in the subimage, feAnd representing the pixel value of the e-th pixel point in the sub-image, and sib represents the signal-to-noise ratio of the sub-image.
CN202110099444.5A 2021-01-25 2021-01-25 Unmanned aerial vehicle remote sensing mapping image enhancement processing method Active CN112950490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110099444.5A CN112950490B (en) 2021-01-25 2021-01-25 Unmanned aerial vehicle remote sensing mapping image enhancement processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110099444.5A CN112950490B (en) 2021-01-25 2021-01-25 Unmanned aerial vehicle remote sensing mapping image enhancement processing method

Publications (2)

Publication Number Publication Date
CN112950490A CN112950490A (en) 2021-06-11
CN112950490B true CN112950490B (en) 2022-07-19

Family

ID=76236601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110099444.5A Active CN112950490B (en) 2021-01-25 2021-01-25 Unmanned aerial vehicle remote sensing mapping image enhancement processing method

Country Status (1)

Country Link
CN (1) CN112950490B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114898210B (en) * 2022-05-10 2023-03-03 清研灵智信息咨询(北京)有限公司 Neural network-based remote sensing image target identification method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104537615A (en) * 2014-12-04 2015-04-22 大连理工大学 Local Retinex enhancement algorithm based on HSV color spaces
CN104778669A (en) * 2015-04-16 2015-07-15 北京邮电大学 Fast image denoising method and device
CN108154478A (en) * 2016-12-02 2018-06-12 航天星图科技(北京)有限公司 A kind of remote sensing image processing method
CN112233037A (en) * 2020-10-23 2021-01-15 新相微电子(上海)有限公司 Image enhancement system and method based on image segmentation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110280494A1 (en) * 2009-01-20 2011-11-17 Koninklijke Philips Electronics N.V. Method and apparatus for generating enhanced images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104537615A (en) * 2014-12-04 2015-04-22 大连理工大学 Local Retinex enhancement algorithm based on HSV color spaces
CN104778669A (en) * 2015-04-16 2015-07-15 北京邮电大学 Fast image denoising method and device
CN108154478A (en) * 2016-12-02 2018-06-12 航天星图科技(北京)有限公司 A kind of remote sensing image processing method
CN112233037A (en) * 2020-10-23 2021-01-15 新相微电子(上海)有限公司 Image enhancement system and method based on image segmentation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
对比度增强的彩色图像灰度化算法;刘美等;《长春理工大学学报(自然科学版)》;20181015(第05期);全文 *

Also Published As

Publication number Publication date
CN112950490A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN110414387B (en) Lane line multi-task learning detection method based on road segmentation
US8224085B2 (en) Noise reduced color image using panchromatic image
CN108416292B (en) Unmanned aerial vehicle aerial image road extraction method based on deep learning
CN110648316B (en) Steel coil end face edge detection method based on deep learning
EP1217580A2 (en) Method and apparatus for measuring color-texture distance and image segmentation based on said measure
CN112907460B (en) Remote sensing image enhancement method
US8041110B2 (en) Pixel interpolation method
CN105046701B (en) Multi-scale salient target detection method based on construction graph
CN109376641B (en) Moving vehicle detection method based on unmanned aerial vehicle aerial video
CN113077486B (en) Method and system for monitoring vegetation coverage rate in mountainous area
CN112950490B (en) Unmanned aerial vehicle remote sensing mapping image enhancement processing method
CN110728706A (en) SAR image fine registration method based on deep learning
EP1641285A1 (en) Image processing device for processing image having different color components arranged, image processing program, electronic camera, and image processing method
CN115937552A (en) Image matching method based on fusion of manual features and depth features
JP2004523839A (en) Dynamic chain-based thresholding
CN113068011B (en) Image sensor, image processing method and system
JP4649498B2 (en) Color correction method and system for image data
US10789687B2 (en) Image processing method and image processor performing the same
CN110796716B (en) Image coloring method based on multiple residual error network and regularized transfer learning
CN116310524A (en) Classification component information processing application system
CN114419081A (en) Image semantic segmentation method and system and readable storage medium
CN117893611B (en) Image sensor dirt detection method and device and computer equipment
CN112836708B (en) Image feature detection method based on Gram matrix and F norm
CN114445364B (en) Fundus image microaneurysm region detection method and imaging method thereof
CN117994503A (en) Heterogeneous remote sensing image target detection method based on iterative fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 7 / F, block a, Lishi financial building, 63 Huifeng East Road, Yinzhou District, Ningbo City, Zhejiang Province, 315100

Patentee after: Zhejiang Mingzhou Surveying and Mapping Institute

Address before: 7 / F, block a, Lishi financial building, 63 Huifeng East Road, Yinzhou District, Ningbo City, Zhejiang Province, 315100

Patentee before: Ningbo Yinzhou surveying and Mapping Institute

CP01 Change in the name or title of a patent holder