CN110047042A - A kind of space Relative Navigation target image local enhancement methods and system - Google Patents

A kind of space Relative Navigation target image local enhancement methods and system Download PDF

Info

Publication number
CN110047042A
CN110047042A CN201910181836.9A CN201910181836A CN110047042A CN 110047042 A CN110047042 A CN 110047042A CN 201910181836 A CN201910181836 A CN 201910181836A CN 110047042 A CN110047042 A CN 110047042A
Authority
CN
China
Prior art keywords
image
color
target
reflecting component
orbit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910181836.9A
Other languages
Chinese (zh)
Inventor
朱卫红
王大轶
史纪鑫
葛东明
邓润然
邹元杰
刘绍奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Spacecraft System Engineering
Original Assignee
Beijing Institute of Spacecraft System Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Spacecraft System Engineering filed Critical Beijing Institute of Spacecraft System Engineering
Priority to CN201910181836.9A priority Critical patent/CN110047042A/en
Publication of CN110047042A publication Critical patent/CN110047042A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of space Relative Navigation target image local enhancement methods and systems, wherein this method comprises the following steps: (1) obtaining the multichannel image sequence of in-orbit target;(2) Gaussian function of different scale is obtained;(3) reflecting component under three different scales is obtained;(4) the recovery factor of color is obtained using the proportionate relationship between three Color Channels in the multichannel image sequence of the in-orbit target of input, the reflecting component recovery factor of color being updated under three different scales;(5) the Multiscale Fusion reflection model with colour correction is obtained;(6) image gain is carried out to the Multiscale Fusion reflection model with colour correction to compensate to obtain enhancing image.The present invention reduces influence of the illumination to target imaging quality, can be provided for subsequent Relative Navigation more rationally with good image data, and then the accuracy of the success rate of raising target identification and dynamic parameters identification.

Description

A kind of space Relative Navigation target image local enhancement methods and system
Technical field
The invention belongs to the technical field of spacecraft Relative Navigation, in-orbit target apperception, research of vision and image procossing, More particularly to a kind of space Relative Navigation target image local enhancement methods and system.
Background technique
In recent years, with the development of spacecraft space in-orbit service technology, each spacefaring nation proposes in-orbit clothes one after another The concept of business and the plan of demonstration and verification, and spacecraft cognition technology is a key technology therein.
The means of spacecraft perception at present include the means such as vision camera, laser radar, infrared camera, wherein view-based access control model Spacecraft perception means are widely used in space flight, especially in the Relative Navigation of middle distance and short distance, vision Perception becomes indispensable technological means.However due to the characteristic of spacecraft orbit, space light environment is complicated, in certain feelings Under condition, space Relative Navigation target is since illumination is insufficient, uneven illumination is even or the factors such as posture lead to the imaging of extraterrestrial target Quality is increased the identification of the target of subsequent view-based access control model image and the difficulty of dynamic parameters identification by very big interference.
Summary of the invention
Technical problem solved by the present invention is having overcome the deficiencies of the prior art and provide a kind of space Relative Navigation target Image local Enhancement Method and system reduce influence of the illumination to target imaging quality, can provide more for subsequent Relative Navigation Rationally with good image data, and then improve target identification success rate and dynamic parameters identification accuracy.
The object of the invention is achieved by the following technical programs: it is according to an aspect of the present invention, provide a kind of space Relative Navigation target image local enhancement methods, described method includes following steps: (1) obtaining the multichannel image of in-orbit target Sequence;(2) it establishes multiple dimensioned incident light to shine, is shone according to multiple dimensioned incident light and obtain the Gaussian function of different scale;(3) basis Retinex algorithm, the incident components removed in the multichannel image sequence of in-orbit target obtain reflecting component;According to different scale Gaussian function obtain the reflecting component under three different scales;(4) the multichannel image sequence of the in-orbit target of input is utilized In three Color Channels between proportionate relationship obtain the recovery factor of color, the recovery factor of color is updated to three differences Reflecting component under scale obtains the reflecting component containing coloured three different scales for restoring the factor;It (5) will be in step (4) Reflecting component containing coloured three different scales for restoring the factor is weighted fusion, obtains with the multiple dimensioned of colour correction Merge reflection model;(6) image gain is carried out to the Multiscale Fusion reflection model with colour correction in step (5) to compensate To enhancing image.
In above-mentioned space Relative Navigation target image local enhancement methods, in step (1), the multi-pass of in-orbit target is obtained Road image sequence includes: to obtain the in-orbit target with time tag and frame originating point information according to analogue system or in-orbit vision system Multichannel image sequence.
In above-mentioned space Relative Navigation target image local enhancement methods, in step (2), the public affairs of multiple dimensioned incident light photograph Formula is as follows:
Li,j(x, y)=Gj(x,y)*Si(x,y);
Wherein, Li,jFor incident light photograph;I-th of channel in subscript i expression original image, i=1,2,3, it respectively indicates red Chrominance channel, green channel and blue channel;Subscript j=1,2,3, represent three different scales;S is in-orbit target image sequence In original image;* convolution is indicated;X is the horizontal seat of the pixel in some figure in the multichannel image sequence of in-orbit target Mark, y are the ordinate of the pixel in some figure in the multichannel image sequence of in-orbit target, Gj(x, y) is different scale Gaussian function.
In above-mentioned space Relative Navigation target image local enhancement methods, the Gaussian function G of different scalej(x, y) are as follows:Wherein, σjFor standard deviation, that is, scale parameter, be worth it is smaller, current pixel by around as The influence of element is bigger, and local detail is more clear, but image color sensation is bad, is easy to appear deviation, otherwise color is more naturally, still Part is unintelligible, and K is gain coefficient.
In above-mentioned space Relative Navigation target image local enhancement methods, in step (3), according to the Gauss of different scale The reflecting component that function obtains under three different scales of the reflecting component under three different scales is obtained by the following formula:
ln(Ri,j(x, y))=ln (Si(x,y))-ln(Li,j(x,y));
Wherein, logarithm, R are asked in ln expressioni,jFor the reflecting component of the jth scale in i-th of channel.
In above-mentioned space Relative Navigation target image local enhancement methods, in step (4), the public affairs of the recovery factor of color Formula is as follows:
Wherein, Dynamic gene α=125, Dynamic gene β=1;
Reflecting component containing coloured three different scales for restoring the factor is obtained by following formula:
In above-mentioned space Relative Navigation target image local enhancement methods, in step (5), with the multiple dimensioned of colour correction Fusion reflection model is obtained by following formula:
Wherein, Ri(x, y) is the Multiscale Fusion reflection model in i-th of channel, weights omegaj=1/3.
In above-mentioned space Relative Navigation target image local enhancement methods, in step (5), to more rulers with colour correction Degree fusion reflection model carries out image gain and compensates to obtain enhancing image to include: pairIn i-th of channel Multiscale Fusion reflection model Ri(x, y) carries out line Property gray scale stretching will export image and be mapped to the value range (0-255) of color, obtain the image in enhanced i-th of channel R′i(x,y)。
In above-mentioned space Relative Navigation target image local enhancement methods, the value range of color is 0-255;
The image R ' in enhanced i-th of channeli(x, y) is obtained by following formula:
Wherein, Ri,maxAnd Ri,minRespectively i-th of the multiple dimensioned of channel is melted Close reflection model RiThe maxima and minima of (x, y).
According to another aspect of the present invention, a kind of space Relative Navigation target image local enhancement system is additionally provided, is wrapped It includes: the first module, for obtaining the multichannel image sequence of in-orbit target;Second module is shone for establishing multiple dimensioned incident light, It is shone according to multiple dimensioned incident light and obtains the Gaussian function of different scale;Third module, for according to Retinex algorithm, removal to exist Incident components in the multichannel image sequence of rail target obtain reflecting component;Three are obtained according to the Gaussian function of different scale Reflecting component under different scale;4th module, for three face in the multichannel image sequence of the in-orbit target using input Proportionate relationship between chrominance channel obtains the recovery factor of color, and the recovery factor of color is updated under three different scales Reflecting component obtains the reflecting component containing coloured three different scales for restoring the factor;5th module is used for the 4th mould Reflecting component containing coloured three different scales for restoring the factor in block is weighted fusion, obtains with the more of colour correction Scale merges reflection model;6th module, for the Multiscale Fusion reflection model with colour correction in the 5th module into Row image gain compensates to obtain enhancing image.
Compared with prior art, the present invention has the following advantages:
The present invention reduces influence of the illumination to target imaging quality, can be provided for subsequent Relative Navigation more rationally with it is high-quality Image data, and then improve target identification success rate and dynamic parameters identification accuracy.
Detailed description of the invention
By reading the following detailed description of the preferred embodiment, various other advantages and benefits are common for this field Technical staff will become clear.The drawings are only for the purpose of illustrating a preferred embodiment, and is not considered as to the present invention Limitation.And throughout the drawings, the same reference numbers will be used to refer to the same parts.In the accompanying drawings:
Fig. 1 is the flow chart of Relative Navigation target image local enhancement methods in space provided in an embodiment of the present invention;
Fig. 2 is certain extraterrestrial target synergistic effect comparing result provided in an embodiment of the present invention;Wherein, (a) is in-orbit target Original image in sequence image;It (b) is enhanced target image;
Fig. 3 (a) is the objective contour provided in an embodiment of the present invention detected based on original image to target Schematic diagram;
Fig. 3 (b) is the objective contour provided in an embodiment of the present invention detected based on image after enhancing to target Schematic diagram.
Specific embodiment
Exemplary embodiments of the present disclosure are described in more detail below with reference to accompanying drawings.Although showing the disclosure in attached drawing Exemplary embodiment, it being understood, however, that may be realized in various forms the disclosure without should be by embodiments set forth here It is limited.On the contrary, these embodiments are provided to facilitate a more thoroughly understanding of the present invention, and can be by the scope of the present disclosure It is fully disclosed to those skilled in the art.It should be noted that in the absence of conflict, embodiment in the present invention and Feature in embodiment can be combined with each other.The present invention will be described in detail below with reference to the accompanying drawings and embodiments.
Fig. 1 is the flow chart of Relative Navigation target image local enhancement methods in space provided in an embodiment of the present invention.Such as Fig. 1 Shown, this method comprises the following steps:
(1) the multichannel image sequence of in-orbit target is obtained;
(2) it establishes multiple dimensioned incident light to shine, is shone according to multiple dimensioned incident light and obtain the Gaussian function of different scale;
(3) according to Retinex algorithm, the incident components removed in the multichannel image sequence of in-orbit target obtain reflection point Amount;The reflecting component under three different scales is obtained according to the Gaussian function of different scale;
(4) it is obtained using the proportionate relationship between three Color Channels in the multichannel image sequence of the in-orbit target of input The recovery factor of color obtains the reflecting component that the recovery factor of color is updated under three different scales containing coloured extensive The reflecting component of three different scales of multifactor;
(5) reflecting component containing coloured three different scales for restoring the factor in step (4) is weighted fusion, Obtain the Multiscale Fusion reflection model with colour correction;
(6) the Multiscale Fusion reflection model progress image gain with colour correction in step (5) is compensated and is increased Strong image.
Correlation step, concept and specific algorithm are described in detail below:
(1) binocular vision time-series image
According to analogue system or in-orbit vision system, the sequence image with time tag and frame originating point information is obtained, usually The generally rgb format image of triple channel.
(2) based on Retinex vision-Modified Retinal Model image enhancement model
According to Retinex algorithm, the incident components L removed in the multichannel image sequence S of in-orbit target obtains reflection point R is measured, the style of target is obtained.
According to Retinex algorithm, the irradiation level model of global illumination model is can be used in the illumination model of extraterrestrial target model Description, i.e., for each point (x, y) in the multichannel image sequence of in-orbit target, pixel value S can be indicated are as follows:
S (x, y)=R (x, y) L (x, y) (1)
Wherein, R (x, y) indicates the reflection coefficient unrelated with illumination, and the material, shape and texture corresponding to extraterrestrial target are believed Breath etc.;L (x, y) indicates all illumination for being incident on body surface.The algorithm of Retinex is exactly to pass through operation, is removed in image S Incident components L obtain reflecting component R, obtain the style of target.The difficult point of the algorithm be R and L all and be it is unknown, therefore Want that restoring reflection and incident components by original image is that an indeterminate equation solves, general R ∈ [0,1], S≤L.
Require to propose different illumination estimation methods according to above-mentioned condition under normal circumstances, first from logarithmic image s (x, Y)=log (S (x, y)) estimates illumination component l (x, y)=log (L (x, y)) of image, and the reflection point of image then can be obtained It measures r (x, y), R (x, y)=ln (r (x, y)) finally can be obtained.
(3) the estimation model that the incident light based on Gauss annular function shines
Illumination model is established as surround function using Gaussian function:
Li(x, y)=G (x, y) * Si(x,y) (2)
Wherein, G (x, y) is Gaussian function, and * indicates that convolution algorithm, subscript i are expressed as the port number of image, i=1,2,3, Respectively red channel, green channel and blue channel, σ are standard deviation, that is, scale parameter, are worth smaller, current pixel Influenced by surrounding pixel bigger, local detail is more clear, but image color sensation is bad, is easy to appear deviation, otherwise color More naturally, still part is unintelligible, K is gain coefficient.
It is obtained according to S (x, y)=R (x, y) L (x, y)To reflecting componentLog (R is obtained as logarithmic transformationi(x, y))=log (Si(x,y))-log(G(x,y)*Si (x, y)), (influence of illumination is eliminated by subtracting illumination component on the original image).
Specifically, Retinex image enhancement key is to establish illumination model, however illumination component is sought from original image It is a singular problem.For the present invention using the best center ring of current image enhancement effects around algorithm, it is present that it is convenient, which to realize, A kind of illumination component estimation model being most widely used.Here using Gaussian function as surround function, then illumination model is
Li(x, y)=G (x, y) * Si(x,y)
Wherein * indicates that convolution algorithm, subscript i are expressed as the port number of image, i=1,2,3, respectively red channel, green Chrominance channel and blue channel, σ are standard deviation, that is, scale parameter, and value is smaller, and current pixel is influenced by surrounding pixel Bigger, local detail is more clear, but image color sensation is bad, is easy to appear deviation, otherwise color is more naturally, still part is unclear Clear, K is gain coefficient.Therefore Retinex algorithm is difficult to take into account in terms of details and coloration two.
(2) substitution (1) is had:
The calculating of above formula can carry out in log-domain, division arithmetic can be become to subtraction, i.e., by original Illumination component is subtracted on image to eliminate the influence of illumination, it may be assumed that
log(Ri(x, y))=log (Si(x,y))-log(G(x,y)*Si(x,y)) (5)
(4) multiple dimensioned Retinex image enhancement model
σ takes σ respectively in formula (3)123, σ123Respectively represented image low frequency enhancing, intermediate frequency enhancing and High frequency enhancement, according to σ123Obtain Gj(x, y), j=1,2,3;
According to Gj(x, y) and log (Ri(x, y))=log (Si(x,y))-log(G(x,y)*Si(x, y)) it obtainsWherein, weights omegaj=1/3.
Step (3) derives as can be seen that single scale model cannot be considered in terms of local detail and coloration, and the present invention is using multiple dimensioned Retinex algorithm, the reflection of light model of the image of channel i are as follows:
Wherein N is scale sum, ωjFor the corresponding weight of j-th of scale, general value is equal weight ωj=1/N, GjFor Gauss surround function under j-th of scale.
Usual N takes 3, three scale (σ123) image enhancement model respectively represented the low frequency, intermediate frequency and height of image Frequency enhances, and can obtain ideal enhancing result in this way.
(5) color distortion is corrected
Face is calculated using the proportionate relationship between three Color Channels in the multichannel image sequence S of the in-orbit target of input The recovery factor of color, then corrects output image using the recovery factor of obtained color, eliminates its cross-color problem,
The calculation formula of the recovery factor of color isDynamic gene α= 125, Dynamic gene β=1;
The recovery factor of color is substituted into formulaIt obtainsEliminate its cross-color problem.
Since multi-scale image enhancing is respectively to calculate separately to R, tri- channels G, B, so enhanced image The proportionate relationship of three colors may change with original image, therefore will lead to cross-color.Therefore, it is necessary in image enhancement Color is restored afterwards.Specific practice is that the proportionate relationship between three Color Channels in the original image using input calculates Then the recovery factor of color restores the factor using obtained color to correct output image, eliminates its cross-color problem, face The formula that color restores factor C is as follows:
Wherein f is mapping function.
Show that color recovery effects are preferable when mapping function is logarithmic function by the comparison of related many experiments, it may be assumed that
Rule of thumb, wherein α=125, β=1, convolution (6) have:
(6) gain compensation of image
To the reflecting component R in the channel i in formulai(x, y) carries out linear gradation stretching and output image is mapped to color Value range (0-255), it may be assumed that
Wherein R 'i(x, y) is that i-th of Color Channel carries out the output image after linear gradation stretching, Ri(x, y) is i-th The enhancing image of Color Channel, Ri,maxAnd Ri,minThe maxima and minima of image before respectively stretching.
Specifically, enhancing using multiple dimensioned algorithm image, but due to using Logarithmic calculation, lead to the picture of image Element value is not in the range of display, in order to be transformed into conventional color value range, can negate logarithm to enhancing image, but It is since enhancing image has subtracted the irradiation image of reflection image behavioral characteristics, simultaneously this method is unsatisfactory.Here Using relatively simple effective automatic gain compensation, the value range that output image is mapped to color is stretched through linear gradation (0-255), it may be assumed that
Wherein R 'i(x, y) is that i-th of Color Channel carries out the output image after linear gradation stretching, Ri(x, y) is i-th The enhancing image of Color Channel, Ri,maxAnd Ri,minThe maxima and minima of image before respectively stretching.
Fig. 2 is certain extraterrestrial target synergistic effect comparing result provided in an embodiment of the present invention;Wherein, (a) is in-orbit target Original image in sequence image;It (b) is enhanced target image.
By Fig. 2 (a) and Fig. 2 (b) comparison it is found that after using image enchancing method proposed by the present invention, in-orbit target image In sequence, the brightness of the dash area of target has obtained apparent enhancing, while Celestial Background and the color of target itself and original Beginning image sequence maintains preferable consistency, is effective this demonstrate methods herein.
Fig. 3 (a) is the objective contour provided in an embodiment of the present invention detected based on original image to target Schematic diagram;Fig. 3 (b) is the objective contour provided in an embodiment of the present invention detected based on image after enhancing to target Schematic diagram.
By Fig. 3 (a) and Fig. 3 (b) it is found that after by carrying out local enhancement to target, obtained by algorithm of target detection Objective contour is more clear completely, more ideal image input can be provided for subsequent development Relative Navigation, to further increase Add the identification range of target, improve the accuracy of identification of target position, speed and posture.
The present embodiment additionally provides a kind of space Relative Navigation target image local enhancement system, comprising: the first module is used In the multichannel image sequence for obtaining in-orbit target;Second module is shone for establishing multiple dimensioned incident light, according to multiple dimensioned incidence Illumination obtains the Gaussian function of different scale;Third module, for removing the multichannel of in-orbit target according to Retinex algorithm Incident components in image sequence obtain reflecting component;It is obtained according to the Gaussian function of different scale anti-under three different scales Penetrate component;4th module, for the ratio between three Color Channels in the multichannel image sequence of the in-orbit target using input Example relationship obtains the recovery factor of color, and the reflecting component that the recovery factor of color is updated under three different scales is contained The reflecting component of coloured three different scales for restoring the factor;5th module, it is coloured for will contain in the 4th module The reflecting component for restoring three different scales of the factor is weighted fusion, obtains the Multiscale Fusion reflection mould with colour correction Type;6th module, for carrying out image gain compensation to the Multiscale Fusion reflection model with colour correction in the 5th module Obtain enhancing image.
The problem for leading to target identification Yu dynamic parameters identification difficulty, this implementation are influenced for due to space light environment Example reduces influence of the illumination to target imaging quality, can be provided for subsequent Relative Navigation more rationally with good image data, And then improve the success rate of target identification and the accuracy of dynamic parameters identification.
Embodiment described above is the present invention more preferably specific embodiment, and those skilled in the art is in this hair The usual variations and alternatives carried out in bright technical proposal scope should be all included within the scope of the present invention.

Claims (10)

1. a kind of space Relative Navigation target image local enhancement methods, which is characterized in that described method includes following steps:
(1) the multichannel image sequence of in-orbit target is obtained;
(2) it establishes multiple dimensioned incident light to shine, is shone according to multiple dimensioned incident light and obtain the Gaussian function of different scale;
(3) according to Retinex algorithm, the incident components removed in the multichannel image sequence of in-orbit target obtain reflecting component; The reflecting component under three different scales is obtained according to the Gaussian function of different scale;
(4) color is obtained using the proportionate relationship between three Color Channels in the multichannel image sequence of the in-orbit target of input The recovery factor, by the reflecting component that the recovery factor of color is updated under three different scales obtain containing coloured recovery because The reflecting component of three different scales of son;
(5) reflecting component containing coloured three different scales for restoring the factor in step (4) is weighted fusion, obtained Multiscale Fusion reflection model with colour correction;
(6) image gain is carried out to the Multiscale Fusion reflection model with colour correction in step (5) to compensate to obtain enhancing figure Picture.
2. Relative Navigation target image local enhancement methods in space according to claim 1, it is characterised in that: in step (1) in, the multichannel image sequence for obtaining in-orbit target includes: to obtain the band time according to analogue system or in-orbit vision system The multichannel image sequence of the in-orbit target of label and frame originating point information.
3. Relative Navigation target image local enhancement methods in space according to claim 1, it is characterised in that: in step (2) in, the formula that multiple dimensioned incident light shines is as follows:
Li,j(x, y)=Gj(x,y)*Si(x,y);
Wherein, Li,jFor incident light photograph;I-th of channel in subscript i expression original image, i=1,2,3, it respectively indicates red logical Road, green channel and blue channel;Subscript j=1,2,3, represent three different scales;S is in in-orbit target image sequence Original image;* convolution is indicated;X is the abscissa of the pixel in some figure in the multichannel image sequence of in-orbit target, y For the ordinate of the pixel in some figure in the multichannel image sequence of in-orbit target, Gj(x, y) is the height of different scale This function.
4. Relative Navigation target image local enhancement methods in space according to claim 3, it is characterised in that: different scale Gaussian function Gj(x, y) are as follows:Wherein, σjFor standard deviation, that is, scale parameter, value is smaller, Current pixel is influenced bigger by surrounding pixel, and local detail is more clear, but image color sensation is bad, is easy to appear deviation, Otherwise for color more naturally, still part is unintelligible, K is gain coefficient.
5. Relative Navigation target image local enhancement methods in space according to claim 4, it is characterised in that: in step (3) in, the reflection under three different scales of the reflecting component under three different scales is obtained according to the Gaussian function of different scale Component is obtained by the following formula:
ln(Ri,j(x, y))=ln (Si(x,y))-ln(Li,j(x,y));
Wherein, logarithm, R are asked in ln expressioni,jFor the reflecting component of the jth scale in i-th of channel.
6. Relative Navigation target image local enhancement methods in space according to claim 5, it is characterised in that: in step (4) in, the formula of the recovery factor of color is as follows:
Wherein, Dynamic gene α=125, Dynamic gene β=1;
Reflecting component containing coloured three different scales for restoring the factor is obtained by following formula:
7. Relative Navigation target image local enhancement methods in space according to claim 6, it is characterised in that: in step (5) in, the Multiscale Fusion reflection model with colour correction is obtained by following formula:
Wherein, Ri(x, y) is the Multiscale Fusion reflection model in i-th of channel, weights omegaj=1/3.
8. Relative Navigation target image local enhancement methods in space according to claim 7, it is characterised in that: in step (5) in, image gain is carried out to the Multiscale Fusion reflection model with colour correction and compensates to obtain enhancing image to include: pairIn i-th of channel Multiscale Fusion reflection model Ri(x, y) carries out line Property gray scale stretching will export image and be mapped to the value range of color, obtain the image R in enhanced i-th of channeli'(x,y)。
9. Relative Navigation target image local enhancement methods in space according to claim 8, it is characterised in that: color takes Value range is 0-255;
The image R in enhanced i-th of channeli' (x, y) obtained by following formula:
Ri,maxAnd Ri,minThe Multiscale Fusion in respectively i-th channel wherein, instead Penetrate model RiThe maxima and minima of (x, y).
10. a kind of space Relative Navigation target image local enhancement system, characterized by comprising:
First module, for obtaining the multichannel image sequence of in-orbit target;
Second module shines for establishing multiple dimensioned incident light, is shone according to multiple dimensioned incident light and obtain the Gaussian function of different scale;
Third module is obtained for according to Retinex algorithm, removing the incident components in the multichannel image sequence of in-orbit target Reflecting component;The reflecting component under three different scales is obtained according to the Gaussian function of different scale;
4th module is closed for the ratio between three Color Channels in the multichannel image sequence of the in-orbit target using input System obtains the recovery factor of color, and the reflecting component that the recovery factor of color is updated under three different scales is obtained containing face The reflecting component of three different scales of the recovery factor of color;
5th module, for being added the reflecting component containing coloured three different scales for restoring the factor in the 4th module Power fusion, obtains the Multiscale Fusion reflection model with colour correction;
6th module, for carrying out image gain compensation to the Multiscale Fusion reflection model with colour correction in the 5th module Obtain enhancing image.
CN201910181836.9A 2019-03-11 2019-03-11 A kind of space Relative Navigation target image local enhancement methods and system Pending CN110047042A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910181836.9A CN110047042A (en) 2019-03-11 2019-03-11 A kind of space Relative Navigation target image local enhancement methods and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910181836.9A CN110047042A (en) 2019-03-11 2019-03-11 A kind of space Relative Navigation target image local enhancement methods and system

Publications (1)

Publication Number Publication Date
CN110047042A true CN110047042A (en) 2019-07-23

Family

ID=67273667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910181836.9A Pending CN110047042A (en) 2019-03-11 2019-03-11 A kind of space Relative Navigation target image local enhancement methods and system

Country Status (1)

Country Link
CN (1) CN110047042A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116612199A (en) * 2023-07-20 2023-08-18 中国科学院光电技术研究所 On-orbit combined color calibration method for deep space exploration color camera

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080107333A1 (en) * 2006-11-08 2008-05-08 Amir Mazinani Method and apparatus for color image correction
CN104091179A (en) * 2014-07-01 2014-10-08 北京工业大学 Intelligent blumeria graminis spore picture identification method
CN105913455A (en) * 2016-04-11 2016-08-31 南京理工大学 Local image enhancement-based object tracking method
CN107330871A (en) * 2017-06-29 2017-11-07 西安工程大学 The image enchancing method of insulator automatic identification is run under bad weather condition
CN109215042A (en) * 2018-09-28 2019-01-15 吉林电力股份有限公司科技开发分公司 A kind of photovoltaic battery panel hot spot effect detection system based on computer vision and its calculation method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080107333A1 (en) * 2006-11-08 2008-05-08 Amir Mazinani Method and apparatus for color image correction
CN104091179A (en) * 2014-07-01 2014-10-08 北京工业大学 Intelligent blumeria graminis spore picture identification method
CN105913455A (en) * 2016-04-11 2016-08-31 南京理工大学 Local image enhancement-based object tracking method
CN107330871A (en) * 2017-06-29 2017-11-07 西安工程大学 The image enchancing method of insulator automatic identification is run under bad weather condition
CN109215042A (en) * 2018-09-28 2019-01-15 吉林电力股份有限公司科技开发分公司 A kind of photovoltaic battery panel hot spot effect detection system based on computer vision and its calculation method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴迪等: "图像去雾的最新研究进展", 《自动化学报》 *
肖进胜等: "基于不同色彩空间融合的快速图像增强算法", 《自动化学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116612199A (en) * 2023-07-20 2023-08-18 中国科学院光电技术研究所 On-orbit combined color calibration method for deep space exploration color camera
CN116612199B (en) * 2023-07-20 2024-02-06 中国科学院光电技术研究所 On-orbit combined color calibration method for deep space exploration color camera

Similar Documents

Publication Publication Date Title
Golts et al. Unsupervised single image dehazing using dark channel prior loss
Zhuang et al. Underwater image enhancement with hyper-laplacian reflectance priors
CN103914699B (en) A kind of method of the image enhaucament of the automatic lip gloss based on color space
CN102830793B (en) Sight tracing and equipment
CN107123088B (en) A kind of method of automatic replacement photo background color
CN104318569B (en) Space salient region extraction method based on depth variation model
CN103430208A (en) Image processing device, image processing method, and control program
CN107798661B (en) Self-adaptive image enhancement method
Wang et al. Variational single nighttime image haze removal with a gray haze-line prior
CN110853119B (en) Reference picture-based makeup transfer method with robustness
CN112508814B (en) Image tone restoration type defogging enhancement method based on unmanned aerial vehicle at low altitude visual angle
US20230059499A1 (en) Image processing system, image processing method, and non-transitory computer readable medium
Tang et al. A local flatness based variational approach to retinex
CN112991222A (en) Image haze removal processing method and system, computer equipment, terminal and application
CN113436284A (en) Image processing method and device, computer equipment and storage medium
CN110047042A (en) A kind of space Relative Navigation target image local enhancement methods and system
CN113052783A (en) Face image fusion method based on face key points
CN104240197B (en) A kind of erasing method for keeping contrast, colour consistency and gray-scale pixels feature
CN113935917A (en) Optical remote sensing image thin cloud removing method based on cloud picture operation and multi-scale generation countermeasure network
Finlayson et al. Lookup-table-based gradient field reconstruction
CN113506230B (en) Photovoltaic power station aerial image dodging processing method based on machine vision
CN114187380B (en) Color transfer method based on visual saliency and channel attention mechanism
Varkonyi-Koczy et al. High-dynamic-range image reproduction methods
Qiao et al. UIE-FSMC: Underwater Image Enhancement Based on Few-Shot Learning and Multi-Color Space
CN114882346A (en) Underwater robot target autonomous identification method based on vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190723