CN111739039B - Rapid centroid positioning method, system and device based on edge extraction - Google Patents

Rapid centroid positioning method, system and device based on edge extraction Download PDF

Info

Publication number
CN111739039B
CN111739039B CN202010779297.1A CN202010779297A CN111739039B CN 111739039 B CN111739039 B CN 111739039B CN 202010779297 A CN202010779297 A CN 202010779297A CN 111739039 B CN111739039 B CN 111739039B
Authority
CN
China
Prior art keywords
celestial body
target celestial
image
edge
centroid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010779297.1A
Other languages
Chinese (zh)
Other versions
CN111739039A (en
Inventor
宋小春
刘辉
刘泽文
王青
陈萍萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control and Electronic Technology
Original Assignee
Beijing Institute of Control and Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control and Electronic Technology filed Critical Beijing Institute of Control and Electronic Technology
Priority to CN202010779297.1A priority Critical patent/CN111739039B/en
Publication of CN111739039A publication Critical patent/CN111739039A/en
Application granted granted Critical
Publication of CN111739039B publication Critical patent/CN111739039B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

The invention belongs to the field of image processing and centroid positioning, particularly relates to a method, a system and a device for quickly positioning a centroid based on edge extraction, and aims to solve the problem that the prior art cannot quickly and accurately extract the centroid of a target small celestial body. The invention comprises the following steps: carrying out binarization segmentation on the target celestial body image; extracting and marking the edge of the binaryzation target celestial body image; calculating the centroid coordinate of the largest edge region in the edge region image with the mark through a marking algorithm; converting the multiple centroid coordinates into astronomical coordinates and fitting; and if the slope change of the fitting track is overlarge or the distance between the fitting track and the actual track is overlarge, starting the multispectral camera to acquire a multispectral overlapped image, and extracting the centroid coordinate again to acquire the final centroid coordinate of the target celestial body. The invention greatly reduces the calculation amount, improves the calculation speed and solves the problem of quick positioning of the target centroid in large-range imaging while ensuring the precision.

Description

Rapid centroid positioning method, system and device based on edge extraction
Technical Field
The invention belongs to the technical field of image processing and centroid positioning, and particularly relates to a method, a system and a device for rapidly positioning a centroid based on edge extraction.
Background
An important goal of deep space exploration is to find ways to exploit and utilize space resources to address the resource and environmental challenges that humans are increasingly facing. The detection and the impact of the small planet in the deep space exploration are the basis for acquiring the cosmic information and realizing the final aim of the human deep space exploration.
In the traditional method for positioning the centroid of the asteroid, the centroid of the asteroid surface target is mainly positioned by an estimation method based on gray level, and the centroid detection is directly carried out by utilizing gray level weighting information in a given image area. Since all information in the region is utilized, it is robust to noise and outliers. However, this method has a large amount of calculation and a slow speed, and it is difficult to meet the real-time requirement of the system.
Some articles also propose to simplify the surface target into a line target to improve the speed of centroid location by a gradient-based estimation method, which generally includes two steps of edge precise extraction and ellipse fitting, wherein the edge precise extraction includes the steps of gradient estimation, non-maximization suppression, threshold segmentation, sub-pixel estimation and the like, and these calculations are complex and seriously affect the real-time performance of the fitting algorithm.
Generally, the traditional method for positioning the centroid of the small planet has large calculation amount and low speed, while the existing method for positioning the centroid of the small planet based on edge extraction has complex calculation process in the process of edge accurate extraction and seriously influences the real-time property of a subsequent fitting algorithm, and the field still urgently needs a method for positioning the centroid of the small planet, so that the fast centroid positioning of the small planet can be realized on the premise of ensuring the accuracy.
Disclosure of Invention
In order to solve the above problems in the prior art, that is, the prior art cannot realize the rapid and accurate extraction of the centroid of the target small celestial body, the invention provides a rapid centroid positioning method based on edge extraction, which comprises the following steps:
step S10, obtaining a target celestial body image, and performing image binarization segmentation by a maximum inter-class variance method to obtain a binarization target celestial body image;
step S20, extracting the edge of the binaryzation target celestial body image, and performing edge marking to obtain a marked edge region image;
step S30, taking the largest edge area in the edge area image with the mark as the edge area of the target celestial body, and calculating the centroid coordinate of the edge area of the target celestial body through a marking algorithm to obtain the centroid coordinate of the target celestial body;
step S40, obtaining the centroid coordinate of the target celestial body for multiple times through the method corresponding to the steps S10-S30 according to the set time interval, converting the centroid coordinate into an astronomical coordinate, and performing curve fitting by taking the centroid coordinate as a discrete point to obtain the fitting track of the target celestial body; acquiring the running track of a target celestial body through ephemeris;
step S50, calculating the distance between the fitting track and the running track of the target celestial body at the time t and the slope of the tangent line of the fitting track of the target celestial body at each time;
and step S60, if the distance at the time t or the slope change rate at the time t and the time t +1 is greater than a set threshold, starting the multispectral camera, adding set times of operations for acquiring multispectral segment superposed images of the target celestial body between the time t +1 and the time t +2, calculating the centroid coordinate of the target celestial body, and acquiring the final centroid coordinate of the target celestial body.
In some preferred embodiments, in step S20, "extracting the edge of the binarized target celestial object image", the method is:
step S211, setting the coordinates of the upper left corner of the binaryzation target celestial body image as (0, 0);
step S212, sequentially carrying out horizontal and vertical scanning within the range that x is more than or equal to 0 and less than Nh-1 and y is more than or equal to 0 and less than Ml-1, and skipping to step S213 if the pixel value of the current point (x, y) is 255; wherein Ml and Nh are the column number and the row number of the binaryzation target celestial body image;
step S213, judging whether the pixel values of 8 neighborhood points of the current point (x, y) are all 255, if so, setting the pixel value of the current point (x, y) to be 0; if not, jumping to step S214;
step S214, judging whether the binaryzation target celestial body image is traversed or not, and if not, jumping to the step S212; if yes, edge extraction is completed.
In some preferred embodiments, step S20, "perform edge marking, obtain marked edge region image", is performed by:
step S221, searching a seed point with the gray level of 255 in the image after the edge extraction, and rewriting the pixel value of the seed point into a mark value Snum = 254;
step S222, searching pixel points with a pixel value of 255 in 8 neighborhoods of the seed point, rewriting the pixel values into label values same as the seed point and storing the label values in a stack area;
step S223, searching pixel points with a pixel value of 255 in 8 neighborhoods of the pixels in the stack area, storing the pixel points in the stack area, rewriting the pixel values of the pixel points into label values which are the same as those of the seed points, and releasing the pixel points out of the stack area;
step S224, judging whether pixel points exist in the stack area, and if yes, jumping to step S223; otherwise, jumping to step S225;
step S225, Snum = Snum-1, whether the full image after the edge extraction is traversed or not is judged, and if not, the step S221 is skipped; otherwise, edge marking is completed.
In some preferred embodiments, in step S30, "calculating centroid coordinates of the edge region of the target celestial body by using a labeling algorithm to obtain the centroid coordinates of the target celestial body" includes:
setting the gray value of pixel points of the edge image of the target celestial body as
Figure 122128DEST_PATH_IMAGE002
Wherein, in the step (A),
Figure 711372DEST_PATH_IMAGE004
and
Figure 22268DEST_PATH_IMAGE006
respectively the number of rows and columns of pixel points in the image, which
Figure 824002DEST_PATH_IMAGE008
The cartesian geometric matrix of
Figure 169532DEST_PATH_IMAGE010
Wherein, W and H are the width and height of the edge image of the target celestial body respectively;
the centroid coordinates of the target celestial body
Figure 257574DEST_PATH_IMAGE012
Comprises the following steps:
Figure 429667DEST_PATH_IMAGE013
in some preferred embodiments, step S10 is preceded by:
setting a celestial body near a target celestial body as a reference celestial body group, acquiring the gradual change trend of the main light source color in the reference celestial body group according to the image color of the reference celestial body group, and according to the gradual change trend, enhancing the main light source color of the target celestial body in the gradual change trend decreasing direction part or weakening the main light source color of the target celestial body in the gradual change trend increasing direction part.
In some preferred embodiments, the step of "extracting the edge of the binarized target celestial body image" in the step S20 is followed by a step of edge information supplement, and the method is as follows:
acquiring the spin direction and spin cycle of the target celestial body, marking an edge point set of a high-resolution surface of the target celestial body at the time t1, and calculating the time t2 when the edge point set of the high-resolution surface appears in a low-resolution surface;
and at the time t2, supplementing the edge point set of the corresponding position with the edge point set of the target celestial body high-resolution surface at the time t 1.
On the other hand, the invention provides a rapid centroid location system based on edge extraction, and the centroid location system comprises an input module, an image segmentation module, an edge extraction and marking module, a centroid coordinate calculation module, a track acquisition module, a centroid coordinate correction module and an output module;
the input module is configured to acquire and input a target celestial body image;
the image segmentation module is configured to perform image binarization segmentation on the target celestial body image by a maximum inter-class variance method to obtain a binarization target celestial body image;
the edge extracting and marking module is configured to extract the edge of the binaryzation target celestial body image, carry out edge marking and obtain a marked edge region image;
the centroid coordinate calculation module is configured to calculate the centroid coordinate of the edge area of the target celestial body by using the largest edge area in the edge area image with the mark as the edge area of the target celestial body through a marking algorithm, so as to obtain the centroid coordinate of the target celestial body;
the track acquisition module is configured to acquire the centroid coordinate of the target celestial body for multiple times through the input module, the image segmentation module, the edge extraction and marking module and the centroid coordinate calculation module according to a set time interval, convert the centroid coordinate into an astronomical coordinate and perform curve fitting as a discrete point to acquire a fitting track of the target celestial body; acquiring the running track of a target celestial body through ephemeris;
the centroid coordinate correction module is configured to calculate the distance between the fitting track and the running track of the target celestial body at the time t and the slope of the tangent line of the fitting track of the target celestial body at each time, if the distance at the time t or the slope change rate of the time t and the time t +1 is greater than a set threshold, the multispectral camera is started, a set number of times of operation of obtaining multispectral segment superposed images of the target celestial body is increased between the time t +1 and the time t +2, centroid coordinate calculation of the target celestial body is carried out, and the final centroid coordinate of the target celestial body is obtained;
the output module is configured to output the obtained centroid coordinates of the target celestial body.
In a third aspect of the present invention, a storage device is provided, in which a plurality of programs are stored, the programs being adapted to be loaded and executed by a processor to implement the above-mentioned fast centroid localization method based on edge extraction.
In a fourth aspect of the present invention, a processing apparatus is provided, which includes a processor, a storage device; the processor is suitable for executing various programs; the storage device is suitable for storing a plurality of programs; the program is adapted to be loaded and executed by a processor to implement the above-described fast centroid location method based on edge extraction.
The invention has the beneficial effects that:
(1) the rapid centroid location method based on edge extraction simplifies the face target in centroid extraction into a line target, greatly reduces the calculated amount, improves the calculating speed and solves the problem of rapid location of the centroid of the target in large-range imaging while ensuring the precision.
(2) The invention relates to a rapid centroid positioning method based on edge extraction, which converts the centroid coordinates of an extracted target celestial body into astronomical coordinates and performs fitting, starts a multispectral camera based on overlarge slope change of a fitting track or overlarge distance between the fitting track and the running track of an actual target celestial body, acquires a multispectral segment superposed image of the target celestial body, and performs centroid coordinate calculation again, thereby greatly improving the accuracy and precision of centroid extraction.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a schematic flow chart of the fast centroid locating method based on edge extraction according to the present invention.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
The invention relates to a rapid centroid positioning method based on edge extraction, which comprises the following steps:
step S10, obtaining a target celestial body image, and performing image binarization segmentation by a maximum inter-class variance method to obtain a binarization target celestial body image;
step S20, extracting the edge of the binaryzation target celestial body image, and performing edge marking to obtain a marked edge region image;
step S30, taking the largest edge area in the edge area image with the mark as the edge area of the target celestial body, and calculating the centroid coordinate of the edge area of the target celestial body through a marking algorithm to obtain the centroid coordinate of the target celestial body;
step S40, obtaining the centroid coordinate of the target celestial body for multiple times through the method corresponding to the steps S10-S30 according to the set time interval, converting the centroid coordinate into an astronomical coordinate, and performing curve fitting by taking the centroid coordinate as a discrete point to obtain the fitting track of the target celestial body; acquiring the running track of a target celestial body through ephemeris;
step S50, calculating the distance between the fitting track and the running track of the target celestial body at the time t and the slope of the tangent line of the fitting track of the target celestial body at each time;
and step S60, if the distance at the time t or the slope change rate at the time t and the time t +1 is greater than a set threshold, starting the multispectral camera, adding set times of operations for acquiring multispectral segment superposed images of the target celestial body between the time t +1 and the time t +2, calculating the centroid coordinate of the target celestial body, and acquiring the final centroid coordinate of the target celestial body.
In order to more clearly describe the fast centroid location method based on edge extraction of the present invention, the following will describe each step in the embodiment of the present invention in detail with reference to fig. 1.
The quick centroid locating method based on the edge extraction comprises the steps of S10-S30, and the steps are described in detail as follows:
and step S10, acquiring a target celestial body image, and performing image binarization segmentation by a maximum inter-class variance method to obtain a binarization target celestial body image.
Maximum between-class variance algorithm: suppose that the gray level of the original image is L =256 and the number of pixels with gray value I is
Figure 159726DEST_PATH_IMAGE015
And the total pixel number of the image is Num, the probability of each gray value can be obtained, as shown in formula (1):
Figure 235129DEST_PATH_IMAGE017
(1)
in image segmentation, the gray levels are divided into two categories by a threshold T for gray level: c0= (0,1, …, T) and c1= (T +1, T +2, …, L-1). Therefore, the probabilities w0, w1 and the average grayscales u0, u1 of c0 and c1 are respectively shown in formula (2) and formula (3):
Figure 494072DEST_PATH_IMAGE018
(2)
Figure 654926DEST_PATH_IMAGE019
(3)
wherein the content of the first and second substances,
Figure 923096DEST_PATH_IMAGE021
Figure 712061DEST_PATH_IMAGE023
the variance of classes c0 and c1 is shown in equation (4):
Figure 548430DEST_PATH_IMAGE024
(4)
defining the inter-class variance as shown in formula (5):
Figure 55635DEST_PATH_IMAGE026
(5)
make it
Figure 238748DEST_PATH_IMAGE028
And taking the T value at the maximum value, namely the separated optimal threshold value.
In one embodiment of the invention, the image binarization segmentation is carried out by a maximum inter-class variance method, and the method comprises the following steps:
in step S11, a histogram of the target celestial body image is obtained, an initial segmentation threshold T =0 is set,
Figure 413377DEST_PATH_IMAGE030
Figure 14123DEST_PATH_IMAGE032
,m=0。
in step S12, the histogram is divided into two left and right parts by T, and the probabilities w0 and w1 and the average gradations u0 and u1 of the two left and right parts of T are obtained.
Step S13, calculating inter-class variance
Figure 149569DEST_PATH_IMAGE034
If, if
Figure 493963DEST_PATH_IMAGE036
Then, then
Figure 398465DEST_PATH_IMAGE038
Figure 435691DEST_PATH_IMAGE040
Step S14, let T = T +1, if T < 256, go to step S12, if T > 256, for T > 256mThe area (2) is assigned 255, and the other areas are assigned 0 for division.
Step S10 is preceded by:
setting a celestial body near a target celestial body as a reference celestial body group, acquiring the gradual change trend of the main light source color in the reference celestial body group according to the image color of the reference celestial body group, and according to the gradual change trend, enhancing the main light source color of the target celestial body in the gradual change trend decreasing direction part or weakening the main light source color of the target celestial body in the gradual change trend increasing direction part.
If the components of the celestial bodies near the target celestial body are similar to those of the target celestial body, the colors of the shot images are similar to those of the target celestial body, and the relative distances are close; setting a celestial body near a target celestial body as a reference celestial body group, acquiring the gradual change trend of the main light source color in the reference celestial body group according to the image color of the reference celestial body group, and according to the gradual change trend, enhancing the main light source color of the target celestial body in the gradual change trend decreasing direction part or weakening the main light source color of the target celestial body in the gradual change trend increasing direction part.
Through image analysis of peripheral celestial bodies, the gradual change trend of the main light source influencing the environment where the target celestial body is located is obtained, and the color of the image of the target celestial body is corrected according to the gradual change trend, so that the influence of different image colors caused by different distances between each point of the surface of the target celestial body and the main light source can be eliminated.
And step S20, extracting the edge of the binarization target celestial body image, and performing edge marking to obtain a marked edge region image.
And extracting the edge of the binaryzation target celestial body image, wherein the method comprises the following steps:
step S211, setting the coordinates of the upper left corner of the binaryzation target celestial body image as (0, 0);
step S212, sequentially carrying out horizontal and vertical scanning within the range that x is more than or equal to 0 and less than Nh-1 and y is more than or equal to 0 and less than Ml-1, and skipping to step S213 if the pixel value of the current point (x, y) is 255; wherein Ml and Nh are the column number and the row number of the binaryzation target celestial body image;
step S213, judging whether the pixel values of 8 neighborhood points of the current point (x, y) are all 255, if so, setting the pixel value of the current point (x, y) to be 0; if not, jumping to step S214;
step S214, judging whether the binaryzation target celestial body image is traversed or not, and if not, jumping to the step S212; if yes, edge extraction is completed.
To extract features of different regions, connected regions are often labeled first. The connected region mark means that pixels in the image which accord with a certain connection rule (such as 8-neighborhood connection) are represented by the same reference numbers. The mark merging refers to a process of merging mark units of different mark numbers having a connected relationship into the same mark number. Here, the extracted edge region needs to be marked.
"carry on the edge marking, obtain the edge area picture marked", its method is:
step S221, searching a seed point with the gray level of 255 in the image after the edge extraction, and rewriting the pixel value of the seed point into a mark value Snum = 254; marking the values Snum in the sequence of the marked areas in the image, and subtracting 1 from 254; wherein, Snum =254,253, … …, 1;
step S222, searching pixel points with a pixel value of 255 in 8 neighborhoods of the seed point, rewriting the pixel values into label values same as the seed point and storing the label values in a stack area;
step S223, searching pixel points with a pixel value of 255 in 8 neighborhoods of the pixels in the stack area, storing the pixel points in the stack area, rewriting the pixel values of the pixel points into label values which are the same as those of the seed points, and releasing the pixel points out of the stack area;
step S224, judging whether pixel points exist in the stack area, and if yes, jumping to step S223; otherwise, jumping to step S225;
step S225, enabling Snum = Snum-1, judging whether the full image after the edge extraction is traversed, and if not, skipping to the step S221; otherwise, edge marking is completed.
In order to improve the accuracy and efficiency of edge detection, the method further includes a step of edge information supplement after "extracting the edges of the binarized target celestial body image" in step S20, and the method includes:
acquiring the spin direction and spin cycle of the target celestial body, marking an edge point set of a high-resolution surface of the target celestial body at the time t1, and calculating the time t2 when the edge point set of the high-resolution surface appears in a low-resolution surface;
and at the time t2, supplementing the edge point set of the corresponding position with the edge point set of the target celestial body high-resolution surface at the time t 1.
And step S30, taking the largest edge area in the edge area image with the mark as the edge area of the target celestial body, and calculating the centroid coordinate of the edge area of the target celestial body through a marking algorithm to obtain the centroid coordinate of the target celestial body.
And selecting the largest edge area from the marked edge areas as the edge area of the target celestial body.
Setting the gray value of pixel points of the edge image of the target celestial body as
Figure 917488DEST_PATH_IMAGE002
Wherein, in the step (A),
Figure 206518DEST_PATH_IMAGE004
and
Figure 824581DEST_PATH_IMAGE006
respectively the number of rows and columns of pixel points in the image, which
Figure 672189DEST_PATH_IMAGE008
The cartesian geometric matrix is represented by equation (6):
Figure 375703DEST_PATH_IMAGE041
(6)
wherein, W and H are the width and height of the edge image of the target celestial body respectively;
the centroid coordinates of the target celestial body
Figure 327478DEST_PATH_IMAGE012
As shown in formula (7):
Figure 940993DEST_PATH_IMAGE013
(7)
wherein the content of the first and second substances,
Figure 585601DEST_PATH_IMAGE043
representing the value of the formula (6) when p takes the value 1 and q takes the value 0;
Figure DEST_PATH_IMAGE045
representing the value of the formula (6) when p takes the value of 0 and q takes the value of 1;
Figure DEST_PATH_IMAGE047
represents the value of the formula (6) when p takes the value of 0 and q takes the value of 0.
The accuracy of the centroid coordinate of the target celestial body directly affects the navigation effect of the aircraft, and in the process that the aircraft flies to the target celestial body, the step S30 is followed by:
step S40, obtaining the centroid coordinate of the target celestial body for multiple times according to a set time interval, converting the centroid coordinate into an astronomical coordinate as a discrete point for curve fitting, and obtaining the fitting track of the target celestial body; acquiring the running track of a target celestial body through ephemeris;
step S50, calculating the distance between the fitting track and the running track of the target celestial body at the time t and the slope of the tangent line of the fitting track of the target celestial body at each time;
and step S60, if the distance at the time t or the slope change rate at the time t and the time t +1 is greater than a set threshold value, starting the multispectral camera, adding one operation of acquiring a multispectral segment superposed image of the target celestial body between the time t +1 and the time t +2, and calculating the centroid coordinate of the target celestial body.
In one embodiment of the present invention, a multi-lens multi-spectral camera is selected, which has 4 to 9 lenses, each lens has a filter to allow light of a narrow spectrum (such as infrared light, ultraviolet light, etc. of different spectral bands) to pass through, the lenses shoot the same target at the same time, and one image is used to record target image information of several different spectral bands at the same time. The multispectral camera can superpose the visible light images and the invisible light images of the target celestial body, so that the information in the images is increased, and the accuracy of subsequent edge extraction and centroid coordinate calculation is greatly improved.
Because the images acquired by the multispectral camera need to be superposed, the image processing time is long, and the consumed energy is large, the multispectral camera is started only when the images are not clear enough or the accuracy of the acquired centroid coordinates is low, and the excessive energy consumption is avoided.
The fast centroid location system based on the edge extraction in the second embodiment of the present invention is based on the fast centroid location method based on the edge extraction, and the centroid location system includes an input module, an image segmentation module, an edge extraction and marking module, a centroid coordinate calculation module, a trajectory acquisition module, a centroid coordinate correction module, and an output module;
the input module is configured to acquire and input a target celestial body image;
the image segmentation module is configured to perform image binarization segmentation on the target celestial body image by a maximum inter-class variance method to obtain a binarization target celestial body image;
the edge extracting and marking module is configured to extract the edge of the binaryzation target celestial body image, carry out edge marking and obtain a marked edge region image;
the centroid coordinate calculation module is configured to calculate the centroid coordinate of the edge area of the target celestial body by using the largest edge area in the edge area image with the mark as the edge area of the target celestial body through a marking algorithm, so as to obtain the centroid coordinate of the target celestial body;
the track acquisition module is configured to acquire the centroid coordinate of the target celestial body for multiple times through the input module, the image segmentation module, the edge extraction and marking module and the centroid coordinate calculation module according to a set time interval, convert the centroid coordinate into an astronomical coordinate and perform curve fitting as a discrete point to acquire a fitting track of the target celestial body; acquiring the running track of a target celestial body through ephemeris;
the centroid coordinate correction module is configured to calculate the distance between the fitting track and the running track of the target celestial body at the time t and the slope of the tangent line of the fitting track of the target celestial body at each time, if the distance at the time t or the slope change rate of the time t and the time t +1 is greater than a set threshold, the multispectral camera is started, a set number of times of operation of obtaining multispectral segment superposed images of the target celestial body is increased between the time t +1 and the time t +2, centroid coordinate calculation of the target celestial body is carried out, and the final centroid coordinate of the target celestial body is obtained;
the output module is configured to output the obtained centroid coordinates of the target celestial body.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and related description of the system described above may refer to the corresponding process in the foregoing method embodiments, and will not be described herein again.
It should be noted that, the fast centroid location system based on edge extraction provided in the foregoing embodiment is only illustrated by the division of the functional modules, and in practical applications, the functions may be allocated to different functional modules according to needs, that is, the modules or steps in the embodiment of the present invention are further decomposed or combined, for example, the modules in the foregoing embodiment may be combined into one module, or may be further split into multiple sub-modules, so as to complete all or part of the functions described above. The names of the modules and steps involved in the embodiments of the present invention are only for distinguishing the modules or steps, and are not to be construed as unduly limiting the present invention.
A storage device according to a third embodiment of the present invention stores a plurality of programs, and the programs are suitable for being loaded and executed by a processor to realize the above-mentioned fast centroid locating method based on edge extraction.
A processing apparatus according to a fourth embodiment of the present invention includes a processor, a storage device; a processor adapted to execute various programs; a storage device adapted to store a plurality of programs; the program is adapted to be loaded and executed by a processor to implement the above-described fast centroid location method based on edge extraction.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes and related descriptions of the storage device and the processing device described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Those of skill in the art would appreciate that the various illustrative modules, method steps, and modules described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that programs corresponding to the software modules, method steps may be located in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. To clearly illustrate this interchangeability of electronic hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The terms "first," "second," and the like are used for distinguishing between similar elements and not necessarily for describing or implying a particular order or sequence.
The terms "comprises," "comprising," or any other similar term are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.

Claims (8)

1. A fast centroid positioning method based on edge extraction is characterized by comprising the following steps:
step S10, obtaining a target celestial body image, and performing image binarization segmentation by a maximum inter-class variance method to obtain a binarization target celestial body image;
step S20, extracting the edge of the binaryzation target celestial body image, and performing edge marking to obtain a marked edge region image;
step S30, taking the largest edge area in the edge area image with the mark as the edge area of the target celestial body, calculating the centroid coordinate of the edge area of the target celestial body through a marking algorithm, and obtaining the centroid coordinate of the target celestial body:
setting the gray value of pixel points of the edge image of the target celestial body as
Figure DEST_PATH_IMAGE001
Wherein, in the step (A),
Figure 59826DEST_PATH_IMAGE002
and
Figure DEST_PATH_IMAGE003
respectively the number of rows and columns of pixel points in the image, which
Figure 259864DEST_PATH_IMAGE004
The cartesian system geometry matrix is:
Figure DEST_PATH_IMAGE005
wherein, W and H are the width and height of the edge image of the target celestial body respectively;
the centroid coordinates of the target celestial body
Figure 518807DEST_PATH_IMAGE006
Comprises the following steps:
Figure DEST_PATH_IMAGE007
step S40, obtaining the centroid coordinate of the target celestial body for multiple times through the method corresponding to the steps S10-S30 according to the set time interval, converting the centroid coordinate into an astronomical coordinate, and performing curve fitting by taking the centroid coordinate as a discrete point to obtain the fitting track of the target celestial body; acquiring the running track of a target celestial body through ephemeris;
step S50, calculating the distance between the fitting track and the running track of the target celestial body at the time t and the slope of the tangent line of the fitting track of the target celestial body at each time;
and step S60, if the distance at the time t or the slope change rate at the time t and the time t +1 is greater than a set threshold, starting the multispectral camera, increasing the operation of acquiring the multispectral segment superposed image of the target celestial body for a set number of times between the time t +1 and the time t +2, and calculating the centroid coordinate of the target celestial body to obtain the final centroid coordinate of the target celestial body.
2. The method for fast centroid location based on edge extraction as claimed in claim 1, wherein in step S20 "extracting the edge of said binarized target celestial body image" the method is:
step S211, setting the coordinates of the upper left corner of the binaryzation target celestial body image as (0, 0);
step S212, sequentially carrying out horizontal and vertical scanning within the range that x is more than or equal to 0 and less than Nh-1 and y is more than or equal to 0 and less than Ml-1, and skipping to step S213 if the pixel value of the current point (x, y) is 255; wherein Ml and Nh are the column number and the row number of the binaryzation target celestial body image;
step S213, judging whether the pixel values of 8 neighborhood points of the current point (x, y) are all 255, if so, setting the pixel value of the current point (x, y) to be 0; if not, jumping to step S214;
step S214, judging whether the binaryzation target celestial body image is traversed or not, and if not, jumping to the step S212; if yes, edge extraction is completed.
3. The method for fast centroid location based on edge extraction as claimed in claim 1 or 2, wherein in step S20 "edge marking is performed to obtain marked edge region image", the method comprises:
step S221, searching a seed point with the gray level of 255 in the image after the edge extraction, and rewriting the pixel value of the seed point into a mark value Snum = 254;
step S222, searching pixel points with a pixel value of 255 in 8 neighborhoods of the seed point, rewriting the pixel values into label values same as the seed point and storing the label values in a stack area;
step S223, searching pixel points with a pixel value of 255 in 8 neighborhoods of the pixels in the stack area, storing the pixel points in the stack area, rewriting the pixel values of the pixel points into label values which are the same as those of the seed points, and releasing the pixel points out of the stack area;
step S224, judging whether pixel points exist in the stack area, and if yes, jumping to step S223; otherwise, jumping to step S225;
step S225, Snum = Snum-1, whether the full image after the edge extraction is traversed or not is judged, and if not, the step S221 is skipped; otherwise, edge marking is completed.
4. The method for fast centroid localization based on edge extraction as claimed in claim 1, wherein step S10 is preceded by:
setting a celestial body near a target celestial body as a reference celestial body group, acquiring the gradual change trend of the main light source color in the reference celestial body group according to the image color of the reference celestial body group, and according to the gradual change trend, enhancing the main light source color of the target celestial body in the gradual change trend decreasing direction part or weakening the main light source color of the target celestial body in the gradual change trend increasing direction part.
5. The edge-extraction-based fast centroid localization method according to claim 1, wherein step S20 further comprises a step of edge information supplement after "extracting the edges of said binarized target celestial body image", and the method is as follows:
acquiring the spin direction and spin cycle of the target celestial body, marking an edge point set of a high-resolution surface of the target celestial body at the time t1, and calculating the time t2 when the edge point set of the high-resolution surface appears in a low-resolution surface;
and at the time t2, supplementing the edge point set of the corresponding position with the edge point set of the target celestial body high-resolution surface at the time t 1.
6. A fast centroid localization system based on edge extraction is characterized in that based on the fast centroid localization method based on edge extraction in any one of claims 1-5, the centroid localization system comprises an input module, an image segmentation module, an edge extraction and marking module, a centroid coordinate calculation module, a trajectory acquisition module, a centroid coordinate correction module and an output module;
the input module is configured to acquire and input a target celestial body image;
the image segmentation module is configured to perform image binarization segmentation on the target celestial body image by a maximum inter-class variance method to obtain a binarization target celestial body image;
the edge extracting and marking module is configured to extract the edge of the binaryzation target celestial body image, carry out edge marking and obtain a marked edge region image;
the centroid coordinate calculation module is configured to calculate the centroid coordinate of the target celestial body edge region through a marking algorithm by taking the largest edge region in the marked edge region image as the target celestial body edge region, and obtain the centroid coordinate of the target celestial body:
setting the gray value of pixel points of the edge image of the target celestial body as
Figure 804294DEST_PATH_IMAGE001
Wherein, in the step (A),
Figure 806885DEST_PATH_IMAGE002
and
Figure 595850DEST_PATH_IMAGE003
respectively the number of rows and columns of pixel points in the image, which
Figure 291273DEST_PATH_IMAGE004
The cartesian system geometry matrix is:
Figure 798478DEST_PATH_IMAGE005
wherein, W and H are the width and height of the edge image of the target celestial body respectively;
the centroid coordinates of the target celestial body
Figure 73602DEST_PATH_IMAGE006
Comprises the following steps:
Figure 982652DEST_PATH_IMAGE007
the track acquisition module is configured to acquire the centroid coordinate of the target celestial body for multiple times through the input module, the image segmentation module, the edge extraction and marking module and the centroid coordinate calculation module according to a set time interval, convert the centroid coordinate into an astronomical coordinate and perform curve fitting as a discrete point to acquire a fitting track of the target celestial body; acquiring the running track of a target celestial body through ephemeris;
the centroid coordinate correction module is configured to calculate the distance between the fitting track and the running track of the target celestial body at the time t and the slope of the tangent line of the fitting track of the target celestial body at each time, if the distance at the time t or the slope change rate of the time t and the time t +1 is greater than a set threshold, the multispectral camera is started, the operation of obtaining a multispectral segment superposed image of the target celestial body is added for a set number of times between the time t +1 and the time t +2, and centroid coordinate calculation of the target celestial body is carried out to obtain the final centroid coordinate of the target celestial body;
the output module is configured to output the obtained centroid coordinates of the target celestial body.
7. A storage device having stored therein a plurality of programs, wherein the programs are adapted to be loaded and executed by a processor to implement the edge extraction based fast centroid localization method according to any one of claims 1-5.
8. A processing apparatus comprising a processor adapted to execute programs; and a storage device adapted to store a plurality of programs; characterized in that said program is adapted to be loaded and executed by a processor to implement the edge extraction based fast centroid localization method of any one of claims 1-5.
CN202010779297.1A 2020-08-05 2020-08-05 Rapid centroid positioning method, system and device based on edge extraction Active CN111739039B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010779297.1A CN111739039B (en) 2020-08-05 2020-08-05 Rapid centroid positioning method, system and device based on edge extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010779297.1A CN111739039B (en) 2020-08-05 2020-08-05 Rapid centroid positioning method, system and device based on edge extraction

Publications (2)

Publication Number Publication Date
CN111739039A CN111739039A (en) 2020-10-02
CN111739039B true CN111739039B (en) 2020-11-13

Family

ID=72657979

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010779297.1A Active CN111739039B (en) 2020-08-05 2020-08-05 Rapid centroid positioning method, system and device based on edge extraction

Country Status (1)

Country Link
CN (1) CN111739039B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113640445A (en) * 2021-08-11 2021-11-12 贵州中烟工业有限责任公司 Characteristic peak identification method based on image processing, computing equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6357698B1 (en) * 2000-02-02 2002-03-19 The Boeing Company Twin lobe spacecraft dispenser apparatus and method
CN102927973A (en) * 2012-10-24 2013-02-13 北京控制工程研究所 Quick edge locating method of sub pixel image of target celestial body for deep space exploration autonomous navigation
CN105261047A (en) * 2015-09-08 2016-01-20 北京控制工程研究所 Docking ring circle center extraction method based on close-range short-arc image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6357698B1 (en) * 2000-02-02 2002-03-19 The Boeing Company Twin lobe spacecraft dispenser apparatus and method
CN102927973A (en) * 2012-10-24 2013-02-13 北京控制工程研究所 Quick edge locating method of sub pixel image of target celestial body for deep space exploration autonomous navigation
CN105261047A (en) * 2015-09-08 2016-01-20 北京控制工程研究所 Docking ring circle center extraction method based on close-range short-arc image

Also Published As

Publication number Publication date
CN111739039A (en) 2020-10-02

Similar Documents

Publication Publication Date Title
CN110046529B (en) Two-dimensional code identification method, device and equipment
US8103115B2 (en) Information processing apparatus, method, and program
US8009879B2 (en) Object recognition device, object recognition method, object recognition program, feature registration device, feature registration method, and feature registration program
CN111062885B (en) Mark detection model training and mark detection method based on multi-stage transfer learning
US10423855B2 (en) Color recognition through learned color clusters
AU2015238872B2 (en) Method for identifying a sign on a deformed document
CN104299006A (en) Vehicle license plate recognition method based on deep neural network
CN111160273A (en) Hyperspectral image space spectrum combined classification method and device
CN104766344B (en) Vehicle checking method based on movement edge extractor
Cai et al. Guided attention network for object detection and counting on drones
CN113538491B (en) Edge identification method, system and storage medium based on self-adaptive threshold
CN109977834B (en) Method and device for segmenting human hand and interactive object from depth image
CN111507340B (en) Target point cloud data extraction method based on three-dimensional point cloud data
CN111739039B (en) Rapid centroid positioning method, system and device based on edge extraction
CN107316318B (en) Air target automatic detection method based on multi-subregion background fitting
CN112541394A (en) Black eye and rhinitis identification method, system and computer medium
CN114882204A (en) Automatic ship name recognition method
CN112200191B (en) Image processing method, image processing device, computing equipment and medium
CN111739050B (en) Zynq-based aircraft navigation system for attitude measurement and celestial body centroid extraction
CN108921884A (en) Based on the optics and SAR Image registration method, equipment and storage medium for improving SIFT
CN105405120A (en) Method extracting cloud graph from sky image
CN107358138A (en) Bearing calibration, mobile terminal and the storage device of nonlinear distortion EAN bar codes
CN113128500A (en) Mask-RCNN-based non-motor vehicle license plate recognition method and system
Chen et al. A computer vision system for automated container code recognition
CN105868789B (en) A kind of target detection method estimated based on image-region cohesion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant