CN111008659A - Point trace condensing method based on pixel point association - Google Patents
Point trace condensing method based on pixel point association Download PDFInfo
- Publication number
- CN111008659A CN111008659A CN201911219760.0A CN201911219760A CN111008659A CN 111008659 A CN111008659 A CN 111008659A CN 201911219760 A CN201911219760 A CN 201911219760A CN 111008659 A CN111008659 A CN 111008659A
- Authority
- CN
- China
- Prior art keywords
- point
- trace
- upper edge
- pixel
- lower edge
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a pixel point correlation-based trace point condensation method, which belongs to the technical field of water area radars and comprises the following steps: s10, receiving data sent by the radar; s20, forming a radar signal diagram by taking the signal of each frame of radar in the received data as the distance according to the Y axis and the azimuth angle according to the X axis; s30, scanning pixel points of each row in the radar signal diagram from bottom to top, and marking out points on the lower edge and the upper edge of a trace; s40, correlating the points of the upper edge to form an upper edge simulation trace; correlating the points of the lower edge to form a lower edge simulation trace; and S50, matching the upper edge simulation dot trace and the lower edge simulation dot trace with each other to form a radar dot trace. The invention provides a point track condensation method capable of avoiding point track adhesion and losing ships in track matching, which can match the tracks of different ships and avoid losing part of the ship tracks when the reflection light spots of a plurality of ships are used as a target in a radar.
Description
Technical Field
The invention belongs to the technical field of water area radars, and particularly relates to a pixel point correlation-based trace point condensation method.
Background
The existing ship point trace aggregation method is to cluster bright points with close distances by using distance information, and when more than two ships are close to each other, radar light spots reflected by the ships are very close to each other or are adhered to each other to a certain degree, the existing point trace aggregation method can cluster the bright points into a target. Since the speed of a ship is usually very slow, when the above situation occurs, the existing point-trace condensation algorithm takes a plurality of reflected light spots of the ship as a target for a long time, and a part of the ship track is lost when the track is matched.
A point trace condensing method capable of avoiding point trace adhesion and loss of ships due to track matching is urgently needed.
Disclosure of Invention
The invention aims to provide a point track condensation method capable of avoiding point track adhesion and track matching loss of ships, which comprises the following steps:
a dot trace condensation method based on pixel point association comprises the following steps:
s10, receiving data sent by the radar;
s20, forming a radar signal diagram by taking the signal of each frame of radar in the received data as the distance according to the Y axis and the azimuth angle according to the X axis;
30. scanning pixel points of each row in the radar signal diagram from bottom to top, and marking out points of the lower edge and the upper edge of a trace; the points on the lower edge are turned from dark to bright, and the points on the upper edge are turned from bright to dark; the dark finger is below a threshold value and the bright finger is above the threshold value;
s40, correlating the points of the upper edge to form an upper edge simulation trace; correlating the points of the lower edge to form a lower edge simulation trace;
and S50, matching the upper edge simulation trace and the lower edge simulation trace to form a radar trace.
Further, the matching steps of the upper edge simulation trace and the lower edge simulation trace are as follows:
s51, searching the upper edge point and the lower edge point from left to right;
s52, performing one-to-one matching on the pixel points in the upper edge point and the pixel points in the lower edge point;
s53, checking whether a pixel on the left side of a pixel point in the upper edge point or the lower edge point has a matching point, if so, associating the pixel with the matching point, if not, generating a new simulation batch number, wherein the pixel point is the starting point of the new simulation batch number;
s54, checking whether a pixel on the right side of a pixel point in the upper edge point or the lower edge point has a matching point, if so, associating the matching point with the pixel, if not, ending the simulation batch number, and taking the pixel point as an ending point of the simulation batch number;
s55, checking whether the upper edge point or the lower edge point exists in the right 6 pixel points of the pixel points in the upper edge point or the lower edge point, if so, associating, and if not, ending the corresponding simulation batch number;
and S56, fusing and matching the batch numbers of the upper edge point and the lower edge point, and generating the radar point batch number by the analog batch number of the upper edge obeying the analog batch number of the lower edge.
Further, the specific steps of matching the pixels adjacent to the point are shown in fig. 1, wherein the upper edge point is defined as a blue point, which may also be referred to as a blue point for short; the lower edge point is defined as a yellow point, which may also be referred to as a yellow point for short.
The invention has the beneficial effects that: the invention provides a point track condensation method capable of avoiding point track adhesion and losing ships in track matching, which can match the tracks of different ships and avoid losing part of the ship tracks when the reflection light spots of a plurality of ships are used as a target in a radar.
Drawings
FIG. 1 is a flow chart of a trace point condensation method based on pixel point association
FIG. 2 Radar Signal diagram
FIG. 3 shows the points of the lower edge and the points of the upper edge
Point matching of the lower edge and the upper edge of fig. 4
FIG. 5 looks at the matching points of the pixels to the left of the point of the lower right edge and the point of the upper edge
Correlation of pixel points in the top edge points or bottom edge points of FIG. 6
6 pixel point inliers to the right of the pixel point in the upper edge point or the lower edge point of fig. 7
The simulated lot numbers of the upper margin and the simulated lot numbers of the lower margin of FIG. 8 are matched with each other
Detailed Description
The method for condensing the trace points based on the pixel point association comprises the following steps (figure 1):
s10, receiving data sent by the radar;
s20, forming a radar signal diagram (figure 2) by taking the signal of each frame of radar in the received data as the distance according to the Y axis and the azimuth angle according to the X axis;
s30, scanning pixel points of each row in the radar signal diagram from bottom to top, and marking out points on the lower edge and the upper edge of a trace; the points on the lower edge are turned from dark to bright, and the points on the upper edge are turned from bright to dark; the dark finger is below a threshold value and the bright finger is above the threshold value; (FIG. 3)
S40, correlating the points of the upper edge to form an upper edge simulation trace; correlating the points of the lower edge to form a lower edge simulation trace;
and S50, matching the upper edge simulation dot trace and the lower edge simulation dot trace with each other to form a radar dot trace.
The steps of matching the upper edge simulation trace and the lower edge simulation trace are as follows:
s51, searching an upper edge point and a lower edge point from left to right;
s52, performing one-to-one matching on the pixel points in the upper edge point and the pixel points in the lower edge point (figure 4);
s53, checking whether a pixel on the left side of a pixel point in the upper edge point or the lower edge point has a matching point, if so, associating the pixel with the matching point, if not, generating a new simulation batch number, wherein the pixel point is the starting point of the new simulation batch number (figure 5);
s54, checking whether a pixel on the right side of a pixel point in the upper edge point or the lower edge point has a matching point, if so, associating the matching point with the pixel, if not, ending the simulation batch number, and taking the pixel point as an ending point of the simulation batch number; (FIG. 6)
S55, checking whether the right 6 pixel points of the pixel points in the upper edge point or the lower edge point have upper edge points or lower edge points, if so, associating, and if not, ending the corresponding simulation batch number;
s56, the batch numbers of the upper edge point and the lower edge point are fused and matched (figure 7), the simulation batch number of the upper edge follows the simulation batch number of the lower edge to generate the radar point trace batch number (figure 8).
The above description is only a preferred embodiment of the present invention, and is not intended to limit the technical scope of the present invention, so that any minor modifications, equivalent changes and modifications made to the above embodiment according to the technical spirit of the present invention are within the technical scope of the present invention.
Claims (3)
1. A point trace condensation method based on pixel point association is characterized by comprising the following steps:
s10, receiving data sent by the radar;
s20, forming a radar signal diagram by taking the signal of each frame of radar in the received data as the distance according to the Y axis and the azimuth angle according to the X axis;
s30, scanning pixel points of each row in the radar signal diagram from bottom to top, and marking out points on the lower edge and the upper edge of a trace; the points on the lower edge are turned from dark to bright, and the points on the upper edge are turned from bright to dark; the dark finger is below a threshold value and the bright finger is above the threshold value;
s40, correlating the points of the upper edge to form an upper edge simulation trace; correlating the points of the lower edge to form a lower edge simulation trace;
and S50, matching the upper edge simulation trace and the lower edge simulation trace to form a radar trace.
2. The pixel point association-based trace point agglomeration method according to claim 1, wherein the steps of matching the upper edge simulation trace point with the lower edge simulation trace point are as follows:
s51, searching the upper edge point and the lower edge point from left to right;
s52, performing one-to-one matching on the pixel points in the upper edge point and the pixel points in the lower edge point;
s53, checking whether a pixel on the left side of a pixel point in the upper edge point or the lower edge point has a matching point, if so, associating the pixel with the matching point, if not, generating a new simulation batch number, wherein the pixel point is the starting point of the new simulation batch number;
s54, checking whether a pixel on the right side of a pixel point in the upper edge point or the lower edge point has a matching point, if so, associating the matching point with the pixel, if not, ending the simulation batch number, and taking the pixel point as an ending point of the simulation batch number;
s55, checking whether the upper edge point or the lower edge point exists in the right 6 pixel points of the pixel points in the upper edge point or the lower edge point, if so, associating, and if not, ending the corresponding simulation batch number;
and S56, fusing and matching the batch numbers of the upper edge point and the lower edge point, and generating the radar point batch number by the analog batch number of the upper edge obeying the analog batch number of the lower edge.
3. The method for dot trace condensation based on pixel point association according to claim 2, wherein the specific steps of matching the pixels adjacent to the dots are shown in fig. 1, wherein the upper edge point is defined as a blue dot, which may also be referred to as a blue dot for short; the lower edge point is defined as a yellow point, which may also be referred to as a yellow point for short.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911219760.0A CN111008659B (en) | 2019-12-16 | 2019-12-16 | Point trace condensing method based on pixel point association |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911219760.0A CN111008659B (en) | 2019-12-16 | 2019-12-16 | Point trace condensing method based on pixel point association |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111008659A true CN111008659A (en) | 2020-04-14 |
CN111008659B CN111008659B (en) | 2023-04-07 |
Family
ID=70113938
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911219760.0A Active CN111008659B (en) | 2019-12-16 | 2019-12-16 | Point trace condensing method based on pixel point association |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111008659B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113702964A (en) * | 2021-08-23 | 2021-11-26 | 中国北方工业有限公司 | Radar adaptive area aggregation method based on track information |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110012778A1 (en) * | 2008-12-10 | 2011-01-20 | U.S. Government As Represented By The Secretary Of The Army | Method and system for forming very low noise imagery using pixel classification |
CN104991235A (en) * | 2015-06-15 | 2015-10-21 | 南京航空航天大学 | Method for rapid tracking target based on radar trace points |
CN105093215A (en) * | 2015-08-31 | 2015-11-25 | 西安电子科技大学 | Doppler information based method for tracking low-altitude low-speed small target through radar |
-
2019
- 2019-12-16 CN CN201911219760.0A patent/CN111008659B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110012778A1 (en) * | 2008-12-10 | 2011-01-20 | U.S. Government As Represented By The Secretary Of The Army | Method and system for forming very low noise imagery using pixel classification |
CN104991235A (en) * | 2015-06-15 | 2015-10-21 | 南京航空航天大学 | Method for rapid tracking target based on radar trace points |
CN105093215A (en) * | 2015-08-31 | 2015-11-25 | 西安电子科技大学 | Doppler information based method for tracking low-altitude low-speed small target through radar |
Non-Patent Citations (2)
Title |
---|
吉军: "雷达点迹的目标智能特征提取方法研究", 《信息技术》 * |
郭剑辉等: "多功能相控阵雷达点迹融合技术研究", 《现代雷达》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113702964A (en) * | 2021-08-23 | 2021-11-26 | 中国北方工业有限公司 | Radar adaptive area aggregation method based on track information |
CN113702964B (en) * | 2021-08-23 | 2023-09-26 | 中国北方工业有限公司 | Radar self-adaptive region aggregation method based on track information |
Also Published As
Publication number | Publication date |
---|---|
CN111008659B (en) | 2023-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104751187B (en) | Meter reading automatic distinguishing method for image | |
CN103530590B (en) | DPM Quick Response Code recognition system | |
CN101408937B (en) | Method and apparatus for locating character row | |
CN102360419B (en) | Method and system for computer scanning reading management | |
CN100538726C (en) | Automatic input device for cloth sample image based on image vector technology | |
CN110766008A (en) | Text detection method facing any direction and shape | |
CN101727654B (en) | Method realized by parallel pipeline for performing real-time marking and identification on connected domains of point targets | |
WO2018086233A1 (en) | Character segmentation method and device, and element detection method and device | |
CN104700062A (en) | Method and equipment for identifying two-dimension code | |
CN111008659B (en) | Point trace condensing method based on pixel point association | |
CN108564614A (en) | Depth acquisition methods and device, computer readable storage medium and computer equipment | |
CN113538491B (en) | Edge identification method, system and storage medium based on self-adaptive threshold | |
CN112819840B (en) | High-precision image instance segmentation method integrating deep learning and traditional processing | |
CN105335701A (en) | Pedestrian detection method based on HOG and D-S evidence theory multi-information fusion | |
CN104036244A (en) | Checkerboard pattern corner point detecting method and device applicable to low-quality images | |
CN103745221A (en) | Two-dimensional code image correction method | |
CN112115948A (en) | Chip surface character recognition method based on deep learning | |
CN104391294A (en) | Connection domain characteristic and template matching based radar plot correlation method | |
CN107688812B (en) | Food production date ink-jet font repairing method based on machine vision | |
CN110766001A (en) | Bank card number positioning and end-to-end identification method based on CNN and RNN | |
CN106648171B (en) | A kind of interactive system and method based on lettering pen | |
CN102885631A (en) | Distortion correction method applied to flat-panel charge coupling device (CCD) detector | |
CN112784737B (en) | Text detection method, system and device combining pixel segmentation and line segment anchor | |
CN103968833A (en) | Method for selecting observation triangle before star map matching | |
CN111179355A (en) | Binocular camera calibration method combining point cloud and semantic recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |