CN106597406B - Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view - Google Patents

Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view Download PDF

Info

Publication number
CN106597406B
CN106597406B CN201611100076.7A CN201611100076A CN106597406B CN 106597406 B CN106597406 B CN 106597406B CN 201611100076 A CN201611100076 A CN 201611100076A CN 106597406 B CN106597406 B CN 106597406B
Authority
CN
China
Prior art keywords
dimensional
under
template matching
visual angle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611100076.7A
Other languages
Chinese (zh)
Other versions
CN106597406A (en
Inventor
杨学岭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
724 Research Institute Of China Shipbuilding Corp
Original Assignee
724th Research Institute of CSIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 724th Research Institute of CSIC filed Critical 724th Research Institute of CSIC
Priority to CN201611100076.7A priority Critical patent/CN106597406B/en
Publication of CN106597406A publication Critical patent/CN106597406A/en
Application granted granted Critical
Publication of CN106597406B publication Critical patent/CN106597406B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of implementation methods of the radar target recognition of Decision-level fusion based on picture one-dimensional under multi-angle of view.This method is primarily adapted for use in the radar target recognition of conventional broadband coherent surveillance radar under collaboration system.Its main flow is: first to one-dimensional as carrying out data prediction under each visual angle;It is one-dimensional as energy accumulating area extracts thresholding that target under each visual angle is set;It is one-dimensional to target under each visual angle as energy accumulating area extracts;Calculate radar target posture under each visual angle;Picture one-dimensional to target under each visual angle carries out template matching;It is then based on and improves the one-dimensional picture template matching amalgamation judging of target under each visual angle of D-S evidence theory progress;Finally carry out radar target recognition.Method provided by the present invention has the characteristics that Project Realization is simple, Decision-level fusion effect is good, method therefor theoretical foundation is abundant, and radar target recognition accuracy is improved compared to radar target recognition accuracy under single-view 10% or more under multi-angle of view.

Description

Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view
Technical field
The present invention is a kind of for realizing that multi-angle of view is one-dimensional based on conventional broadband coherent surveillance radar element under collaboration system As lower radar target identification method, the rough sort of radar target is realized.
Background technique
Under multi-angle of view on the basis of one-dimensional picture, radar target recognition is realized by Decision-level fusion etc., utilizes radar mesh The classification of radar target may be implemented in identification information.Cooperate the transformation of existing detecting devices, it can be in existing detecting devices On the basis of energetically promoted collaboration target identification ability.
The technical research of many one-dimensional pictures at present is carried out based on picture one-dimensional under single-view, such as in April, 2013 Xi'an It is special that the one kind proposed in University of Electronic Science and Technology's academic dissertation " research of Radar High Range Resolution target identification technology " is based on time domain The truncated process Hidden Markov Model of sign, and the layered recognition method based on TSB-HMM model is established, utilize TSB-HMM Models coupling temporal signatures and power spectrum characteristic identify one-dimensional picture, realize the radar target recognition under single-view.
Different from the method proposed in other documents, the present invention is directed on the basis of based on one-dimensional picture under multi-angle of view, is led to The radar target identification method for crossing Decision-level fusion realizes the classification of radar target.
Summary of the invention
The purpose of the present invention is to provide the one-dimensional thunders as Decision-level fusion under the multi-angle of view under a kind of solution collaboration system Up to target identification method, the classification of radar target is effectively realized.By means of the invention it is possible to realize navy formation to sighting distance Radar target under interior various motion states is effectively classified, and the one-dimensional radar target recognition as under of multi-angle of view is correct 10% is improved on the basis of radar target recognition accuracy of the rate under the one-dimensional picture of single-view.
Realize technical solution of the invention are as follows:
First to one-dimensional as data carry out rejecting the pretreatments such as bad picture, no-coherence cumulating under each visual angle;To incoherent product Target after tired is one-dimensional as setting target is one-dimensional as energy accumulating area extracts thresholding;It is one-dimensional as energy is poly- to extract target under each visual angle Collect area's distance unit front and back edge, spotting is one-dimensional as energy accumulating area;It is calculated using the course of radar target and azimuth information Attitude angle under each visual angle of radar target;The one-dimensional center as energy accumulating area of target under each visual angle is calculated away from utilizing target appearance State information and center are one-dimensional using targeted attitude information and target away from combining maximum correlation coefficient to carry out center away from template matching Picture energy accumulating area combines maximum correlation coefficient to carry out energy accumulating area template matching, to the center under each visual angle away from template With Decision-level fusion judgement is carried out with energy accumulating area template matching, each visual angle lower template matching result is exported;Under each visual angle On the basis of template matching exports result, the posture information and target in conjunction with target under each visual angle are one-dimensional as signal-tonoise information carries out It is one-dimensional as template matching amalgamation judging based on target under each visual angle for improving D-S evidence theory;Finally adjudicated according to maximal correlation Criterion carries out radar target recognition.
Compared with prior art, the present invention its remarkable advantage are as follows:
Using Decision-level fusion criterion to the center under each visual angle away from template matching and energy accumulating area template matching into The method of row comprehensive judgement, the target that can quickly and efficiently carry out under each visual angle is one-dimensional as template matching, and this method has certainly It adapts to, the features such as calculation amount is small, and operational efficiency is high.It is one-dimensional as template based on target under each visual angle for improving D-S evidence theory Amalgamation judging method is matched, can accurately and efficiently be eliminated without one-dimensional caused by detection ranges different under visual angle, different postures As targe-aspect sensitivity, the influence of translation sensibility and strength sensitive.It is proposed and Project Realization of the invention is known in radar target Other field has highly application value.
Present invention is further described in detail with reference to the accompanying drawing.
Detailed description of the invention
Fig. 1 is data flowchart of the invention.
Fig. 2 is center of the invention away from template matching results output schematic diagram.
Fig. 3 is template matching results output schematic diagram in energy accumulating area of the invention.
Fig. 4 is one-dimensional as template matching results export schematic diagram under each visual angle of the invention.
Fig. 5 is of the invention one-dimensional as template matching schematic diagram based on the target for improving D-S evidence theory.
Specific embodiment
The present invention is based on the radar target identification method specific implementation steps as Decision-level fusion one-dimensional under multi-angle of view to be, ginseng See attached drawing 1:
It is (1) one-dimensional as data prediction under each visual angle, the method is as follows:
The kurtosis matrix K for calculating one-dimensional picture finds the greatest member max (K) and its corresponding one-dimensional picture of kurtosis matrix, system The remaining one-dimensional image set of meter closes { xiAnd the kurtosis set { k that peels offi}:
Wherein N is the number of one-dimensional picture, XiIt is remaining i-th one-dimensional decent notebook data, μiIt is sample average, σiIt is sample This standard variance, E (Xii)4Be i-th of one-dimensional decent notebook data 4 rank centers away from.If kiIt is non-just, then it is assumed that i-th one Sample data is set to 0 as being abnormal one-dimensional picture by dimension.Using the corresponding one-dimensional picture of kurtosis matrix greatest member as base picture, using most Small entropy spectral estimation criterion carries out registration process, does no-coherence cumulating to the one-dimensional picture after alignment.
I(xk)=- log pk
Wherein xkIt is one-dimensional decent notebook data, I (xk) it is information content, X is discrete random variable, PkIt is event X=xkOccur Probability,It is one-dimensional as sample data sets.
Wherein siFor alignment after i-th of distance unit no-coherence cumulating as a result, N is the number of one-dimensional picture.
(2) it is one-dimensional as energy accumulating area extracts thresholding that target under each visual angle is set, the method is as follows:
Calculate in (1) after no-coherence cumulating it is one-dimensional as before P (1, n) 1/8th and rear eight/a part mean value and Variance is proposed feature using mean-min and variance minimum value as the noise mean value and variance of one-dimensional picture after no-coherence cumulating Thresholding is taken to be set as the mean value of noise section and the form of variance sum.
Gate=k* (mean (X)+6*std (X))
Wherein gate is local threshold, and X is noise section data acquisition system in one-dimensional range profile, and k is a constant value coefficient.
(3) target is one-dimensional as energy accumulating area extracts under each visual angle, the method is as follows:
One-dimensional picture P (1, n) after no-coherence cumulating is split using threshold value gate, obtains segmentation result Pg(1, n):
To the P after segmentationgThe preceding back edge position for extracting thresholding part was extracted in (1, n) region, pressed 10 in preceding back edge position A distance length does sliding average, if mean value is less than 0.3 times of Largest Mean, to the territory element multiplied by 0.1 weighting Value, then carries out the one-dimensional picture energy accumulating area boundary demarcation of target with thresholding again, and spotting is one-dimensional as energy accumulating area boundary Afterwards, it since maximum value, carries out to both direction together with processing, if being more than a certain range together with domain spacing, then it is assumed that be not One target, then this direction stops together with processing, and the target finally obtained is one-dimensional as energy accumulating area P 'g(1, n), target is one-dimensional As energy accumulating area P 'g(1, n) distance unit minimum value is one-dimensional as energy accumulating area forward position as target, one-dimensional as region P 'g (1, n) distance unit maximum value is one-dimensional as behind energy accumulating area edge as target.
(4) radar target attitude algorithm under each visual angle, the method is as follows:
Wherein θ is radar target attitude angle under each visual angle, and α is radar target course under each visual angle, and β is radar under each visual angle Target bearing.
It is (5) one-dimensional as template matching under each visual angle, the method is as follows:
1) center is away from template matching, and output center is away from template matching results, and steps are as follows:
A calculate the one-dimensional center as energy accumulating area of target under each visual angle away from:
Wherein:P′g(1, n) one-dimensional as energy for target under each visual angle Accumulation regions, N are P 'g(1, n) data length, μpFor P 'gThe p rank center of (1, n) away from.
B is to center away from obtaining P ' after normalizationg(1, n) p rank normalization center away from:
C by the one-dimensional normalization center as energy accumulating area of calculated target away fromAs characteristic vector F, becauseIt is permanent Equal to 1,It is constantly equal to 0, so will notWithAs feature, the maximum value of p is 6 in the present invention, therefore:
D is using maximum correlation coefficient to the ship template normalization center of targeted attitude ± 15 ° range away from characteristic vector Relevant calculation is carried out away from characteristic vector with the one-dimensional normalization center as energy accumulating area of target, by maximum correlation coefficient output 3 Class ship center is away from template matching results, such as attached drawing 2.
2) energy accumulating area template matching, using maximum correlation coefficient respectively to the ship of targeted attitude ± 15 ° range Template energy accumulation regions and target are one-dimensional as energy accumulating area progress relevant calculation, export 3 class ship energy by maximum correlation coefficient Accumulation regions template matching results are measured, such as attached drawing 3.
3) Decision-level fusion judgement, step are carried out away from template matching and energy accumulating area template matching to the center under each visual angle It is rapid as follows:
A determines whether center has consistent item away from the ship type of template matching and the output of energy accumulating area template matching, such as There is consistent Xiang Ze to be weighted center away from template matching confidence level and energy accumulating area template matching confidence level.
Matching result after consistency discrimination is ranked up by b from big to small by related coefficient, and maximum 3 matchings are tied The ship matching result that fruit determines as Decision-level fusion.
Result after c determines a, b is one-dimensional as template matching results as target under each visual angle, such as attached drawing 4.
(6) one-dimensional as template matching based on the target for improving D-S evidence theory, method is as follows, such as attached drawing 4:
1) it calculates one-dimensional as the given probability assignment { A determined of template matching output type under each visual anglei, i=1,2 ..., 12
Wherein θiFor posture information, SnriFor signal-tonoise information, qiIt is one-dimensional as template matching type exports under each visual angle Confidence level.
2) according to Dempster combination criterion to { Ai, i=1,2 ..., 12 are merged.
3) fused probability assignment is ranked up according to maximal correlation principle.
4) it carries out one-dimensional as template matching based on the amalgamation judging for improving D-S evidence theory.
(7) radar target recognition, the method is as follows:
On the basis of (6), using 3 ship types for sorting forward and corresponding confidence level as radar target recognition knot Fruit.

Claims (3)

1. based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view, it is characterised in that: one-dimensional by calculating The kurtosis matrix K of picture, finds the greatest member max (K) and its corresponding one-dimensional picture of kurtosis matrix, and the remaining one-dimensional image set of statistics closes {xiAnd the kurtosis set { k that peels offi, utilize the one-dimensional picture of kurtosis method rejecting abnormalities that peels off;With kurtosis matrix greatest member corresponding one Dimension does incoherent product to the one-dimensional picture after alignment as carrying out registration process to one-dimensional as being base picture, using minimum entropy estimation criterion It is tired;
Before one-dimensional picture after counting no-coherence cumulating 1/8th and rear eight/a part mean value and variance, with mean value minimum Value and noise mean value and variance of the variance minimum value as picture one-dimensional after no-coherence cumulating, with gate=k* (mean (X)+6*std (X)) form is arranged the one-dimensional picture energy accumulating area of target under each visual angle and extracts thresholding, and wherein mean (X) is noise mean value, std It (X) is noise variance, k is constant value coefficient;It is one-dimensional as boundary demarcation, extraction target that threshold target was carried out using moving average method One-dimensional picture energy accumulating area;According to the course and orientation of radar target, the attitude angle of radar target is calculated;Calculate mesh under each visual angle It marks one-dimensional as the center in energy accumulating area is away to normalize center away from construction feature vectorWhereinFor 2 ranks normalize center away from,For 3 ranks normalize center away from,It is 4 Rank normalize center away from,For 5 ranks normalize center away from,Center is normalized away from conjunction with maximum correlation coefficient pair for 6 ranks The ship template normalization center of targeted attitude ± 15 ° range is away from the one-dimensional normalization as energy accumulating area of characteristic vector and target Center of the center away from characteristic vector relevant calculation is away from template matching;In conjunction with maximum correlation coefficient to targeted attitude ± 15 ° range Ship template energy accumulation regions and the one-dimensional energy accumulating area template matching as energy accumulating area relevant calculation of target;To each view Center under angle carries out Decision-level fusion judgement away from template matching and energy accumulating area template matching, using the result after judgement as Target is one-dimensional as template matching results under each visual angle;WithForm building It is one-dimensional as the given probability assignment { A determined of template matching output type under each visual anglei, i=1,2 ..., 12, wherein θiFor posture Information, SnriFor signal-tonoise information, qiFor the confidence level exported as template matching type one-dimensional under each visual angle, demonstrate,proved based on D-S is improved It is one-dimensional as template matching that target is carried out according to theory;Finally carry out radar target recognition;It is carried out by this method one-dimensional under multi-angle of view As radar target recognition accuracy improves 10% on the basis of the one-dimensional accuracy as radar target recognition under single-view.
2. it is according to claim 1 based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view, it is special Sign is: being carried out to the center under each visual angle away from template matching and energy accumulating area template matching using Decision-level fusion criterion Comprehensive judgement, this method pass through center away from template matching and energy accumulating area template matching consistency discrimination, by center away from template Matching carries out fusion treatment with energy accumulating area template matching, and it is one-dimensional as template to carry out target under each visual angle in conjunction with correlation coefficient process It matches, targe-aspect sensitivity caused by different postures and template library data redundancy are under each visual angle under reduction radar target different perspectives The one-dimensional influence as template matching accuracy.
3. it is according to claim 1 based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view, it is special Sign is: one-dimensional as template matching amalgamation judging method based on target under each visual angle for improving D-S evidence theory, this method utilizes Targeted attitude information, target are one-dimensional as one-dimensional as template matching type exports confidence level under signal-to-noise ratio and each visual angle under each visual angle, WithForm construct it is one-dimensional as template matching output type is given under each visual angle Surely probability assignment { the A determinedi, i=1,2 ..., 12, wherein θiFor posture information, SnriFor signal-tonoise information, qiFor each visual angle Under the one-dimensional confidence level exported as template matching type, reduce different detection ranges under radar target different perspectives, different postures Caused by translation sensibility and targe-aspect sensitivity to the one-dimensional influence as template matching accuracy.
CN201611100076.7A 2016-12-02 2016-12-02 Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view Active CN106597406B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611100076.7A CN106597406B (en) 2016-12-02 2016-12-02 Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611100076.7A CN106597406B (en) 2016-12-02 2016-12-02 Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view

Publications (2)

Publication Number Publication Date
CN106597406A CN106597406A (en) 2017-04-26
CN106597406B true CN106597406B (en) 2019-03-29

Family

ID=58596962

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611100076.7A Active CN106597406B (en) 2016-12-02 2016-12-02 Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view

Country Status (1)

Country Link
CN (1) CN106597406B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109508582A (en) * 2017-09-15 2019-03-22 中国公路工程咨询集团有限公司 The recognition methods of remote sensing image and device
CN110412548B (en) * 2019-07-20 2023-08-01 中国船舶集团有限公司第七二四研究所 Radar multi-target recognition method based on high-resolution one-dimensional range profile
CN110940959B (en) * 2019-12-13 2022-05-24 中国电子科技集团公司第五十四研究所 Man-vehicle classification and identification method for low-resolution radar ground target
CN111190156B (en) * 2020-01-08 2022-04-22 中国船舶重工集团公司第七二四研究所 Radar and photoelectric based low-slow small target and sea surface small target identification method
CN115166748A (en) * 2022-07-08 2022-10-11 上海埃威航空电子有限公司 Flight target identification method based on information fusion

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH102961A (en) * 1996-06-14 1998-01-06 Tech Res & Dev Inst Of Japan Def Agency Automatic classification and discrimination apparatus for target
CN101598783A (en) * 2009-07-08 2009-12-09 西安电子科技大学 Based on distance by radar under the strong noise background of PPCA model as statistical recognition method
CN104459663A (en) * 2014-11-27 2015-03-25 中国船舶重工集团公司第七二四研究所 Naval vessel and cargo vessel classification method based on high-resolution one-dimensional range profile
CN105116397A (en) * 2015-08-25 2015-12-02 西安电子科技大学 Radar high-resolution range profile target recognition method based on MMFA model
CN106019255A (en) * 2016-07-22 2016-10-12 中国船舶重工集团公司第七二四研究所 Radar target type recognition method based on one-dimensional image data layer fusion under multiple viewing angles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH102961A (en) * 1996-06-14 1998-01-06 Tech Res & Dev Inst Of Japan Def Agency Automatic classification and discrimination apparatus for target
CN101598783A (en) * 2009-07-08 2009-12-09 西安电子科技大学 Based on distance by radar under the strong noise background of PPCA model as statistical recognition method
CN104459663A (en) * 2014-11-27 2015-03-25 中国船舶重工集团公司第七二四研究所 Naval vessel and cargo vessel classification method based on high-resolution one-dimensional range profile
CN105116397A (en) * 2015-08-25 2015-12-02 西安电子科技大学 Radar high-resolution range profile target recognition method based on MMFA model
CN106019255A (en) * 2016-07-22 2016-10-12 中国船舶重工集团公司第七二四研究所 Radar target type recognition method based on one-dimensional image data layer fusion under multiple viewing angles

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
基于 HRRP 的地面目标多判决融合识别;范颖锐等;《四川兵工学报》;20151031;第36卷(第10期);第125-128页
基于信息融合的图像理解方法研究;胡良梅;《中国博士学位论文全文数据库 信息科技辑》;20060815(第08期);全文
舰船目标一维距离像多姿态角相关匹配识别;孙剑波等;《舰船电子工程》;20111231;第31卷(第11期);第49-52页
雷达目标融合识别研究;付耀文;《中国博士学位论文全文数据库 信息科技辑》;20040315(第01期);全文

Also Published As

Publication number Publication date
CN106597406A (en) 2017-04-26

Similar Documents

Publication Publication Date Title
CN106597406B (en) Based on the radar target identification method as Decision-level fusion one-dimensional under multi-angle of view
Jalal et al. Robust human activity recognition from depth video using spatiotemporal multi-fused features
US8611604B2 (en) Object detection device
Tome et al. The 1st competition on counter measures to finger vein spoofing attacks
Ikemura et al. Real-time human detection using relational depth similarity features
Xu et al. A people counting system based on head-shoulder detection and tracking in surveillance video
Li et al. Rapid and robust human detection and tracking based on omega-shape features
CN106251332B (en) SAR image airport target detection method based on edge feature
Malof et al. A large-scale multi-institutional evaluation of advanced discrimination algorithms for buried threat detection in ground penetrating radar
CN102663374B (en) Multi-class Bagging gait recognition method based on multi-characteristic attribute
CN102521565A (en) Garment identification method and system for low-resolution video
Ming et al. Activity recognition from RGB-D camera with 3D local spatio-temporal features
US9317765B2 (en) Human image tracking system, and human image detection and human image tracking methods thereof
Reichman et al. On choosing training and testing data for supervised algorithms in ground-penetrating radar data for buried threat detection
CN108171193A (en) Polarization SAR Ship Target Detection method based on super-pixel local message measurement
CN109901130A (en) A kind of rotor wing unmanned aerial vehicle detection and recognition methods converting and improve 2DPCA based on Radon
CN102928822A (en) Radar target length calculation method based on high-resolution one-dimensional range profiles
Tajbakhsh et al. Automatic polyp detection from learned boundaries
CN106019255A (en) Radar target type recognition method based on one-dimensional image data layer fusion under multiple viewing angles
Yang et al. Multiscenario open-set gait recognition based on radar micro-Doppler signatures
CN110412548A (en) Radar Multi Target recognition methods based on high-resolution lattice image
Alujaim et al. Human motion detection using planar array FMCW Radar through 3D point clouds
WO2022184699A1 (en) Anti-spoofing for contactless fingerprint readers
Iwashita et al. Gait identification using invisible shadows: robustness to appearance changes
CN117665807A (en) Face recognition method based on millimeter wave multi-person zero sample

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 210003 No. 346, Zhongshan North Road, Jiangsu, Nanjing

Patentee after: 724 Research Institute of China Shipbuilding Corp.

Address before: 210003 No. 346, Zhongshan North Road, Jiangsu, Nanjing

Patentee before: 724TH RESEARCH INSTITUTE OF CHINA SHIPBUILDING INDUSTRY Corp.

CP01 Change in the name or title of a patent holder
CB03 Change of inventor or designer information

Inventor after: Yang Xueling

Inventor after: Hao Yangang

Inventor before: Yang Xueling

CB03 Change of inventor or designer information