CN105424059B - Wide baseline near infrared camera position and orientation estimation method - Google Patents
Wide baseline near infrared camera position and orientation estimation method Download PDFInfo
- Publication number
- CN105424059B CN105424059B CN201510750446.0A CN201510750446A CN105424059B CN 105424059 B CN105424059 B CN 105424059B CN 201510750446 A CN201510750446 A CN 201510750446A CN 105424059 B CN105424059 B CN 105424059B
- Authority
- CN
- China
- Prior art keywords
- datum mark
- camera
- line
- datum
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
Landscapes
- Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a kind of wide baseline near infrared camera position and orientation estimation method, the technical issues of for solving existing method poor practicability.Technical solution be each camera on warship runway public visual field large scene region in choose symmetrical six pairs of datum marks, utilize the world coordinates of all-station instrument accurate measurement datum mark;Identification light is placed in reference point location when being demarcated, the pose of each camera is accurately calculated by detecting identification light.In view of near infrared camera can not shoot effective scaling board photo, and the characteristic of influence and infrared camera itself of the visible light to calibration result in the complicated natural scene in the scene of UAV Landing, using infrared laser as cooperation identification light, and install optical filter additional on infrared camera.Six pairs of datum marks are set in landing runway both sides, the spatial accuracy for being carried out grade to the space coordinate of datum mark using total powerstation is measured.After tested, calibration result is accurate, and the re-projection error on image reaches 0.05 pixel or less.
Description
Technical field
The present invention relates to a kind of wide baseline near infrared camera position and orientation estimation method, autonomous more particularly to a kind of unmanned plane
The wide baseline near infrared camera position and orientation estimation method of warship guidance system.
Background technology
View-based access control model unmanned plane autonomous landing on the ship navigation be current research hotspot airmanship, have bionics meaning and
Source has system independence, strong interference immunity, characteristic with high accuracy compared with traditional inertial and satellite navigation technology.
The navigation of unmanned plane autonomous landing on the ship refers to the navigation system for measuring camera and being mounted on landing runway, the fixed-focus camera in navigation system
It is often affected by ambient light when track and localization, in order to avoid this influence, in existing vision navigation system
The light for some specific wavelength that luminous point is sent out is obtained using near infrared camera.In addition, in order to obtain three-dimensional seat in space
Correspondence between punctuate and camera review coordinate, it is necessary to which punctuate, and the precision of camera calibration are carried out to video camera
Directly affect the precision of optical alignment.
Common scaling method needs place calibrated reference before camera to be calibrated, usually big with known form and size
Small scaling board is demarcated, such as gridiron pattern, cross, but since near infrared camera can not obtain ordinary optical phase
The texture information of machine scaling board, so near-infrared video camera can not be demarcated with the scaling board of general visible video camera.Vision
Navigation system needs to cover the landing runway of enough ranges, so needing to install telephoto lens additional to video camera, due to telephoto lens
The depth of field is short, it is difficult to which accurate focusing brings prodigious difficulty to the method for tradition shooting precision calibration object.
Document " A camera calibration method for large field optical measurement,
Optik 123 (2013) 6553-6558 " proposes a kind of scaling method for the wide baseline multicamera system of large scene, the party
Method calculates the internal reference matrix of camera first with gridiron pattern scaling board, then calculates phase in the way of place placement index point
The outer ginseng matrix of machine, the projection of wherein index point being placed on camera image are to be similar to be parallel to the transverse matrix of image,
It is possible thereby to simplify the calculating of longitudinal direction distortion coefficients of camera lens.For unmanned plane autonomous landing on the ship system, the meeting of visible light in scene
Generate greatly interference.And for infrared camera, common scaling method needs to place calibration reference before camera to be calibrated
Object is usually demarcated with the scaling board of known form and size, such as gridiron pattern, cross, but due to close red
Outer camera can not obtain the texture information of ordinary optical camera calibration plate, so the calibration of general visible video camera can not be used
Plate demarcates near-infrared video camera.And vision navigation system needs to cover the landing runway of enough ranges, so needing to taking the photograph
Camera installs telephoto lens additional, since the telephoto lens depth of field is short, it is difficult to accurate focusing, to the method band of tradition shooting precision calibration object
Prodigious difficulty is carried out.
Invention content
In order to overcome the shortcomings of that existing method poor practicability, the present invention provide a kind of wide baseline near infrared camera pose estimation
Method.This method chooses symmetrical six pairs of datum marks on warship runway in the public visual field large scene region of each camera,
Utilize the world coordinates of all-station instrument accurate measurement datum mark;Identification light is placed in reference point location when being demarcated, passes through detection
Identification light accurately calculates the pose of each camera.This method considers, and near infrared camera can not shoot effective scaling board photo,
Pose estimation can not be carried out using traditional scaling method.And it is visible in the complicated natural scene in the scene of UAV Landing
The characteristic of influence and infrared camera itself of the light to calibration result.Design has used infrared laser etc. to be used as cooperation identification light,
And install optical filter additional on infrared camera.Six pairs of datum marks are set in landing runway both sides, and using total powerstation to benchmark
The space coordinate of point carries out spatial accuracy measurement.After tested, calibration result is accurate, and the re-projection error on image reaches 0.05 picture
It is plain following.
The technical solution adopted by the present invention to solve the technical problems is:A kind of width baseline near infrared camera pose estimation side
Method, its main feature is that including the following steps:
Step 1: establishing a world coordinate system to unmanned plane Autonomous landing runway, near infrared camera is placed under unmanned plane
The both sides of drop runway are simultaneously symmetrical, and it is world coordinate system origin to choose camera baseline midpoint, are X-axis along runway center line marking direction,
It is Y direction perpendicular to runway heading, is straight up Z axis, coordinate system meets right hand theorem.
For the transformation matrix between estimation world three dimensional coordinate and camera image two-dimensional coordinate, need to obtain several three-dimensionals
The corresponding points of point and image coordinate, therefore select 12 datum marks in runway both sides:
Datum mark 1,2:The direction that datum mark 2 is directed toward datum mark 1 is Y-axis positive direction, datum mark 1, in 2 two line I_12
Point is world coordinate system origin.I_12 length is 15m.
Datum mark 3,4:The line I_34 of datum mark 3, the 4 and line I_12 of datum mark 1,2 is parallel, between I_12 and I_34
Vertical range be 50m, I_34 length be 15m.
Datum mark 5,6:The line I_5 of datum mark 5,6,6 is parallel with the line I_12 of datum mark 1,2, I_12 and I_5,6 it
Between vertical range be 100m, I_5,6 length be 15m.
Datum mark 7,8:The line I_5 of datum mark 5,6,6 is parallel with the line I_12 of datum mark 1,2, I_12 and I_7,8 it
Between vertical range be 150m, I_7,8 length be 15m.
Datum mark 9,10:The line I_9 of datum mark 9,10,10 is parallel with the line I_12 of datum mark 1,2, I_12 and I_9,
It is 15m that vertical range between 10, which is 200m, I_9,10 length,.
Datum mark 11,12:The line I_11.12 of datum mark 11,12 and the line I_12 of datum mark 1,2 are parallel, I_12 with
I_11, it is 15m that the vertical range between 12, which is 300m, I_11,12 length,.
It is datum mark above ground level that a part is chosen in 12 datum marks.Wherein, datum mark 7,8 be high point, height
For 2.8m;Datum mark 11,12 is high point, is highly 2.6m.
After the completion of reconnaissance, in datum mark placing total station prism, total powerstation is placed at world coordinate system origin, it will be complete
Instrument positive direction of standing is positioned along runway center line marking direction, uses seat of 12 datum marks of all-station instrument accurate measurement under total station instrument coordinate system
Mark.
Step 2: cooperation identification light of the near-infrared laser lamp as the near infrared camera in landing navigation system is chosen, and
And install optical filter additional on camera lens.After opening each cooperation identification light, the camera shooting figure in landing navigation system is used
Piece obtains the image coordinate of datum mark using manual reconnaissance in such a way that field detection mode is combined.
Step 3: assume there is correspondence between the coordinate points X and camera image coordinate points x in three-dimensional world coordinate system,
Camera matrix P just can be determined.For each group of correspondence X and x, derived relation formula:
Wherein PiTIt is the i-th row of matrix P, is a four-vector.
Step 4: the camera transformation matrix P being calculated using first three step, image is calculated by formula x=PX
Two-dimensional coordinate ties up to the coordinate in three-dimensional world coordinate system.
The beneficial effects of the invention are as follows:This method is chosen on warship runway in the public visual field large scene region of each camera
Symmetrical six pairs of datum marks, utilize the world coordinates of all-station instrument accurate measurement datum mark;In benchmark point when being demarcated
Placement location identification light accurately calculates the pose of each camera by detecting identification light.This method considers, and near infrared camera can not
Effective scaling board photo is shot, pose estimation can not be carried out using traditional scaling method.And in the scene of UAV Landing
In complicated natural scene in influence and infrared camera itself of the visible light to calibration result characteristic.Design has used infrared
Laser etc. is used as cooperation identification light, and installs optical filter additional on infrared camera.In landing runway both sides, six pairs of datum marks are set,
And the spatial accuracy for being carried out grade to the space coordinate of datum mark using total powerstation is measured.After tested, calibration result is accurate, image
On re-projection error reach 0.05 pixel or less.
It elaborates With reference to embodiment to the present invention.
Specific implementation mode
Width baseline near infrared camera position and orientation estimation method of the invention is as follows:
1, the selection of datum mark and measurement of coordinates.
The pose estimation procedure of camera is really the calculating to world coordinate system transformation matrix between camera coordinates system, because
This needs first to establish a world coordinate system to unmanned plane Autonomous landing runway, and the selection of world coordinate system is as follows in the present invention:
The both sides that camera is located at decline runway are symmetrical, and it is world coordinate system origin to choose camera baseline midpoint, along runway center line marking side
To for X-axis, it is Y direction perpendicular to runway heading, is straight up Z axis, coordinate system meets right hand theorem.
For the transformation matrix between estimation world three dimensional coordinate and camera image two-dimensional coordinate, need to obtain several three-dimensionals
Point and the corresponding points of image coordinate, it is a in the present invention, in runway both sides select 12 (6 pairs) datum marks, wherein datum mark
Selection mode is as follows:
Datum mark 1,2:The direction that datum mark 2 is directed toward datum mark 1 is Y-axis positive direction, datum mark 1, in 2 two line I_12
Point is world coordinate system origin.I_12 length is 15m.
Datum mark 3,4:The line I_34 of datum mark 3, the 4 and line I_12 of datum mark 1,2 is parallel, between I_12 and I_34
Vertical range be 50m, I_34 length be 15m.
Datum mark 5,6:The line I_5 of datum mark 5,6,6 is parallel with the line I_12 of datum mark 1,2, I_12 and I_5,6 it
Between vertical range be 100m, I_5,6 length be 15m.
Datum mark 7,8:The line I_5 of datum mark 5,6,6 is parallel with the line I_12 of datum mark 1,2, I_12 and I_7,8 it
Between vertical range be 150m, I_7,8 length be 15m.
Datum mark 9,10:The line I_9 of datum mark 9,10,10 is parallel with the line I_12 of datum mark 1,2, I_12 and I_9,
It is 15m that vertical range between 10, which is 200m, I_9,10 length,.
Datum mark 11,12:The line I_11.12 of datum mark 11,12 and the line I_12 of datum mark 1,2 are parallel, I_12 with
I_11, it is 15m that the vertical range between 12, which is 300m, I_11,12 length,.
In the case where large scene demarcates environment, in order to simultaneously so that ground and aerial accuracy of detection are guaranteed, in datum mark
A middle selection part is datum mark above ground level.Wherein:Datum mark 7,8 is high point, is highly 2.8m;Datum mark 11,12 is
High point is highly 2.6m.
The dimensional measuring instrument that datum mark selects is tape measure;It, will be complete in datum mark placing total station prism after the completion of reconnaissance
Instrument of standing is placed at world coordinate system origin, and total powerstation positive direction is positioned along runway center line marking direction, is accurately surveyed using total powerstation
Measure coordinate of 12 datum marks under total station instrument coordinate system.Total station instrument coordinate system is identical as world coordinate system in this way,
The conversion of total station instrument coordinate system and world coordinate system can be simplified.
2, it shoots the cooperation identification light of datum and obtains the image coordinate of datum mark.
The purpose of this step is to determine image coordinate of the datum mark on the image captured by camera.First in each datum
Cooperation identification light is placed, for the background interference and distance of various weather conditions and large scene that landing runway is likely to occur
Problem, cooperation identification light of the present invention selection near-infrared laser lamp as the near infrared camera in landing navigation system, and
Install optical filter on camera lens additional.After opening each cooperation identification light, picture is shot using the camera in landing navigation system, by
It is selected in landing runway both sides, therefore in image of each cooperation identification light captured by camera in datum mark.Profit of the invention
With near infrared camera imaging characteristics, the image for being obtained datum mark in such a way that field detection mode is combined using manual reconnaissance is sat
Mark, i.e., in the interactive software developed, the position of each identification light is clicked by mouse, then in the 5*5 of mouse reconnaissance
With the method that intensity-weighted is average in territory, the center of speck in picture is calculated, using centre coordinate as
The image coordinate of datum mark.
3, camera parameter is demarcated.
Simplest correspondence is, it is assumed that it is corresponding between 3D coordinate points X and camera image coordinate points x, if it is known that foot
The correspondence of enough X and x, with Method of Direct Liner Transformation (DTL), camera matrix P can be determined.
It, can be with derived relation formula for each group of correspondence X and x:
Wherein PiTIt is the i-th row of matrix P, it is 4 n dimensional vector ns.
4, camera pose is estimated.
1,2,3 steps of the implementation steps of the present invention have been completed the estimation for camera transformation matrix P, you can to pass through
Formula:The re-projection error of x=PX calculating matrix P, re-projection error are estimated by the datum mark world coordinates that will be obtained is used
Camera transformation put to the proof P and project on camera image, the pixel distance between re-projection pixel and original pixel is calculated, for fixed
Amount evaluation camera pose estimated accuracy.And it can directly pass through formula:X=PX calculates 2 dimension coordinate of image and ties up to the 3 dimension worlds
Coordinate in coordinate system.
Claims (1)
1. a kind of width baseline near infrared camera position and orientation estimation method, it is characterised in that include the following steps:
Step 1: establishing a world coordinate system to unmanned plane Autonomous landing runway, near infrared camera is placed in unmanned plane and declines race
The both sides in road are simultaneously symmetrical, and it is world coordinate system origin to choose camera baseline midpoint, are X-axis along runway center line marking direction, vertically
It is Y direction in runway heading, is straight up Z axis, coordinate system meets right hand theorem;
For the transformation matrix between estimation world three dimensional coordinate and camera image two-dimensional coordinate, need to obtain several three-dimensional points with
The corresponding points of image coordinate, therefore select 12 datum marks in runway both sides:
Datum mark 1,2:The direction that datum mark 2 is directed toward datum mark 1 is Y-axis positive direction, and datum mark 1,2 two midpoints line I_12 are
World coordinate system origin, I_12 length are 15m;
Datum mark 3,4:The line I_34 of datum mark 3, the 4 and line I_12 of datum mark 1,2 is parallel, hanging down between I_12 and I_34
Straight distance is 50m, and I_34 length is 15m;
Datum mark 5,6:The line I_5 of datum mark 5,6,6 is parallel with the line I_12 of datum mark 1,2, I_12 and I_5, between 6
Vertical range is that 100m, I_5,6 length are 15m;
Datum mark 7,8:The line I_5 of datum mark 5,6,6 is parallel with the line I_12 of datum mark 1,2, I_12 and I_7, between 8
Vertical range is that 150m, I_7,8 length are 15m;
Datum mark 9,10:The line I_9 of datum mark 9,10,10 is parallel with the line I_12 of datum mark 1,2, I_12 and I_9,10 it
Between vertical range be 200m, I_9,10 length be 15m;
Datum mark 11,12:The line I_11 of datum mark 11,12,12 is parallel with the line I_12 of datum mark 1,2, I_12 and I_11,
It is 15m that vertical range between 12, which is 300m, I_11,12 length,;
It is datum mark above ground level that a part is chosen in 12 datum marks;Wherein, datum mark 7,8 be high point, is highly
2.8m;Datum mark 11,12 is high point, is highly 2.6m;
After the completion of reconnaissance, in datum mark placing total station prism, total powerstation is placed at world coordinate system origin, by total powerstation
Positive direction is positioned along runway center line marking direction, uses coordinate of 12 datum marks of all-station instrument accurate measurement under total station instrument coordinate system;
Step 2: cooperation identification light of the near-infrared laser lamp as the near infrared camera in landing navigation system is chosen, and
Install optical filter on camera lens additional;After opening each cooperation identification light, picture is shot using the camera in landing navigation system, is adopted
The image coordinate of datum mark is obtained with manual reconnaissance with the mode that field detection mode is combined;
Step 3: assuming there is correspondence between the coordinate points X and camera image coordinate points x in three-dimensional world coordinate system, image
Machine matrix P just can be determined;For each group of correspondence X and x, derived relation formula:
Wherein PiTIt is the i-th row of matrix P, is a four-vector;
Step 4: the camera matrix P being calculated using first three step, is calculated two-dimensional image by formula x=PX and sat
Mark ties up to the coordinate in three-dimensional world coordinate system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510750446.0A CN105424059B (en) | 2015-11-06 | 2015-11-06 | Wide baseline near infrared camera position and orientation estimation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510750446.0A CN105424059B (en) | 2015-11-06 | 2015-11-06 | Wide baseline near infrared camera position and orientation estimation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105424059A CN105424059A (en) | 2016-03-23 |
CN105424059B true CN105424059B (en) | 2018-10-16 |
Family
ID=55502430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510750446.0A Expired - Fee Related CN105424059B (en) | 2015-11-06 | 2015-11-06 | Wide baseline near infrared camera position and orientation estimation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105424059B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105890590A (en) * | 2016-04-12 | 2016-08-24 | 西北工业大学 | UAV (unmanned aerial vehicle) remote optical landing guidance system based on infrared laser lamps and multi-camera array |
CN106228534B (en) * | 2016-07-08 | 2019-05-17 | 众趣(北京)科技有限公司 | Relationship scaling method between a kind of shaft and camera based on constrained global optimization |
CN106444792A (en) * | 2016-09-18 | 2017-02-22 | 中国空气动力研究与发展中心高速空气动力研究所 | Infrared visual recognition-based unmanned aerial vehicle landing positioning system and method |
CN110764117B (en) * | 2019-10-31 | 2022-10-11 | 成都圭目机器人有限公司 | Method for calibrating relative position of detection robot antenna and sensor based on total station |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103940364A (en) * | 2014-05-04 | 2014-07-23 | 赵鸣 | Subway tunnel relative deformation photogrammetry method |
CN104200086A (en) * | 2014-08-25 | 2014-12-10 | 西北工业大学 | Wide-baseline visible light camera pose estimation method |
CN104215239A (en) * | 2014-08-29 | 2014-12-17 | 西北工业大学 | Vision-based autonomous unmanned plane landing guidance device and method |
CN104637053A (en) * | 2015-01-29 | 2015-05-20 | 西北工业大学 | Method for calibrating wide baseline multi-array camera system |
-
2015
- 2015-11-06 CN CN201510750446.0A patent/CN105424059B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103940364A (en) * | 2014-05-04 | 2014-07-23 | 赵鸣 | Subway tunnel relative deformation photogrammetry method |
CN104200086A (en) * | 2014-08-25 | 2014-12-10 | 西北工业大学 | Wide-baseline visible light camera pose estimation method |
CN104215239A (en) * | 2014-08-29 | 2014-12-17 | 西北工业大学 | Vision-based autonomous unmanned plane landing guidance device and method |
CN104637053A (en) * | 2015-01-29 | 2015-05-20 | 西北工业大学 | Method for calibrating wide baseline multi-array camera system |
Non-Patent Citations (3)
Title |
---|
"A calibration methods for Vision measuring system with large view field";Feng Guanming;《2011 4th International Congress on Image and Signal Processing》;20111231;正文第1377-1380页 * |
"Accurate Installation Method and Precision Analysis for Vision Measurement of Remote Falling Point";Meilian Liu;《International Conference on Information Science and Technology》;20110328;正文第751-754页 * |
"无人机平台运动目标检测与跟踪及其视觉辅助着陆系统研究";张恒;《中国博士学位论文全文数据库,信息科技辑》;20100415;正文第119页、第136页 * |
Also Published As
Publication number | Publication date |
---|---|
CN105424059A (en) | 2016-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104200086B (en) | Wide-baseline visible light camera pose estimation method | |
CN104197928B (en) | Multi-camera collaboration-based method for detecting, positioning and tracking unmanned aerial vehicle | |
CN106651953B (en) | A kind of vehicle position and orientation estimation method based on traffic sign | |
CN104215239B (en) | Guidance method using vision-based autonomous unmanned plane landing guidance device | |
CN111540048B (en) | Fine live-action three-dimensional modeling method based on space-ground fusion | |
CN102419178B (en) | Mobile robot positioning system and method based on infrared road sign | |
CN104864889B (en) | A kind of robot odometer correction system and method for view-based access control model | |
CN107121125B (en) | A kind of communication base station antenna pose automatic detection device and method | |
CN105424059B (en) | Wide baseline near infrared camera position and orientation estimation method | |
CN107316325A (en) | A kind of airborne laser point cloud based on image registration and Image registration fusion method | |
CN108088414A (en) | A kind of monocular distance measuring method | |
CN106774431A (en) | One kind mapping unmanned plane route planning method and device | |
CN105913410A (en) | Long-distance moving object height measurement apparatus and method based on machine vision | |
CN110009682B (en) | Target identification and positioning method based on monocular vision | |
CN109801302A (en) | A kind of ultra-high-tension power transmission line foreign matter detecting method based on binocular vision | |
CN108335337B (en) | method and device for generating orthoimage picture | |
CN106019264A (en) | Binocular vision based UAV (Unmanned Aerial Vehicle) danger vehicle distance identifying system and method | |
CN101532841A (en) | Method for navigating and positioning aerocraft based on landmark capturing and tracking | |
CN106408601A (en) | GPS-based binocular fusion positioning method and device | |
US20130113897A1 (en) | Process and arrangement for determining the position of a measuring point in geometrical space | |
CN107607091A (en) | A kind of method for measuring unmanned plane during flying flight path | |
CN108180888A (en) | A kind of distance detection method based on rotating pick-up head | |
Crispel et al. | All-sky photogrammetry techniques to georeference a cloud field | |
CN113340272B (en) | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle | |
CN109035343A (en) | A kind of floor relative displacement measurement method based on monitoring camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20181016 Termination date: 20191106 |