CN110006423B - Self-adaptive inertial navigation and visual combined navigation method - Google Patents
Self-adaptive inertial navigation and visual combined navigation method Download PDFInfo
- Publication number
- CN110006423B CN110006423B CN201910270894.9A CN201910270894A CN110006423B CN 110006423 B CN110006423 B CN 110006423B CN 201910270894 A CN201910270894 A CN 201910270894A CN 110006423 B CN110006423 B CN 110006423B
- Authority
- CN
- China
- Prior art keywords
- matrix
- covariance matrix
- navigation
- error
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
Abstract
The invention discloses a self-adaptive inertial navigation and vision combined navigation method, which designs a self-adaptive unscented Kalman filtering to combine inertial navigation and vision on the basis of the existing inertial navigation and vision combined navigation method; the method considers the noise caused by camera vibration due to road surface bumping when a vehicle runs, designs a self-adaptive factor reflecting the camera vibration noise level, and adaptively adjusts Kalman filter gain and an error covariance matrix in an integrated navigation algorithm according to different noise levels to ensure the reliability of the integrated navigation system positioning result, so that the method is a robust positioning method; because the method does not need modeling compensation or track closed-loop correction of errors caused by vibration noise in advance, the method has strong adaptability to the environment; on the basis of the inertial navigation 15-dimensional error quantity, the invention designs the state quantity of the filter added with the camera pose, realizes the tight combination of inertial navigation and vision, and is a positioning method with higher precision.
Description
Technical Field
The invention belongs to the technical field of automatic control, and particularly relates to a self-adaptive inertial navigation and visual combined navigation method.
Background
Compared with the traditional vehicle, the unmanned vehicle as an intelligent vehicle consisting of technologies such as environment perception, navigation positioning, motion control and path planning has rapidly developed in recent years and is widely concerned by society. In the technology related to the unmanned automobile, navigation positioning is the key and the foundation for the normal operation of the whole system.
At present, a satellite navigation and inertial navigation combined navigation system widely used on an unmanned automobile can obtain accurate vehicle positioning in an open environment, but in a satellite signal rejection environment such as an urban canyon, due to the influence of signal shielding and multipath effect, the satellite navigation system cannot effectively correct accumulated errors of inertial navigation, the positioning precision of the combined navigation system is rapidly reduced, and even the combined navigation system cannot perform positioning. Therefore, a scheme of combining inertial navigation and vision can be adopted, and the navigation and positioning under the satellite signal rejection environment are realized by utilizing the complementarity of the characteristics of the two sensors. However, currently, the mainstream method does not consider the measurement noise possibly caused by road bumpiness, which easily causes the divergence of the positioning error of the navigation system.
Disclosure of Invention
In view of this, the present invention provides a method for adaptive inertial navigation and visual integrated navigation, which can ensure the reliability of the positioning result of the integrated navigation system and has better robustness.
A self-adaptive inertial navigation and vision combined navigation method comprises the following steps:
step 0, initializing a system nominal state matrixSystem error state matrixAnd an initial error covariance matrix P0|0;
Step 1, resolving by a strapdown inertial navigation system to obtain a system nominal state matrixMeanwhile, a system transfer matrix F is calculated according to the kinematic dynamics relationdSum system process noise variance matrix Qd;
Step 2, utilizing the system matrix FdSum system process noise variance matrix QdCarrying out one-step prediction on the error state of the system;
step 3, performing feature extraction and feature matching on the visual images acquired by the camera, and removing mismatching points by using a single-point RANSAC method to obtain feature matching pairs among the continuous three frames of images;
step 4, 2n +1 sigma points are obtained through calculation; n represents a system state dimension;
and 5, correcting the nominal state of the system by using 2n +1 sigma pointsAnd (3) predicting an observed value by combining a feature matching result:
step 6, updating the filter observed quantity prediction covariance matrix of the current ith navigation solutionAnd covariance matrix of observed quantity and state quantity
Step 7, acquiring fact covariance matrix P of observed quantity of ith navigation solutionf,iAnd the prediction covariance matrix Pc,iWherein the fact covariance matrix Pf,iPass filter innovation ΔiSo as to obtain the compound with the characteristics of,prediction covariance matrix Pc,iUpdating the prediction covariance matrix of the filter observations in step 6:
then, a diagonal matrix of the two matrixes is obtained through a diagonal operator:
Df,i=diag(Pf,i),Dc,i=diag(Pc,i)
will Df,iAnd Dc,iDividing diagonal elements corresponding to the two matrixes, and then taking the average value to obtain the ith navigation resolving self-adaptive factor alphai:
Wherein, alpha represents the dispersion degree of sigma points, beta is used for describing the distribution information of the state quantity;
step 8, obtaining the mean value m of the adaptive factorαAnd judging:
when alpha isi≤mαComputing filter Kalman filter gain KkAnd updating the error covariance matrix Pk|k:
Wherein, Pk|k-1Representing an error covariance matrix in the error state obtained in the step 2;
when m isα<αi≤mα+σαAnd adjusting the error covariance matrix, namely:
when m isα+σα<αi≤mα+σα 2And adjusting the error covariance matrix and Kalman filtering gain, namely:
when alpha isi>mα+σα 2And then, quickly adjusting the error covariance matrix, namely:
and finally, estimating the system error state:
step 9, using the system error state obtained in step 8For nominal state of systemCorrecting to obtain a positioning result; and repeating the steps 1 to 9 to obtain continuous vehicle navigation positioning information.
Further, in the step 0, adding the camera poses at the latest two moments in the system error state matrix; in the step 9, after the positioning result is obtained, updating the historical camera poses in the system error state matrix by using the camera poses at the latest two moments, and then updating the error covariance matrix; and then returning to the step 1 to enter the next positioning calculation.
The invention has the following beneficial effects:
on the basis of the existing inertial navigation and vision combined navigation method, the invention designs a self-adaptive unscented Kalman filtering to combine inertial navigation and vision; the method considers the noise caused by camera vibration due to road surface bumping when a vehicle runs, designs a self-adaptive factor reflecting the camera vibration noise level, and adaptively adjusts Kalman filter gain and an error covariance matrix in an integrated navigation algorithm according to different noise levels to ensure the reliability of the integrated navigation system positioning result, so that the method is a robust positioning method; the method of the invention does not need modeling compensation or track closed-loop correction of errors caused by vibration noise in advance, so the method has strong adaptability to the environment.
On the basis of the inertial navigation 15-dimensional error quantity, the invention designs the state quantity of the filter added with the camera pose, realizes the tight combination of inertial navigation and vision, and is a positioning method with higher precision.
Drawings
FIG. 1 is a flow chart of a method for adaptive inertial navigation and visual integrated navigation according to the present invention.
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
Aiming at the problem that the precision of a positioning result is reduced or even the positioning cannot be realized easily due to camera vibration noise in the existing inertial navigation and visual combined navigation algorithm, a self-adaptive algorithm can be designed to carry out self-adaptive adjustment on an error covariance matrix and Kalman filtering gain in a filter. The invention provides a self-adaptive inertial navigation and vision combined navigation method, which comprises the following specific processes as shown in figure 1:
step 1, initializationNormalizing system nominal state matrixSystem error state matrixAnd an initial error covariance matrix P0|0. Meanwhile, the weight values of 2n +1 sigma points are calculated, and the formula is as follows:
wherein λ ═ α2(n + k) -n, α determines the degree of dispersion of the sigma points, usually taking a small positive value, k is chosen to be 3-n, β is used to describe the distribution information of the state quantities, the optimum value is 2, and n is the system state dimension.
Step 2, resolving by the strapdown inertial navigation system to obtain a system nominal state matrixMeanwhile, a system transfer matrix F is calculated according to the kinematic dynamics relationdSum system process noise variance matrix Qd。
Step 3, updating the filter time, namely using the system matrix F obtained in the step 2dSum system process noise variance matrix QdAnd carrying out one-step prediction on the error state of the system, wherein the formula is as follows:
Pk|k-1=FdPk-1|k-1Fd T+Qd
step 4, performing feature extraction and feature matching on the visual image acquired by the camera, and removing mismatching points by using a single-point RANSAC method to obtain feature matching pairs (p) among three continuous frame images1,p2,p3)i。
And step 5, combining the weighted value obtained in the step 1 and the one-step prediction of the error state obtained in the step 3, and obtaining 2n +1 sigma points by using unscented transformation calculation, wherein the formula is as follows:
step 6, correcting the nominal state of the system by using the 2n +1 sigma points obtained in the step 5And (4) further combining the feature matching result obtained in the step (4) to predict an observation value:
wherein h (-) is a transition matrix from the system state to the observed quantity, is a nonlinear function representing epipolar geometric constraint and trifocal tensor constraint between the continuous three-frame images, and is expressed by a formula:
step 7, updating the filter observation prediction covariance matrix according to the results of the step 5 and the step 6And covariance matrix of observed quantity and state quantityThe formula is as follows:
step 8, calculating Kalman filtering gain K of the filter by using the covariance matrix of the filter obtained in the step 7kAnd updating the error covariance matrix Pk|kTo obtain the optimal estimation of the filterThe formula is as follows:
step 9, using the system error state obtained in step 8For nominal state of systemAnd correcting to obtain a positioning result. And repeating the steps 2 to 9 to obtain continuous vehicle navigation positioning information.
The above process is the conventional step of the existing inertial navigation and vision combined navigation method based on the unscented Kalman filter, the invention designs the adaptive factor reflecting the vibration noise level of the camera on the basis of the existing navigation positioning method, and adjusts the Kalman filtering gain and the error covariance matrix of the filter by different methods to obtain the navigation positioning result of adaptive robustness, and concretely, the step 8 is changed into the following two steps:
step 8(a), obtaining the fact covariance matrix P of the observed quantityf,iAnd the prediction covariance matrix Pc,iWherein the fact covariance matrix Pf,iPass filter innovation ΔiObtaining, predicting a covariance matrix Pc,iIn step 7, the following results:
then, a diagonal matrix of the two matrixes is obtained through a diagonal operator:
Df,i=diag(Pf,i),Dc,i=diag(Pc,i)
further, D isf,iAnd Dc,iDividing diagonal elements corresponding to the two matrixes, and then taking the average value to obtain the adaptive factor alphai:
In the ideal case, the fact of the observed quantity is the same as the predicted result, so the mean m of the adaptive factor is knownαIs 1, variance σα 2Is 2-5.
Step 8(b), according to different vibration levels of the camera, using the adaptive factor obtained in the step 8(a) to perform Kalman filtering on the filter gain KkAnd performing adaptive adjustment on the error covariance matrix. Because Kalman filtering gain and error covariance matrix both have influence on filter estimation, and the influence of the Kalman filtering gain and error covariance matrix is obvious, the adaptive adjustment strategy specifically comprises the following steps: in the case of slight vibration, only the error covariance matrix is adjusted; in the case of moderate vibration, reducing Kalman filter gain prevents error divergence and error correctionAdjusting the difference covariance matrix; under the condition of severe vibration, due to low observation result reliability, the result is greatly influenced if the Kalman filtering gain is adjusted, so that only the error covariance matrix is adjusted, and a quick adjustment coefficient beta is introducediFast adjustment of the error covariance matrix, betaiThe value range of (A) is 1.5-3 according to debugging experience. By the adjustment, the optimal estimation of the filter can be obtainedIs formulated as follows: normal case of alphai≤mαNo adjustment is made.
Slight vibration condition mα<αi≤mα+σαOnly the error covariance matrix is adjusted.
Medium vibration case mα+σα<αi≤mα+σα 2And adjusting both the error covariance matrix and the Kalman filtering gain.
Severe vibration condition alphai>mα+σα 2Adjusting only the error covariance matrix and introducing a fast adjustment coefficient betaiAchieving rapid adjustment.
Estimating the system error state:
particularly, when a filter is initialized, the camera poses at the two latest moments are added on the basis of the inertial navigation 15-dimensional error quantity to form a 27-dimensional filter state quantity, so that the close combination of inertial navigation and vision is realized. And simultaneously, updating the system error state and the error covariance matrix when the single solution is finished in the step 9 so as to ensure that the filter only keeps the camera poses at the latest two moments in the cyclic iteration process.
The invention uses KITTI data set '2011 _10_03_ drive _ 0027' to carry out simulation experiment. The vehicle trajectory in the selected data set was 540 meters, the total elapsed time was 78 seconds, and the average speed was about 25 kilometers per hour. In addition, the vehicle has a typical scene of one violent vibration, one long-time moderate vibration and a plurality of slight vibrations. The navigation positioning results of the existing method and the self-adaptive inertial navigation and vision combined navigation method provided by the invention are as follows:
algorithm | Root mean square error | End point error |
The invention | 2.352 Rice | 1.513 m |
Existing methods | 4.002 m | 6.448 Rice |
According to simulation experiment results, compared with the conventional method, the self-adaptive inertial navigation and visual combined navigation method provided by the invention improves the navigation positioning precision to 2.352 meters. Meanwhile, the comparison of the end point errors shows that the method provided by the invention does not have the error divergence.
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (2)
1. A self-adaptive inertial navigation and vision combined navigation method comprises the following steps:
step 0, initializing a system nominal state matrixSystem error state matrixAnd an initial error covariance matrix P0|0;
Step 1, resolving by a strapdown inertial navigation system to obtain a system nominal state matrixMeanwhile, a system transfer matrix F is calculated according to the kinematic dynamics relationdSum system process noise variance matrix Qd;
Step 2, utilizing a system transfer matrix FdSum system process noise variance matrix QdCarrying out one-step prediction on the error state of the system;
step 3, performing feature extraction and feature matching on the visual images acquired by the camera, and removing mismatching points by using a single-point RANSAC method to obtain feature matching pairs among the continuous three frames of images;
step 4, 2n +1 sigma points are obtained through calculation; n represents a system state dimension;
and 5, correcting the nominal state of the system by using 2n +1 sigma pointsMatrix arrayAnd (3) predicting an observed value by combining a feature matching result:
step 6, updating the filter observed quantity prediction covariance matrix of the current ith navigation solutionAnd covariance matrix of observed quantity and state quantity
It is characterized by also comprising the following steps:
step 7, acquiring the fact covariance matrix P of the filter observed quantity of the ith navigation solutionf,iAnd a prediction covariance matrixWherein the fact covariance matrix Pf,iPass filter innovation ΔiSo as to obtain the compound with the characteristics of,predictive covariance matrixA prediction covariance matrix obtained by updating the filter observations in step 6:
then, the fact covariance matrix P is obtained through a diagonal operatorf,iAnd a prediction covariance matrixDiagonal matrix of (a):
will Df,iAnd Dc,iDividing diagonal elements corresponding to the two matrixes, and then taking the average value to obtain the ith navigation resolving self-adaptive factor alphai:
Wherein, alpha represents the dispersion degree of sigma points, beta is used for describing the distribution information of the state quantity;
step 8, obtaining the mean value m of the adaptive factorαAnd judging:
when alpha isi≤mαComputing filter Kalman filter gain KkAnd updating the error covariance matrix Pk|k:
Wherein, Pk|k-1Representing an error covariance matrix in the error state obtained in the step 2;
when m isα<αi≤mα+σα 2And adjusting the error covariance matrix, namely:
wherein σα 2Representing the variance of the adaptation factor;
when m isα+σα<αi≤mα+σα 2And adjusting the error covariance matrix and Kalman filtering gain, namely:
when alpha isi>mα+σα 2Time to error covarianceThe matrix is adjusted rapidly, namely:
and finally, estimating the system error state:
2. The integrated adaptive inertial navigation and visual navigation method according to claim 1, wherein in step 0, the camera poses at the last two moments are added to the system error state matrix; in the step 9, after the positioning result is obtained, updating the historical camera poses in the system error state matrix by using the camera poses at the latest two moments, and then updating the error covariance matrix; and then returning to the step 1 to enter the next positioning calculation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910270894.9A CN110006423B (en) | 2019-04-04 | 2019-04-04 | Self-adaptive inertial navigation and visual combined navigation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910270894.9A CN110006423B (en) | 2019-04-04 | 2019-04-04 | Self-adaptive inertial navigation and visual combined navigation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110006423A CN110006423A (en) | 2019-07-12 |
CN110006423B true CN110006423B (en) | 2020-11-06 |
Family
ID=67170029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910270894.9A Active CN110006423B (en) | 2019-04-04 | 2019-04-04 | Self-adaptive inertial navigation and visual combined navigation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110006423B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113532455B (en) * | 2020-04-17 | 2023-03-31 | 北京三快在线科技有限公司 | Visual fusion positioning method and device, electronic equipment and readable storage medium |
CN113916222B (en) * | 2021-09-15 | 2023-10-13 | 北京自动化控制设备研究所 | Combined navigation method based on Kalman filtering estimation variance constraint |
CN114485574B (en) * | 2021-12-21 | 2023-03-21 | 武汉大学 | Three-linear array image POS auxiliary ground positioning method based on Kalman filtering model |
CN114777745A (en) * | 2022-04-08 | 2022-07-22 | 南京信息工程大学 | Inclined evidence obtaining modeling method based on unscented Kalman filtering |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102538781A (en) * | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
CN103162687A (en) * | 2013-03-07 | 2013-06-19 | 中国人民解放军国防科学技术大学 | Image/inertial navigation combination navigation method based on information credibility |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080195316A1 (en) * | 2007-02-12 | 2008-08-14 | Honeywell International Inc. | System and method for motion estimation using vision sensors |
CN101598556B (en) * | 2009-07-15 | 2011-05-04 | 北京航空航天大学 | Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment |
US10012504B2 (en) * | 2014-06-19 | 2018-07-03 | Regents Of The University Of Minnesota | Efficient vision-aided inertial navigation using a rolling-shutter camera with inaccurate timestamps |
CN106708066B (en) * | 2015-12-20 | 2019-07-26 | 中国电子科技集团公司第二十研究所 | View-based access control model/inertial navigation unmanned plane independent landing method |
CN106885569A (en) * | 2017-02-24 | 2017-06-23 | 南京理工大学 | A kind of missile-borne deep combination ARCKF filtering methods under strong maneuvering condition |
CN107014371A (en) * | 2017-04-14 | 2017-08-04 | 东南大学 | UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension |
CN109443353B (en) * | 2018-12-25 | 2020-11-06 | 中北大学 | Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF |
-
2019
- 2019-04-04 CN CN201910270894.9A patent/CN110006423B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102538781A (en) * | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
CN103162687A (en) * | 2013-03-07 | 2013-06-19 | 中国人民解放军国防科学技术大学 | Image/inertial navigation combination navigation method based on information credibility |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
Also Published As
Publication number | Publication date |
---|---|
CN110006423A (en) | 2019-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110006423B (en) | Self-adaptive inertial navigation and visual combined navigation method | |
CN109459019B (en) | Vehicle navigation calculation method based on cascade adaptive robust federal filtering | |
CN109459705B (en) | Power battery SOC estimation method of anti-outlier robust unscented Kalman filtering | |
Zhang et al. | A hybrid intelligent algorithm DGP-MLP for GNSS/INS integration during GNSS outages | |
CN112758097A (en) | State prediction and estimation method for unmanned vehicle | |
CN107608208B (en) | Task constraint-oriented spacecraft attitude control system on-orbit reconstruction method | |
CN116309731A (en) | Multi-target dynamic tracking method based on self-adaptive Kalman filtering | |
CN112083457A (en) | IMM satellite positioning and navigation method based on neural network optimization | |
CN115451952B (en) | Multi-system integrated navigation method and device for fault detection and robust adaptive filtering | |
CN113465628A (en) | Inertial measurement unit data compensation method and system | |
CN110763245A (en) | Map creating method and system based on stream computing | |
CN114626307B (en) | Distributed consistent target state estimation method based on variational Bayes | |
CN108459314B (en) | Three-dimensional solid-state area array laser radar non-uniform correction method | |
CN113362377B (en) | VO weighted optimization method based on monocular camera | |
CN104050349B (en) | External air temperature measuring apparatus and method | |
CN114063131A (en) | GNSS/INS/wheel speed combined positioning real-time smoothing method | |
CN114970341B (en) | Method for establishing low-orbit satellite orbit prediction precision improvement model based on machine learning | |
CN112034445A (en) | Vehicle motion trail tracking method and system based on millimeter wave radar | |
Dah-Jing et al. | Neural network aided adaptive Kalman filter for GPS/INS navigation system design | |
CN117170354A (en) | Wheel type robot positioning method and system | |
US9900491B2 (en) | Method and system for automatically focusing a camera | |
CN115731269A (en) | Method and device for predicting road lane line, vehicle and storage medium | |
CN114567288B (en) | Distribution collaborative nonlinear system state estimation method based on variable decibels | |
CN112800889A (en) | Target tracking method based on distributed matrix weighting and Gaussian filtering fusion | |
CN115546319B (en) | Lane keeping method, lane keeping apparatus, computer device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |