CN103606007A - Target identification method and apparatus based on Internet of Things - Google Patents

Target identification method and apparatus based on Internet of Things Download PDF

Info

Publication number
CN103606007A
CN103606007A CN201310590706.3A CN201310590706A CN103606007A CN 103606007 A CN103606007 A CN 103606007A CN 201310590706 A CN201310590706 A CN 201310590706A CN 103606007 A CN103606007 A CN 103606007A
Authority
CN
China
Prior art keywords
internet
things
neural network
target
target identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310590706.3A
Other languages
Chinese (zh)
Other versions
CN103606007B (en
Inventor
李炯城
丁胜培
李桂愉
黄海艺
肖恒辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Planning and Designing Institute of Telecommunications Co Ltd
Original Assignee
Guangdong Planning and Designing Institute of Telecommunications Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Planning and Designing Institute of Telecommunications Co Ltd filed Critical Guangdong Planning and Designing Institute of Telecommunications Co Ltd
Priority to CN201310590706.3A priority Critical patent/CN103606007B/en
Publication of CN103606007A publication Critical patent/CN103606007A/en
Application granted granted Critical
Publication of CN103606007B publication Critical patent/CN103606007B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a target identification method and apparatus based on the Internet of Things. The target identification method comprises: obtaining a measuring value aiming at a predetermined target acquired by each sensor in the Internet of Things; training a BP nerve network by taking the obtained measuring values as the input of the BP nerve network, wherein the double-curve tangent transfer function of the BP nerve network performs approximation by use of a polynomial; and according to the trained BP nerve network, performing mode identification on the predetermined target. According to the invention, the polynomial is introduced into the field of the Internet of Things, and the polynomial approximation of the transfer function is adopted by combination with a Gauss bracket so that the calculating precision is ensured, the calculating complexity is lowered, and a nerve network model is rapidly realized in a resource-constrained environment of the sensors, thus the real-time performance is substantially improved, and target identification is better realized based on the Internet of Things.

Description

Target identification method based on Internet of Things and device
Technical field
The present invention relates to technology of Internet of things field, relate in particular to a kind of target identification method and a kind of Target Identification Unit based on Internet of Things based on Internet of Things.
Background technology
In actual Internet of things system, various features amount is monitored, and these information are merged, obtain compatibility of goals identification, all these depend on sensing system.Because of the intrinsic learning ability of neural network and adaptive faculty, BP neural network is widely used in Practical Project, fields such as weather prognosis, image processing, control automatically, Combinatorial Optimization, Video segmentation.But BP neural network requires hardware to have stronger computing power, this causes the weak sensor hardware of some computing powers to use.Therefore, when input feature vector amount dimension is larger, neural network is complex structure not only, and the training time extends greatly, and real-time is also bad.So, in the neural network of Multi-sensor Fusion experiment platform for IOT, relating to the calculating of transfer function, computation complexity is high, thereby has greatly limited the application prospect of Multi-sensor Fusion Internet of Things aspect target identification.
Field of target recognition in Internet of Things, no matter traditional classic BP neural network or improved relevant BP neural network all need hardware to have stronger computing power.Sensor in Internet of Things, belongs to a kind of computational resource constrained environment.On sensor, there is no CPU, therefore cannot call the hyperbolic tangent function of the inside, programming language storehouse, so a little less than hardware computing power, cannot bear related operation, seriously restricted the target recognition capability of Internet of Things.
Summary of the invention
Based on this, the invention provides a kind of target identification method and a kind of Target Identification Unit based on Internet of Things based on Internet of Things.
A target identification method based on Internet of Things, comprises the following steps:
Obtain the measured value that in Internet of things system, each sensor tip gathers intended target;
Input quantity using the described measured value obtaining as BP neural network, trains described BP neural network; Wherein, the tanh transfer function of described BP neural network adopts polynomial expression to approach;
According to the described BP neural network after training, described intended target is carried out to pattern-recognition.
Compare with general technology, the target identification method that the present invention is based on Internet of Things approaches polynomial expression to be incorporated in Internet of Things field, the polynomial expression that has drawn transfer function in conjunction with Gaussian beam approaches, guarantee computational accuracy and reduce computation complexity, make neural network model be able to realize fast under the resource-constrained environment of sensor, thereby greatly improved real-time, and then based on Internet of Things realize target, identified well.
A Target Identification Unit based on Internet of Things, comprises acquisition module, training module and identification module;
Described acquisition module, measured value intended target being gathered for obtaining each sensor tip of Internet of things system;
Described training module, for the input quantity using the described measured value obtaining as BP neural network, trains described BP neural network; Wherein, the tanh transfer function of described BP neural network adopts polynomial expression to approach;
Described identification module, for according to the described BP neural network after training, carries out pattern-recognition to described intended target.
Compare with general technology, the Target Identification Unit that the present invention is based on Internet of Things approaches polynomial expression to be incorporated in Internet of Things field, the polynomial expression that has drawn transfer function in conjunction with Gaussian beam approaches, guarantee computational accuracy and reduce computation complexity, make neural network model be able to realize fast under the resource-constrained environment of sensor, thereby greatly improved real-time, and then based on Internet of Things realize target, identified well.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet that the present invention is based on the target identification method of Internet of Things;
Fig. 2 is for realizing the schematic flow sheet of a preferred embodiment of the target identification method that the present invention is based on Internet of Things;
Fig. 3 is the structural representation that the present invention is based on the Target Identification Unit of Internet of Things.
Embodiment
For further setting forth the technological means that the present invention takes and the effect obtaining, below in conjunction with accompanying drawing and preferred embodiment, to technical scheme of the present invention, know and complete description.
Refer to Fig. 1, for the present invention is based on the schematic flow sheet of the target identification method of Internet of Things.
The target identification method that the present invention is based on Internet of Things, comprises the following steps:
S101 obtains the measured value that in Internet of things system, each sensor tip gathers intended target;
The input quantity of S102 using the described measured value obtaining as BP neural network, trains described BP neural network; Wherein, the tanh transfer function of described BP neural network adopts polynomial expression to approach;
S103, according to the described BP neural network after training, carries out pattern-recognition to described intended target.
In step S101, obtain the measured value that in Internet of things system, each sensor tip gathers intended target, as one of them embodiment, described measured value comprises temperature, humidity, pressure and flow.
Measured value can comprise the various measured values that in Internet of things system, sensor tip gathers intended target, and the type of measured value is more extensive, and the later stage is just higher to the accuracy of target identification.
In actual Internet of things system, various features amount (as temperature, humidity, pressure, flow etc.) is monitored, and these information are merged, obtain compatibility of goals and explain and describe, all these depend on sensing system.
In step S102, in the neural network of Multi-sensor Fusion experiment platform for IOT, the former transfer function using is as follows:
tanh ( x ) = e x - e - x e x + e - x = e x - 1 e x e x + 1 e x
When x=9,1-tanh (9)=3.05 * 10 -8.And adopt float(floating number) during precision, the minimum interval that computing machine can represent is 1.1921 * 10 -7.This can key in eps(' single ' in matlab) confirm.Again because tanh (x) is odd function, i.e. tanh (x)=-tanh (x),
Figure BDA0000418272960000035
only need to approach e x, x ∈ [0,9].According to exponential function e xcharacteristic, known:
Figure BDA0000418272960000032
Wherein,
Figure BDA0000418272960000033
be Gaussian beam, representative is not more than the maximum integer of x,
Figure BDA0000418272960000034
represent to be not less than the smallest positive integral of x.If e 1, e 2..., e 9calculate in advance, and leave in a register array as constant, be designated as a={a 1..., a 9.According to (1), to e x, the calculating of x ∈ [0,9], only needs to calculate e x, x ∈ [0.5,0.5].Adopt Taylor polynomial to launch approximate solution.When x from Taylor expansion point more away from, truncation error is larger.Above-mentioned skill is limited to range of variables | and x|≤0.5, goes far towards to reduce error.Along with exponent number increases, upper error reduces rapidly, and actual error declines faster.For example, when n=7, r nthe upper bound has dropped to 1.1 * 10 -7, be less than the float minimum interval of computing machine.
Similar analysis is known, when independent variable is when [0.5,0], i.e. and x ∈ [0,0.5].E -xcan carry out Taylor polynomial is unfolded as follows:
e - x = 1 - x + x 2 2 ! - x 3 3 ! + · · · ( 0 ≤ x ≤ 0.5 )
Thereby known truncation error is
r n = ( - 1 ) n + 1 × x n + 1 ( n + 1 ) ! + ( - 1 ) n + 2 × x n + 2 ( n + 2 ) ! + · · ·
For succinctly, be defined as follows symbol:
u m = Δ x m m ! ,
Thereby can obtain:
u m + 1 u m = x m + 1 .
For any fixing x, have:
lim m → ∞ u m + 1 u m = 0 .
According to reaching, bright Bel's diagnostic method is known,
Figure BDA0000418272960000046
in addition { u, mmonotonicity is obvious, so { r nalternate function progression sum, its general term dullness goes to zero, namely alternating harmonic series.Known according to the character of alternating harmonic series:
| r n | ≤ x n + 1 ( n + 1 ) ! ≤ 1 2 ( n + 1 ) × ( n + 1 ) × n !
Visible e xindependent variable when [0.5,0] than having lower upper error in [0,0.5].According to more than, when n=7, r nthe upper bound has dropped to 0.97 * 10 -7.Therefore, in the present invention, only need to get n=7 can reach required precision.
Now be defined as follows function
Figure BDA0000418272960000051
f ( x ) = &Delta; G ( x ) x &GreaterEqual; 0 - G ( - x ) x < 0
Wherein
Figure BDA0000418272960000053
with
Figure BDA0000418272960000054
can from the register array having stored above, read.Definition by with superior function, has:
e 0.5 - lim x &RightArrow; 0.5 - f ( x ) = 1.03 &times; 10 - 7
e 0.5 - lim x &RightArrow; 0.5 + f ( x ) = 2.50 &times; 10 - 7
Maximum error appears at x=8.5 place, and is compared as follows with exact value:
lim x &RightArrow; 8.5 - f ( x ) = 4914.768534 ,
lim x &RightArrow; 8.5 + f ( x ) = 4914.768097 ,
e 8.5=4914.768840。
Visible, even at maximum error place, also there are seven significance bits, this has been maximum significance bits that can represent while adopting floating number precision in computing machine.So, can learn that the transfer function tanh (x) of BP neural network uses
F ( x ) = &Delta; f ( x ) - 1 f ( x ) f ( x ) + 1 f ( x ) , x &Element; [ - 9,9 ] . 1 , x &Element; ( - &infin; , - 9 ] &cup; [ 9 , + &infin; )
Maximum error while approaching appears at x=0.5 place, and is not more than 1.2 * 10 -7.
As one of them embodiment, described polynomial expression is as follows:
F ( x ) = &Delta; f ( x ) - 1 f ( x ) f ( x ) + 1 f ( x ) , x &Element; [ - 9,9 ] 1 , x &Element; ( - &infin; , - 9 ] &cup; [ 9 , + &infin; )
Wherein,
Figure BDA0000418272960000061
f ( x ) = &Delta; G ( x ) x &GreaterEqual; 0 - G ( - x ) x < 0
Wherein,
Figure BDA0000418272960000063
for Gaussian beam, represent to be not more than the maximum integer of x,
Figure BDA0000418272960000064
represent to be not less than the smallest positive integral of x.
Figure BDA0000418272960000065
with
Figure BDA0000418272960000066
can from the register array of having stored, read.
Adopt polynomial expression to approach the tanh transfer function of neural network, not only guaranteed approximation accuracy, the computation complexity also greatly reducing, its computation process relates to computing and simply only relates to polynomial addition and multiplication, makes platform of internet of things have more real-time.
In step S103, as one of them embodiment, described intended target comprises image and video.
By Internet of things system, can be to various intended target implementation pattern identification.Artificial neural network is to be interconnected and formed by a large amount of basic neurons, can carry out distributed parallel processing and non-linear conversion, has powerful study and sums up the function of concluding, and has the ability of robustness, parallel processing, has also guaranteed given precision.In BP neural network and platform application, have the following advantages: 1, the relation of input and output can keep Nonlinear Monotone relation; 2, this function is very smooth, meets relevant gradient and solves; 3, to neural network zmodem.Therefore, the target identification method that the present invention is based on Internet of Things is specially adapted to the pattern-recognition to image and video, is easy to promote.
Take BP Algorithm as example, apply method of the present invention and approach.Thereby can approach based on polynomial expression the Multi-sensor Fusion Internet of Things experimental procedure of BP neural network:
Adopt the various features amounts such as multisensor test vibration, temperature, humidity, pressure, flow, using the amount measuring as input quantity;
Use polynomial expression to approach BP neural network and train, learn, conclude, obtain relevant parameter;
Utilize the parameter obtaining, target is carried out to pattern-recognition or classification.
Refer to Fig. 2, for realizing the schematic flow sheet of a preferred embodiment of the target identification method that the present invention is based on Internet of Things.
Compare with general technology, the target identification method that the present invention is based on Internet of Things approaches polynomial expression to be incorporated in Internet of Things field, the polynomial expression that has drawn transfer function in conjunction with Gaussian beam approaches, guarantee computational accuracy and reduce computation complexity, make neural network model be able to realize fast under the resource-constrained environment of sensor, thereby greatly improved real-time, and then based on Internet of Things realize target, identified well.
Refer to Fig. 3, for the present invention is based on the structural representation of the Target Identification Unit of Internet of Things.
The present invention is based on the Target Identification Unit of Internet of Things, comprise acquisition module 301, training module 302 and identification module 303;
Described acquisition module 301, measured value intended target being gathered for obtaining each sensor tip of Internet of things system;
Described training module 302, for the input quantity using the described measured value obtaining as BP neural network, trains described BP neural network; Wherein, the tanh transfer function of described BP neural network adopts polynomial expression to approach;
Described identification module 303, for according to the described BP neural network after training, carries out pattern-recognition to described intended target.
As one of them embodiment, described measured value comprises temperature, humidity, pressure and flow.
Measured value can comprise the various measured values that in Internet of things system, sensor tip gathers intended target, and the type of measured value is more extensive, and the later stage is just higher to the accuracy of target identification.
As one of them embodiment, described polynomial expression is as follows:
F ( x ) = &Delta; f ( x ) - 1 f ( x ) f ( x ) + 1 f ( x ) , x &Element; [ - 9,9 ] 1 , x &Element; ( - &infin; , - 9 ] &cup; [ 9 , + &infin; )
Wherein,
Figure BDA0000418272960000072
f ( x ) = &Delta; G ( x ) x &GreaterEqual; 0 - G ( - x ) x < 0
Wherein, for Gaussian beam, represent to be not more than the maximum integer of x,
Figure BDA0000418272960000075
represent to be not less than the smallest positive integral of x.
Adopt polynomial expression to approach the tanh transfer function of neural network, not only guaranteed approximation accuracy, the computation complexity also greatly reducing, its computation process relates to computing and simply only relates to polynomial addition and multiplication, makes platform of internet of things have more real-time.
As one of them embodiment, described intended target comprises image and video.
By Internet of things system, can be to various intended target implementation pattern identification.Artificial neural network is to be interconnected and formed by a large amount of basic neurons, can carry out distributed parallel processing and non-linear conversion, has powerful study and sums up the function of concluding, and has the ability of robustness, parallel processing, has also guaranteed given precision.In BP neural network and platform application, have the following advantages: 1, the relation of input and output can keep Nonlinear Monotone relation; 2, this function is very smooth, meets relevant gradient and solves; 3, to neural network zmodem.Therefore, the target identification method that the present invention is based on Internet of Things is specially adapted to the pattern-recognition to image and video, is easy to promote.
Compare with general technology, the Target Identification Unit that the present invention is based on Internet of Things approaches polynomial expression to be incorporated in Internet of Things field, the polynomial expression that has drawn transfer function in conjunction with Gaussian beam approaches, guarantee computational accuracy and reduce computation complexity, make neural network model be able to realize fast under the resource-constrained environment of sensor, thereby greatly improved real-time, and then based on Internet of Things realize target, identified well.
The above embodiment has only expressed several embodiment of the present invention, and it describes comparatively concrete and detailed, but can not therefore be interpreted as the restriction to the scope of the claims of the present invention.It should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.

Claims (8)

1. the target identification method based on Internet of Things, is characterized in that, comprises the following steps:
Obtain the measured value that in Internet of things system, each sensor tip gathers intended target;
Input quantity using the described measured value obtaining as BP neural network, trains described BP neural network; Wherein, the tanh transfer function of described BP neural network adopts polynomial expression to approach;
According to the described BP neural network after training, described intended target is carried out to pattern-recognition.
2. the target identification method based on Internet of Things according to claim 1, it is characterized in that, described obtain Internet of things system in the step of each sensor tip measured value that intended target is gathered, described measured value comprises temperature, humidity, pressure and flow.
3. the target identification method based on Internet of Things according to claim 1, is characterized in that, in the step that the tanh transfer function employing polynomial expression of described BP neural network approaches, described polynomial expression is as follows:
F ( x ) = &Delta; f ( x ) - 1 f ( x ) f ( x ) + 1 f ( x ) , x &Element; [ - 9,9 ] 1 , x &Element; ( - &infin; , - 9 ] &cup; [ 9 , + &infin; )
Wherein,
Figure FDA0000418272950000012
f ( x ) = &Delta; G ( x ) x &GreaterEqual; 0 - G ( - x ) x < 0
Wherein,
Figure FDA0000418272950000014
for Gaussian beam, represent to be not more than the maximum integer of x, represent to be not less than the smallest positive integral of x.
4. the target identification method based on Internet of Things according to claim 1, is characterized in that, in described step of described intended target being carried out to pattern-recognition, described intended target comprises image and video.
5. the Target Identification Unit based on Internet of Things, is characterized in that, comprises acquisition module, training module and identification module;
Described acquisition module, measured value intended target being gathered for obtaining each sensor tip of Internet of things system;
Described training module, for the input quantity using the described measured value obtaining as BP neural network, trains described BP neural network; Wherein, the tanh transfer function of described BP neural network adopts polynomial expression to approach;
Described identification module, for according to the described BP neural network after training, carries out pattern-recognition to described intended target.
6. the Target Identification Unit based on Internet of Things according to claim 5, is characterized in that, described measured value comprises temperature, humidity, pressure and flow.
7. the Target Identification Unit based on Internet of Things according to claim 5, is characterized in that, described polynomial expression is as follows:
F ( x ) = &Delta; f ( x ) - 1 f ( x ) f ( x ) + 1 f ( x ) , x &Element; [ - 9,9 ] 1 , x &Element; ( - &infin; , - 9 ] &cup; [ 9 , + &infin; )
Wherein,
Figure FDA0000418272950000025
f ( x ) = &Delta; G ( x ) x &GreaterEqual; 0 - G ( - x ) x < 0
Wherein,
Figure FDA0000418272950000023
for Gaussian beam, represent to be not more than the maximum integer of x,
Figure FDA0000418272950000024
represent to be not less than the smallest positive integral of x.
8. the Target Identification Unit based on Internet of Things according to claim 5, is characterized in that, described intended target comprises image and video.
CN201310590706.3A 2013-11-20 2013-11-20 Target identification method based on Internet of Things and device Active CN103606007B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310590706.3A CN103606007B (en) 2013-11-20 2013-11-20 Target identification method based on Internet of Things and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310590706.3A CN103606007B (en) 2013-11-20 2013-11-20 Target identification method based on Internet of Things and device

Publications (2)

Publication Number Publication Date
CN103606007A true CN103606007A (en) 2014-02-26
CN103606007B CN103606007B (en) 2016-11-16

Family

ID=50124227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310590706.3A Active CN103606007B (en) 2013-11-20 2013-11-20 Target identification method based on Internet of Things and device

Country Status (1)

Country Link
CN (1) CN103606007B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1945602A (en) * 2006-07-07 2007-04-11 华中科技大学 Characteristic selecting method based on artificial nerve network
CN101329169A (en) * 2008-07-28 2008-12-24 中国航空工业第一集团公司北京航空制造工程研究所 Neural network modeling approach of electron-beam welding consolidation zone shape factor
CN101561427A (en) * 2009-05-15 2009-10-21 江苏大学 Pig house environment harmful gas multi-point measurement system based on CAN field bus
CN102749471A (en) * 2012-07-13 2012-10-24 兰州交通大学 Short-term wind speed and wind power prediction method
CN102759430A (en) * 2012-06-28 2012-10-31 北京自动化控制设备研究所 BP (Back Propagation) neural network based high-precision correction and test method for resonance cylinder pressure sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1945602A (en) * 2006-07-07 2007-04-11 华中科技大学 Characteristic selecting method based on artificial nerve network
CN101329169A (en) * 2008-07-28 2008-12-24 中国航空工业第一集团公司北京航空制造工程研究所 Neural network modeling approach of electron-beam welding consolidation zone shape factor
CN101561427A (en) * 2009-05-15 2009-10-21 江苏大学 Pig house environment harmful gas multi-point measurement system based on CAN field bus
CN102759430A (en) * 2012-06-28 2012-10-31 北京自动化控制设备研究所 BP (Back Propagation) neural network based high-precision correction and test method for resonance cylinder pressure sensor
CN102749471A (en) * 2012-07-13 2012-10-24 兰州交通大学 Short-term wind speed and wind power prediction method

Also Published As

Publication number Publication date
CN103606007B (en) 2016-11-16

Similar Documents

Publication Publication Date Title
CN107368845A (en) A kind of Faster R CNN object detection methods based on optimization candidate region
CN109902705A (en) A kind of object detection model to disturbance rejection generation method and device
CN105046277A (en) Robust mechanism research method of characteristic significance in image quality evaluation
CN102222313B (en) Urban evolution simulation structure cell model processing method based on kernel principal component analysis (KPCA)
CN104599292A (en) Noise-resistant moving target detection algorithm based on low rank matrix
CN105546352A (en) Natural gas pipeline tiny leakage detection method based on sound signals
CN103793926B (en) Method for tracking target based on sample reselection procedure
CN105023013B (en) The object detection method converted based on Local standard deviation and Radon
CN105354860A (en) Box particle filtering based extension target CBMeMBer tracking method
CN103335814A (en) Inclination angle measurement error data correction system and method of experimental model in wind tunnel
CN104732546A (en) Non-rigid SAR image registration method based on region similarity and local spatial constraint
CN102708294A (en) Self-adaptive parameter soft measuring method on basis of semi-supervised local linear regression
CN110458046A (en) A kind of human body motion track analysis method extracted based on artis
CN105138983A (en) Pedestrian detection method based on weighted part model and selective search segmentation
CN103607772A (en) Taylor positioning algorithm based on LMBP (Levenberg-Marquardt Back Propagation) neural network
CN106155540A (en) Electronic brush pen form of a stroke or a combination of strokes treating method and apparatus
CN105424043A (en) Motion state estimation method based on maneuver judgment
CN102708277B (en) Snow depth Based Inverse Design Method based on ant group algorithm
Loseille et al. Anisotropic adaptive simulations in aerodynamics
CN105044531A (en) Dynamic signal parameter identification method based on EKF and FSA
CN103390265A (en) Texture image denoising filter based on fractional order evolution equation
CN103606007A (en) Target identification method and apparatus based on Internet of Things
CN116542254A (en) Wind tunnel test data anomaly decision method and device, electronic equipment and storage medium
CN105005686A (en) Probability prediction type target tracking method
CN108280244A (en) A kind of exact algorithm of material constitutive relation without iteration

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant