CN107144281B - Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision - Google Patents

Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision Download PDF

Info

Publication number
CN107144281B
CN107144281B CN201710530501.4A CN201710530501A CN107144281B CN 107144281 B CN107144281 B CN 107144281B CN 201710530501 A CN201710530501 A CN 201710530501A CN 107144281 B CN107144281 B CN 107144281B
Authority
CN
China
Prior art keywords
module
flight
unmanned aerial
aerial vehicle
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710530501.4A
Other languages
Chinese (zh)
Other versions
CN107144281A (en
Inventor
卢翔
王明明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Efy Intelligent Control Tianjin Tech Co ltd
Original Assignee
Efy Intelligent Control Tianjin Tech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Efy Intelligent Control Tianjin Tech Co ltd filed Critical Efy Intelligent Control Tianjin Tech Co ltd
Priority to CN201710530501.4A priority Critical patent/CN107144281B/en
Publication of CN107144281A publication Critical patent/CN107144281A/en
Application granted granted Critical
Publication of CN107144281B publication Critical patent/CN107144281B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision, which comprises a plurality of markers arranged indoors and provided with coding information, a monocular vision system module, a sensing module and a flight control system, wherein the monocular vision system module comprises a monocular camera and a vision information processing unit, the monocular camera is used for acquiring marker information during flight, and the vision information processing unit is used for completing acquisition and information processing of video data and transmitting processing results to the flight control system; the invention takes the marker with the coding information as the guide, only one marker is needed as the cooperation target to identify the relative pose of the unmanned aerial vehicle relative marker, and the marker is easy to distinguish from the environment because of the coding information, and is suitable for batch production because of the coding characteristic, and a great deal of effort is not needed to put out the marker which is distinguished from the environment and has the distinction between the markers.

Description

Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision
Technical Field
The invention relates to the technical field of unmanned aerial vehicle control systems, in particular to an unmanned aerial vehicle indoor positioning system and method based on cooperative targets and monocular vision.
Background
Unmanned aerial vehicle is unmanned aircraft, and unmanned aerial vehicle adopts high and new technologies such as satellite positioning, remote sensing, geospatial, aerospace, automatic control, computer-aided analysis, etc. can serve a plurality of fields such as homeland, survey and drawing, forestry, traffic, water conservancy and military affairs.
Unmanned aerial vehicle navigation systems are one of the key components of unmanned aerial vehicle systems. At present, navigation technologies used by unmanned aerial vehicles mainly comprise inertial navigation, satellite navigation, doppler navigation, visual navigation, terrain assistance and geomagnetic navigation. In practical applications, it is necessary to select a suitable navigation system according to the flight environment of the unmanned aerial vehicle, the task requirements, and the application range and use conditions of the navigation technology.
In recent years, unmanned aerial vehicle navigation technology research has greatly progressed, but most navigation technologies are only suitable for outdoor space and cannot be applied to indoor unknown environment navigation. The indoor navigation environment is unknown and complex. The unmanned aerial vehicle does not know the size of the indoor environment, the shape and the distribution of the obstacles and the existence of artificial reference objects, the obstacles in the indoor environment are randomly placed or mutually shielded, and indoor light changes along with the change of the angle of the unmanned aerial vehicle. Nowadays, some research is carried out on unmanned aerial vehicle indoor navigation control by the country, but a unified theoretical system is not formed yet, and some important theories and technologies remain to be solved. These problems are mainly environmental modeling, unmanned aerial vehicle positioning, unmanned aerial vehicle navigation controller design, real-time motion control, etc.
Disclosure of Invention
The invention aims at overcoming the technical defects in the prior art and provides an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision.
The technical scheme adopted for realizing the purpose of the invention is as follows:
an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision comprises a plurality of markers arranged indoors and provided with coding information, a monocular vision system module, a sensing module and a flight control system,
the monocular vision system module comprises a monocular camera and a vision information processing unit, wherein the monocular camera is used for acquiring identifier information during flight, and the vision information processing unit is used for completing acquisition and information processing of video data and transmitting a processing result to the flight control system;
the sensing module comprises a gyroscope, an accelerometer and a relative height measuring module;
the flight control system comprises a sensor data fusion module based on an extended Kalman filter, and the sensor data fusion module receives the processing result and the flight state information from the sensor module and fuses the processing result and the flight state information to generate a flight instruction.
The output end of the sensor data fusion module based on the extended Kalman filter is connected with the input end of the flight controller, and the output end of the flight controller is connected with the input end of the data transmission module.
The relative height measuring module is a laser ranging module.
The marker is in a block shape, a plate shape or a sheet shape, and the coded information has differentiation.
The system also comprises a flight state indicating module and a remote controller which are in communication connection with the data transmission module.
The positioning method of the unmanned aerial vehicle indoor positioning system comprises the following steps,
1) The monocular camera acquires the video stream containing the identifier and transmits the video stream to the visual information processing unit, the visual information processing unit module extracts the characteristic information of the code information of the identifier acquired in the visual field,
2) Acquiring the corresponding code of the marker and the coordinates of the marker in a world coordinate system through the characteristic information,
3) Calculating the relative pose of the camera relative to the marker by combining the coordinates, the corner features and the monocular camera parameters of the marker, and then obtaining pose data and indoor navigation data of the unmanned aerial vehicle under an indoor world coordinate system;
4) And the sensor data fusion module based on the extended Kalman filter receives the pose data, the indoor navigation data and the flight state information from the sensing module, performs fusion processing, and generates a flight instruction according to a set flight track.
The flight command is transmitted to the unmanned aerial vehicle power module through the flight controller and the data transmission module, and the flight controller is based on a muC/OS-II system.
The remote control method further comprises a remote control step, wherein a remote control instruction of the remote control module is transmitted to the unmanned aerial vehicle power module through the data transmission module and is input into the sensor data fusion module.
Compared with the prior art, the invention has the beneficial effects that:
the invention takes the marker with the coding information as the guide, only one marker is needed as the cooperation target to identify the relative pose of the unmanned aerial vehicle relative marker, and the marker is easy to distinguish from the environment because of the coding information, and is suitable for batch production because of the coding characteristic, and a great deal of effort is not needed to put out the marker which is distinguished from the environment and has the distinction between the markers. The identification itself has coding information, rather than coding a plurality of identifications after the cooperative identification is arranged on the indoor ground, so that the specific coordinates of the identification can be obtained through information matching after the coding information is read, and the calculation speed is improved. The marker with the coded information can effectively improve real-time high-precision positioning in the environment or interaction with the environment.
Drawings
FIG. 1 is a system block diagram of an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision according to the present invention;
fig. 2 is a view of collaborative goal and monocular vision based indoor navigation data solution according to the present invention.
Fig. 3 is a schematic diagram of a sensor data fusion module based on an extended kalman filter according to the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and the specific examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
As shown in the figure, the unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision comprises a plurality of markers which are arranged on the indoor ground or a certain elevation and provided with coding information, a monocular vision system module, a sensing module and a flight control system,
the monocular vision system module comprises a monocular camera and a vision information processing unit, wherein the monocular camera is used for acquiring identifier information during flight and transmitting a video stream to the vision information processing unit through a CAN data bus, and the vision information processing unit comprises a processing system with a CPU and a GPU dual processing core, and is used for completing acquisition and information processing of video data and transmitting a processing result to a flight control system; wherein the monocular camera is on-board, the visual information processing unit is provided with a dual-core processing system for data analysis, processing and calculation, thereby effectively improving the calculation speed,
the sensing module comprises a gyroscope, an accelerometer and a relative height measuring module; meanwhile, the system also comprises conventional sensors such as a magnetometer, a barometer and the like, and the multi-sensor is arranged to detect various types, so that the control stability is further improved; the relative height measuring module is a laser ranging module;
the flight control system comprises a sensor data fusion module based on an extended Kalman filter, and the sensor data fusion module receives the processing result and the flight state information from the sensor module and fuses the processing result and the flight state information to generate a flight instruction.
The extended Kalman filter is a software algorithm, and the key of sensor fusion based on the algorithm is the establishment of a system model and the processing of identification switching between monocular vision and multiple identifications, so that smooth transition and stable flight of the unmanned aerial vehicle in the identification switching process are ensured.
The invention takes the marker with the coding information as the guide, only one marker is needed as the cooperation target to identify the relative pose of the unmanned aerial vehicle relative marker, and the marker is easy to distinguish from the environment because of the coding information, and is suitable for batch production because of the coding characteristic, and a great deal of effort is not needed to put out the marker which is distinguished from the environment and has the distinction between the markers. The identification itself has coding information, rather than coding a plurality of identifications after the cooperative identification is arranged on the indoor ground, so that the specific coordinates of the identification can be obtained through information matching after the coding information is read, and the calculation speed is improved. The marker with the coded information can effectively improve real-time high-precision positioning in the environment or interaction with the environment.
For unmanned aerial vehicle control system, the output of sensor data fusion module based on extended Kalman filter links to each other with flight controller's input, and flight controller's output links to each other with data transmission module's input, and data transmission module delivers the flight instruction to unmanned aerial vehicle power module and realizes following flight and autonomous take off and land of various topography.
The marker is in a block shape, a plate shape or a sheet shape, and the coded information has differentiation. The generation of the identifier and the synchronous generation of the coding information can utilize designed software to directly print or generate corresponding pictures after the binary coding information is determined, and simultaneously store corresponding information and coordinate information. Since the image of each marker itself contains encoded information, there is a difference between each marker picture, and thus the marker picture library refers to all encoded template pictures. Each marker has a fixed number of corner points, and the geometric distance between the corner points satisfies invariance under projective transformation. From this characteristic information it is determined whether it is the designed identity.
Meanwhile, in order to improve the controllability and man-machine interaction, the intelligent control system further comprises a flight state indicating module and a remote controller, wherein the flight state indicating module and the remote controller are in communication connection with the data transmission module. The ground station and the data transmission module communicate to obtain the flight state of the unmanned aerial vehicle, display through a mobile phone, a tablet personal computer or a computer and the like, and when abnormal control occurs, early warning can be performed through the color of the navigation lights, the flashing frequency, the voice prompt and the like.
Meanwhile, the invention discloses a positioning method of the unmanned aerial vehicle indoor positioning system, which comprises the following steps,
1) The monocular camera acquires the video stream containing the identifier and transmits the video stream to the visual information processing unit, the visual information processing unit module extracts the characteristic information of the code information of the identifier acquired in the visual field,
2) Acquiring the corresponding code of the marker and the coordinates of the marker in a world coordinate system through the characteristic information,
3) Calculating the relative pose of the camera relative to the marker by combining the coordinates, the corner features and the monocular camera parameters of the marker, and then obtaining pose data and indoor navigation data of the unmanned aerial vehicle under an indoor world coordinate system;
4) And the sensor data fusion module based on the extended Kalman filter receives the pose data, the indoor navigation data and the flight state information from the sensing module, performs fusion processing, and generates a flight instruction according to a set flight track.
The flight command is transmitted to the unmanned aerial vehicle power module through the flight controller and the data transmission module, the flight controller is based on the muC/OS-II and also comprises a remote control module and a ground station, such as a mobile phone, a tablet personal computer, a computer and the like, and the remote control module is in communication connection with the unmanned aerial vehicle power module through the data transmission module. Based on the system scheduling provided by the muC/OS-II, the aircraft controller outputs instructions to a servo executing mechanism, namely a steering engine system after judgment, calculation and processing through collected sensor information, stored relevant states and data and instructions and data of uplink remote control or ground station planning sent by a radio control terminal, and controls the control surface of the unmanned aircraft and the throttle of an engine so as to control the flight of the unmanned aircraft.
The unmanned aerial vehicle position control method based on the monocular vision can plan a path in advance through software or plan the path on line according to the perception information of the environment, is responsible for feeding back the position and posture information of the unmanned aerial vehicle in the indoor environment as the identification of the identifier of the cooperative target, is the basis of the unmanned aerial vehicle to fly according to the preset or on-line corrected track, is used for controlling the position ring (namely the outer ring) of the unmanned aerial vehicle based on the monocular vision position and has high dynamic characteristics when the unmanned aerial vehicle is controlled, so that the unmanned aerial vehicle cannot fly stably only by means of the vision information.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (8)

1. An unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision is characterized by comprising a plurality of indoor markers provided with coded information, a monocular vision system module, a sensing module and a flight control system,
the monocular vision system module comprises a monocular camera and a vision information processing unit, wherein the monocular camera is used for acquiring identifier information during flight, and the vision information processing unit is used for completing acquisition and information processing of video data and transmitting a processing result to the flight control system;
the sensing module comprises a gyroscope, an accelerometer and a relative height measuring module;
the flight control system comprises a sensor data fusion module based on an extended Kalman filter, and the sensor data fusion module receives the processing result and the flight state information from the sensor module and fuses the processing result and the flight state information to generate a flight instruction.
2. The unmanned aerial vehicle indoor positioning system of claim 1, wherein the output of the extended kalman filter based sensor data fusion module is connected to an input of a flight controller, and wherein the output of the flight controller is connected to an input of the data transmission module.
3. The unmanned aerial vehicle indoor positioning system of claim 1, wherein the relative height measurement module is a laser ranging module.
4. The unmanned aerial vehicle indoor positioning system of claim 1, wherein the marker is in the form of a block, a plate or a sheet, and the coded information has a differentiation.
5. The unmanned aerial vehicle indoor positioning system of claim 2, further comprising a flight status indication module communicatively coupled to the data transmission module and a remote control.
6. A positioning method of an unmanned aerial vehicle indoor positioning system according to claim 1, comprising the steps of,
1) The monocular camera acquires the video stream containing the identifier and transmits the video stream to the visual information processing unit, the visual information processing unit module extracts the characteristic information of the code information of the identifier acquired in the visual field,
2) Acquiring the corresponding code of the marker and the coordinates of the marker in a world coordinate system through the characteristic information,
3) Calculating the relative pose of the camera relative to the marker by combining the coordinates, the corner features and the monocular camera parameters of the marker, and then obtaining pose data and indoor navigation data of the unmanned aerial vehicle under an indoor world coordinate system;
4) And the sensor data fusion module based on the extended Kalman filter receives the pose data, the indoor navigation data and the flight state information from the sensing module, performs fusion processing, and generates a flight instruction according to a set flight track.
7. The positioning method of claim 6, wherein the flight command is transmitted to the unmanned aerial vehicle power module via a flight controller, a data transmission module, the flight controller being based on a μc/OS-ii system.
8. The positioning method according to claim 6, further comprising a remote control step, wherein a remote control command of the remote control module is transmitted to the unmanned aerial vehicle power module via the data transmission module and is input to the sensor data fusion module.
CN201710530501.4A 2017-06-30 2017-06-30 Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision Active CN107144281B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710530501.4A CN107144281B (en) 2017-06-30 2017-06-30 Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710530501.4A CN107144281B (en) 2017-06-30 2017-06-30 Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision

Publications (2)

Publication Number Publication Date
CN107144281A CN107144281A (en) 2017-09-08
CN107144281B true CN107144281B (en) 2023-09-12

Family

ID=59785556

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710530501.4A Active CN107144281B (en) 2017-06-30 2017-06-30 Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision

Country Status (1)

Country Link
CN (1) CN107144281B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108592914A (en) * 2018-04-08 2018-09-28 河南科技学院 The positioning of complex region inspecting robot, navigation and time service method under no GPS scenario
WO2020000386A1 (en) * 2018-06-29 2020-01-02 深圳市大疆创新科技有限公司 Flight control method, device and system, and storage medium
CN109405821B (en) * 2018-09-21 2020-01-03 北京三快在线科技有限公司 Method and device for positioning and target equipment
CN109977924A (en) * 2019-04-15 2019-07-05 北京麦飞科技有限公司 For real time image processing and system on the unmanned plane machine of crops
CN110261877B (en) * 2019-06-26 2024-06-11 南京航空航天大学 Ground-air collaborative visual navigation method and device based on improved graph optimization SLAM
CN110446159B (en) * 2019-08-12 2020-11-27 上海工程技术大学 System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle
CN111486841B (en) * 2020-05-19 2022-04-05 西安因诺航空科技有限公司 Unmanned aerial vehicle navigation positioning method based on laser positioning system
CN113720331B (en) * 2020-12-25 2023-12-19 北京理工大学 Multi-camera fused unmanned aerial vehicle in-building navigation positioning method
CN112859923B (en) * 2021-01-25 2022-02-18 西北工业大学 Unmanned aerial vehicle vision formation flight control system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2071353A2 (en) * 2007-12-14 2009-06-17 The Boeing Company System and methods for autonomous tracking and surveillance
CN101667032A (en) * 2008-09-03 2010-03-10 中国科学院自动化研究所 Vision-based target tracking system using unmanned helicopter
CN106527485A (en) * 2016-12-14 2017-03-22 天津文林科技有限公司 Unmanned aerial vehicle (UAV) fixed-point flight control system based on photo-electro-mechanical integrated visual positioning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2071353A2 (en) * 2007-12-14 2009-06-17 The Boeing Company System and methods for autonomous tracking and surveillance
CN101667032A (en) * 2008-09-03 2010-03-10 中国科学院自动化研究所 Vision-based target tracking system using unmanned helicopter
CN106527485A (en) * 2016-12-14 2017-03-22 天津文林科技有限公司 Unmanned aerial vehicle (UAV) fixed-point flight control system based on photo-electro-mechanical integrated visual positioning

Also Published As

Publication number Publication date
CN107144281A (en) 2017-09-08

Similar Documents

Publication Publication Date Title
CN107144281B (en) Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision
WO2018032457A1 (en) Systems and methods for augmented stereoscopic display
CN110426046B (en) Unmanned aerial vehicle autonomous landing runway area obstacle judging and tracking method
CN103175524B (en) A kind of position of aircraft without view-based access control model under marking environment and attitude determination method
CN105222760A (en) The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method
CN104298248A (en) Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN103353758A (en) Indoor robot navigation device and navigation technology thereof
CN110097620A (en) High-precision map creation system based on image and three-dimensional laser
CN112558608A (en) Vehicle-mounted machine cooperative control and path optimization method based on unmanned aerial vehicle assistance
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
CN102190081A (en) Vision-based fixed point robust control method for airship
CN116540784B (en) Unmanned system air-ground collaborative navigation and obstacle avoidance method based on vision
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
CN104965513A (en) Son hopping robot recovery system and recovery method
CN109116861A (en) A kind of unmanned aerial vehicle ground control system
CN110309883A (en) A kind of unmanned plane autonomic positioning method of view-based access control model SLAM
CN115826622B (en) Night co-location method for unmanned aerial vehicle group
CN112556695A (en) Indoor positioning and three-dimensional modeling method and system, electronic equipment and storage medium
CN115755575A (en) ROS-based double-tripod-head unmanned aerial vehicle autonomous landing method
CN112904895B (en) Image-based airplane guiding method and device
Zhou et al. Multi-robot real-time cooperative localization based on high-speed feature detection and two-stage filtering
CN113959412A (en) Unmanned aerial vehicle docking process active visual navigation method based on infrared detection
WO2021072766A1 (en) Flight control method and system, unmanned aerial vehicle, and storage medium
Ming et al. Laser tracking leader-follower automatic cooperative navigation system for UAVs
Zhong et al. A vision-based auxiliary system of multirotor unmanned aerial vehicles for autonomous rendezvous and docking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant