CN113867387A - Unmanned aerial vehicle autonomous landing course identification method - Google Patents

Unmanned aerial vehicle autonomous landing course identification method Download PDF

Info

Publication number
CN113867387A
CN113867387A CN202111135612.8A CN202111135612A CN113867387A CN 113867387 A CN113867387 A CN 113867387A CN 202111135612 A CN202111135612 A CN 202111135612A CN 113867387 A CN113867387 A CN 113867387A
Authority
CN
China
Prior art keywords
straight line
unmanned aerial
aerial vehicle
connected domain
course
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111135612.8A
Other languages
Chinese (zh)
Other versions
CN113867387B (en
Inventor
李忠威
李威
魏大洲
史传飞
王荣阳
曲国远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Aeronautical Radio Electronics Research Institute
Original Assignee
China Aeronautical Radio Electronics Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Aeronautical Radio Electronics Research Institute filed Critical China Aeronautical Radio Electronics Research Institute
Priority to CN202111135612.8A priority Critical patent/CN113867387B/en
Publication of CN113867387A publication Critical patent/CN113867387A/en
Application granted granted Critical
Publication of CN113867387B publication Critical patent/CN113867387B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an autonomous landing course identification method of an unmanned aerial vehicle, which is realized by a computer program, runs on the unmanned aerial vehicle and comprises the processes of camera image input, target area detection and tracking, straight line detection, type judgment, quadrant judgment and course angle calculation. The vision-based unmanned aerial vehicle autonomous landing course identification method provided by the invention overcomes the defect that the traditional method is influenced by the geomagnetic environment, and can provide accurate course data for the unmanned aerial vehicle during autonomous landing.

Description

Unmanned aerial vehicle autonomous landing course identification method
Technical Field
The invention relates to the technical field of unmanned aerial vehicle navigation and image processing, in particular to an unmanned aerial vehicle autonomous landing course identification method based on vision.
Background
The traditional course identification method is used for resolving through data output by a gyroscope and a magnetometer, but the resolving result has a large error, is greatly influenced by the accuracy of a sensor, and has poor robustness, so that the reliability of course data is difficult to ensure in the final landing process. With the development of artificial intelligence in the aspect of vision and the popularization and application of unmanned aerial vehicle technology in various fields, unmanned aerial vehicles based on vision are falling autonomously to become research hotspots. The autonomous landing of the unmanned aerial vehicle is mainly based on an autonomous navigation technology, wherein a course identification technology is an important component in the navigation technology, but the current course identification technology is still immature and needs to be continuously researched.
The course recognition method based on vision is based on an image processing technology, a target pattern image is obtained in real time through a monocular camera, information is extracted by adopting an image processing means, and the course of the unmanned aerial vehicle is finally calculated.
Disclosure of Invention
The invention mainly aims to provide a vision-based unmanned aerial vehicle autonomous landing course identification method, which is characterized in that a monocular camera is used for acquiring a target pattern image in real time and detecting and tracking a target area, the detected target area is used as input to carry out image processing, linear detection, linear type judgment, course quadrant judgment and other technologies to realize course resolving, and accurate course data is provided for autonomous landing of an unmanned aerial vehicle. By means of unique target pattern design and the adoption of the inner layer pattern and the outer layer pattern, the problem that the size of an icon is changed due to different heights, so that the icon cannot be detected is solved, and the course recognition system can automatically switch the inner layer pattern and the outer layer pattern according to the heights in the landing process so as to achieve an accurate course resolving effect.
The invention aims to be realized by the following technical scheme:
an autonomous landing course identification method of an unmanned aerial vehicle is realized by a computer program, runs on the unmanned aerial vehicle, and comprises the following steps:
step 1: after the unmanned aerial vehicle flies into the autonomous landing area, detecting and tracking images acquired by a camera at the bottom of the unmanned aerial vehicle in real time, and finding out a target image contained in the images as the input of the step 2;
step 2: performing linear extraction on the target area;
and step 3: judging the types of straight lines, wherein the straight lines with a large number are horizontal straight lines, and the straight lines with a small number are vertical straight lines;
and 4, step 4: judging a quadrant to which the course angle belongs based on two conditions that the slope of the horizontal straight line and the vertical straight line are positioned on the left side and the right side of the horizontal straight line;
and 5: calculating the inclination angles of all the straight lines by using an arctangent angle formula, then obtaining the rotation angle of each straight line according to the straight line type and quadrant classification, and finally weighting and averaging the final course angles which are the rotation angles of all the detected straight lines.
As a preferred scheme of the present invention, in step 1, a cascade classifier is adopted to perform sliding window classification, and a target image contained in an image is found out; using confidence
Figure BDA0003282253900000021
Performing autonomous switching between detection and tracking; and pi is a confidence value of the target frames of the ith overlapping area, and the target frames enter a tracking state if the confidence value is large enough and enter a detection state if the confidence value is too small.
As a preferred scheme of the present invention, the process of extracting the straight line in step 2 includes performing gaussian smoothing, binarization, connected domain detection and removal, canny edge detection on the target region, and then performing straight line extraction by using a hough straight line detection method.
As a preferred scheme of the present invention, when the height of the unmanned aerial vehicle is higher than the threshold, in step 1, the image in the outer ring connected domain is taken as the target area; in the step 2, when the connected domain is removed, removing images in the outer ring connected domain, the inner ring connected domain and the inner ring connected domain;
when the height of the unmanned aerial vehicle is lower than a threshold value, in the step 1, taking an image in an inner ring connected domain as a target area; in the step 2, when the connected domain is removed, the inner ring connected domain is removed.
As a preferred scheme of the present invention, in the process of judging the type of the straight line in step 3, the detected first straight line is used as a reference, and whether the straight line belongs to the same class is judged in turn according to the included angle between the remaining straight line and the first straight line; lines with included angles less than 45 degrees are classified into one class, and the rest are classified into another class.
As a preferred scheme of the invention, the quadrant to which the step 4 belongs is determined as (1) when the vertical straight line end point is on the left side of the horizontal straight line and the slope of the horizontal straight line is negative, the course angle range is the first quadrant, namely 0-90 degrees; (2) the vertical straight line end point is on the left side of the horizontal straight line, the slope of the horizontal line is positive, and the course angle range is a second quadrant, namely 90-180 degrees; (3) when the vertical straight line end point is on the right side of the horizontal straight line and the slope of the horizontal straight line is negative, the course angle range is a third quadrant, namely 180-270 degrees; (4) the vertical straight line end point is right of the horizontal straight line and the slope of the horizontal straight line is positive, the course angle range is the fourth quadrant, namely 270-359 degrees.
The invention has the beneficial effects that:
1. the vision-based unmanned aerial vehicle autonomous landing course identification method provided by the invention overcomes the defect that the traditional method is influenced by the geomagnetic environment, and can provide accurate course data for the unmanned aerial vehicle during autonomous landing.
2. The vision-based unmanned aerial vehicle autonomous landing course identification method provided by the invention is a novel method based on quadrant division and quadrant judgment, and a specific angle can be determined according to the method.
2. According to the vision-based unmanned aerial vehicle autonomous landing course recognition method and system, the inner ring and the outer ring can be automatically switched according to the height change in the landing process, so that the accuracy of course recognition data at different heights is ensured, and the course recognition function under the condition of insufficient illumination can be realized by additionally arranging the lighting equipment.
Drawings
Fig. 1 is an overall flowchart of an autonomous landing course identification method of an unmanned aerial vehicle.
FIG. 2 is a schematic view of a heading identification icon according to the present invention.
FIG. 3 is a schematic diagram of the present invention employing a classifier.
Fig. 4 is a schematic diagram of detecting tracking state switching in the present invention.
Fig. 5 is a flow of line detection in the present invention.
FIG. 6 is a diagram illustrating the straight line extraction result of the outer ring detection state in the present invention.
Fig. 7 is a diagram illustrating the linear extraction result of the inner loop detection state in the present invention.
FIG. 8 is a pseudo code diagram of an efficient straight line type determination algorithm according to the present invention.
FIG. 9 is a quadrant division schematic diagram according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Referring to fig. 1, an autonomous landing course recognition method for an unmanned aerial vehicle is implemented by a computer program, runs on the unmanned aerial vehicle, and comprises the processes of camera image input, target area detection and tracking, linear detection, type judgment, quadrant judgment and course angle calculation, and specifically comprises the following steps:
step 1: and (3) after the unmanned aerial vehicle flies into the autonomous landing area, detecting and tracking the image acquired by the camera at the bottom of the unmanned aerial vehicle in real time, and finding out the target image contained in the image as the input of the step (2).
The target image comprises an outer ring connected domain and an inner ring connected domain as shown in fig. 2, two horizontal straight lines and one vertical straight line are respectively arranged in the outer ring connected domain and the inner ring connected domain, in order to facilitate binarization of the image, the pixel values of the outer ring connected domain and the inner ring connected domain are the same, and the pixel values of the horizontal straight line and the vertical straight line are the same.
In order to improve the operation performance, the embodiment adopts the cascade classifier to perform sliding window classification, finds out the target image contained in the image, and ensures that the cascade classifier can operate at high speed as the image is closer to the landing point, the negation rate is higher, and the calculation amount is smaller, as shown in fig. 3. Using confidence
Figure BDA0003282253900000041
And performing autonomous switching between detection and tracking to realize the optimal speed of detection, wherein pi is a confidence value of the target frames aggregated into the ith overlapping region, and if the confidence value is large enough, entering a tracking state, and if the confidence value is too small, entering a detection state, a detection and tracking switching schematic diagram is shown in fig. 4.
In the target image detection and tracking process, the inner and outer rings can be automatically switched to detect according to the height of the unmanned aerial vehicle during landing, and the height threshold value is h' ═ w during switching2.D/(2PminTan (θ/2)), wherein w2The side length of an inner ring area in the target area image is shown, D is the vertical resolution of the camera, PminAnd theta is the vertical included angle between the optical axis of the camera and the machine body, so as to obtain the minimum resolution of the required pattern for visual measurement. And when the height of the unmanned aerial vehicle is higher than a switching threshold value h ', taking the image in the outer ring connected domain as a target area, and when the height of the unmanned aerial vehicle is lower than the threshold value h', taking the image in the inner ring connected domain as the target area.
Step 2: and (4) taking the target area detected, tracked and identified in the step (1) as an input to perform straight line extraction.
As shown in fig. 5, the straight line extraction process includes gaussian smoothing, binarization processing, connected domain detection and removal, canny edge detection on the target region, and then straight line extraction by using a hough straight line detection method. And in the connected domain removing process, adopting a connected domain removing strategy in a corresponding state according to different flying heights. When the flight height is higher than the threshold value, the flight height is the outer ring detection state, and at the moment, images in an outer ring connected domain, an inner ring connected domain and an inner ring connected domain are removed, so that the interference on the line detection is eliminated, as shown in fig. 6, the left side is a detected outer ring target area, and the right side is a result obtained after the connected domain is removed and the line detection is carried out; when the height is lower than the threshold, i.e. the inner ring detection state, the straight line detection can be excluded by removing the inner ring connected domain, as shown in fig. 7.
And step 3: taking the straight lines detected in the step 2 as input to judge the types of the straight lines, wherein the straight lines with a large number are horizontal straight lines, and the straight lines with a small number are vertical straight lines;
and in the process of judging the type of the straight line, the detected first straight line is taken as a reference, and whether the straight line belongs to the same type is judged according to the included angles between the residual straight line and the first straight line in sequence. The straight lines with the included angle smaller than 45 degrees are classified into one type, the other types are classified into another type, the type with more straight lines is a horizontal straight line, the other type is a vertical straight line, and as shown in fig. 6 and 7, the red line is a horizontal straight line, and the blue line is a vertical straight line. The specific algorithm pseudo code is shown in fig. 8.
And 4, step 4: and judging the quadrant to which the heading angle belongs based on two conditions that the slope of the horizontal straight line and the vertical straight line are positioned on the left side and the right side of the horizontal straight line.
The specific division of the angle quadrants is shown in fig. 9, and the judgment basis is based on two conditions that the slope of the horizontal line and the vertical line are located on the left and right sides of the horizontal line, and the four quadrants are respectively: 0-90, 90-180, 180-270, 270-359, and the four quadrants do not include angles that fall on the coordinate axes. Define quatrat as quadrant type, kHIs the slope of a horizontal line, (P)H1_X,PH1_Y)、(PH2_X,PH2_Y)、(PV_X,PV_Y) Coordinates of two points on the horizontal straight line and one point on the vertical straight line are respectively shown.
Figure BDA0003282253900000061
Wherein which _ side ═ (P)H1_X-PV_X).(PH2_Y-PV_Y)-(PH1_Y-PV_Y).(PH2_X-PV_X) And means that the vertical line is positioned on the left and right sides of the horizontal line.
The quadrant judgment is specifically that (1) when the vertical straight line end point is on the left side of the horizontal straight line and the slope of the horizontal straight line is negative, the course angle range is a first quadrant, namely 0-90 degrees; (2) the vertical straight line end point is on the left side of the horizontal straight line, the slope of the horizontal line is positive, and the course angle range is a second quadrant, namely 90-180 degrees; (3) when the vertical straight line end point is on the right side of the horizontal straight line and the slope of the horizontal straight line is negative, the course angle range is a third quadrant, namely 180-270 degrees; (4) the vertical straight line end point is right of the horizontal straight line and the slope of the horizontal straight line is positive, the course angle range is the fourth quadrant, namely 270-359 degrees.
And 5: calculating the inclination angles of all the straight lines by using an arctangent angle formula, then obtaining the rotation angle of each straight line according to the straight line type and quadrant classification, and finally weighting and averaging the final course beta which is obtained by detecting the rotation angles of all the straight lines:
Figure BDA0003282253900000062
wherein k isHiIs the slope of a horizontal line, kViIs the slope of the vertical line, N is the total number of the detected lines, i.e. the sum of the number of the vertical lines and the number of the horizontal lines, HnumNumber of horizontal straight lines, VnumAre intended to represent the number of vertical lines.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. An autonomous landing course identification method of an unmanned aerial vehicle is realized by a computer program and operates on the unmanned aerial vehicle, and is characterized by comprising the following steps:
step 1: after the unmanned aerial vehicle flies into the autonomous landing area, detecting and tracking images acquired by a camera at the bottom of the unmanned aerial vehicle in real time, and finding out a target image contained in the images as the input of the step 2;
step 2: performing linear extraction on the target area;
and step 3: judging the types of straight lines, wherein the straight lines with a large number are horizontal straight lines, and the straight lines with a small number are vertical straight lines;
and 4, step 4: judging a quadrant to which the course angle belongs based on two conditions that the slope of the horizontal straight line and the vertical straight line are positioned on the left side and the right side of the horizontal straight line;
and 5: calculating the inclination angles of all the straight lines by using an arctangent angle formula, then obtaining the rotation angle of each straight line according to the straight line type and quadrant classification, and finally weighting and averaging the rotation angles of all the detected straight lines, namely the final course angle.
2. The method for identifying the autonomous landing course of the unmanned aerial vehicle as claimed in claim 1, wherein in the step 1, a cascade classifier is adopted to perform sliding window classification to find out the imageThe target image contained in (1); using confidence
Figure FDA0003282253890000011
Performing autonomous switching between detection and tracking; and pi is a confidence value of the target frames of the ith overlapping area, and the target frames enter a tracking state if the confidence value is large enough and enter a detection state if the confidence value is too small.
3. The unmanned aerial vehicle autonomous landing course identification method according to claim 1, wherein the straight line extraction process in the step 2 comprises gaussian smoothing, binarization processing, connected domain detection and removal, canny edge detection on a target area, and then straight line extraction is performed by adopting a Hough straight line detection method.
4. The autonomous landing course identification method of the unmanned aerial vehicle as claimed in claim 3, wherein when the height of the unmanned aerial vehicle is higher than the threshold, in step 1, the image in the outer ring connected domain is taken as a target area; in the step 2, when the connected domain is removed, removing images in the outer ring connected domain, the inner ring connected domain and the inner ring connected domain;
when the height of the unmanned aerial vehicle is lower than a threshold value, in the step 1, taking an image in an inner ring connected domain as a target area; in the step 2, when the connected domain is removed, the inner ring connected domain is removed.
5. The unmanned aerial vehicle autonomous landing course identification method according to claim 1, wherein in the straight line type judgment process of step 3, a detected first straight line is used as a reference, and whether the unmanned aerial vehicle belongs to the same class or not is judged according to included angles between the remaining straight lines and the first straight line in sequence; lines with included angles less than 45 degrees are classified into one class, and the rest are classified into another class.
6. The unmanned aerial vehicle autonomous landing course identification method according to claim 1, wherein the quadrant to which the unmanned aerial vehicle belongs in the step 4 is determined specifically as (1) when the vertical straight line end point is on the left side of the horizontal straight line and the slope of the horizontal straight line is negative, the course angle range is the first quadrant, i.e., 0 ° -90 °; (2) the vertical straight line end point is on the left side of the horizontal straight line, the slope of the horizontal line is positive, and the course angle range is a second quadrant, namely 90-180 degrees; (3) when the vertical straight line end point is on the right side of the horizontal straight line and the slope of the horizontal straight line is negative, the course angle range is a third quadrant, namely 180-270 degrees; (4) the vertical straight line end point is right of the horizontal straight line and the slope of the horizontal straight line is positive, the course angle range is the fourth quadrant, namely 270-359 degrees.
CN202111135612.8A 2021-09-27 2021-09-27 Unmanned aerial vehicle autonomous landing course recognition method Active CN113867387B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111135612.8A CN113867387B (en) 2021-09-27 2021-09-27 Unmanned aerial vehicle autonomous landing course recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111135612.8A CN113867387B (en) 2021-09-27 2021-09-27 Unmanned aerial vehicle autonomous landing course recognition method

Publications (2)

Publication Number Publication Date
CN113867387A true CN113867387A (en) 2021-12-31
CN113867387B CN113867387B (en) 2024-04-12

Family

ID=78991149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111135612.8A Active CN113867387B (en) 2021-09-27 2021-09-27 Unmanned aerial vehicle autonomous landing course recognition method

Country Status (1)

Country Link
CN (1) CN113867387B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6782742B1 (en) * 1999-01-18 2004-08-31 Saab Ab Redundant system for the indication of heading and attitude in an aircraft
CN109949361A (en) * 2018-12-16 2019-06-28 内蒙古工业大学 A kind of rotor wing unmanned aerial vehicle Attitude estimation method based on monocular vision positioning
CN110176030A (en) * 2019-05-24 2019-08-27 中国水产科学研究院 A kind of autoegistration method, device and the electronic equipment of unmanned plane image
CN110335293A (en) * 2019-07-12 2019-10-15 东北大学 A kind of long-time method for tracking target based on TLD frame
CN110488848A (en) * 2019-08-23 2019-11-22 中国航空无线电电子研究所 Unmanned plane vision guide it is autonomous drop method and system
CN113029148A (en) * 2021-03-06 2021-06-25 西南交通大学 Inertial navigation indoor positioning method based on course angle accurate correction

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6782742B1 (en) * 1999-01-18 2004-08-31 Saab Ab Redundant system for the indication of heading and attitude in an aircraft
CN109949361A (en) * 2018-12-16 2019-06-28 内蒙古工业大学 A kind of rotor wing unmanned aerial vehicle Attitude estimation method based on monocular vision positioning
CN110176030A (en) * 2019-05-24 2019-08-27 中国水产科学研究院 A kind of autoegistration method, device and the electronic equipment of unmanned plane image
CN110335293A (en) * 2019-07-12 2019-10-15 东北大学 A kind of long-time method for tracking target based on TLD frame
CN110488848A (en) * 2019-08-23 2019-11-22 中国航空无线电电子研究所 Unmanned plane vision guide it is autonomous drop method and system
CN113029148A (en) * 2021-03-06 2021-06-25 西南交通大学 Inertial navigation indoor positioning method based on course angle accurate correction

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张梁;徐锦法;夏青元;于永军;: "地面目标特征识别与无人飞行器位姿估计", 国防科技大学学报, no. 01, 28 February 2015 (2015-02-28) *
王小洪: "无人机组合视觉特征着陆导引技术和合作目标优化研究", 《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》, pages 1 - 38 *
闫明;杜佩;王惠林;高贤娟;张正;刘栋;: "机载光电系统的地面多目标定位算法", 应用光学, no. 04 *

Also Published As

Publication number Publication date
CN113867387B (en) 2024-04-12

Similar Documents

Publication Publication Date Title
CN109785316B (en) Method for detecting apparent defects of chip
Yuan et al. Robust lane detection for complicated road environment based on normal map
Yan et al. A method of lane edge detection based on Canny algorithm
CN109434251B (en) Welding seam image tracking method based on particle filtering
CN110569838A (en) Autonomous landing method of quad-rotor unmanned aerial vehicle based on visual positioning
CN110930459A (en) Vanishing point extraction method, camera calibration method and storage medium
Gomez et al. Traffic lights detection and state estimation using hidden markov models
KR101569919B1 (en) Apparatus and method for estimating the location of the vehicle
CN111598952B (en) Multi-scale cooperative target design and online detection identification method and system
Ahmad et al. An edge-less approach to horizon line detection
CN104036523A (en) Improved mean shift target tracking method based on surf features
CN107993224B (en) Object detection and positioning method based on circular marker
CN111209920B (en) Airplane detection method under complex dynamic background
CN110334625A (en) A kind of parking stall visual identifying system and its recognition methods towards automatic parking
CN108509950B (en) Railway contact net support number plate detection and identification method based on probability feature weighted fusion
CN111832388B (en) Method and system for detecting and identifying traffic sign in vehicle running
CN114331986A (en) Dam crack identification and measurement method based on unmanned aerial vehicle vision
CN109740613A (en) A kind of Visual servoing control method based on Feature-Shift and prediction
CN110490903B (en) Multi-target rapid capturing and tracking method in binocular vision measurement
CN114241438B (en) Traffic signal lamp rapid and accurate identification method based on priori information
Wang et al. Road detection based on illuminant invariance and quadratic estimation
CN109508720B (en) Vehicle color identification method and device
CN110705553A (en) Scratch detection method suitable for vehicle distant view image
Gökçe et al. Recognition of dynamic objects from UGVs using Interconnected Neuralnetwork-based Computer Vision system
JP2002175534A (en) Method for detecting road white line

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant