CN104808685A - Vision auxiliary device and method for automatic landing of unmanned aerial vehicle - Google Patents

Vision auxiliary device and method for automatic landing of unmanned aerial vehicle Download PDF

Info

Publication number
CN104808685A
CN104808685A CN201510204419.3A CN201510204419A CN104808685A CN 104808685 A CN104808685 A CN 104808685A CN 201510204419 A CN201510204419 A CN 201510204419A CN 104808685 A CN104808685 A CN 104808685A
Authority
CN
China
Prior art keywords
information
unmanned plane
landing
vision
monocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510204419.3A
Other languages
Chinese (zh)
Inventor
江晟
贾宏光
厉明
马经纬
李银海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN201510204419.3A priority Critical patent/CN104808685A/en
Publication of CN104808685A publication Critical patent/CN104808685A/en
Pending legal-status Critical Current

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention discloses a vision auxiliary device and method for automatic landing of an unmanned aerial vehicle, and relates to the field of unmanned aerial vehicles. The problems that according to an existing vision auxiliary device which acquires and recognizes runways and the horizon through an airborne photographing device so as to sense the landing posture of the unmanned aerial vehicle, due to the lack of a multi-scale analysis mechanism, the landing requirement cannot be easily met, the posture and position calibration effect is poor, and deep prediction of the movement behavior of the unmanned aerial vehicle lacks are solved. The vision auxiliary device comprises a two-shaft rotary table, a rotary table motor, a power module, a single-view fixed-focus camera, a vision auxiliary information processing module and an information storage module. The vision auxiliary information processing module comprises a landing track re-planning module, a data comparator, an information filter, a computer vision processor based on image control, a computer vision processor based on position control, a first feedback correction device, a second feedback correction device, an information fusion parameter extractor and an RS485 communication interface. The vision auxiliary device is simple in structure, low in power consumption, high in efficiency and good in compatibility.

Description

For visual aid and the vision householder method of unmanned plane Autonomous landing
Technical field
The present invention relates to unmanned vehicle technical field, be specifically related to a kind of visual aid for unmanned plane Autonomous landing and vision householder method.
Background technology
The acquisition of unmanned plane position and attitude information and Collaborative Control are the cores of unmanned plane Autonomous landing technology, can land smoothly according to schedule after executing the task to make unmanned plane, need to make unmanned plane can perceive touchdown area when remote, adjustment pitch attitude and course shift; After (process of line up with runway when aircraft declines) is entered closely in perception, Real-time Obtaining unmanned plane position and attitude information, in conjunction with UAV Flight Control System heading orientation, glide according to the navigation path of planning; When planning level-out distance, guarantee that each flight parameter of unmanned plane meets drop conditions, and kill engine and to land according to flare trajectory line; Finally guarantee that unmanned plane can slide along airfield runway line in the sliding race brake stage, avoid gunning off the runway, and complete braking in appointed area.Therefore how the various parameters of Obtaining Accurate unmanned plane are the keys that this technology realizes.
The visual aid adopted in current unmanned plane Autonomous landing process, normally utilizes airborne camera installation to obtain and identifies that perception unmanned plane touch down attitude is carried out in runway and local horizon, but still come with some shortcomings.First, environment sensing lacks multiscale analysis mechanism, the whole descent of unmanned plane needs a series of continued operation, visual field resolution progressively improves, the required key parameter obtained of each operational phase difference to some extent again, current vision-aided system adopts same scale feature mostly, is difficult to the actual demand meeting said process; Secondly, attitude and location position effect poor, fail to be coupled with unmanned aerial vehicle platform well to the demarcation of photograph device parameter, acquisition more dependence off-line calibrations process of camera installation parameter or isolate self-calibration obtain by demarcating thing in flight course, fail fully in conjunction with the advantage of unmanned plane self platform device, have impact on the precision that vision auxiliary parameter is extracted; Finally, lack the depth prediction to unmanned plane motor behavior, the collection laying particular emphasis on parameter of current visual aid, lacks and excavates the analysis of acquisition parameter and the degree of depth more, excavates can improve precision and the effect of unmanned plane vision auxiliary landing further if carry out the degree of depth to supplemental characteristic.
Summary of the invention
Existingly utilize airborne camera installation to obtain to identify the shortage multiscale analysis mechanism that the visual aid that perception unmanned plane touch down attitude is carried out in runway and local horizon exists to solve and be difficult to the problem that satisfied landing demand, attitude and location position effect are poor, lack the depth prediction to unmanned plane motor behavior, the invention provides a kind of visual aid for unmanned plane Autonomous landing and vision householder method.
The technical scheme that the present invention adopts for technical solution problem is as follows:
Visual aid for unmanned plane Autonomous landing of the present invention, comprising:
Be arranged on the two-axle rotating table of unmanned plane head position;
The turntable motor be connected with two-axle rotating table, moves for driving two-axle rotating table;
The power module be connected with turntable motor;
The monocular fixed-focus camera be connected with power module, vision supplementary processing module and information storage module, described monocular fixed-focus camera is connected with vision supplementary processing module, described vision supplementary processing module is connected with UAV Flight Control System with information storage module respectively, the coded video information that described information storage module exports for storing vision supplementary processing module;
Described turntable motor, power module, monocular fixed-focus camera, vision supplementary processing module and information storage module are all fixed on two-axle rotating table;
Vision supplementary processing module comprises: the heavy planning module of landing path, Data Comparison device, signal filter, the computer vision processor, the first feedback modifiers device, the second feedback modifiers device, information fusion parameter extractor and the RS485 communication interface that control based on the computer vision processor of image control, position-based;
The described computer vision processor based on image control is connected with the first feedback modifiers device, and the computer vision processor that position-based controls is connected with the second feedback modifiers device; UAV Flight Control System weighs planning module with Data Comparison device, monocular fixed-focus camera, signal filter, the second feedback modifiers device, information fusion parameter extractor, landing path, is connected based on the computer vision processor of image control; Information fusion parameter extractor is connected with the first feedback modifiers device, unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter, the second feedback modifiers device respectively; The heavy planning module of landing path is all connected with unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter respectively; Described vision supplementary processing module is connected by RS485 communication interface with UAV Flight Control System, realizes data communication between the two.
Further, described power module provides 12V power supply for turntable motor and monocular fixed-focus camera.
Further, described power module provides 5V power supply for vision supplementary processing module and information storage module.
Further, described vision supplementary processing module adopts four core CortexA9 as process acp chip, voltage 5V.
Further, described information storage module adopts Samsung MB-MP32D to store, voltage 5V.
The present invention also provides a kind of vision householder method of the visual aid for unmanned plane Autonomous landing, and the method comprises the following steps:
(1) data after information database and historical evaluation inputed in vision supplementary processing module, the heavy planning module of landing flight path utilizes knowledge base and the landing path of raw information to unmanned plane that obtain from unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter is revised in real time simultaneously;
(2) inertial navigation information obtained by unmanned aerial vehicle onboard sensor, GPS information, starlight information transmission are to information fusion parameter extractor, and the reference information that the visual information that the reference information simultaneously obtained by Data Comparison device, monocular fixed-focus camera obtain, signal filter obtain is transferred to information fusion parameter extractor;
(3) the computer vision processor based on image control obtains the image information of unmanned plane and the visual spatial attention signal be converted into based on image, simultaneously by this visual spatial attention Signal transmissions based on image give the first feedback modifiers device, the first feedback modifiers device again by this visual spatial attention Signal transmissions based on image to information fusion parameter extractor;
(4) the computer vision processor that position-based controls obtains the positional information of unmanned plane and is converted into location-based visual spatial attention signal, simultaneously by this location-based visual spatial attention Signal transmissions give the second feedback modifiers device, the second feedback modifiers device again by this location-based visual spatial attention Signal transmissions to information fusion parameter extractor;
(5) information fusion parameter extractor is by carrying out information integration to the inertial navigation information received, GPS information, starlight information, reference information, visual information, reference information, the fusion parameters integrated out is analyzed, learns and differentiated, parse the parameter that UAV Flight Control System controls for flight management, and carry out communication by RS485 communication interface and unmanned plane Autonomous landing flight control modules, auxiliary UAV Flight Control System has divided corresponding control work for different landing phases;
(6) the last UAV Flight Control System information that utilizes visual aid to provide, upgrade in conjunction with nonlinear Control and UAV Attitude, the steering wheel in control unmanned plane completes all working of unmanned plane Autonomous landing.
In further step (1), raw information refers to: inertial navigation information, GPS information, starlight information that unmanned aerial vehicle onboard sensor obtains, and reference information, the visual information of monocular fixed-focus camera acquisition, the reference information of signal filter acquisition that Data Comparison device obtains.
Further, in step (5), landing phases is divided into that perception is entered closely, the aligning that glides, even up and land and slidingly run braking.
The invention has the beneficial effects as follows: the relative position of unmanned plane and attitude angle are the navigational parameters of its stabilized flight and safe landing indispensability, the object of the invention is to be to utilize visual information when providing a kind of unmanned plane to land, carry out real time position by analyzing and processing, attitude information obtains collection, realize dropping motion track following and prediction, the perception completed in Autonomous landing of assisting in flying control system is entered closely, aim at and glide and sliding run the tasks such as brake, possess that structure is simple simultaneously, processing speed soon and precision advantages of higher.
The present invention is directed to the problem that visual aid in unmanned plane Autonomous landing process needs to solve, construct corresponding apparatus structure, and devise overall plan, unmanned plane is in Autonomous landing process, unmanned plane landing place and attitude parameter rower of going forward side by side is gathered fixed by camera and information handling system, prediction flight path provides state identification, and assisting in flying control system completes the device that unmanned plane perception is entered closely, aimed at the tasks such as downslide and landing braking.
The present invention achieves the autonomous collection of unmanned plane landing phases information, analysis and cooperation control by efficient Vision information processing system fast, possesses intelligent independent, does not need external unit support can complete the advantage such as parameter acquisition and analysis.
The present invention can by based on coupling camera calibration and detect in conjunction with Analysis On Multi-scale Features and realize position in unmanned plane Autonomous landing process and attitude information collection with parameter extraction technology, can obtain and the movement locus of analyses and prediction unmanned plane simultaneously, assist flight control system to complete perception and enter closely, aim at decline and slide functions such as running brake.
The present invention can carry out being coupled linking with UAV Flight Control System, to improve the unmanned plane position and attitude precision obtained, also can when external communications equipment be disturbed or lost efficacy, independent perception also obtains necessary parameter when environmental information provides flight control system to land.
Structure of the present invention is simple, low in energy consumption, can on-line equipment at unmanned plane head position, through inside and outside parameter demarcate after be connected with flight control system by various communication interfaces, realize efficient, real-time parameter acquisition and condition adjudgement accurately, and possess the compatible advantage such as good.
Accompanying drawing explanation
Fig. 1 is the composition frame chart of vision auxiliary landing device of the present invention.
Fig. 2 is the process schematic of the vision householder method for unmanned plane Autonomous landing of the present invention.
Fig. 3 is vision auxiliary landing divided stages schematic diagram.
Fig. 4 is vision auxiliary landing yaw steering scope schematic diagram.
Fig. 5 demarcates design sketch for evening up to land.
Embodiment
Below in conjunction with accompanying drawing, the present invention is described in further details.
As shown in Figure 1, the visual aid for unmanned plane Autonomous landing of the present invention, forms primarily of two-axle rotating table, turntable motor, power module, monocular fixed-focus camera, vision supplementary processing module and information storage module.Two-axle rotating table is arranged on unmanned plane head position, and turntable motor, power module, monocular fixed-focus camera, vision supplementary processing module and information storage module are all fixed on two-axle rotating table; Turntable motor is connected with two-axle rotating table, moves for driving two-axle rotating table; Power module is connected with information storage module with turntable motor, monocular fixed-focus camera, vision supplementary processing module respectively, power for giving turntable motor, monocular fixed-focus camera, vision supplementary processing module and information storage module, wherein, power module provides 12V power supply for turntable motor and monocular fixed-focus camera, and power module provides 5V power supply for vision supplementary processing module and information storage module; Monocular fixed-focus camera is connected with vision supplementary processing module; Vision supplementary processing module is connected with information storage module, the coded video information that information storage module exports for storing vision supplementary processing module; Vision supplementary processing module is with RS485 communication interface, and vision supplementary processing module is connected by RS485 communication interface with UAV Flight Control System, realizes data communication between the two.Unmanned plane inside is all provided with UAV Flight Control System, UAV Flight Control System is prior art, UAV Flight Control System forms primarily of unmanned plane Autonomous landing flight control modules, for realizing the nonlinear Control of the steering wheel in unmanned plane and upgrading unmanned plane state.
As shown in Figure 2, vision supplementary processing module is primarily of the heavy planning module of landing path, Data Comparison device, signal filter, the computer vision processor based on image control, the computer vision processor of position-based control, the first feedback modifiers device, the second feedback modifiers device, information fusion parameter extractor composition.Computer vision processor based on image control is connected with the first feedback modifiers device, and the computer vision processor that position-based controls is connected with the second feedback modifiers device.UAV Flight Control System weighs planning module with Data Comparison device, monocular fixed-focus camera, signal filter, the second feedback modifiers device, information fusion parameter extractor, landing path, is connected based on the computer vision processor of image control.Information fusion parameter extractor is connected with the first feedback modifiers device, unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter, the second feedback modifiers device respectively.The heavy planning module of landing path is all connected with unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter respectively.
In present embodiment, said vision supplementary processing module adopts four core CortexA9 as process acp chip, voltage 5V; Said information storage module adopts Samsung MB-MP32D to store, voltage 5V.
As shown in Figure 2, the vision householder method for unmanned plane Autonomous landing of the present invention, be that the basis based on above-mentioned visual aid realizes, the method comprises the following steps:
(1) data after information database and historical evaluation are inputed in vision supplementary processing module, the heavy planning module of landing flight path utilizes knowledge base and from unmanned aerial vehicle onboard sensor simultaneously, Data Comparison device, monocular fixed-focus camera, (raw information refers to the raw information obtained in signal filter: the inertial navigation information that unmanned aerial vehicle onboard sensor obtains, GPS information, starlight information, and the reference information that Data Comparison device obtains, the visual information that monocular fixed-focus camera obtains, the reference information that signal filter obtains) landing path of unmanned plane is revised in real time,
(2) inertial navigation information obtained by unmanned aerial vehicle onboard sensor, GPS information, starlight information transmission are to information fusion parameter extractor, and the reference information that the visual information that the reference information simultaneously obtained by Data Comparison device, monocular fixed-focus camera obtain, signal filter obtain is transferred to information fusion parameter extractor;
(3) the computer vision processor based on image control obtains the image information of unmanned plane and the visual spatial attention signal be converted into based on image, simultaneously by this visual spatial attention Signal transmissions based on image give the first feedback modifiers device, the first feedback modifiers device again by this visual spatial attention Signal transmissions based on image to information fusion parameter extractor;
(4) the computer vision processor that position-based controls obtains the positional information of unmanned plane and is converted into location-based visual spatial attention signal, simultaneously by this location-based visual spatial attention Signal transmissions give the second feedback modifiers device, the second feedback modifiers device again by this location-based visual spatial attention Signal transmissions to information fusion parameter extractor;
(5) information fusion parameter extractor is passed through the inertial navigation information received, GPS information, starlight information, reference information, visual information, reference information carries out information integration, the fusion parameters integrated out is analyzed, study and differentiation, parse the key parameter that UAV Flight Control System controls for flight management and (refer to the parameters of unmanned plane during flying: angular velocity information, acceleration information, magnetic vector information, positional information, velocity information, barometer altitude information etc.), and carry out communication by RS485 communication interface and unmanned plane Autonomous landing flight control modules, auxiliary UAV Flight Control System has divided corresponding control work for different landing phases,
(6) the last UAV Flight Control System information that utilizes visual aid to provide, upgrade in conjunction with nonlinear Control and UAV Attitude, the steering wheel in control unmanned plane completes all working of unmanned plane Autonomous landing.
Visual aid for unmanned plane Autonomous landing according to the present invention completes unmanned plane Autonomous landing divided stages, is described in detail in each stage to visual aid of the present invention and vision householder method.
(1) divided stages
By analyzing the feature of unmanned plane Autonomous landing, unmanned plane Autonomous landing process being carried out stage and dividing, determining some key parameters in each stage, and carry out Analysis On Multi-scale Features design.As shown in Figure 3, the present invention, in conjunction with the different task characteristic in unmanned plane Autonomous landing process, is divided into four-stage, and namely perception is entered closely, the aligning that glides, evened up and land and sliding run braking.
(2) perception is entered closely
Perception is entered the camera related generally to for apperceive characteristic of nearly stage and is carried out choosing and to determine and in camera, square element is demarcated.
1) camera for apperceive characteristic carries out choosing determining
Unmanned plane, after acceptance returns instruction, namely starts to decline in certain cruising altitude; When dropping to (during as height 600m, sighting distance 6000m) when predetermined altitude or sighting distance reach predetermined value, start to detect field range, obtain runway orientation to start to march into the arena, now airfield runway line accuracy of identification is low, and the Analysis On Multi-scale Features obtaining UAV position and orientation is sky ground wire and airport general image; Along with the raising of resolving power, visual aid progressively will obtain course beacon parameter to revise runway orientation; And notify that UAV Flight Control System reduces flying speed gradually, but gliding speed should do not made lower than 1.3 ~ 1.5 times of unmanned plane stalling speed.
Unmanned plane Autonomous landing process need is not more than 8000m in sighting distance, height brings into operation and apperceive identity airport in the scope of 400m ~ 1000m, and by Analysis On Multi-scale Features Detection and Extraction key parameter, assist UAV Flight Control System to be continued until that unmanned plane completes landing.Unmanned plane glide paths and ground angle and gliding angle are 3 ° ~ 5 °, and consider projective transformation, runway can meet Perspective Conditions usually in the vertical projection direction size perpendicular to unmanned plane glide paths direction.
The present invention selects monocular fixed-focus camera, and the demand for horizontal projection and different landing phases designs.Enter in the nearly stage in perception, enter near point demand the highest, now:
f = 8 × 10 - 3 × 6 × 7500 20 = 18 mm - - - ( 1 )
By calculating, select focal length to be 18mm, corresponding field angle is not more than the camera lens of 16.2 ° * 12.2 °, can meet the image processing requirements in follow-up whole vision auxiliary landing process; According to Optical System Design and image processing requirements, for improving resolving power and then improving accuracy of identification, now field angle should be the smaller the better, by calculating, when minimum field angle meets 7.6 ° * 5.7 °, can guarantee that target is in identifiable design regional extent, calculate nargin for increasing, the camera lens choosing 8 ° * 6 ° carries out image acquisition, and corresponding focal length is 36mm simultaneously, now can be calculated image size according to imaging model is 18*13 pixel, can meet image processing requirements.
2) in, square element is demarcated
Enter closely to be realized perception by visual aid, need to set up the relation between monocular fixed-focus camera image pixel positions and perception scene location, concrete method is according to monocular fixed-focus camera model, solve the model parameter of monocular fixed-focus camera, carry out interior side's element to monocular fixed-focus camera to demarcate, and then obtain parameter coordinate corresponding to feature, interior side's element calibration algorithm of the present invention is:
β ( m ) = arctan ( m sin β M - m + m cos β ) - - - ( 3 )
In formula (2): n is the image pixel coordinates of X-direction, N is image total pixel number in the X direction, for the angle of visibility in X-direction cross section, for X-direction initial point is to angle corresponding to the n-th pixel; In formula (3): m is the image pixel coordinates of Y-direction, M is image total pixel number in the Y direction, and β is the angle of visibility in Y-direction cross section, and β (m) is for Y-direction initial point is to angle corresponding to the n-th pixel.
(3) glide aligning
Downslide alignment stage relates generally to Multi resolution feature extraction and posture information obtains.
1) Multi resolution feature extraction: enter nearly task when unmanned plane completes perception, now needs to guarantee that unmanned plane yaw direction is in controlled range.As shown in Figure 4, disconnect the height keeping system in UAV Flight Control System, enter the nearly landing navigation signal obtained according to perception and assist UAV Flight Control System to generate navigation path and start to glide, corresponding gliding angle scope is 3 ° ~ 5 °.Along with improving constantly of visual field resolving power, detected by Analysis On Multi-scale Features, course beacon parameter accurately can be obtained gradually, course parameter is now based on airfield runway line, again in conjunction with the yaw angle of UAV Flight Control System auto modification unmanned plane and the trajector deviation caused by crosswind interference, make air velocity vector line up with runway center line, and underspeed, complete course and aim at; Close again along with sighting distance, Airport Images is increasing, and the airfield runway line features of its inside is clear gradually, can provide yaw information more accurately.
2) posture information obtains: for extracting unmanned plane position and attitude information, need to detect in conjunction with Multi resolution feature extraction combine with technique sky ground wire and airfield runway line.
A, based on the pitching of rapid Hough transform (identifying geometric configuration in image procossing from image) and roll attitude parameter acquiring
The present invention utilizes the rapid Hough transform multiple dimensioned crucial linear feature come in detected image to complete pitching and roll attitude parameter extraction.First binary conversion treatment is carried out to image, then carry out corresponding rim detection, the result that then edge detects makes Hough transform, obtains straight-line detection result, and analyze according to Analysis On Multi-scale Features, the acquisition of pitching and roll attitude parameter is finally realized in conjunction with calibrating parameters.
The concrete computation process of pitching and roll attitude parameter acquiring is as follows:
I, the monocular fixed-focus camera by having set up reads in 256 grades of gray level images and carries out pre-service;
II, the size of Hough transform totalizer is determined and storage allocation according to Analysis On Multi-scale Features analysis and picture size;
III, image is changed, and carry out Hough transform, and by transformation results stored in Hough transform totalizer;
IV, obtain threshold value in conjunction with prior imformation and weather by adaptive algorithm, and reset according to the point that accumulated value in Hough transform totalizer does not satisfy condition by threshold size;
V, the point that accumulated value in Hough transform totalizer is maximum is searched;
VI, by record queries mechanism, the straight line that can be used for angle of pitch attitude detection existed in image is looked for fast;
VII, unmanned plane pitching and roll attitude parameter is obtained by demarcating.
B, to obtain based on the yaw-position of multistage gradient matrix character
The present invention, for obtaining unmanned plane crab angle, by multi-scale gradient matrix character, detects airfield runway line and extracts corresponding information.First utilize adaptive algorithm to obtain gradient matrix T, then ask for often row gradient and, obtain the capable gradient of n and S, and utilize gradient and maximum value S_ maxhalf as threshold value reject invalid gradient and, again gradient is carried out UNICOM, avoid occurring too screening problem, and the region after UNICOM is screened again, the feature choosing certain length scope carries out detecting and UNICOM, again the feature satisfied condition is carried out pattern-recognition, then course beacon parameter can be obtained in conjunction with perspective transform and calibration algorithm, and then realize crab angle extraction.
C, to obtain in conjunction with the positional information of pixel-angle map
After acquisition pitching and yaw-position information, in conjunction with the perspective transform equation of pixel-angle map such as formula shown in (4) and formula (5), obtain real information parameter, the two-dimensional pixel angle that can obtain image pixel coordinates (n, m) corresponding by calibrating parameters is can be calculated by the height H of monocular fixed-focus camera and inclination angle δ (being equal to optical axis and vertical angle) for the coordinate of arbitrfary point pixel in actual scene, that is:
Y n ( m ) = H · tan [ δ + arctan ( m sin β M - m + m cos β ) ] - - - ( 5 )
In formula (4) and formula (5), H and δ is respectively height H and the inclination angle of monocular fixed-focus camera, and n is the image pixel coordinates of X-direction, m is the image pixel coordinates of Y-direction, N is image total pixel number in the X direction, and M is image total pixel number in the Y direction for the angle of visibility in X-direction cross section, β is the angle of visibility in Y-direction cross section, for X-direction initial point is to angle corresponding to the n-th pixel, β (m) is for Y-direction initial point is to angle corresponding to the n-th pixel.
(4) landing is evened up
Even up landing period and relate generally to multilevel two-dimensional code positioning analysis and track following and state identification.
In conjunction with downslide alignment and prior imformation, judge that unmanned plane landing speed now should be-45m/s ~-70m/s, and according to gliding angle about-3 ° downslide, now the decline rate of unmanned plane is: and reduce gliding angle further, the track of unmanned plane is evened up along curve, evens up the demarcation effect of landing as shown in Figure 5.
Unmanned plane in the present invention is suitable for evening up decision-making: 20m ~ 40m.After unmanned plane detects to even up decision-making height, automatically increase the angle of attack by UAV Flight Control System and unmanned plane movement locus is bent; Finally make air velocity vector and ground keeping parallelism complete the process of evening up, final landing carries out sliding brake on runway, and guarantees that braking distance answers≤500m.
A, multilevel two-dimensional code positioning analysis
Evening up landing period, because unmanned plane sighting distance in descent reduces fast, image will constantly expand and then be full of visual field, and therefore feature also corresponding conversion can occur, and needs to design Analysis On Multi-scale Features targetedly, to adapt to the conversion of this rapid expanding.Evening up landing period, when elemental height is 30m, the corresponding camera lens of 8 ° * 6 ° and the monocular fixed-focus camera of 640*480, now gliding angle is larger, corresponding field range is about 17m*7m, and runway is full of visual field, needs to design the feature of runway inside, consider the switching between feature level, therefore the largest interval of airport double course line should not be greater than 7m simultaneously.In conjunction with Quick Response Code feature, arrange accordingly runway, due to the conversion demand of demand fulfillment 360 ° rotation, convergent-divergent and translation, select QR Quick Response Code as a reference, the main setting of its feature comprises following content:
I, position sensing figure, position sensing figure separator: for the location to Quick Response Code;
II, correction graph: be considered to image distortion etc., carries out assistance adjustment by correcting figure;
III, figure is calibrated: by arranging Quick Response Code grid, identifying information is arranged;
IV, format information: Quick Response Code level of error correction is set;
V, data area: the scale-of-two grid coding content utilizing black and white, by the interpretation to content, and then realizes the positioning analysis of unmanned plane;
VI, error correction code word: damage the mistake brought for revising Quick Response Code.
The present invention considers in unmanned plane descent, visual field narrows down to 1.5m scope gradually, therefore need in this range scale, arrange corresponding multilevel two-dimensional code image, now the large I of Quick Response Code cell is set to 5cm, overall size is 105cm, and identification that visual field causes was lost efficacy can to guarantee all to there will not be feature to be full of in whole descent.Can obtain the current absolute position of unmanned plane by multilevel two-dimensional code positioning analysis technology, and the pose parameter utilizing the parameter in image correction process to realize unmanned plane obtains.
B, track following and state identification
Unmanned plane position and attitude information, by multilevel two-dimensional code positioning analysis technology, are registered, are utilized the coordinate C of unmanned plane by the present invention in space coordinates k, speed V k, state U k, tight ness rating S k(i.e. the distributive law of existence) and tight ness rating change ▽ S kunmanned plane movement locus is followed the tracks of, to its state of flight T kbe described, the status flag vector in corresponding k moment can be expressed as:
T k=(C k,V k,U k,S k,▽S k) (6)
Wherein under normal circumstances, unmanned plane will run in presumptive area with the speed of certain limit, and the finiteness of its acceleration determines the variation range of position within the adjacent two observation moment.Therefore, stablize at unmanned plane and come across after in surveyed area, the first two can be utilized to observe the speed of the unmanned plane coordinate position determination moving target in moment, that is:
V x , 0 = x c , 1 - x c , 0 V y , 0 = y c , 1 - y c , 0 V z , 0 = z c , 1 - z c , 0 - - - ( 7 )
In formula (7), V x, 0, V y, 0, V z, 0be respectively x, the initial velocity in y, z direction, the present invention considers that the first two observation moment unmanned plane coordinate is respectively C 0=(x c, 0, y c, 0, z c, 0), C 1=(x c, 1, y c, 1, z c, 1), simultaneously, due to the relative stability of motion, in the continuous two observation moment can not there is larger change in the tight ness rating of unmanned plane state, according to this principle, state tight ness rating is set, after the k moment, utilize relative to the estimator of unmanned plane current kinetic track and state and current state feature observed reading and the state likely existed and distribution, again in conjunction with the estimated value of previous observation moment status flag, select optimum matching according to coupling minimum distance criterion, the identification of unmanned plane track and state can be realized.
(5) sliding brake function
When after UAV Landing, utilize visual aid of the present invention to gather airfield runway, and utilize adaptive algorithm to extract the multistage gradient matrix character of airfield runway, and in conjunction with calibrating parameters, detect unmanned plane and whether carry out sliding race along runway; Run region key signature with cunning again to compare, whether the analysis unmanned plane sliding race deboost phase can complete in appointed area, avoids it to fly off the runway; The brake system controlled in UAV Flight Control System makes braking procedure more level and smooth and reliable and stable simultaneously.

Claims (8)

1. for the visual aid of unmanned plane Autonomous landing, it is characterized in that, comprising:
Be arranged on the two-axle rotating table of unmanned plane head position;
The turntable motor be connected with two-axle rotating table, moves for driving two-axle rotating table;
The power module be connected with turntable motor;
The monocular fixed-focus camera be connected with power module, vision supplementary processing module and information storage module, described monocular fixed-focus camera is connected with vision supplementary processing module, described vision supplementary processing module is connected with UAV Flight Control System with information storage module respectively, the coded video information that described information storage module exports for storing vision supplementary processing module;
Described turntable motor, power module, monocular fixed-focus camera, vision supplementary processing module and information storage module are all fixed on two-axle rotating table;
Vision supplementary processing module comprises: the heavy planning module of landing path, Data Comparison device, signal filter, the computer vision processor, the first feedback modifiers device, the second feedback modifiers device, information fusion parameter extractor and the RS485 communication interface that control based on the computer vision processor of image control, position-based;
The described computer vision processor based on image control is connected with the first feedback modifiers device, and the computer vision processor that position-based controls is connected with the second feedback modifiers device; UAV Flight Control System weighs planning module with Data Comparison device, monocular fixed-focus camera, signal filter, the second feedback modifiers device, information fusion parameter extractor, landing path, is connected based on the computer vision processor of image control; Information fusion parameter extractor is connected with the first feedback modifiers device, unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter, the second feedback modifiers device respectively; The heavy planning module of landing path is all connected with unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter respectively; Described vision supplementary processing module is connected by RS485 communication interface with UAV Flight Control System, realizes data communication between the two.
2. the visual aid for unmanned plane Autonomous landing according to claim 1, is characterized in that, described power module provides 12V power supply for turntable motor and monocular fixed-focus camera.
3. the visual aid for unmanned plane Autonomous landing according to claim 1, is characterized in that, described power module provides 5V power supply for vision supplementary processing module and information storage module.
4. the visual aid for unmanned plane Autonomous landing according to claim 1, is characterized in that, described vision supplementary processing module adopts four core CortexA9 as process acp chip, voltage 5V.
5. the visual aid for unmanned plane Autonomous landing according to claim 1, is characterized in that, described information storage module adopts Samsung MB-MP32D to store, voltage 5V.
6., as claimed in claim 1 for the vision householder method of the visual aid of unmanned plane Autonomous landing, it is characterized in that, the method comprises the following steps:
(1) data after information database and historical evaluation inputed in vision supplementary processing module, the heavy planning module of landing flight path utilizes knowledge base and the landing path of raw information to unmanned plane that obtain from unmanned aerial vehicle onboard sensor, Data Comparison device, monocular fixed-focus camera, signal filter is revised in real time simultaneously;
(2) inertial navigation information obtained by unmanned aerial vehicle onboard sensor, GPS information, starlight information transmission are to information fusion parameter extractor, and the reference information that the visual information that the reference information simultaneously obtained by Data Comparison device, monocular fixed-focus camera obtain, signal filter obtain is transferred to information fusion parameter extractor;
(3) the computer vision processor based on image control obtains the image information of unmanned plane and the visual spatial attention signal be converted into based on image, simultaneously by this visual spatial attention Signal transmissions based on image give the first feedback modifiers device, the first feedback modifiers device again by this visual spatial attention Signal transmissions based on image to information fusion parameter extractor;
(4) the computer vision processor that position-based controls obtains the positional information of unmanned plane and is converted into location-based visual spatial attention signal, simultaneously by this location-based visual spatial attention Signal transmissions give the second feedback modifiers device, the second feedback modifiers device again by this location-based visual spatial attention Signal transmissions to information fusion parameter extractor;
(5) information fusion parameter extractor is by carrying out information integration to the inertial navigation information received, GPS information, starlight information, reference information, visual information, reference information, the fusion parameters integrated out is analyzed, learns and differentiated, parse the parameter that UAV Flight Control System controls for flight management, and carry out communication by RS485 communication interface and unmanned plane Autonomous landing flight control modules, auxiliary UAV Flight Control System has divided corresponding control work for different landing phases;
(6) the last UAV Flight Control System information that utilizes visual aid to provide, upgrade in conjunction with nonlinear Control and UAV Attitude, the steering wheel in control unmanned plane completes all working of unmanned plane Autonomous landing.
7. the vision householder method of the visual aid for unmanned plane Autonomous landing according to claim 6, it is characterized in that, in step (1), raw information refers to: inertial navigation information, GPS information, starlight information that unmanned aerial vehicle onboard sensor obtains, and reference information, the visual information of monocular fixed-focus camera acquisition, the reference information of signal filter acquisition that Data Comparison device obtains.
8. the vision householder method of the visual aid for unmanned plane Autonomous landing according to claim 6, is characterized in that, in step (5), landing phases is divided into that perception is entered closely, the aligning that glides, even up and land and slidingly run braking.
CN201510204419.3A 2015-04-27 2015-04-27 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle Pending CN104808685A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510204419.3A CN104808685A (en) 2015-04-27 2015-04-27 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510204419.3A CN104808685A (en) 2015-04-27 2015-04-27 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN104808685A true CN104808685A (en) 2015-07-29

Family

ID=53693602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510204419.3A Pending CN104808685A (en) 2015-04-27 2015-04-27 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN104808685A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105197252A (en) * 2015-09-17 2015-12-30 武汉理工大学 Small-size unmanned aerial vehicle landing method and system
CN105278541A (en) * 2015-09-02 2016-01-27 蔡兆旸 Aircraft auxiliary landing control method and system
CN105353765A (en) * 2015-11-10 2016-02-24 浙江大华技术股份有限公司 Method and device for controlling UAV landing
CN105527973A (en) * 2016-01-15 2016-04-27 无锡觅睿恪科技有限公司 Unmanned aerial vehicle automatic landing system
CN106814850A (en) * 2016-12-03 2017-06-09 西安科锐盛创新科技有限公司 Simulated flight operation test system and method for testing based on sight line track
CN106814849A (en) * 2016-12-03 2017-06-09 西安科锐盛创新科技有限公司 Simulated flight operation test accessory system and method based on eyeball tracking
CN106909162A (en) * 2017-04-21 2017-06-30 普宙飞行器科技(深圳)有限公司 A kind of vehicle-mounted Autonomous landing device of universal unmanned plane
CN106940888A (en) * 2017-04-14 2017-07-11 上海工程技术大学 The servicing unit that a kind of image pixel for high-altitude structure thing is demarcated
CN106980132A (en) * 2017-05-18 2017-07-25 北京理工大学 A kind of unmanned plane coordinated operation system
CN108008434A (en) * 2016-11-01 2018-05-08 波音公司 Flight control system and application method with low frequency instrument-landing-system localizer beacon abnormality detection
CN108280043A (en) * 2018-01-29 2018-07-13 北京润科通用技术有限公司 A kind of method and system of fast prediction flight path
CN108983812A (en) * 2018-07-25 2018-12-11 哈尔滨工业大学 A kind of onboard control system that unmanned plane sea is landed
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109739257A (en) * 2018-12-21 2019-05-10 中科院合肥技术创新工程院 Merge the patrol unmanned machine closing method and system of satellite navigation and visual perception
CN109903331A (en) * 2019-01-08 2019-06-18 杭州电子科技大学 A kind of convolutional neural networks object detection method based on RGB-D camera
CN110058604A (en) * 2019-05-24 2019-07-26 中国科学院地理科学与资源研究所 A kind of accurate landing system of unmanned plane based on computer vision
CN110322462A (en) * 2019-06-13 2019-10-11 暨南大学 Unmanned aerial vehicle vision based on 5G network feels land method and system
CN111578947A (en) * 2020-05-29 2020-08-25 天津工业大学 Unmanned aerial vehicle monocular SLAM extensible framework with depth recovery capability
CN112101374A (en) * 2020-08-01 2020-12-18 西南交通大学 Unmanned aerial vehicle obstacle detection method based on SURF feature detection and ISODATA clustering algorithm
WO2021056139A1 (en) * 2019-09-23 2021-04-01 深圳市大疆创新科技有限公司 Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium
CN113221883A (en) * 2021-05-12 2021-08-06 武汉天图地信科技有限公司 Real-time correction method for flight navigation route of unmanned aerial vehicle
CN116843748A (en) * 2023-09-01 2023-10-03 上海仙工智能科技有限公司 Remote two-dimensional code and object space pose acquisition method and system thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
US20120173053A1 (en) * 2011-01-04 2012-07-05 Kabushiki Kaisha Topcon Flight Control System For Flying Object
US20130103233A1 (en) * 2011-10-24 2013-04-25 Airbus Operations (Sas) Automatic landing method and device for an aircraft on a strong slope runway
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103809598A (en) * 2014-03-12 2014-05-21 北京航空航天大学 Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109640A (en) * 2006-07-19 2008-01-23 北京航空航天大学 Unmanned aircraft landing navigation system based on vision
US20120173053A1 (en) * 2011-01-04 2012-07-05 Kabushiki Kaisha Topcon Flight Control System For Flying Object
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
US20130103233A1 (en) * 2011-10-24 2013-04-25 Airbus Operations (Sas) Automatic landing method and device for an aircraft on a strong slope runway
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103809598A (en) * 2014-03-12 2014-05-21 北京航空航天大学 Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈龙胜: "基于视觉伺服的无人机自主着陆飞行关键技术研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278541A (en) * 2015-09-02 2016-01-27 蔡兆旸 Aircraft auxiliary landing control method and system
CN105278541B (en) * 2015-09-02 2018-08-17 盐城智博科技有限公司 A kind of aircraft auxiliary landing control method and system
CN105197252A (en) * 2015-09-17 2015-12-30 武汉理工大学 Small-size unmanned aerial vehicle landing method and system
CN105353765A (en) * 2015-11-10 2016-02-24 浙江大华技术股份有限公司 Method and device for controlling UAV landing
CN105353765B (en) * 2015-11-10 2018-07-03 浙江华飞智能科技有限公司 A kind of method and device for controlling unmanned plane landing
CN105527973A (en) * 2016-01-15 2016-04-27 无锡觅睿恪科技有限公司 Unmanned aerial vehicle automatic landing system
CN108008434A (en) * 2016-11-01 2018-05-08 波音公司 Flight control system and application method with low frequency instrument-landing-system localizer beacon abnormality detection
CN108008434B (en) * 2016-11-01 2023-05-09 波音公司 Flight control system with low-frequency instrument landing system positioning beacon anomaly detection and use method
CN106814849A (en) * 2016-12-03 2017-06-09 西安科锐盛创新科技有限公司 Simulated flight operation test accessory system and method based on eyeball tracking
CN106814850A (en) * 2016-12-03 2017-06-09 西安科锐盛创新科技有限公司 Simulated flight operation test system and method for testing based on sight line track
CN106814849B (en) * 2016-12-03 2020-08-14 中信海洋直升机股份有限公司 Simulated flight operation test auxiliary system and method based on eyeball tracking
CN106940888B (en) * 2017-04-14 2023-11-24 上海工程技术大学 Auxiliary device for calibrating image pixels of high-altitude structure
CN106940888A (en) * 2017-04-14 2017-07-11 上海工程技术大学 The servicing unit that a kind of image pixel for high-altitude structure thing is demarcated
CN106909162A (en) * 2017-04-21 2017-06-30 普宙飞行器科技(深圳)有限公司 A kind of vehicle-mounted Autonomous landing device of universal unmanned plane
CN106980132B (en) * 2017-05-18 2023-07-21 北京理工大学 Unmanned aerial vehicle collaborative operation system
CN106980132A (en) * 2017-05-18 2017-07-25 北京理工大学 A kind of unmanned plane coordinated operation system
CN108280043A (en) * 2018-01-29 2018-07-13 北京润科通用技术有限公司 A kind of method and system of fast prediction flight path
CN108280043B (en) * 2018-01-29 2021-11-23 北京润科通用技术有限公司 Method and system for quickly predicting flight trajectory
CN108983812B (en) * 2018-07-25 2021-06-04 哈尔滨工业大学 Shipborne control system for unmanned aerial vehicle landing at sea
CN108983812A (en) * 2018-07-25 2018-12-11 哈尔滨工业大学 A kind of onboard control system that unmanned plane sea is landed
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109341700B (en) * 2018-12-04 2023-06-30 中国航空工业集团公司西安航空计算技术研究所 Visual auxiliary landing navigation method for fixed-wing aircraft under low visibility
CN109739257A (en) * 2018-12-21 2019-05-10 中科院合肥技术创新工程院 Merge the patrol unmanned machine closing method and system of satellite navigation and visual perception
CN109903331A (en) * 2019-01-08 2019-06-18 杭州电子科技大学 A kind of convolutional neural networks object detection method based on RGB-D camera
CN110058604A (en) * 2019-05-24 2019-07-26 中国科学院地理科学与资源研究所 A kind of accurate landing system of unmanned plane based on computer vision
CN110322462A (en) * 2019-06-13 2019-10-11 暨南大学 Unmanned aerial vehicle vision based on 5G network feels land method and system
WO2021056139A1 (en) * 2019-09-23 2021-04-01 深圳市大疆创新科技有限公司 Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium
CN111578947A (en) * 2020-05-29 2020-08-25 天津工业大学 Unmanned aerial vehicle monocular SLAM extensible framework with depth recovery capability
CN111578947B (en) * 2020-05-29 2023-12-22 国网浙江省电力有限公司台州市椒江区供电公司 Unmanned plane monocular SLAM (selective liquid level adjustment) expandable frame with depth recovery capability
CN112101374B (en) * 2020-08-01 2022-05-24 西南交通大学 Unmanned aerial vehicle obstacle detection method based on SURF feature detection and ISODATA clustering algorithm
CN112101374A (en) * 2020-08-01 2020-12-18 西南交通大学 Unmanned aerial vehicle obstacle detection method based on SURF feature detection and ISODATA clustering algorithm
CN113221883A (en) * 2021-05-12 2021-08-06 武汉天图地信科技有限公司 Real-time correction method for flight navigation route of unmanned aerial vehicle
CN113221883B (en) * 2021-05-12 2023-10-27 武汉天图地信科技有限公司 Unmanned aerial vehicle flight navigation route real-time correction method
CN116843748A (en) * 2023-09-01 2023-10-03 上海仙工智能科技有限公司 Remote two-dimensional code and object space pose acquisition method and system thereof
CN116843748B (en) * 2023-09-01 2023-11-24 上海仙工智能科技有限公司 Remote two-dimensional code and object space pose acquisition method and system thereof

Similar Documents

Publication Publication Date Title
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN113485441B (en) Distribution network inspection method combining unmanned aerial vehicle high-precision positioning and visual tracking technology
CN103809598B (en) A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system
EP2413096B1 (en) Ground-based videometrics guiding method for aircraft landing or unmanned aerial vehicles recovery
CN102426019B (en) Unmanned aerial vehicle scene matching auxiliary navigation method and system
CN104015931B (en) Vision localization, measurement and control method, system and experimental platform for automatic refueling dead zone of unmanned aerial vehicle
CN103411609B (en) A kind of aircraft return route planing method based on online composition
WO2019084919A1 (en) Methods and system for infrared tracking
CN108820233B (en) Visual landing guiding method for fixed-wing unmanned aerial vehicle
CN113597591A (en) Geographic reference for unmanned aerial vehicle navigation
CN105512628A (en) Vehicle environment sensing system and method based on unmanned plane
CN101109640A (en) Unmanned aircraft landing navigation system based on vision
CN105405126B (en) A kind of multiple dimensioned vacant lot parameter automatic calibration method based on single camera vision system
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
CN112596071B (en) Unmanned aerial vehicle autonomous positioning method and device and unmanned aerial vehicle
CN112650304B (en) Unmanned aerial vehicle autonomous landing system and method and unmanned aerial vehicle
EP3552388A1 (en) Feature recognition assisted super-resolution method
CN109584264B (en) Unmanned aerial vehicle vision guiding aerial refueling method based on deep learning
CN114564042A (en) Unmanned aerial vehicle landing method based on multi-sensor fusion
CN114581516A (en) Monocular vision-based multi-unmanned aerial vehicle intelligent identification and relative positioning method
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN116508071A (en) System and method for annotating automotive radar data
CN114066972A (en) Unmanned aerial vehicle autonomous positioning method based on monocular vision
CN117636284A (en) Unmanned aerial vehicle autonomous landing method and device based on visual image guidance
Qi et al. Detection and tracking of a moving target for UAV based on machine vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150729