CN106598075A - System and method for tracking control of unmanned aerial vehicle based on luminescence object identification - Google Patents

System and method for tracking control of unmanned aerial vehicle based on luminescence object identification Download PDF

Info

Publication number
CN106598075A
CN106598075A CN201610578483.2A CN201610578483A CN106598075A CN 106598075 A CN106598075 A CN 106598075A CN 201610578483 A CN201610578483 A CN 201610578483A CN 106598075 A CN106598075 A CN 106598075A
Authority
CN
China
Prior art keywords
luminous target
unmanned plane
luminous
target
movable information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610578483.2A
Other languages
Chinese (zh)
Inventor
王军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wingsland Technology Co Ltd
Original Assignee
Shenzhen Wingsland Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wingsland Technology Co Ltd filed Critical Shenzhen Wingsland Technology Co Ltd
Priority to CN201610578483.2A priority Critical patent/CN106598075A/en
Priority to PCT/CN2016/097249 priority patent/WO2018014420A1/en
Publication of CN106598075A publication Critical patent/CN106598075A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention provides a system and method for tracking control of an unmanned aerial vehicle based on luminescence object identification. The system comprises a luminescence object and an unmanned aerial vehicle. The unmanned aerial vehicle comprises: at least two cameras which are configured to perform real-time camera of the luminescence object; an operation processing unit connected with the cameras and configured to identify the luminescence object in the image shot by the cameras and calculate the motion information of the luminescence object; and a flight control unit connected with the operation processing unit and configured to control the unmanned aerial vehicle to follow the flight of the luminescence object according to the motion information of the luminescence object. The system and method for tracking control of the unmanned aerial vehicle based on luminescence object identification can solve the problems of the instability, the complex system, high cost and low precision, can be also taken as a novel control mode except for the accurate tracking to allow the unmanned aerial vehicle to only add one set of devices to realize a multiple intelligent functions, and is very suitable for the popular aerial photography entertainment unmanned aerial vehicle.

Description

A kind of unmanned plane tracking control system and method based on luminous target recognition
Technical field
The present invention relates to unmanned air vehicle technique, more particularly to a kind of unmanned plane tracing control system based on luminous target recognition System and method.
Background technology
Automatic tracking function is in the past two years on unmanned plane using being also widely especially welcome function, especially It is that for unmanned plane is taken photo by plane in amusement, after having this function, unmanned plane becomes more intelligent, and it can be specified from motion tracking Target, realize with clapping, the function such as interaction, be to entertain the requisite intelligent function of unmanned plane at present.
Realize that this function mainly has two ways at present, the first is based on GPS positioning technology, on the object of tracking Need to carry one with GPS and can be with the communication module of UAV Communication function, such as the unmanned controller with GPS, intelligence Energy mobile phone, the Intelligent bracelet with GPS etc., the GPS information of object position is sent in real time unmanned plane, nothing by communication module Man-machine to realize following function by this positional information, its advantage is that technology maturation is simple, moderate.
The second way is that the technological core is by unmanned plane based on the target following technology of binocular vision recognition method On dual camera, captured in real-time tracking target, then the distance before target and unmanned plane is determined by a set of complicated algorithm, And quantity of motion and the direction of target, unmanned aerial vehicle control system obtains real-time adjustment heading and flight speed after this information, Realize following function.The advantage of the technology is that indoor and outdoor is all practical, and tracking target need not wear any device, and can also For avoidance.
Above-mentioned existing two kinds of technologies respectively have advantage, also there is many weak points, the deficiency based on GPS tracking modes it Place is can only to use in the good open air of spacious gps signal, easily by environmental disturbances, GPS location precision is limited by addition, Its tracking accuracy is not also high, the range of error generally more than 1 meter;It is disadvantageous in that based on the tracing mode of binocular identification, Its operand is very big, needs to be equipped with powerful processor, while power consumption is also higher.In addition, at present technology is ripe not enough, Easily it is subject to environment and light to disturb, tracking accuracy is not very high, is easily occurred with losing phenomenon, and cost is very high, is not suitable for using Take photo by plane unmanned plane in the amusement of low cost.
For at present, for increasingly popular amusement unmanned plane of taking photo by plane for, be badly in need of finding it is a kind of it is stable, simple, Cost is relatively low, the tracking mode that precision is higher, to meet the demand of user Geng Gao.
The content of the invention
Present invention aim at providing a kind of unmanned plane tracking control system and method based on luminous target recognition, it is intended to Solve the problems, such as that current unmanned plane tracking control system tracking accuracy is high, stability ground.
The invention provides a kind of unmanned plane tracking control system based on luminous target recognition, including luminous target and nothing Man-machine, the unmanned plane includes:
At least two photographic head, two photographic head are used to carry out real-time photography to the luminous target;
Operation processing unit, is connected with the photographic head, recognizes the described luminous mesh in the image that the photographic head shoots Mark, and calculate the movable information of the luminous target;
Flight control units, are connected with the operation processing unit, and according to the movable information of the luminous target institute is controlled Stating unmanned plane follows the motion track of the luminous target to fly.
Present invention also offers a kind of unmanned plane tracking and controlling method based on luminous target recognition, methods described is based on extremely A few luminous target recognition tracing control, the method comprising the steps of:
It is used to carry out real-time photography to the luminous target using at least two photographic head;
The described luminous target in the image that the photographic head shoots is recognized, and calculates the motion letter of the luminous target Breath;
The unmanned plane according to the movable information check and control system of the luminous target follows the motion track of the luminous target Flight.
The above-mentioned unmanned plane tracking control system based on luminous target recognition and method adopts two common cameras, right Luminous target carries out real-time photography, while the luminous target in image enters the calculating of row distance and displacement, to the direction of motion Make analysis to judge, to flight control system, flight control system makes right according to variable quantity for output campaign amount and azimuth information The motion answered, so as to realize following function.Due to the real-time position coordinateses of luminous target can be calculated such that it is able to judge to send out The movement locus of optical target, this track is compared with default track in flight control system, so as to find out the fortune of correspondence track Row instruction, is reached and is moved to control the purpose of unmanned plane during flying by the target that lights, such as the target vertical that lights is moved upwards, table Show that aircraft rises, luminous target vertical is moved downward, represent that aircraft declines.The present invention solve well it is unstable, be The relatively low problem of system complexity, high cost, precision, in addition to it more can accurately track, is also used as a kind of brand-new control Mode processed so that unmanned plane only needs to increase set of device and can be achieved with various intelligent functions, is especially suitable for popular taking photo by plane Amusement unmanned plane.
Description of the drawings
Fig. 1 is that the module of the unmanned plane tracking control system in present pre-ferred embodiments based on luminous target recognition is illustrated Figure;
Fig. 2 is the module that operation processing unit in the unmanned plane tracking control system of luminous target recognition is based on shown in Fig. 1 Schematic diagram;
Fig. 3 is the coordinate system schematic diagram of luminous target;
Fig. 4 is the module that flight control units in the unmanned plane tracking control system of luminous target recognition are based on shown in Fig. 1 Schematic diagram.
Fig. 5 is the flow chart of the unmanned plane tracking and controlling method in present pre-ferred embodiments based on luminous target recognition;
Fig. 6 is the method flow diagram of the movable information that luminous target is calculated in present pre-ferred embodiments;
Fig. 7 is the method flow that the luminous target trajectory flight of unmanned plane tracking is controlled in present pre-ferred embodiments Figure.
Specific embodiment
In order that the technical problem to be solved in the present invention, technical scheme and beneficial effect become more apparent, below in conjunction with Drawings and Examples, the present invention will be described in further detail.It should be appreciated that specific embodiment described herein is only used To explain the present invention, it is not intended to limit the present invention.
Fig. 1 is referred to, is included based on the unmanned plane tracking control system of luminous target recognition in present pre-ferred embodiments At least one luminous target 10 and unmanned plane 20.
Unmanned plane 20 includes the first photographic head 21, second camera 22, operation processing unit 23 and flight control units 24.
Two photographic head 21,22 are used to carry out real-time photography to the luminous target 10;Operation processing unit 23 and institute State the first photographic head 21 and second camera 22 connects, recognize the image that first photographic head 21 and second camera 22 shoot In described luminous target 10, and calculate the movable information of the luminous target 10;At flight control units 24 and the computing Reason unit 23 connects, and according to the movable information of the luminous target 10 shifting that the unmanned plane 20 follows the luminous target is controlled Dynamic rail mark flies.
In the present embodiment, the luminous body of the target 10 that lights at least one side (vision section) uniformly light-emitting, such as photosphere, why It is that photosphere is given out light with photosphere reason highly uniform, in nature it is difficult to find similar this uniform light source, thus is not susceptible to To ambient light interference, in addition, its feature is clearly, capture is easy to during to image analysis processing, is subtracted significantly The operand and difficulty of software are lacked.Further, since photosphere is luminous body, also can be imaged in the case that at night light is bad Head capture so that this tracking mode can be used in day and night, than binocular recognition method tracking advantageously.Thus, Operation processing unit 23 is using general performance DSP (Digital Signal Processor, number with simple image algorithm Word signal processor), the direction of motion and quantity of motion of photosphere are accurately calculated enough.Flight control units 24 are unmanned plane 20 Central processing unit, using chips such as ARM, intel and AMD.
In more specifically embodiment, flight control units 24 search pre-stored according to the movable information of luminous target 10 , corresponding projected path, and control unmanned plane 20 with projected path flight.Wherein, according to current luminous mesh The projected path that finds of movable information of mark 10 is:The flight path of the motion track of the current luminous target of simulation, should The movable information of flight path has mutual corresponding relation, such as the motion speed of unmanned plane with the movable information of current luminous target Degree, acceleration of motion, travel etc. are luminous targets n times, and the direction of motion is that identical with luminous target, contrary or skew is default Angle.Thus, the movable information of luminous target 10 includes:Movement velocity, travel, the direction of motion of the luminous target 10 And one or more in acceleration of motion.
In more specifically embodiment, first unmanned plane 20 opens tracking or luminous target (photosphere) control model, flies Row control unit 24 notifies that operation processing unit 23 is started working, and operation processing unit 23 opens left camera module L, and (first takes the photograph 21) and right camera module R (second camera 22) as, two camera modules start to search for photosphere, and this process needs to fly Row control unit 24 and operation processing unit 23 are in communication with each other, and flight control units 24 constantly adjust height and the side of unmanned plane 20 To, until with definite angle shot to photosphere (as best mode photosphere to be generally in two photographic head centres), Photosphere and unmanned plane 20 apart from default parameterss be 5 meters, naturally it is also possible to user oneself is changing this distance and height, orientation Deng, only guaranteed double shooting have can and meanwhile photograph photosphere just can be with.
In the particular embodiment, Fig. 2 is referred to, operation processing unit 23 includes acquisition module 231, identification module 232 With computing module 233.
Acquisition module 231 is used to continuously acquire the image that first photographic head 21 and second camera 22 shoot, for Continuously acquire, can be continuously acquired with prefixed time interval, it is also possible to directly by taking the shooting interval of photographic head itself as an example, i.e., per Image is required for obtaining;Can also be at interval of several image as the element being required.
Identification module 232 is used to recognize the described luminous target 10 in each described image, is known using image recognition technology Light target 10 and labelling in the image not shot, to treat the coordinate of the luminous target 10 of technology.
Computing module 233 is used to calculate the coordinate of the luminous target 10 relatively corresponding photographic head in each described image Information, and the coordinate information according to the luminous target 10 in each described image calculates the luminous target in Preset Time 10 movable information.Especially, for identification, operation processing unit 23 needs continuously to calculate luminous target 10 Mobile vestige (movable information), the accumulative displacement amount of luminous target 10 is exported every a period of time, and the flight control of unmanned plane 20 is single Unit 24 responds again.For tracing control, it is limited in a Preset Time (with every a period of time), the ginseng of projected path When number is matched with the movable information of the target 10 that lights in Preset Time, then tracking control is abandoned in response tracking control if mismatching System, it is also possible to the flight path flight given tacit consent to, such as hovering, landing.Preset Time can be 0.2,0.5 second etc..
In the present embodiment, Fig. 3 is referred to, the coordinate information (X, Y, Z) of the luminous target 10 meets below equation:
Z=(b*focal_length)/(x_camL-x_camR);
X=x_camL*Z/focal_length;
Y=y_camL*Z/focal_length;
Wherein:The coordinate system of luminous target 10 can be selected in other embodiments with the first photographic head 21 as zero point Another photographic head 22 for coordinate system zero point, or other positions.First photographic head 21 and second camera 22 fall in X-axis, X-axis i.e. with the straight line at the first photographic head 21 and the place of second camera 22 as coordinate system;The Z axis of coordinate system are optical axises, are institute State the direction (direction vertical with the minute surface of photographic head) of photographic head sensing.The Y-axis of coordinate system is located flat with the X-axis and Z axis Face is vertical.
More specifically, the value of x_camL, y_camL be respectively the luminous target 10 (P points) the first photographic head 21 into The X-axis coordinate of picture point, Y-axis coordinate;The value of x_camR is that the luminous target 10 is sat in the X-axis of the imaging point of second camera 22 Mark;Because P points are changing always, so, the value of x_camL, x_camR, y_camL is also to change always, x_camL, The value of x_camR, y_camL is represented with pixel coordinate.
The method of the value of collection x_camL:The image of the collection of the first photographic head 21, in OpenCV, (full name of OpenCV is: Open Source Computer Vision Library.OpenCV is one and permits the cross-platform of (increasing income) distribution based on BSD Computer vision storehouse) under through process, by the function of cvFindStereoCorrespondenceBM can obtain XL, YL, The value of ZL tri-, XL, YL, ZL be respectively luminous target 10 the first photographic head 21 imaging point on X-axis, Y-axis, Z axis with first The distance of photographic head 21 (zero point).Now x_camL=XL, y_camL=YL.
The method of the value of collection x_camR:The image of the collection of second camera 22, through processing under opencv, passes through CvFindStereoCorrespondenceBM functions obtain value XR, XR be light target 10 second camera 22 into Picture point in X-axis with the distance of second camera 22.Therefore, x_camR=b+XR.The value of b be between two photographic head away from From;In the present embodiment, the first photographic head 21 is identical with the focal length of second camera 22, and the value of focal_length is photographic head Focal length.
Operation processing unit 23 can calculate coordinate of the photosphere in each image and technology export one by above formula The photosphere movable information of section of fixing time, by this information flight control units 24 are passed in real time, and flying control system just can be accurate Photosphere is followed, while judging by the carrying out to photosphere movement locus, realization controls the mesh of unmanned plane 20 by movement locus 's.
In the particular embodiment, Fig. 4 is referred to, flight control units 24 include memory module 241, searching modul 242 With control module 243.Memory module 241 is used to prestore the movable information and corresponding unmanned plane of the luminous target The list data of projected path;Searching modul 242 be used for tabled look-up according to the movable information of the luminous target 10, obtain with The corresponding projected path of unmanned plane 20 of the movable information;Control module 243 is used for the predetermined flight rail found according to this Mark exports corresponding flight control instruction and controls the unmanned plane 20 with the pre-set flight information flight.Such as:Flight control is single Following several unmanned plane projected paths are prestored in unit 24:Photosphere moves upwards 10 to 30CM, accordingly, represents nobody Machine 20 rises 0.5 meter;Photosphere moves downward motion 10 upwards to 30CM, accordingly, represents that unmanned plane 20 declines 0.5 meter;Photosphere To left movement 10 to 30CM, accordingly, represent that unmanned plane 20 flies 1 meter to the left;Photosphere 10 arrives 30CM to the right, accordingly, represents Fly 1 meter to the right;Photosphere original place is drawn a circle, and accordingly, under representing the double swerve 5 of unmanned plane 20, above parameter is all default value, this Value can be arranged by the APP of unmanned plane 20, be also as follows the same.Should illustrate that this example includes this several control command, But it is not limited to this several control command.
Operation processing unit 23 can in real time calculate light ball position, if it is determined that have motion, at random by quantity of motion direction and speed The information such as degree are sent to flight control units 24.Prestore movable information and the predetermined flight of unmanned plane 20 of each luminous target The corresponding list data in track is exemplified below:
Photosphere is such as detected with the speed of 0.5 meter per second again to unmanned front operation, flight control units 24 receive this letter Breath, control aircraft is run with identical speed and orientation, is so allowed for the distance between photosphere and unmanned plane 20 and is relatively fixed, from And realize and more accurately track.If generally photosphere speed is moved less than 1 meter per second speed, precision is followed to accomplish 10CM.
If further user holds photosphere with handss, upwards motion 10 is to 30CM, and still further below motion 10 is to 30CM, then to Left movement 10 arrives 30CM, then 10 arrives 30CM to the right, then original place is drawn a circle.Operation processing unit 23 calculated and can lead to after these displacements Know flight control units 24, the contrast of flight control units 24 finds corresponding control mode, performs control command, and unmanned plane 20 is just 0.5 meter first can be upwards moved, 0.5 meter is run still further below, then run 1 meter to the left, then be run to the right under 1 meter, then double swerve 5. It is thus to realize the purpose that the motion of unmanned plane 20 is controlled by photosphere, to user brand-new experience is brought.
Further in order to improve recreational and precision, user can so be defined by two photospheres operations Action and control mode will be more, while two photospheres are mutually before as object of reference so that DSP operation is more accurate, so as to So that discrimination is higher, the identification of more exceedingly difficult movements can be completed, more enjoyment can be brought to user.
Unmanned plane tracking control system realizes accurate tracking by the uniform specific luminous body of binocular identification;By Different motion track is preset in flight control system and represents that the mode of different control commands realizes photosphere with reference to photosphere location technology Control unmanned plane 20, solves the problems, such as that well unstable, system complex, high cost, precision are relatively low.
Additionally, also disclosing a kind of unmanned plane tracking and controlling method based on luminous target recognition, methods described is based on extremely A few luminous target recognition tracing control, luminous target is preferably the photosphere of uniformly light-emitting.
In preferred embodiment, first unmanned plane opens tracking or luminous target (photosphere) control model, flight control Notifications DSP operation processing system is started working, and DSP opens left camera module L and right camera module R, two shootings Module starts to search for photosphere, and this process needs flight control units and operation processing unit to be in communication with each other, flight control units Height and the direction of unmanned plane are constantly adjusted, until (generally photosphere being in into two to take the photograph to photosphere with definite angle shot As being best mode till head centre), photosphere is 5 meters apart from default parameterss with unmanned plane, naturally it is also possible to which user oneself comes Change this distance and height, orientation etc., only guaranteed double shootings have can and meanwhile photograph photosphere just can be with.
Fig. 5 is referred to, in preferably embodiment, this method is comprised the following steps:
Step S110, using two photographic head the real-time photography that carries out to the luminous target, described two shootings are used for Head includes the first photographic head and second camera.The focal length of two photographic head is identical.
Step S120, recognizes the described luminous target in the image that the photographic head shoots, and calculates the luminous target Movable information.The movable information of luminous target includes:Movement velocity, travel, the direction of motion, the fortune of the luminous target One or more in dynamic time and acceleration of motion.
Step S130, according to the movable information of the luminous target shifting that the unmanned plane follows the luminous target is controlled Dynamic rail mark flies.
This set constituted by way of two photographic head and DSP operation system add a specific luminous body is followed the trail of and simple Control system can greatly enhance the recreational and intelligent of unmanned plane, and its key element is 2 points, and first is photosphere why It is that photosphere is given out light with photosphere reason highly uniform, in nature it is difficult to find similar this uniform light source, thus is not susceptible to To ambient light interference, in addition, its feature is clearly, capture is easy to during to image analysis processing, is subtracted significantly The operand and difficulty of software are lacked, with the DSP and simple image algorithm of general performance the fortune of photosphere can have just been accurately calculated Dynamic direction and quantity of motion.
In more specifically embodiment, flight control system controls the nothing according to the movable information of the luminous target The man-machine motion track for following the luminous target flies and is:Pre-stored, correspondence are searched according to the movable information of luminous target Projected path, and control unmanned plane with projected path flight.Wherein, according to the motion letter of current luminous target Ceasing the projected path for finding is:The flight path of the motion track of the current luminous target of simulation, the fortune of the flight path Dynamic information and the movable information of current luminous target have a mutual corresponding relation, such as the movement velocity of unmanned plane, acceleration of motion, Travel etc. is luminous target n times, and the direction of motion is identical with luminous target, contrary or skew predetermined angle.
In more detailed embodiment, Fig. 6 and Fig. 3 is referred to, step S120 is specifically included:
Step S121, continuously acquires the image that first photographic head 21 and second camera 22 shoot.For continuously obtaining Take, can be continuously acquired with prefixed time interval, it is also possible to which directly by taking the shooting interval of photographic head itself as an example, i.e., every image is all Need to obtain;Can also be at interval of several image as the element being required.
Step S122, recognizes the described luminous target in each described image.Recognize what is shot using image recognition technology Light target and labelling in image, to treat the coordinate of the luminous target of technology.
Step S123, calculates the coordinate information of luminous target relatively corresponding photographic head in each described image.This In embodiment, the coordinate information (X, Y, Z) of the luminous target meets below equation:
Z=(b*focal_length)/(x_camL-x_camR);
X=x_camL*Z/focal_length;
Y=y_camL*Z/focal_length;
Wherein:The coordinate system of luminous target can be selected separately with the first photographic head 21 as zero point, in other embodiments One photographic head for coordinate system zero point, or other positions.First photographic head 21 and second camera 22 fall in X-axis, i.e., with The straight line that first photographic head 21 and second camera 22 are located is the X-axis of coordinate system;The Z axis of coordinate system are optical axises, are described taking the photograph As the direction (direction vertical with the minute surface of photographic head) that head is pointed to.The Y-axis of coordinate system is hung down with the X-axis and Z axis place plane Directly.
More specifically, the value of x_camL, y_camL is respectively imaging point (P of the luminous target in the first photographic head 21 Point) X-axis coordinate, Y-axis coordinate;The value of x_camR is X-axis coordinate of the luminous target in the imaging point of second camera 22;By In P points changing always, so, the value of x_camL, x_camR, y_camL is also to change always, x_camL, x_ The value of camR, y_camL is represented with pixel coordinate.
The method of the value of collection x_camL:The image of the collection of the first photographic head 21, in OpenCV, (full name of OpenCV is: Open Source Computer Vision Library.OpenCV is one and permits the cross-platform of (increasing income) distribution based on BSD Computer vision storehouse) under through process, by the function of cvFindStereoCorrespondenceBM can obtain XL, YL, Tri- values of ZL, XL, YL, ZL are respectively that luminous target is taken the photograph on X-axis, Y-axis, Z axis in the imaging point of the first photographic head 21 with first As the distance of 21 (zero point).Now x_camL=XL, y_camL=YL.
The method of the value of collection x_camR:The image of the collection of second camera 22, through processing under opencv, passes through CvFindStereoCorrespondenceBM functions obtain value XR, and XR is imaging of the luminous target in second camera 22 Point in X-axis with the distance of second camera 22.Therefore, x_camR=b+XR.The value of b is the distance between two photographic head; In the present embodiment, the first photographic head 21 is identical with the focal length of second camera 22, and the value of focal_length is Jiao of photographic head Away from.
Step S124, the coordinate information according to the luminous target in each described image calculates described in Preset Time The movable information of luminous target.Coordinate of the photosphere in each image and the timing of technology export one can be calculated by above formula Between section photosphere movable information, this information is passed in real time flight control system, winged control system just can accurately follow light Ball, while being judged by the carrying out to photosphere movement locus, realization controls the purpose of unmanned plane by movement locus.Especially Ground, for identification, needs the mobile vestige (movable information) for continuously calculating luminous target, exports every a period of time The accumulative displacement amount of luminous target, the flight control system of unmanned plane is responded again.It is to be limited to one to preset for tracing control In time (with every a period of time), when the parameter of projected path is matched with the movable information of the target that lights in Preset Time, Then response tracking control, if mismatching tracing control is abandoned, it is also possible to the flight path flight given tacit consent to, such as hovering, landing Deng.Preset Time can be 0.2,0.5 second etc..
In more detailed embodiment, Fig. 7 is referred to, searched according to the movable information of luminous target pre-stored, right The projected path answered, and control unmanned plane and specifically included with projected path flight:
Step S131, prestores the movable information and corresponding unmanned plane projected path of the luminous target Data form.
Step S132, tables look-up according to the movable information of the luminous target, obtains unmanned plane corresponding with the movable information Projected path.
Step S133, the projected path found according to this export corresponding flight control instruction control it is described nobody Machine is with the pre-set flight information flight.
Prestore the movable information list data citing corresponding with unmanned plane projected path for sending out each optical target It is as follows:
Photosphere is such as detected with the speed of 0.5 meter per second again to unmanned front operation, flight control system receives this letter Breath, control aircraft is run with identical speed and orientation, is so allowed for the distance between photosphere and unmanned plane and is relatively fixed, so as to Realize and more accurately track.If generally photosphere speed is moved less than 1 meter per second speed, precision is followed to accomplish 10CM.
Further several movement locus following defined in flight control system:Photosphere is moved upwards 10 and is represented to 30CM Unmanned plane rises 0.5 meter;Photosphere moves downward motion 10 to 30CM upwards and represents that unmanned plane declines 0.5 meter;Photosphere is to left movement 10 to 30CM represent 1 meter of flight to the left;Photosphere 10 to 30CM represents 1 meter of flight to the right to the right;Draw a circle and represent nobody in photosphere original place Under machine double swerve 5, above parameter is all default value, and this value can be arranged by the APP of unmanned plane, is also as follows the same. Should illustrate that this example includes this several control command, but be not limited to this several control command.
If further user holds photosphere with handss, upwards motion 10 is to 30CM, and still further below motion 10 is to 30CM, then to Left movement 10 arrives 30CM, then 10 arrives 30CM to the right, then original place is drawn a circle.DSP is calculated and flight control is notified that after these displacements Corresponding control mode is found in system, flight control single system contrast, performs control command, and unmanned plane will be moved first upwards 0.5 meter, 0.5 meter is run still further below, then run 1 meter to the left, then run to the right under 1 meter, then double swerve 5.It is thus to realize The purpose of unmanned plane motion is controlled by photosphere, to user brand-new experience is brought.
Further in order to improve recreational and precision, user can so be defined by two photospheres operations Action and control mode will be more, while two photospheres are mutually before as object of reference so that DSP operation is more accurate, so as to So that discrimination is higher, the identification of more exceedingly difficult movements can be completed, more enjoyment can be brought to user.
Dsp system can in real time calculate light ball position, if it is determined that have motion, at random by information such as quantity of motion direction and speed It is sent to flight control system.By the uniform specific luminous body of binocular identification, accurate tracking is realized;By in flight control Different motion track is preset in system and represents that the mode of different control commands realizes that photosphere controls nobody with reference to photosphere location technology Machine, solves the problems, such as that well unstable, system complex, high cost, precision are relatively low.
Those skilled in the art can be understood that, for convenience of description and succinctly, only with above-mentioned each work( The division of energy unit is illustrated, and in practical application, as desired can distribute above-mentioned functions by different functions Unit is completed, will the internal structure of described device be divided into different functional unit or module, it is described above complete to complete Portion or partial function.Each functional unit in embodiment can be integrated in a processing unit, or unit Individually it is physically present, it is also possible to which two or more units are integrated in a unit, above-mentioned integrated unit both can be adopted Realized with the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.In addition, the specific name of each functional unit Only to facilitate mutually distinguishing, the protection domain of the application is not limited to.The specific works mistake of unit in said apparatus Journey, may be referred to the corresponding process in preceding method embodiment, will not be described here.
In sum, the control system and method for the embodiment of the present invention adopts two common cameras, and luminous target is entered Row real-time photography, while the luminous target in image enters the calculating of row distance and displacement, makes analysis and sentences to the direction of motion Break, to flight control system, flight control system makes corresponding motion according to variable quantity for output campaign amount and azimuth information, from And realize following function.Due to the real-time position coordinateses of luminous target can be calculated such that it is able to judge the fortune of luminous target Dynamic rail mark, this track is compared with default track in flight control system, so as to find out the operating instruction of correspondence track, is reached Move to control the purpose of unmanned plane during flying by the target that lights, such as the target vertical that lights is moved upwards, represent that aircraft rises, Luminous target vertical is moved downward, and represents that aircraft declines.The present invention solves well unstable, system complex, cost The relatively low problem of high, precision, in addition to it more can accurately track, is also used as a kind of brand-new control mode so that Unmanned plane only needs to increase set of device and can be achieved with various intelligent functions, is especially suitable for popular amusement unmanned plane of taking photo by plane.
Those of ordinary skill in the art are it is to be appreciated that the list of each example with reference to the embodiments described herein description Unit and algorithm steps, being capable of being implemented in combination in electronic hardware or computer software and electronic hardware.These functions are actually Performed with hardware or software mode, depending on the application-specific and design constraint of technical scheme.Professional and technical personnel Each specific application can be used different methods to realize described function, but this realization it is not considered that exceeding The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed apparatus and method, can pass through other Mode is realized.For example, device embodiment described above is only schematic, for example, the division of the module or unit, It is only a kind of division of logic function, there can be other dividing mode when actually realizing, such as multiple units or component can be with With reference to or be desirably integrated into another system, or some features can be ignored, or not perform.It is another, it is shown or discussed Coupling each other or direct-coupling or communication connection can be INDIRECT COUPLING by some interfaces, device or unit or Communication connection, can be electrical, mechanical or other forms.
The unit as separating component explanation can be or may not be it is physically separate, it is aobvious as unit The part for showing can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.Some or all of unit therein can according to the actual needs be selected to realize the mesh of this embodiment scheme 's.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing unit, it is also possible to It is that unit is individually physically present, it is also possible to which two or more units are integrated in a unit.Above-mentioned integrated list Unit both can be realized in the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.
If the integrated unit is realized using in the form of SFU software functional unit and as independent production marketing or used When, during a computer read/write memory medium can be stored in.Based on such understanding, the technical scheme of the embodiment of the present invention The part for substantially contributing to prior art in other words or all or part of the technical scheme can be with software products Form embody, the computer software product is stored in a storage medium, including some instructions use so that one Computer equipment (can be personal computer, server, or network equipment etc.) or processor (processor) perform this The all or part of step of bright embodiment each embodiment methods described.And aforesaid storage medium includes:USB flash disk, portable hard drive, Read only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic Dish or CD etc. are various can be with the medium of store program codes.
Presently preferred embodiments of the present invention is the foregoing is only, not to limit the present invention, all essences in the present invention Any modification, equivalent and improvement made within god and principle etc., should be included within the scope of the present invention.

Claims (12)

1. a kind of unmanned plane tracking control system based on luminous target recognition, it is characterised in that including unmanned plane and at least Individual luminous target, the unmanned plane includes:
First photographic head and second camera, for carrying out real-time photography to the luminous target;
Operation processing unit, is connected with first photographic head and second camera, recognizes that first photographic head and second is taken the photograph As the described luminous target in the image that head shoots, and calculate the movable information of the luminous target;
Flight control units, are connected with the operation processing unit, and according to the movable information of the luminous target nothing is controlled It is man-machine to follow the motion track of the luminous target to fly.
2. the unmanned plane tracking control system of luminous target recognition is based on as claimed in claim 1, it is characterised in that described Optical target is the luminous body of at least one side uniformly light-emitting.
3. the unmanned plane tracking control system of luminous target recognition is based on as claimed in claim 1, it is characterised in that the fortune Calculating processing unit includes:
Acquisition module, for continuously acquiring the image that first photographic head and second camera shoot;
Identification module, for recognizing each described image in the luminous target;
Computing module, for calculating the coordinate information of the luminous target relatively corresponding photographic head in each described image, and The movable information of the luminous target in Preset Time is calculated according to the coordinate information.
4. the unmanned plane tracking control system of luminous target recognition is based on as claimed in claim 1, it is characterised in that described to fly Row control unit controls the motion track that the unmanned plane follows the luminous target according to the movable information of the luminous target Flight is specially:Pre-stored, corresponding projected path are searched according to the movable information of the luminous target, and controls institute Unmanned plane is stated with the projected path flight.
5. the unmanned plane tracking control system of luminous target recognition is based on as claimed in claim 4, it is characterised in that flight control Unit processed includes:
Memory module, for prestoring the movable information and corresponding unmanned plane projected path of the luminous target Data form;
Searching modul, for tabling look-up according to the movable information of the luminous target, obtains unmanned plane corresponding with the movable information Projected path;
Control module, the projected path for being found according to this export corresponding flight control instruction control it is described nobody Machine is with the pre-set flight information flight.
6. the unmanned plane tracking control system based on luminous target recognition as described in any one of claim 1 to 5, its feature exists In the movable information of the luminous target includes:When movement velocity, travel, the direction of motion, the motion of the luminous target Between and acceleration of motion in one or more.
7. a kind of unmanned plane tracking and controlling method based on luminous target recognition, it is characterised in that methods described is based at least Individual luminous target recognition tracing control, the method comprising the steps of:
It is used for the real-time photography that carries out to the luminous target using two photographic head, described two photographic head include the first shooting Head and second camera;
The described luminous target in the image that the photographic head shoots is recognized, and calculates the movable information of the luminous target;
Controlling the unmanned plane according to the movable information of the luminous target follows the motion track of the luminous target to fly.
8. the unmanned plane tracking and controlling method of luminous target recognition is based on as claimed in claim 7, it is characterised in that described Optical target is the luminous body of at least one side uniformly light-emitting.
9. the unmanned plane tracking and controlling method of luminous target recognition is based on as claimed in claim 7, it is characterised in that the knowledge Described luminous target in the image that not described photographic head shoots, and the movable information of the luminous target is calculated, specifically include:
Continuously acquire the image that first photographic head and second camera shoot;
Recognize the described luminous target in each described image;
Calculate the coordinate information of the luminous target relatively corresponding photographic head in each described image;
Coordinate information according to the luminous target in each described image calculates the fortune of the luminous target in Preset Time Dynamic information.
10. the unmanned plane tracking and controlling method of luminous target recognition is based on as claimed in claim 7, it is characterised in that described Controlling the unmanned plane according to the movable information of the luminous target follows the motion track flight of the luminous target to be specially:
According to the movable information of the luminous target search pre-stored, corresponding projected path, and control it is described nobody Machine is with the projected path flight.
The 11. unmanned plane tracking and controlling methods based on luminous target recognition as claimed in claim 10, it is characterised in that described According to the movable information of the luminous target search pre-stored, corresponding projected path, and control the unmanned plane with The step of projected path flight, is specially:
Prestore the movable information of the luminous target and the data form of corresponding unmanned plane projected path;
Tabled look-up according to the movable information of the luminous target, obtain unmanned plane projected path corresponding with the movable information;
The corresponding flight control instruction control unmanned plane is exported according to the projected path for finding default winged with this Row information is flown.
The 12. unmanned plane tracking and controlling methods based on luminous target recognition as described in any one of claim 7 to 11, its feature It is that the movable information of the luminous target includes:Movement velocity, travel, the direction of motion, the motion of the luminous target One or more in time and acceleration of motion.
CN201610578483.2A 2016-07-21 2016-07-21 System and method for tracking control of unmanned aerial vehicle based on luminescence object identification Pending CN106598075A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610578483.2A CN106598075A (en) 2016-07-21 2016-07-21 System and method for tracking control of unmanned aerial vehicle based on luminescence object identification
PCT/CN2016/097249 WO2018014420A1 (en) 2016-07-21 2016-08-30 Light-emitting target recognition-based unmanned aerial vehicle tracking control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610578483.2A CN106598075A (en) 2016-07-21 2016-07-21 System and method for tracking control of unmanned aerial vehicle based on luminescence object identification

Publications (1)

Publication Number Publication Date
CN106598075A true CN106598075A (en) 2017-04-26

Family

ID=58556015

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610578483.2A Pending CN106598075A (en) 2016-07-21 2016-07-21 System and method for tracking control of unmanned aerial vehicle based on luminescence object identification

Country Status (2)

Country Link
CN (1) CN106598075A (en)
WO (1) WO2018014420A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108168522A (en) * 2017-12-11 2018-06-15 宁波亿拍客网络科技有限公司 A kind of unmanned plane observed object method for searching and correlation technique again
CN108496138A (en) * 2017-05-25 2018-09-04 深圳市大疆创新科技有限公司 A kind of tracking and device
CN109388151A (en) * 2017-08-04 2019-02-26 深圳曼塔智能科技有限公司 Method, apparatus, system and the terminal device of unmanned plane target tracking
CN110262565A (en) * 2019-05-28 2019-09-20 深圳市吉影科技有限公司 The target following motion control method and device for pushing away unmanned plane applied to underwater six
CN110956642A (en) * 2019-12-03 2020-04-03 深圳市未来感知科技有限公司 Multi-target tracking identification method, terminal and readable storage medium
CN111742348A (en) * 2018-02-20 2020-10-02 软银股份有限公司 Image processing device, flight object, and program
CN111833381A (en) * 2020-06-24 2020-10-27 鹏城实验室 Unmanned aerial vehicle target tracking trajectory generation method, unmanned aerial vehicle and storage medium
CN113721661A (en) * 2021-09-03 2021-11-30 中国人民解放军32802部队 Cooperative unmanned aerial vehicle cluster observation device
CN114979611A (en) * 2022-05-19 2022-08-30 国网智能科技股份有限公司 Binocular sensing system and method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113759986A (en) * 2021-09-27 2021-12-07 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle monitoring and tracking method, device, equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012706A (en) * 2010-10-01 2011-04-13 苏州佳世达电通有限公司 Electronic device capable of automatically positioning and moving and method for automatically returning moving element thereof
CN102238986A (en) * 2008-12-04 2011-11-09 鹦鹉股份有限公司 Set of drones with recognition markers
CN102871784A (en) * 2012-09-21 2013-01-16 中国科学院深圳先进技术研究院 Positioning controlling apparatus and method
CN104820435A (en) * 2015-02-12 2015-08-05 武汉科技大学 Quadrotor moving target tracking system based on smart phone and method thereof
CN105000194A (en) * 2015-08-13 2015-10-28 史彩成 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
CN105100728A (en) * 2015-08-18 2015-11-25 零度智控(北京)智能科技有限公司 Unmanned aerial vehicle video tracking shooting system and method
CN105108757A (en) * 2015-09-09 2015-12-02 三峡大学 Wheeled soccer robot based on smartphone, and operation method thereof
US20160023761A1 (en) * 2014-07-22 2016-01-28 Jonathan McNally Method for installing an object using an unmanned aerial vehicle
CN105676860A (en) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 Wearable equipment, unmanned plane control device and control realization method
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9342746B1 (en) * 2011-03-17 2016-05-17 UtopiaCompression Corporation Maneuverless passive range estimation using monocular image sequences
CN102902271A (en) * 2012-10-23 2013-01-30 上海大学 Binocular vision-based robot target identifying and gripping system and method
JP6062079B2 (en) * 2014-05-30 2017-01-18 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Controller and method and vehicle for controlling the operation of an unmanned air transport (UAV)
CN105550670B (en) * 2016-01-27 2019-07-12 兰州理工大学 A kind of target object dynamically track and measurement and positioning method
CN105739520B (en) * 2016-01-29 2019-10-08 余江 A kind of unmanned vehicle identifying system and its recognition methods
CN105678289A (en) * 2016-03-07 2016-06-15 谭圆圆 Control method and device of unmanned aerial vehicle

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102238986A (en) * 2008-12-04 2011-11-09 鹦鹉股份有限公司 Set of drones with recognition markers
CN102012706A (en) * 2010-10-01 2011-04-13 苏州佳世达电通有限公司 Electronic device capable of automatically positioning and moving and method for automatically returning moving element thereof
CN102871784A (en) * 2012-09-21 2013-01-16 中国科学院深圳先进技术研究院 Positioning controlling apparatus and method
US20160023761A1 (en) * 2014-07-22 2016-01-28 Jonathan McNally Method for installing an object using an unmanned aerial vehicle
CN104820435A (en) * 2015-02-12 2015-08-05 武汉科技大学 Quadrotor moving target tracking system based on smart phone and method thereof
CN105000194A (en) * 2015-08-13 2015-10-28 史彩成 UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark
CN105100728A (en) * 2015-08-18 2015-11-25 零度智控(北京)智能科技有限公司 Unmanned aerial vehicle video tracking shooting system and method
CN105108757A (en) * 2015-09-09 2015-12-02 三峡大学 Wheeled soccer robot based on smartphone, and operation method thereof
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle
CN105676860A (en) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 Wearable equipment, unmanned plane control device and control realization method

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11216958B2 (en) 2017-05-25 2022-01-04 SZ DJI Technology Co., Ltd. Tracking method and device
CN108496138A (en) * 2017-05-25 2018-09-04 深圳市大疆创新科技有限公司 A kind of tracking and device
WO2018214093A1 (en) * 2017-05-25 2018-11-29 深圳市大疆创新科技有限公司 Tracking method and apparatus
US11776139B2 (en) 2017-05-25 2023-10-03 SZ DJI Technology Co., Ltd. Tracking method and device
CN108496138B (en) * 2017-05-25 2022-04-22 深圳市大疆创新科技有限公司 Tracking method and device
CN109388151A (en) * 2017-08-04 2019-02-26 深圳曼塔智能科技有限公司 Method, apparatus, system and the terminal device of unmanned plane target tracking
CN108168522A (en) * 2017-12-11 2018-06-15 宁波亿拍客网络科技有限公司 A kind of unmanned plane observed object method for searching and correlation technique again
CN111742348A (en) * 2018-02-20 2020-10-02 软银股份有限公司 Image processing device, flight object, and program
US11042740B2 (en) 2018-02-20 2021-06-22 Softbank Corp. Image processing device, flight vehicle, and computer-readable storage medium
CN111742348B (en) * 2018-02-20 2022-02-15 软银股份有限公司 Image processing device, flight object, and program
CN110262565A (en) * 2019-05-28 2019-09-20 深圳市吉影科技有限公司 The target following motion control method and device for pushing away unmanned plane applied to underwater six
CN110956642A (en) * 2019-12-03 2020-04-03 深圳市未来感知科技有限公司 Multi-target tracking identification method, terminal and readable storage medium
CN111833381A (en) * 2020-06-24 2020-10-27 鹏城实验室 Unmanned aerial vehicle target tracking trajectory generation method, unmanned aerial vehicle and storage medium
CN111833381B (en) * 2020-06-24 2024-07-23 鹏城实验室 Unmanned aerial vehicle target tracking track generation method, unmanned aerial vehicle and storage medium
CN113721661A (en) * 2021-09-03 2021-11-30 中国人民解放军32802部队 Cooperative unmanned aerial vehicle cluster observation device
CN113721661B (en) * 2021-09-03 2022-02-25 中国人民解放军32802部队 Cooperative unmanned aerial vehicle cluster observation device
CN114979611A (en) * 2022-05-19 2022-08-30 国网智能科技股份有限公司 Binocular sensing system and method

Also Published As

Publication number Publication date
WO2018014420A1 (en) 2018-01-25

Similar Documents

Publication Publication Date Title
CN106598075A (en) System and method for tracking control of unmanned aerial vehicle based on luminescence object identification
CN106197422B (en) A kind of unmanned plane positioning and method for tracking target based on two-dimensional tag
US10664993B1 (en) System for determining a pose of an object
US20230050566A1 (en) System and method for tracking a passive wand and actuating an effect based on a detected wand path
CN106054929B (en) A kind of unmanned plane based on light stream lands bootstrap technique automatically
CN105120146B (en) It is a kind of to lock filming apparatus and image pickup method automatically using unmanned plane progress moving object
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN109724603A (en) A kind of Indoor Robot air navigation aid based on environmental characteristic detection
CN109643127A (en) Construct map, positioning, navigation, control method and system, mobile robot
CN109298723A (en) A kind of accurate landing method of vehicle-mounted unmanned aerial vehicle and system
US11788845B2 (en) Systems and methods for robust self-relocalization in a visual map
CN106155092A (en) A kind of intelligent multi-control flight capture apparatus and flight control method thereof
US10347001B2 (en) Localizing and mapping platform
CN107257931A (en) Actuating optical components for beam scanning apparatus
CN108513649A (en) Flight control method, equipment, machine readable storage medium and system
CN105190703A (en) Using photometric stereo for 3D environment modeling
CN110427055A (en) A kind of stage follow spotlight automatic control system and method
CN109240496B (en) Acousto-optic interaction system based on virtual reality
CN106681510A (en) Posture identification device, virtual reality display device and virtual reality system
CN106973221A (en) Unmanned plane image capture method and system based on aesthetic evaluation
CN108700892A (en) A kind of path method of adjustment and unmanned plane
US20220362659A1 (en) Handle controller
JP6902142B2 (en) Charging device, control method, and program
CN106454103A (en) Automatic following system
Ghandeharizadeh Holodeck: Immersive 3D Displays Using Swarms of Flying Light Specks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170426