CN106184787A - There is the aircraft of DAS (Driver Assistant System) and landing thereof and the method for collision free - Google Patents
There is the aircraft of DAS (Driver Assistant System) and landing thereof and the method for collision free Download PDFInfo
- Publication number
- CN106184787A CN106184787A CN201610556749.3A CN201610556749A CN106184787A CN 106184787 A CN106184787 A CN 106184787A CN 201610556749 A CN201610556749 A CN 201610556749A CN 106184787 A CN106184787 A CN 106184787A
- Authority
- CN
- China
- Prior art keywords
- image
- ultra
- low illumination
- infrared
- focal length
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
- B64D45/04—Landing aids; Safety measures to prevent collision with earth's surface
- B64D45/08—Landing aids; Safety measures to prevent collision with earth's surface optical
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
- G08G5/045—Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Abstract
The present invention relates to a kind of aircraft with flight DAS (Driver Assistant System), by ultra-low illumination video camera, Infrared Thermography Technology, the technology such as image processing techniques and elaborate servo control combine and come together in aircraft, can well assisting in flying person at night dark and rain, snow, mist, aircraft takeoff is completed under the bad weather circumstances such as haze, flight, the demands such as landing, the aircraft using above-mentioned assisting in flying system uses forward sight unit, overlook unit, aircraft forward can be absorbed, the video image of lower section sends display unit and is shown to pilot, pilot is made to will appreciate that aircraft forward, the ambient conditions of lower section, improve flight, the safety of landing;Also provide for the landing method of aircraft simultaneously and avoid flying method and the device of barrier crash, International Classification to belong to B64C or B64F classification.
Description
Technical field
The present invention relates to vehicle technology field, particularly relate to have the aircraft of DAS (Driver Assistant System), the rising of aircraft
Fall method and the method for collision free, International Classification belongs to B64C or B64F classification.
Background technology
Along with the appearance of the aircraft equipments such as aircraft, the pattern of transportation there occurs huge change.Air transportation
Substantial amounts of passenger is attract with advantage easily and efficiently.Distance between the world " shortens " therewith, starts at dawn and arrive at dusk and becomes out of question
The fact, but aviation safety always annoyings people and selects a great problem of trip.At the initial stage of air transportation development, due to
Science and technology is the most backward, and the aviation accident caused because of the mechanical breakdown of aircraft itself remains high, and occupies main cause.
In recent decades, along with the development of science and technology, a large amount of uses of new material, aircraft has been high and new technology
Masterpiece, the safety coefficient of self improves constantly, and the aviation accident ratio caused with anthropic factor due to natural cause is big
Big increase.Nearly seventy percent accident of numerical monitor is caused with artificial origin by natural cause according to statistics, becomes restriction aviation safety
Big obstacle.
In flight, especially during landing, aircraft pilot is it should be understood that the ambient conditions of front and lower section, with safety
Ground flight, landing.But, in prior art, aircraft does not has such equipment, so that aircraft pilot observation flight
Device front and the scene of lower section.Even if some aircraft have observation front and the camera head of lower section scene, but owing to using
The restriction of condition, can not preferably use, and more lacks relevant theoretical method simultaneously and instructs so that aircraft security performance
Can not get being effectively improved.
Summary of the invention
Based on the problems referred to above, it is an object of the invention to overcome above-mentioned the deficiencies in the prior art, it is provided that there is auxiliary and drive system
The aircraft of system, the landing method of aircraft and avoid flying method and the device of barrier crash, make aircraft pilot permissible
The convenient scene environment that aircraft front and lower section are clearly viewed in time.The present invention is by ultra-low illumination video camera, infrared
The technology such as thermal imaging, image processing techniques and elaborate servo control combine and come together in system, can well assist and fly
Office staff is dark at night and complete under the bad weather circumstances such as rain, snow, mist, haze aircraft takeoff, fly, the demand such as landing.
On the basis of inventive concept is conceived, in being embodied as, can be summarized by following four aspect
Illustrate and show, it should be noted that this four aspects is separate connects each other again, its structures and methods main idea neither constitutes phase
Mutually contradiction, those skilled in the art after reading comprehensively it will be appreciated that the spiritual place of the present invention.
(1) aircraft flight DAS (Driver Assistant System)
A kind of aircraft flight DAS (Driver Assistant System), including video acquisition unit and manipulation unit, described manipulation unit bag
Including industrial computer and display unit, wherein said industrial computer includes image capture module and image processing module, described video acquisition
Unit includes forward sight unit and overlooks unit two parts, and forward sight unit and vertical view unit absorb aircraft front and aircraft respectively
The image of lower section, and respectively the image of picked-up is sent image capture module, the view data of acquisition is sent out by image capture module
Send display unit after sending image processing module to process to show.
It is preferred that, described forward sight unit includes imaging lens, visible light lens, is respectively used to described in picked-up fly
The graphic images in row device front, visible images;Overlook unit and include imaging lens, visible light lens, be respectively used to take the photograph
Take the graphic images below described aircraft, visible images.
It is preferred that, described forward sight unit include the first forward sight imaging lens and the second forward sight imaging lens,
And first forward sight ultra-low illumination camera lens and the second forward sight ultra-low illumination camera lens, it is respectively used to absorb the heat in described aircraft front
Image and visible images, described vertical view unit include the first vertical view imaging lens and second overlook imaging lens,
And first overlook ultra-low illumination camera lens and second overlook ultra-low illumination camera lens, be respectively used to absorb the heat below described aircraft
Image and visible images.
It is preferred that, described forward sight unit and described vertical view unit are by independent or groups of imaging lens and single
Only or groups of visible light lens composition, the quantity of described imaging lens or visible light lens is 1,2 or 4.
It is preferred that, two imaging lens of described forward sight unit are same focal length lens, described forward sight unit
Two ultra-low illumination camera lenses are same focal length lens, and two imaging lens of described vertical view unit are same focal length lens, described in bow
Regard two ultra-low illumination camera lenses of unit as same focal length lens;
Described image processing module includes image mosaic submodule, for two imaging lens to described forward sight unit
The image absorbed splices, and the image being absorbed two ultra-low illumination camera lenses of described forward sight unit splices, right
The image that two imaging lens of described vertical view unit are absorbed splices, to two the ultra-low illumination camera lenses overlooking unit
The image absorbed splices.
It is preferred that, two imaging lens of described forward sight unit are different focal camera lens, described forward sight unit
Two ultra-low illumination camera lenses be different focal camera lens, two imaging lens of described vertical view unit are different focal camera lens,
Two ultra-low illumination camera lenses of described vertical view unit are different focal camera lens;
Described image processing module includes image co-registration submodule, for two imaging lens to described forward sight unit
The image absorbed merges, and the image being absorbed two ultra-low illumination camera lenses of described forward sight unit merges, right
The image that two imaging lens of described vertical view unit are absorbed merges, two ultra-low illumination to described vertical view unit
The image that camera lens is absorbed merges.
It is preferred that, described forward sight unit include two 25mm imaging lens, two 12.3mm imaging lens,
Two 25mm ultra-low illumination camera lenses, two 8mm ultra-low illumination camera lenses, described vertical view unit include two 25mm imaging lens,
Two 12.3mm imaging lens, two 25mm ultra-low illumination camera lenses, two 8mm ultra-low illumination camera lenses, described imaging lens
For absorbing graphic images, described ultra-low illumination camera lens is used for absorbing visible images;Described image processing module includes figure
As splicing submodule, image co-registration submodule, described image mosaic submodule is used for described two 25mm heat to forward sight unit
The image that imaging lens is absorbed splices, the image being absorbed two 12.3mm imaging lens of described forward sight unit
Splicing, the image being absorbed two 25mm ultra-low illumination camera lenses of described forward sight unit splices, to described forward sight
The image that two 8mm ultra-low illumination camera lenses of unit are absorbed splices, two 25mm thermal imagings to described vertical view unit
The image that camera lens is absorbed splices, and the image being absorbed two 12.3mm imaging lens of described vertical view unit is carried out
Splicing, the image being absorbed two 25mm ultra-low illumination camera lenses of described vertical view unit splices, to described vertical view unit
The image that absorbed of two 8mm ultra-low illumination camera lenses splice;Described image co-registration submodule is for described forward sight list
Image after the image mosaic that two 25mm imaging lens of unit are absorbed becomes with two 12.3mm heat of described forward sight unit
Two 25mm ultra-low illumination camera lens institutes as the image after the image mosaic that camera lens is absorbed merges, to described forward sight unit
After the image mosaic that two 8mm ultra-low illumination camera lenses of image and described forward sight unit after the image mosaic of picked-up are absorbed
Image merges, and the image after the image mosaic being absorbed two 25mm imaging lens of described vertical view unit is with described
Image after the image mosaic that two 12.3mm imaging lens of vertical view unit are absorbed merges, to described vertical view unit
The image mosaic absorbed of two 25mm ultra-low illumination camera lenses after two 8mm ultra-low illumination of image and described vertical view unit
Image after the image mosaic that camera lens is absorbed merges.
It is preferred that, described video acquisition unit also includes gyroscope, servo platform, for making the bat of forward sight unit
Take the photograph direction holding level and overlook the shooting direction holding of unit vertically.
It is preferred that, described industrial computer also includes memory module, the video after processing image processing module
Image carries out real-time storage.
It is preferred that, described industrial computer also includes playback module, aobvious for the video image of storage is carried out playback
Show, and video selects module, is used for selecting graphic images or visible images to export display unit.
(2) aircraft takeoff, fly and land time auxiliary security flying method and application the method device
This auxiliary security flying method comprises the following steps:
Detection brightness value, judges enter ultra-low illumination pattern or enter infrared mode according to the testing result obtained;
When entering described ultra-low illumination pattern, gather the first focal length some width of ultra-low illumination image and the second focal length respectively
The some width of ultra-low illumination image, the ultra-low illumination image that focusing is identical respectively carries out image mosaic, obtains the super of the first focal length
The ultra-low illumination stitching image of low-light (level) stitching image and the second focal length;Then by the ultra-low illumination stitching image of the first focal length and
The ultra-low illumination stitching image of the second focal length carries out image co-registration, and the ultra-low illumination image after being merged, after described fusion
Ultra-low illumination image be sent to aircraft human pilot, display merge after ultra-low illumination image, return terminates;
When entering described infrared mode, gather the 3rd some width of focal length infrared image and the 4th focal length infrared image respectively
Some width, the infrared image that focusing is identical respectively carries out image mosaic, obtains trifocal infrared mosaic image and the 4th
The infrared mosaic image of focal length;Then the infrared mosaic image of trifocal infrared mosaic image and the 4th focal length is carried out figure
As merging, the infrared image after being merged, the infrared image after described fusion is sent to aircraft human pilot, display is melted
Infrared image after conjunction, return terminates.
It is preferred that, described gather the first focal length ultra-low illumination image and the second focal length ultra-low illumination image respectively,
Including: the first focal length ultra-low illumination image in apparent direction before gathering respectively, the second focal length ultra-low illumination figure in front apparent direction
Picture, the second focal length ultra-low illumination image gather the first focal length ultra-low illumination image overlooked on direction, overlooking on direction;
The identical ultra-low illumination image of described focusing respectively carries out image mosaic, and the ultra-low illumination obtaining the first focal length is spelled
The ultra-low illumination stitching image of map interlinking picture and the second focal length, including: to the described front apparent direction part of the body cavity above the diaphragm housing the heart and lungs away from identical ultra-low illumination figure
As carrying out image mosaic, obtain second in the ultra-low illumination stitching image of the first focal length in front apparent direction and front apparent direction burnt
Away from ultra-low illumination stitching image;And the ultra-low illumination image that focal length on described vertical view direction is identical is carried out image mosaic,
The ultra-low illumination obtaining overlooking the second focal length on the ultra-low illumination stitching image of the first focal length on direction and vertical view direction is spelled
Map interlinking picture;
The described ultra-low illumination stitching image by described first focal length and the ultra-low illumination stitching image of described second focal length
Carry out image co-registration, the ultra-low illumination image after being merged, including: by the ultralow photograph of the first focal length in described front apparent direction
The ultra-low illumination stitching image of the second focal length on degree stitching image and described front apparent direction carries out image co-registration, obtains forward sight side
Ultra-low illumination image after upwards merging;By the ultra-low illumination stitching image of the first focal length on described vertical view direction with described bow
The ultra-low illumination stitching image of the second focal length in apparent direction carries out image co-registration, obtains overlooking the ultralow photograph after merging on direction
Degree image.
It is preferred that, described collection the 3rd focal length infrared image and the 4th focal length infrared image, including: gather respectively
The 4th focal length infrared image in the 3rd focal length infrared image in front apparent direction, front apparent direction, gathers the overlooked on direction
Three focal length infrared images, the 4th focal length infrared image overlooked on direction;
The identical infrared image of described focusing respectively carries out image mosaic, obtain trifocal infrared mosaic image and
The infrared mosaic image of the 4th focal length, including: the described front apparent direction part of the body cavity above the diaphragm housing the heart and lungs is carried out image mosaic away from identical infrared image,
Trifocal infrared mosaic image in front apparent direction and the infrared mosaic image of the 4th focal length in front apparent direction;To institute
State the infrared image that on vertical view direction, focal length is identical and carry out image mosaic, obtain overlooking the trifocal infrared mosaic on direction
The infrared mosaic image of the 4th focal length in image and vertical view direction;
The described infrared mosaic image by described trifocal infrared mosaic image and described 4th focal length carries out image
Merge, the infrared image after being merged, including: by the trifocal infrared mosaic image in described front apparent direction and described
The infrared mosaic image of the 4th focal length in front apparent direction carries out image co-registration, obtains the infrared figure after merging in front apparent direction
Picture;And red by the trifocal infrared mosaic image on described vertical view direction and the 4th focal length on described vertical view direction
Outer stitching image carries out image co-registration, obtains overlooking the infrared image after merging on direction.
It is preferred that, described detection brightness value, according to the testing result that obtains select to enter ultra-low illumination pattern or
Infrared mode, including the brightness value that described detection is current, if the testing result that obtains meets first pre-conditioned, then enters
Ultra-low illumination pattern;Otherwise enter infrared mode.
It is preferred that, described method also includes: whether detection forward sight shooting direction is in level, does not locates when detecting
When level, adjust orientation to level.
Additionally, this part also includes the device disclosing the flight of a kind of auxiliary security, it is characterised in that described device is installed
On board the aircraft, described aircraft takeoff, fly and land time gather image, including:
Mode selection module, is used for detecting brightness value, according to the testing result that obtains select to enter ultra-low illumination pattern or
Infrared mode;
Ultra-low illumination image capture module, for when entering described ultra-low illumination pattern, gathers the first focal length respectively and surpasses
The some width of low-light (level) image and the second focal length some width of ultra-low illumination image;
Ultra-low illumination image mosaic module, carries out image mosaic for the ultra-low illumination image that focusing is identical respectively,
Ultra-low illumination stitching image and the ultra-low illumination stitching image of the second focal length to the first focal length;
Ultra-low illumination image co-registration module, for by burnt to the ultra-low illumination stitching image of described first focal length and described second
Away from ultra-low illumination stitching image carry out image co-registration, the ultra-low illumination image after being merged, ultralow by after described fusion
Illumination image is sent to aircraft human pilot, shows the ultra-low illumination image after described fusion;
Infrared image acquisition module, for when entering described infrared mode, if gathering the 3rd focal length infrared image respectively
Dry width and the 4th some width of focal length infrared image;
Infrared image concatenation module, carries out image mosaic for the infrared image that focusing is identical respectively, obtains the 3rd burnt
Away from infrared mosaic image and the infrared mosaic image of the 4th focal length;
Infrared image Fusion Module, infrared for by described trifocal infrared mosaic image and described 4th focal length
Stitching image carries out image co-registration, the infrared image after being merged, and the infrared image after described fusion is sent to aircraft
Human pilot, shows the infrared image after described fusion.
(3) device of methods and applications the method for aircraft landing safety is improved based on detection obstacle
A kind of method improving landing safety based on detection obstacle, it is characterised in that comprise the following steps: capture apparatus
Adjusting self orientation, making forward sight shooting direction is level;
Detection current illumination value, it is judged that whether the testing result obtained meets ultra-low illumination treatment conditions;
When described testing result meets ultra-low illumination treatment conditions, then by ultra-low illumination camera collection current location
Front view picture and overhead view image, through splicing and merge after obtain ultra-low illumination front view picture and ultra-low illumination overhead view image;
Ultra-low illumination front view picture and ultra-low illumination overhead view image after merging splicing respectively carry out obstacle detection, work as inspection
Measure time ultra-low illumination front view picture and/or ultra-low illumination overhead view image exist obstacle, generate and send warning information, return
Terminate;
When described testing result does not meets described ultra-low illumination treatment conditions, then gathered by infrared thermal imaging camera
The front view picture of current location and overhead view image, obtain infrared level front view picture and infrared level top view after splicing and fusion
Picture;
Respectively the infrared level front view picture after splicing and fusion and infrared level overhead view image are carried out obstacle detection, work as detection
When there is obstacle in infrared level front view picture and/or infrared level overhead view image, generating and sending warning information, return terminates.
It is preferred that, described capture apparatus adjusts self orientation makes forward sight shooting direction be level, including:
Described equipment detects whether forward sight shooting direction is in level in real time, when being not in level, adjusts self-position,
Making forward sight shooting direction is level.
It is preferred that, the described front view picture by ultra-low illumination camera collection current location and overhead view image,
Including:
By ultra-low illumination camera collection some width front view picture and some width overhead view images;
Front view picture identical for focal length is divided into one group, is spliced into a width figure by often organizing the identical front view picture of focal length
Picture, obtains the first splicing front view picture that this focal length is corresponding;
First splicing front view picture of different focal is carried out image co-registration, is finally fused into piece image, as ultralow
Illumination front view picture;
Overhead view image identical for focal length is divided into one group, is spliced into a width figure by often organizing the identical overhead view image of focal length
Picture, obtains the first splicing overhead view image that this focal length is corresponding;
First splicing overhead view image of different focal is carried out image co-registration, is finally fused into piece image, as ultralow
Illumination overhead view image.
It is preferred that, described front view picture and the top view being gathered current location by infrared thermal imaging camera
Picture, including:
Some width front view pictures and some width overhead view images is gathered by infrared thermal imaging camera;
Front view picture identical for focal length is divided into one group, is spliced into a width figure by often organizing the identical front view picture of focal length
Picture, obtains the second splicing front view picture that this focal length is corresponding;
Second splicing front view picture of different focal is carried out image co-registration, is finally fused into piece image, as infrared
Level front view picture;
Overhead view image identical for focal length is divided into one group, is spliced into a width figure by often organizing the identical overhead view image of focal length
Picture, obtains the second splicing overhead view image that this focal length is corresponding;
Second splicing overhead view image of different focal is carried out image co-registration, is finally fused into piece image, as infrared
Level overhead view image.
It is preferred that, described front view is as specifically including: in front apparent direction overlying lid the first predetermined angle scope
Image;Described overhead view image specifically includes: overlook the image covering the second predetermined angle scope on direction.
It is preferred that, described respectively described ultra-low illumination front view picture and described ultra-low illumination overhead view image are carried out
Obstacle detection, specifically includes:
Judge whether described ultra-low illumination front view picture and/or described ultra-low illumination overhead view image exist characteristics of image
Point, is to extract described image characteristic point, according to described characteristics of image point-rendering obstacle image, and before corresponding ultra-low illumination
In visible image and/or ultra-low illumination overhead view image, obstacle image described in labelling, determines with this and there is obstacle;Otherwise determine and do not exist
Obstacle.
It is preferred that, described when described ultra-low illumination front view picture and/or described ultra-low illumination top view being detected
When there is obstacle in Xiang, generate and send warning information and include:
By the positional information of obstacle described in satellite fix, according to the positional information of described obstacle, and it is marked with described
The described ultra-low illumination front view picture of obstacle image and/or ultra-low illumination overhead view image, generate warning information, send described alarm
Information.
It is preferred that, described when described testing result meets ultra-low illumination treatment conditions, then pass through ultra-low illumination
The ultra-low illumination front view picture of camera collection current location and ultra-low illumination overhead view image, specifically include:
When described testing result meets ultra-low illumination treatment conditions, detect state of flight,
If the takeoff condition of being in or landing state or state of flight, then by ultra-low illumination camera collection current location
Ultra-low illumination front view picture and ultra-low illumination overhead view image.
It is preferred that, described when described testing result does not meets described ultra-low illumination treatment conditions, then by red
Exterior-heat imaging camera head gathers the infrared level front view picture of current location and infrared level overhead view image, specifically includes:
When described testing result does not meets described ultra-low illumination treatment conditions, detect state of flight,
If the takeoff condition of being in or landing state or state of flight, then by infrared level camera collection current location
Infrared level front view picture and infrared level overhead view image.
A kind of equipment improving landing safety based on detection obstacle, it is characterised in that including:
Orientation adjustment module, is used for adjusting self orientation, and making forward sight shooting direction is level;
Luminance detection module, is used for detecting current illumination value, it is judged that whether the testing result obtained meets at ultra-low illumination
Reason condition;
Starlight acquisition module, for when described testing result meets ultra-low illumination treatment conditions, then passes through ultra-low illumination
The ultra-low illumination front view picture of camera collection current location and ultra-low illumination overhead view image also carry out splicing and fusion treatment;
Starlight obstacle detection module, for respectively to the ultra-low illumination front view picture after splicing is merged and ultra-low illumination
Overhead view image carries out obstacle detection, exists when detecting in described ultra-low illumination front view picture and/or ultra-low illumination overhead view image
During obstacle, generate and send warning information;
Infrared collecting module, for when described testing result does not meets described ultra-low illumination treatment conditions, then by red
Exterior-heat imaging camera head gathers the infrared level front view picture of current location and infrared level overhead view image and carries out splicing and at fusion
Reason;
Infrared obstacle detection module, the described infrared level front view picture after respectively splicing being merged and described infrared level
Overhead view image carries out obstacle detection, there is obstacle when detecting in described infrared level front view picture and/or infrared level overhead view image
Time, generate and send warning information.
(4) aircraft of methods and applications the method avoiding aircraft to knock down based on data syn-chronization
A kind of method that collision free based on data syn-chronization hinders, it is characterised in that including:
Aircraft obtains current positional information, obtains environmental information request according to described positional information tissue, and by institute
State acquisition environmental information request to send to data center;
When receiving the environmental information response that described data center returns, from described environmental information responds, obtain obstacle
Information, as history complaint message, obtains flight parameter from described environmental information responds, as history flight ginseng
Number;
State of flight according to self gathers ambient image, obtains current complaint message from described ambient image, according to
Described current complaint message, described history complaint message and described history flight parameter, in described ambient image, labelling should
Obstacle, shows the ambient image of marked obstacle;
When avoiding described obstacle, obtain the flight parameter avoiding in described obstructive process, join as current flight
Number, sends described current location information, described current flight parameter and described current complaint message to data center.
It is preferred that, described aircraft obtains current location information, including:
Described aircraft uses global position system to obtain longitude, latitude and height above sea level, by described longitude, described latitude
Degree and described height above sea level are as current location information.
It is preferred that, from described environmental information responds, obtain complaint message, as history complaint message, bag
Include:
The positional information of obstacle, dimension information, mobile attribute information is obtained, by described barrier from described environmental information responds
The positional information, dimension information and the mobile attribute information that hinder are as history complaint message.
It is preferred that, the described state of flight according to self gathers ambient image, specifically includes:
Obtain the state of flight of self, if takeoff condition or land state or state of flight, then gather front view picture
And overhead view image, using the image that collects as ambient image.
It is preferred that, described from described ambient image, obtain current complaint message, specifically include:
For the ambient image collected in the same direction, the ambient image of identical focal length is carried out image mosaic, spell
It is connected into piece image, the spliced image of different focal in the direction is carried out image co-registration, the image after being merged, from
Extract characteristic point on image after described fusion, when extracting characteristic point, draw obstacle, and labelling institute according to described characteristic point
State obstacle, obtained the positional information of described obstacle by global position system, according to positional information and the labelling of described obstacle
The image of described obstacle, calculates the dimension information of described obstacle, mobile attribute information, by positional information, the size of described obstacle
Information, mobile attribute information are as current complaint message.
It is preferred that, described fly according to described current complaint message, described history complaint message and described history
Line parameter is labelling obstacle in described ambient image, including:
Judge that described current complaint message is the most identical with described history complaint message, be then at described ambient image subscript
Note obstacle, and point out described history flight parameter;Otherwise according to described current complaint message labelling in described ambient image
Obstacle.
It is preferred that, described method also includes: draw the motion track of described obstacle on the image of labelling obstacle;
The motion track of the described obstacle of described drafting, including:
Position the most described obstacle every Preset Time, when positioning described obstacle every time, obtain residing for described obstacle
Height above sea level, the angle of pitch, azimuth, and the displacement of described obstacle when obtaining the described obstacle in double location;
According to the height above sea level residing for described obstacle, the angle of pitch, azimuth and the displacement of described obstacle, calculate described obstacle and move
Dynamic trace information, and the direction of motion of described obstacle and movement velocity;
The trace information moved according to described obstacle and the direction of motion of described obstacle and movement velocity, simulate described obstacle
The track that will move in three dimensions;
Draw the trace information that described obstacle moves and the track that will move in three dimensions.
It is preferred that, described collection ambient image, including:
Detection current illumination value, selects corresponding image pickup mode to gather ambient image according to the testing result obtained,
When described testing result meets ultra-low illumination treatment conditions, then by ultra-low illumination camera collection environment map
Picture;
When described testing result does not meets described ultra-low illumination treatment conditions, then gathered by infrared thermal imaging camera
Ambient image.
It is preferred that, described method also includes:
Described data center receives described current location information, described current flight parameter and described current obstacle letter
During breath, obtain the flight parameter corresponding with described current location information and complaint message, replace by described current flight parameter
Flight parameter originally, replaces original complaint message with described current complaint message.
The aircraft of a kind of collision free obstacle based on data syn-chronization, it is characterised in that including:
Information request module, for obtaining current positional information, obtains environmental information according to described positional information tissue
Request, and the request of described acquisition environmental information is sent to data center;
Information receiving module, for when receiving the environmental information response that described data center returns, from described environment
Information response obtains complaint message, as history complaint message, from described environmental information responds, obtains flight parameter,
As history flight parameter;
Obstacle detection module, gathers ambient image for the state of flight according to self, obtains from described ambient image
Current complaint message, according to described current complaint message, described history complaint message and described history flight parameter, described
This obstacle of labelling in ambient image, shows the ambient image of marked obstacle;
Data simultaneous module, for when avoiding described obstacle, obtains the flight parameter avoiding in described obstructive process, will
Described current location information, described current flight parameter and described current complaint message, as current flight parameter, are sent out by it
Deliver to data center.
The invention has the beneficial effects as follows: the aircraft flight DAS (Driver Assistant System) of the present invention uses forward sight unit, overlooks list
Unit, it is possible to picked-up aircraft front, the video image of lower section send display unit and be shown to pilot, make pilot will appreciate that
Aircraft front, the ambient conditions of lower section, improve the safety of flight, landing.
Accompanying drawing explanation
Will now be described as the present invention preferably but non-limiting embodiment, these and other features of the present invention, side
Face and advantage will become clear from, wherein when reading detailed further below with reference to accompanying drawing:
Fig. 1 is the structural representation of DAS (Driver Assistant System) in present invention ();
Fig. 2 is the another kind of structural representation of DAS (Driver Assistant System) in present invention ();
Fig. 3 is the another kind of structural representation of DAS (Driver Assistant System) in present invention ();
Fig. 4 is the another kind of structural representation of DAS (Driver Assistant System) in present invention ();
Fig. 5 is the method flow diagram of auxiliary security flight in present invention (two);
Fig. 6 is the device block diagram of auxiliary security flight in present invention (two);
Fig. 7 is the method flow diagram improving landing safety in present invention (three) based on detection obstacle;
Fig. 8 is the equipment block diagram improving landing safety in present invention (three) based on detection obstacle;
Fig. 9 is the method flow diagram of collision free obstacle based on data syn-chronization in present invention (four);
Figure 10 is the aircraft device block diagram of collision free obstacle based on data syn-chronization in present invention (four);
Figure 11 is the structural representation of obstacle detection module in present invention (four);
Figure 12 is image acquisition units installation site schematic diagram in present invention ();
Figure 13 is that in present invention (), image acquisition units photographic head arranges schematic diagram;
Figure 14 is that in present invention (), image acquisition units photographic head arranges schematic diagram.
Detailed description of the invention
That the following description is substantially merely exemplary and be not intended to limit the disclosure, application or purposes.Should
Being understood by, in whole accompanying drawings, corresponding reference represents identical or corresponding parts and feature.
The present invention will explain the flesh and blood of the present invention by following four aspect comprehensively, it should be noted that this four
Aspect is separate to be connected each other again, and its structures and methods main idea neither constitutes conflicting, and those skilled in the art are comprehensively
After reading it will be appreciated that the spiritual place of the present invention.
One, flight DAS (Driver Assistant System)
According to an aspect of the present invention, a kind of flight DAS (Driver Assistant System) being applied to aircraft is disclosed, the most such as
Under:
With reference to Fig. 1 and Figure 12, in one embodiment of the invention, the flight DAS (Driver Assistant System) of the present invention includes video
Collecting unit 100 and manipulation unit 200, manipulation unit 200 includes industrial computer 210 and display unit 240, wherein industrial computer 210
Including image capture module 212 and image processing module 211, video acquisition unit 100 is installed to aircraft bottom part down, its bag
Include forward sight unit 110 and overlook unit 140 two parts, forward sight unit 110 and overlook unit 140 absorb respectively aircraft front with
Image below aircraft, and respectively the image of picked-up is sent image capture module 212, image capture module 212 will obtain
View data send image processing module 211 process after send display unit 240 show.Before native system has
Depending on, overlook two big image acquisition units, aircraft pilot can clearly be observed flown by display screen (display unit transmission)
Row device front, the ambient conditions of lower section, it is ensured that the safety of flight.Forward sight unit 110 and overlook unit 140 can be by individually or in groups
Imaging lens, and independent or groups of visible light lens composition, the quantity of imaging lens or visible light lens can be 1
Individual, 2 or 4 or more quantity in pairs.
Further, as in figure 2 it is shown, forward sight unit 110 includes forward sight imaging lens 111 and forward sight visible light lens
125, it is respectively used to absorb graphic images and the visible images in aircraft front;Overlook unit 140 to include overlooking thermal imaging
Camera lens 141 and vertical view visible light lens 155, be respectively used to absorb the graphic images below aircraft and visible images.As
This achieves thermal imaging camera and is used in combination with visible light camera, it is achieved that thermal imaging camera and visible light camera
Have complementary advantages, use visible light camera to gather video image, at severe weather conditions such as night and rain, snow, mist, hazes by day
Lower use thermal imaging camera, can clearly watch the situation of aircraft front, lower section, so that equipment can not be by environmental factors
Impact, reaches to gather round the clock the effect of video image;In order to enable preferably to keep shooting angle, video camera is made to shoot flight all the time
The video image in the horizontal front of device, vertically lower section, it is simple to aircraft pilot grasps the situation of aircraft front, lower section at any time,
Safely flight and landing, video acquisition unit 100 also includes gyroscope 170 and servo platform 180, the two with the use of, make
The shooting direction of forward sight unit 110 keeps level and overlooks the shooting direction holding of unit 140 vertically.
Further, as it is shown on figure 3, forward sight unit 110 includes that the first forward sight imaging lens 112 becomes with the second forward sight heat
As camera lens 113 and the first forward sight ultra-low illumination camera lens 126 and the second forward sight ultra-low illumination camera lens 127, it is respectively used to picked-up and flies
The graphic images in row device front and visible images.
Equally, overlook unit 140 include the first vertical view imaging lens 142 and second overlook imaging lens 143 and
First overlooks ultra-low illumination camera lens 156 and second overlooks ultra-low illumination camera lens 157, is respectively used to the heat one-tenth absorbed below aircraft
As image and visible images.Use ultra-low illumination video camera, it is possible to make video camera still be able in the case of dark at light
Absorb image clearly, expand the scope of application of native system.
Wherein, two imaging lens 112 and 113 of forward sight unit 110 can be same focal length lens, forward sight unit 110
Two ultra-low illumination camera lenses 126 and 127 can be same focal length lens, overlook unit 140 two imaging lens 142 Hes
143 can be same focal length lens, and two the ultra-low illumination camera lenses 156 and 157 overlooking unit 140 can be same focal length lens.
It should be noted that the biggest angle of visual field of focal length is the least, the distance of shooting is the most remote, and it is the biggest that focal length more neglects rink corner, claps
The distance taken the photograph is the nearest, and the purpose gathering the image under different focal is, by different focal image co-registration, reaches the big visual angle of image
And purpose at a distance, with this rich image content, image is apparent.
Based on this, image processing module 211 includes image mosaic submodule 213, for two heat to forward sight unit 110
The image that imaging lens 112 and 113 (first forward sight imaging lens 112 and the second forward sight imaging lens 113) is absorbed enters
Row splicing, two ultra-low illumination camera lenses 126 and 127 (the first vertical view ultra-low illumination camera lens 126 and second to forward sight unit 110
Overlook ultra-low illumination camera lens 127) image that absorbed splices, to two imaging lens 142 Hes overlooking unit 140
The image that 143 (first overlooks imaging lens 142 and second overlooks imaging lens 143) are absorbed splices, to vertical view
(first overlooks ultra-low illumination camera lens 156 and second overlooks ultra-low illumination mirror to two ultra-low illumination camera lenses 156 and 157 of unit 140
157) image absorbed splices.
Two imaging lens 112 and 113 of forward sight unit 110 can also be different focal camera lens, forward sight unit 110
Two ultra-low illumination camera lenses 126 and 127 can also be different focal camera lens, overlooks two imaging lens 142 Hes of unit 140
143 can also be different focal camera lens, and two the ultra-low illumination camera lenses 156 and 157 overlooking unit 140 can also be different focal
Camera lens.
Image processing module 211 also includes image co-registration submodule 214, for two thermal imagings to forward sight unit 110
The image that camera lens 112 and 113 (first forward sight imaging lens 112 and the second forward sight imaging lens 113) is absorbed melts
Close, two ultra-low illumination camera lenses 126 and 127 (the first vertical view ultra-low illumination camera lens 126 and the second vertical view to forward sight unit 110
Ultra-low illumination camera lens 127) image that absorbed merges, to two imaging lens 142 and 143 (the overlooking unit 140
One overlooks imaging lens 142 and second overlooks imaging lens 143) image that absorbed merges, to overlooking unit 140
Two ultra-low illumination camera lenses 156 and 157 (first overlooks ultra-low illumination camera lens 156 and second overlooks ultra-low illumination camera lens 157)
The image absorbed merges.
Industrial computer 210 also includes that memory module 215, playback module 216, electronics amplification module 217, video select module
218.Wherein, the memory module 215 video image after processing image processing module 211 carries out real-time storage;Playback mould
Block 216 for carrying out playback display to the video image of storage;Electronics amplification module 217 is having remote target video image
When being unfavorable for eye-observation, at manipulation unit 200, target area can be performed electronics enlarging function, distant object region is carried out
Amplify, make negligible image at a distance become big and become clear, facilitate aircraft pilot to observe distant object;Video selects module 218 to use
Display unit 240 is exported in selecting graphic images or visible images.Or, image co-registration submodule 214 is respectively to front
Carry out merging, to the final heat overlooking unit 140 depending on final graphic images and the final visible images of unit 110
Image merges with final visible images, the video image after display device 240 display fusion.
As shown in Figure 4, driving to preferably obtain image auxiliary, in another embodiment, forward sight unit 110 includes
Four imaging lens and four ultra-low illumination camera lenses, it is respectively two 25mm imaging lens (the i.e. first forward sight 25mm heat
Imaging lens 114 and the second forward sight 25mm imaging lens 115), two 12.3mm imaging lens (the i.e. first forward sights 12.3mm
Imaging lens 116 and the second forward sight 12.3mm imaging lens 117), two 25mm ultra-low illumination camera lens (the i.e. first forward sights
25mm ultra-low illumination camera lens 128 and the second forward sight 25mm ultra-low illumination camera lens 129) and two 8mm ultra-low illumination camera lenses (i.e. first
Forward sight 8mm ultra-low illumination camera lens 130 and the second forward sight 8mm ultra-low illumination camera lens 131).
The photographic head of forward sight unit 110 is specifically towards layout as shown in figure 13, and eight photographic head divide row two row four to indulge altogether
Arranging, two of which 25mm imaging lens is positioned in the middle of first row, and two 25mm ultra-low illumination camera lenses divide row first row both sides,
Two 12.3mm imaging lens are positioned in the middle of second row, and two 8mm ultra-low illumination camera lenses divide row second row both sides, such cloth
The mode of putting can maximize the corresponding function of different photographic head.Eight photographic head arrangements of following vertical view unit 140 are with upper
State identical, can be found in Figure 14.
Overlooking unit 140 and also include four imaging lens and four ultra-low illumination camera lenses, it is respectively two 25mm heat
Imaging lens (i.e. first overlooks 25mm imaging lens 144 and second overlooks 25mm imaging lens 145), two 12.3mm heat
Imaging lens (i.e. first overlooks 12.3mm imaging lens 146 and second overlooks 12.3mm imaging lens 147), two 25mm
Ultra-low illumination camera lens (i.e. first overlooks 25mm ultra-low illumination camera lens 158 and second overlooks 25mm ultra-low illumination camera lens 159) and two
Individual 8mm ultra-low illumination camera lens (i.e. first overlooks 8mm ultra-low illumination camera lens 160 and second overlooks 8mm ultra-low illumination camera lens 161),
Imaging lens is used for absorbing graphic images, and ultra-low illumination camera lens is used for absorbing visible images.
Image processing module 211 includes image mosaic submodule 213 and image co-registration submodule 214, image mosaic submodule
Block 213 is for imaging lens (imaging lens 114 and 115, the thermal imaging mirror of two identical focal lengths to forward sight unit 110
116 and 117) image absorbed splices, (super to the ultra-low illumination camera lens of two identical focal lengths of forward sight unit 110
High-aperture lens 128 and 129, ultra-low illumination camera lens 130 and 131) image that absorbed splices, to overlooking unit 140
The image that the imaging lens (imaging lens 144 and 145, imaging lens 146 and 147) of two identical focal lengths is absorbed enters
Row splicing, ultra-low illumination camera lens (ultra-low illumination camera lens 158 and 159, the ultralow photograph to two the identical focal lengths overlooking unit 140
Degree camera lens 160 and 161) image that absorbed splices;
Image co-registration submodule 214 is before to the first forward sight 25mm imaging lens 114 and second of forward sight unit 110
Become with the first forward sight 12.3mm heat of forward sight unit 110 depending on the image after the image mosaic that 25mm imaging lens 115 is absorbed
As the image after the image mosaic that camera lens 116 and the second forward sight 12.3mm imaging lens 117 are absorbed merges, to forward sight
The image that first forward sight 25mm ultra-low illumination camera lens 128 of unit 110 and the second forward sight 25mm ultra-low illumination camera lens 129 are absorbed
Spliced image and the first forward sight 8mm ultra-low illumination camera lens 130 and the second forward sight 8mm ultra-low illumination mirror of forward sight unit 110
Image after 131 image mosaic absorbed merges, to the first vertical view 25mm imaging lens overlooking unit 140
144 and second the first vertical views overlooking the image after the image mosaic that 25mm imaging lens 145 is absorbed and vertical view unit 140
Image after the image mosaic that 12.3mm imaging lens 146 and the second vertical view 12.3mm imaging lens 147 are absorbed is carried out
Merge, the first vertical view 25mm ultra-low illumination camera lens 158 and second overlooking unit 140 is overlooked 25mm ultra-low illumination camera lens 159
Image after the image mosaic absorbed and the first vertical view 8mm ultra-low illumination camera lens 160 and second overlooking unit 140 are overlooked
Image after the image mosaic that 8mm ultra-low illumination camera lens 161 is absorbed merges.
Above-mentioned image processing process, by the splicing to the video image that the camera lens of same focal length is absorbed, can form width and regard
The video image at angle, and by the fusion of the video image to spliced different focal, both reached the requirement of wide visual field angle,
In turn ensure that the fine definition of image central authorities, solve image unsharp problem at a distance.The remainder of the present embodiment is with upper
One embodiment is identical, and here is omitted.Certainly from cost-effective angle, it is also possible to all use ultra-low illumination shooting
Machine (cost of thermal camera is higher), uses infrared lamp floor light, enables ultra-low illumination video camera to collect clear figure
Picture.When all using ultra-low illumination video camera, the focal length of each video camera on forward sight/vertical view direction, can be adjusted to: two
4mm, two 8mm.
The technique effect that this portion of techniques scheme obtains: use the aircraft of above-mentioned assisting in flying system to use forward sight list
Unit, vertical view unit, it is possible to picked-up aircraft forward, the video image of lower section send display unit and be shown to pilot, make pilot
Will appreciate that the ambient conditions of aircraft forward, lower section, improve the safety of flight, landing.
Two, aircraft takeoff, fly and land time auxiliary security flying method and application the method device
Present invention also offers a kind of auxiliary security flight method, apply aircraft takeoff, fly and land time,
The method detects brightness value mainly by optical sensor, selects to enter ultra-low illumination pattern or infrared mould according to testing result
Formula, when entering corresponding pattern, gathers image respectively according to different focal lengths, and the image that focusing is identical respectively carries out image
Splicing, carries out spliced image image co-registration, and shows the image after fusion, relate to both of which, i.e. in this programme
Ultra-low illumination pattern and infrared mode, mutually switched by both patterns, gathers image, it is possible to effectively solve rain, snow,
Under the severe weather conditions such as mist, haze, the sight line difference unsafe problem of flight.Splice in conjunction with identical focal length images, then by after splicing
Image merge, it is thus achieved that the big angle of visual field, clearly image, with this assisting in flying, improve the safety of flight further.
As it is shown in figure 5, specifically include following steps:
Step 101: detection brightness value, judges to enter ultra-low illumination pattern or infrared mode according to the testing result obtained,
If ultra-low illumination pattern, then perform step 102;If infrared mode, then perform step 105;
In the present invention, the brightness value of detection current environment, the testing result obtained is judged, if this detection knot
It is pre-conditioned that fruit meets first, it is determined that enters ultra-low illumination pattern, performs step 102;Otherwise determine entrance infrared mode, hold
Row step 105, wherein, first pre-conditioned can be more than 0.0001Lux.In the technical scheme that the present invention provides, comprise super
Low-light (level) pattern and infrared mode, naturally it is also possible to comprise other patterns, such as: when testing result is more than 0.1Lux, enter
Enter general mode, optical camera lens now can be used to gather image;When testing result is less than or equal to more than 0.01Lux
During 0.1Lux, enter low-light (level) pattern, low-illuminance cameras now can be used to gather image;When testing result is more than
0.001Lux enters moon lighting level pattern less than or equal to 0.01Lux, uses moon lighting level camera acquisition image;When testing result is more than
When 0.0001Lux is less than or equal to 0.001Lux, enter ultra-low illumination pattern, use ultra-low illumination camera acquisition image;Work as inspection
When surveying result less than or equal to 0.0001Lux, enter infrared mode, use thermal infrared imaging camera to gather image, various patterns
The method gathering image is identical, and here is omitted.
Step 102: gather the first focal length some width of ultra-low illumination image, the second focal length ultra-low illumination image respectively some
Width, then performs step 103;
In the present invention, when entering ultra-low illumination pattern, ultra-low illumination video camera can be used under the conditions of different focal
Gathering image, as a rule, the biggest angle of visual field of focal length is the least, and the distance of shooting is the most remote, in the method, uses ultra-low illumination to take the photograph
Camera respectively using the first focal length, the second focal length as focal length value, gathers ultra-low illumination image, wherein, first in front apparent direction
The value of focal length can differ with the value of the second focal length, and the first focal length value can be 25mm, and the second focal length value can be
8mm.On using the ultra-low illumination image that collects as the first focal length ultra-low illumination image in front apparent direction, front apparent direction
The second focal length ultra-low illumination image.Ultra-low illumination video camera is used to overlook on direction respectively with the first focal length, the second focal length
As focal length value, gather ultra-low illumination image, using burnt as first overlooked on direction for the ultra-low illumination image collected
The second focal length ultra-low illumination image on ultra-low illumination image, vertical view direction.During carrying out image acquisition, in order to carry
The definition of the image that height collects and the visual angle of image are the openst, can gather some width focal lengths in front apparent direction
The image that identical, shooting angle is different;Can also overlook on direction equally, gather that some width focal lengths are identical, shooting angle not
Same image.
Step 103: the ultra-low illumination image that focusing is identical respectively carries out image mosaic, obtains the ultralow photograph of the first focal length
Degree stitching image and the ultra-low illumination stitching image of the second focal length;
In the present invention, the front apparent direction part of the body cavity above the diaphragm housing the heart and lungs is carried out image mosaic away from identical ultra-low illumination image, obtains forward sight side
The ultra-low illumination stitching image of the second focal length on the ultra-low illumination stitching image of the first focal length upwards and front apparent direction, the most just
It is to say, the first focal length ultra-low illumination image mosaic in apparent direction before some width is become piece image, obtains in front apparent direction
The ultra-low illumination stitching image of the first focal length.The second focal length ultra-low illumination image mosaic in apparent direction before some width is become a width
Image, obtains the ultra-low illumination stitching image of the second focal length in front apparent direction.Same method obtains overlooking on direction the
The ultra-low illumination stitching image of the second focal length on the ultra-low illumination stitching image of one focal length and vertical view direction.
In the present invention, by by identical under focus condition upper, identical, the image for each angle acquisition carries out figure
As splicing, obtain ultra-low illumination stitching image and the ultra-low illumination spliced map of the second focal length of the first focal length in front apparent direction
Picture, wherein, the ultra-low illumination stitching image of the first focal length is the image mosaic collected under the first focus condition, second
The ultra-low illumination stitching image of focal length is the image mosaic collected under the second focus condition, is spelled by the image of each angle
When being connected into piece image, by increasing capacitance it is possible to increase picture material, the definition of raising characteristics of image.It addition, with the ultralow photograph of the second focal length
Degree stitching image compares, but the narrow mesh that can photograph remotely of the ultra-low illumination stitching image angle of visual field of the first focal length
Mark, accordingly, with a wide angle of view for the second ultra-low illumination splicing figure, but characteristics of image at a distance is fuzzyyer.
Step 104: the ultra-low illumination stitching image of the first focal length and the ultra-low illumination stitching image of the second focal length are carried out
Image co-registration, the ultra-low illumination image after being merged, the ultra-low illumination image after merging is sent to aircraft human pilot,
Ultra-low illumination image after display fusion, return terminates;
In the present invention, by second on the ultra-low illumination stitching image of the first focal length in front apparent direction and front apparent direction
The ultra-low illumination stitching image of focal length carries out image co-registration, obtains the ultra-low illumination image after merging in front apparent direction;To overlook
The ultra-low illumination stitching image of the second focal length on the ultra-low illumination stitching image of the first focal length on direction and vertical view direction enters
Row image co-registration, obtains overlooking the ultra-low illumination image after merging on direction.In the present invention, by by front apparent direction
The ultra-low illumination stitching image of one focal length and the ultra-low illumination stitching image of the second focal length carry out image co-registration, make the angle of visual field narrower
The ultra-low illumination stitching image of the first focal length be fused in the ultra-low illumination stitching image of the second wider focal length of the angle of visual field, both
Reach the purpose of wide visual field angle, improve again the definition of image, and characteristics of image at a distance can be seen.
Step 105: gather the 3rd some width of focal length infrared image and the 4th some width of focal length infrared image respectively, perform step
Rapid 106;
In the present invention, when entering infrared mode, thermal infrared imaging camera can be used to adopt under the conditions of different focal
Collection image, as a rule, the biggest angle of visual field of focal length is the least, and the distance of shooting is the most remote, in the method, uses infrared thermal imaging to take the photograph
Camera gathers infrared image using the 3rd focal length, the 4th focal length as focal length value in front apparent direction (vertical view direction) respectively, wherein,
3rd focal length and the 4th focal length can differ, and the 3rd focal length can be 25mm, and the 4th focal length can be 12.3mm.To collect
Infrared image respectively as the 3rd focal length infrared image in front apparent direction (or overlook direction), front apparent direction (or vertical view side
To) on the 4th focal length infrared image.During carrying out image acquisition, in order to improve the definition of the image collected with
And the visual angle of image is the openst, some images that width focal length is identical, shooting angle is different can be gathered in front apparent direction;
Can also overlook on direction equally, gather some images that width focal length is identical, shooting angle is different.
Step 106: some width infrared images that focusing is identical respectively carry out image mosaic, obtains trifocal infrared
Stitching image and the infrared mosaic image of the 4th focal length;
In the present invention, the apparent direction part of the body cavity above the diaphragm housing the heart and lungs before some width is carried out image mosaic away from identical infrared image, obtains forward sight
Trifocal infrared mosaic image on direction and the infrared mosaic image of the 4th focal length in front apparent direction.Same method
Obtain overlooking the trifocal infrared mosaic image on direction and the infrared mosaic image of the 4th focal length on vertical view direction.
Step 107: the infrared mosaic image of trifocal infrared mosaic image and the 4th focal length is carried out image co-registration,
Infrared image after being merged, the infrared image after merging is sent to aircraft human pilot, display merge after infrared
Image, return terminates.
In the present invention, by the trifocal infrared mosaic image in front apparent direction and the 4th focal length in front apparent direction
Infrared mosaic image carry out image co-registration, obtain in front apparent direction merge after infrared image.In like manner, will overlook on direction
The infrared mosaic image of the 4th focal length on trifocal infrared mosaic image and vertical view direction carries out image co-registration, is bowed
Infrared image after merging in apparent direction.
In the present invention, above-mentioned processing mode makes the narrower trifocal infrared mosaic image co-registration of the angle of visual field to visual field
In the infrared mosaic image of the 4th focal length that angle is wider, both reached the purpose of wide visual field angle, improve again the definition of image,
And characteristics of image at a distance can be seen.
In the present invention, for the difference of brightness value, use infrared mode and ultra-low illumination type collection image, illumination bar
During part ideal, use ultra-low illumination type collection image.When illumination conditions difference, ratio is if any thick fog, haze, cloud layer, rain, snow, dark
Under the mal-conditions such as night, infrared mode is used to gather image, it is possible to collect image, understand environment, assisting in flying with this.
In the method, by image mosaic being obtained the image of the big angle of visual field, and then more image can be collected
Content;The ultra-low illumination stitching image of the first focal length and the ultra-low illumination stitching image of the second focal length are carried out image co-registration,
Ultra-low illumination image after fusion, the ultra-low illumination image after display fusion, as a rule, focal length value is the biggest, and visual angle is the least,
The distance of shooting is the most remote, in the present invention, images different for focal length is carried out image co-registration, obtains farther, more wide viewing angle with this
Image, increase picture material further.During it addition, be operated under infrared mode, power consumption is big, and cost is higher, because of
This, in the present invention, the mode using infrared mode to combine with ultra-low illumination pattern can effectively collect visual field clear, big
The image at angle, power consumption is low, cost-effective.
It should be noted that can also include in the method: whether detection forward sight shooting direction is in level, works as detection
To when being not in level, adjust orientation to level, it is possible to use gyroscope detects whether to be in level, when being in level, energy
Enough collect front view, top view, when being not in level, need to adjust to level, gather front view, top view the most again.
It addition, in the present invention, it is also possible to user is from main separation image acquisition modality.Can in aircraft takeoff, fly or land
Time, select the image acquisition modality of instruction to carry out next step operation according to user.In the present invention, can obtain and resolve use
The voice command of family input, selects corresponding image acquisition modality according to analysis result.
The position that user touches on screen can also be obtained, select corresponding image acquisition mould according to this positional information
Formula.The physical button triggered by user can certainly be detected, when the physical button being triggered is that ultra-low illumination pattern is corresponding
During button, perform step 102;When the physical button that user triggers is button corresponding to infrared mode, perform step 105.?
In the embodiment of the present invention, can will be used for gathering the thermal camera of infrared image, and be used for gathering ultra-low illumination image
Ultra-low illumination video camera, the most radially arranges, and with this broadening angle of visual field, makes the image f iotaeld-of-view angle collected wider.
Correspondingly, for realizing above-mentioned the method, also disclose the device of a kind of auxiliary security flight of correspondence, such as Fig. 6 institute
Showing, this device is applied to aircraft takeoff, gather image when flying and land, comprising: mode selection module 201, is used for
Detection brightness value, selects to enter ultra-low illumination pattern or infrared mode according to the testing result obtained;
In the present invention, mode selection module 201, specifically for detecting the brightness value of current environment, to the detection obtained
Result judges, if this testing result meets first pre-conditioned, then triggers ultra-low illumination image capture module 202;No
Then trigger infrared image acquisition module 205, wherein, first pre-conditioned can be more than 0.0001Lux.Furthermore it is also possible to root
Draw according to brightness value and carry out marking off more accurate pattern, be used in conjunction with each other with the video camera that this makes different types, reach more
Clearly, the image that content is more rich, corresponding, mode selection module 201, specifically for detecting the brightness value of current environment,
The testing result obtained is judged, when testing result is more than 0.1Lux, enters general mode, now can use optics
Pick-up lens gathers image;When testing result is less than or equal to 0.1Lux more than 0.01Lux, enters low-light (level) pattern, now may be used
To use low-illuminance cameras to gather image;When testing result enters moon lighting level mould more than 0.001Lux less than or equal to 0.01Lux
Formula, uses moon lighting level camera acquisition image;When testing result is less than or equal to 0.001Lux more than 0.0001Lux, enter super
Low-light (level) pattern, uses ultra-low illumination camera acquisition image;When testing result is less than or equal to 0.0001Lux, enter infrared
Pattern, uses thermal infrared imaging camera to gather image, and the method for various type collection images is identical, and here is omitted.
Ultra-low illumination image capture module 202, for when entering ultra-low illumination pattern, gathering the first focal length respectively ultralow
Illumination image, the second focal length ultra-low illumination image;
In the present invention, ultra-low illumination image capture module 202, it is used for using ultra-low illumination video camera in front apparent direction
Respectively using the first focal length, the second focal length as focal length value, gather ultra-low illumination image, wherein, the value of the first focal length and second
The value of focal length differs, and the first focal length value can be 25mm, and the second focal length value can be 8mm.The ultra-low illumination that will collect
Image is respectively as the second focal length ultra-low illumination figure in the first focal length ultra-low illumination image in front apparent direction, front apparent direction
Picture.Use ultra-low illumination video camera to overlook on direction respectively using the first focal length, the second focal length as focal length value, gather ultralow photograph
Degree image, using the ultra-low illumination image that collects as the first focal length ultra-low illumination image overlooked on direction, vertical view side
The second focal length ultra-low illumination image upwards.During carrying out image acquisition, in order to improve the clear of the image that collects
The visual angle of degree and image is the openst, can gather some figures that width focal length is identical, shooting angle is different in front apparent direction
Picture;Can also overlook on direction equally, gather some images that width focal length is identical, shooting angle is different.
Ultra-low illumination image mosaic module 203, carries out image mosaic for the ultra-low illumination image that focusing is identical respectively,
Obtain the ultra-low illumination stitching image of the first focal length and the ultra-low illumination stitching image of the second focal length;
In the present invention, ultra-low illumination image mosaic module 203, specifically for the front apparent direction part of the body cavity above the diaphragm housing the heart and lungs away from identical ultralow
Illumination image carries out image mosaic, obtains in ultra-low illumination stitching image and the front apparent direction of the first focal length in front apparent direction
The ultra-low illumination stitching image of the second focal length, say, that by the first focal length ultra-low illumination image in apparent direction before some width
It is spliced into piece image, obtains the ultra-low illumination stitching image of the first focal length in front apparent direction.By in apparent direction before some width
The second focal length ultra-low illumination image mosaic become piece image, obtain the ultra-low illumination spliced map of the second focal length in front apparent direction
Picture.
Ultra-low illumination image mosaic module 203, is specifically additionally operable to overlooking the ultra-low illumination image that on direction, focal length is identical
Carry out image mosaic, obtain the second focal length overlooked the ultra-low illumination stitching image of the first focal length on direction and overlook on direction
Ultra-low illumination stitching image, say, that the first focal length ultra-low illumination image mosaic that some width are overlooked on direction is become one
Width image, obtains overlooking the ultra-low illumination stitching image of the first focal length on direction.Some width are overlooked second on direction burnt
Become piece image away from ultra-low illumination image mosaic, obtain overlooking the ultra-low illumination stitching image of the second focal length on direction.
Ultra-low illumination image co-registration module 204, for by the ultra-low illumination stitching image of the first focal length and the second focal length
Ultra-low illumination stitching image carries out image co-registration, and the ultra-low illumination image after being merged, by the ultra-low illumination after described fusion
Image is sent to aircraft human pilot, the ultra-low illumination image after display fusion.
In the present invention, ultra-low illumination image co-registration module 204, specifically for surpassing the first focal length in front apparent direction
The ultra-low illumination stitching image of the second focal length on low-light (level) stitching image and front apparent direction carries out image co-registration, obtains forward sight side
Ultra-low illumination image after upwards merging;
Ultra-low illumination image co-registration module 204, is specifically additionally operable to spell the ultra-low illumination of the first focal length overlooked on direction
The ultra-low illumination stitching image of the second focal length on map interlinking picture and vertical view direction carries out image co-registration, obtains overlooking on direction and merges
After ultra-low illumination image.
Infrared image acquisition module 205, for when entering infrared mode, gather respectively the 3rd focal length infrared image, the
Four focal length infrared images, trigger infrared image concatenation module 206;
In the present invention, infrared image acquisition module 205, specifically for using thermal infrared imaging camera in front apparent direction
It is upper that respectively using the 3rd focal length, the 4th focal length as focal length value, collection infrared image, wherein, the 3rd focal length can be 25mm, the 4th
Focal length can be 12.3mm.Using the infrared image that collects as the 3rd focal length infrared image in front apparent direction, forward sight
The 4th focal length infrared image on direction.Use thermal infrared imaging camera overlook on direction respectively with the 3rd focal length, the 4th
Focal length, as focal length value, gathers infrared image, using red as the 3rd focal length overlooked on direction for the infrared image collected
Outer image, the 4th focal length infrared image overlooked on direction.During carrying out image acquisition, in order to improve the figure collected
The definition of picture and the visual angle of image are the openst, can gather that some width focal lengths are identical, shooting angle in front apparent direction
Different images;Can also overlook on direction equally, gather some images that width focal length is identical, shooting angle is different.
Infrared image concatenation module 206, carries out image mosaic for the infrared image that focusing is identical respectively, obtains the 3rd
The infrared mosaic image of focal length and the infrared mosaic image of the 4th focal length;
In the present invention, infrared image concatenation module 206, specifically for the front apparent direction part of the body cavity above the diaphragm housing the heart and lungs away from identical infrared image
Carry out image mosaic, obtain the red of the trifocal infrared mosaic image in front apparent direction and the 4th focal length in front apparent direction
Outer stitching image, say, that the 3rd focal length infrared image in apparent direction before some width is spliced into piece image, before obtaining
Trifocal infrared mosaic image in apparent direction.The 4th focal length infrared image in apparent direction before some width is spliced into one
Width image, obtains the infrared mosaic image of the 4th focal length in front apparent direction.
It is additionally operable to the infrared image to focal length on vertical view direction is identical and carries out image mosaic, obtain overlooking the 3rd on direction
The infrared mosaic image of the 4th focal length on the infrared mosaic image of focal length and vertical view direction, say, that some width are overlooked
The 3rd focal length infrared image on direction is spliced into piece image, obtains overlooking the trifocal infrared mosaic figure on direction
Picture.The 4th focal length infrared image that some width are overlooked on direction is spliced into piece image, obtains overlooking the 4th on direction burnt
Away from infrared mosaic image.
Infrared image Fusion Module 207, for by trifocal infrared mosaic image and the infrared mosaic of the 4th focal length
Image carries out image co-registration, the infrared image after being merged, and the infrared image after described fusion is sent to aircraft and drives
Personnel, the infrared image after display fusion.
In the present invention, infrared image Fusion Module 207, specifically for by the trifocal infrared spelling in front apparent direction
The infrared mosaic image of the 4th focal length on map interlinking picture and front apparent direction carries out image co-registration, after obtaining merging in front apparent direction
Infrared image.
Be additionally operable to the trifocal infrared mosaic image overlooked on direction and the 4th focal length overlooked on direction is red
Outer stitching image carries out image co-registration, obtains overlooking the infrared image after merging on direction.
In the present invention, by by trifocal infrared mosaic image and the infrared spelling of the 4th focal length in front apparent direction
Map interlinking picture carries out image co-registration, make the narrower trifocal infrared mosaic image co-registration of the angle of visual field to the angle of visual field wider the 4th
In the infrared mosaic image of focal length, both reached the purpose of wide visual field angle, and improve again the definition of image, and can see clearly
Chu's characteristics of image at a distance.For the difference of brightness value, use infrared mode and ultra-low illumination type collection image, illumination conditions
Time preferable, use ultra-low illumination type collection image.When illumination conditions difference, ratio is if any thick fog, haze, cloud layer, rain, snow, dark night
Deng under mal-condition, infrared mode is used to gather image, it is possible to collect image, understand environment with this, assisting in flying.It addition,
When being operated under infrared mode, power consumption is big, and cost is higher, the most in the present invention, uses infrared mode with ultralow
The mode that illumination pattern combines can effectively collect the image of the angle of visual field clear, big, and power consumption is low, cost-effective.
This device may also include that detection orientation module, is used for detecting whether forward sight shooting direction is in level, when detecting
When being not in level, adjust orientation to level.Concrete detection orientation module can be gyroscope, uses gyroscope to detect whether
It is in level, when being in level, gathers front view, top view, when being not in level, need to adjust to level, the most again
Gather front view, top view.By gathering front view picture and overhead view image, the mode both combined, by flight course
The picture showing of related angle is to operator, it is simple to safe flight.
Three, the device of methods and applications the method for aircraft landing safety is improved based on detection obstacle
For further assisting in flying, on the basis of foregoing teachings, the front view after merging, top view can be hindered
Hinder detection, after obstacle being detected, warning information can be generated, to warn aircraft operation personnel.It is specific as follows,
Disclose a kind of method improving landing safety based on detection obstacle as shown in Figure 7, comprise the following steps:
Step 1010: adjusting capture apparatus self orientation, making forward sight shooting direction is level;
In embodiments of the present invention, capture apparatus detects whether forward sight shooting direction is in level in real time, when being not in water
At ordinary times, adjusting self-position by servo platform, making forward sight shooting direction is level.Specifically gyroscope can be fixed on equipment
In, whether it is in level by gyroscope detection equipment.
Step 1020: detection current illumination value, it is judged that whether testing result meets ultra-low illumination treatment conditions, is, performs
Step 1030;Otherwise perform step 1050;
In this embodiment, detect current illumination value, the testing result obtained is judged, if this testing result symbol
Close ultra-low illumination treatment conditions, it is determined that enter ultra-low illumination pattern, perform step 1030;Otherwise determine entrance infrared mode,
Perform step 1050.Wherein, ultra-low illumination treatment conditions can be more than 0.0001Lux.
Step 1030: use some width front view pictures of ultra-low illumination camera collection current location and some width top views
Picture, obtains ultra-low illumination front view picture and ultra-low illumination overhead view image after splicing and fusion, then performs step 1040;
In an embodiment, before available 8 photographic head (+4 ultra-low illumination photographic head of 4 infrared thermal imaging cameras) gather
Visible image, 8 photographic head (+4 ultra-low illumination photographic head of 4 infrared thermal imaging cameras) gather overhead view image, same type
Photographic head focal length may be the same or different, such as gather front view picture 8 photographic head in, Jiao of two ultra-low illumination photographic head
Away from for 25mm, the focal length of two other ultra-low illumination photographic head is 8mm.
Use ultra-low illumination camera collection some width front view picture and some width overhead view images;Wherein, front view picture is
At the image of front apparent direction overlying lid the first predetermined angle scope, overhead view image is for covering the second predetermined angle on vertical view direction
The image of scope.
Optionally, front view picture identical for focal length is divided into one group, spells often organizing the identical front view picture of focal length
It is connected into piece image, obtains the first splicing front view picture that this focal length is corresponding.Then, the first splicing front view to different focal
As carrying out image co-registration, finally it is fused into piece image, as ultra-low illumination front view picture.
Optionally, overhead view image identical for focal length is divided into one group, spells often organizing the identical overhead view image of focal length
It is connected into piece image, obtains the first splicing overhead view image that this focal length is corresponding.First splicing overhead view image of different focal is entered
Row image co-registration, is finally fused into piece image, as ultra-low illumination overhead view image.
In the technical scheme provided, different images can be gathered for different state of flights, assist peace with this
Full flight.State of flight can be detected when testing result meets ultra-low illumination treatment conditions, if the takeoff condition of being in or fall
The state that falls or state of flight, then by ultra-low illumination front view picture and the ultra-low illumination of ultra-low illumination camera collection current location
Overhead view image.
Step 1040: ultra-low illumination front view picture and ultra-low illumination overhead view image after merging splicing respectively carry out obstacle
Detection, when detect there is obstacle in ultra-low illumination front view picture and/or ultra-low illumination overhead view image time, generate and send alarm
Information, return terminates.
In this embodiment, carrying out obstacle detection can be by judging that ultra-low illumination front view picture and/or ultra-low illumination are bowed
Whether visible image exists image characteristic point judge, when determine there is image characteristic point time, extract image characteristic point, according to figure
As characteristic point draws obstacle image, and labelling obstacle figure in corresponding ultra-low illumination front view picture and ultra-low illumination overhead view image
Picture;When determine there is not image characteristic point time, determine and there is not obstacle.
By the positional information of satellite fix obstacle, according to the positional information of obstacle, and it is marked with the super of obstacle image
Low-light (level) front view picture and ultra-low illumination overhead view image, generate warning information, send a warning message.
Step 1050: use infrared thermal imaging camera to gather some width front view pictures and some width vertical view of current location
Image, obtains infrared level front view picture and infrared level overhead view image after splicing and fusion, performs step 1060;
In an embodiment, infrared thermal imaging camera is used to gather some width front view pictures and some width overhead view images, its
In, front view picture is the image in front apparent direction overlying lid the first predetermined angle scope, and overhead view image is to overlook direction overlying
Cover the image of the second predetermined angle scope.
Optionally, front view picture identical for focal length is divided into one group, spells often organizing the identical front view picture of focal length
It is connected into piece image, obtains the second splicing front view picture that this focal length is corresponding.Second splicing front view picture of different focal is entered
Row image co-registration, is finally fused into piece image, as infrared level front view picture.
Optionally, overhead view image identical for focal length is divided into one group, spells often organizing the identical overhead view image of focal length
It is connected into piece image, obtains the second splicing overhead view image that this focal length is corresponding.Second splicing overhead view image of different focal is entered
Row image co-registration, is finally fused into piece image, as infrared level overhead view image.
In an embodiment, different images can be gathered for different state of flights, carry out auxiliary security flight with this.Can
With when testing result does not meets ultra-low illumination treatment conditions, detect state of flight, if the takeoff condition of being in or landing state
Or state of flight, then by infrared level front view picture and the infrared level overhead view image of infrared level camera collection current location.
Step 1060: respectively the infrared level front view picture after splicing and fusion and infrared level overhead view image are carried out obstacle inspection
Survey, when detect there is obstacle in infrared level front view picture and/or infrared level overhead view image time, generate and send warning information.
In an embodiment, carry out the method for obstacle detection ibid, can be by judging infrared level front view picture and/or infrared
Whether level overhead view image exists image characteristic point to judge whether obstacle, when determine there is image characteristic point time, extract
Image characteristic point, according to characteristics of image point-rendering obstacle image, and at corresponding infrared level front view picture and infrared level top view
Labelling obstacle image in Xiang;When determine there is not image characteristic point time, determine and there is not obstacle.
By the positional information of satellite fix obstacle, according to the positional information of obstacle, and it is marked with the red of obstacle image
Outer level front view picture and infrared level overhead view image, generate warning information, send a warning message.In an embodiment, it is also possible to display
This warning information, and when touch screen event being detected, amplify obstacle image.
For realizing said method, amplify out the equipment improving landing safety based on detection obstacle that the method is corresponding, as
Shown in Fig. 8, this equipment includes:
Orientation adjustment module 2010, is used for adjusting self orientation, and making forward sight shooting direction is level;
In embodiments of the present invention, whether orientation adjustment module 2010 is in water for detection forward sight shooting direction in real time
Flat, when being not in level, adjust self-position by servo platform, making forward sight shooting direction is level.Orientation adjustment module
2010 can be specifically gyroscope, and it is fixing in a device, is used for whether the equipment that detects is in level.
Luminance detection module 2020, is used for detecting current illumination value, it is judged that whether testing result meets ultra-low illumination processes
Condition;
In embodiments of the present invention, luminance detection module 2020, specifically for detection current illumination value, to the detection obtained
Result judges, if this testing result meets ultra-low illumination treatment conditions, then triggers starlight acquisition module 2030;Otherwise touch
Rubescent outer acquisition module 2050.Wherein, ultra-low illumination treatment conditions can be more than 0.0001Lux.
Starlight acquisition module 2030, for when testing result meets ultra-low illumination treatment conditions, then passes through ultra-low illumination
The ultra-low illumination front view picture of camera collection current location and ultra-low illumination overhead view image also carry out splicing and fusion treatment, or
Person's splicing and fusion treatment function can be completed by other separate modular;
In embodiments of the present invention, starlight acquisition module 2030 is used for gathering some width front view pictures and some width top views
Picture;Wherein, front view picture is the image in front apparent direction overlying lid the first predetermined angle scope, and overhead view image is to overlook direction
The image of upper covering the second predetermined angle scope.
Selectively, front view picture identical for focal length also can be divided into by starlight acquisition module 2030 or other separate modular
One group, the front view picture often organizing focal length identical is spliced into piece image, obtains the first splicing front view picture that this focal length is corresponding.
First splicing front view picture of different focal is carried out image co-registration, is finally fused into piece image, as ultra-low illumination forward sight
Image.
Overhead view image identical for focal length is divided into one group, is spliced into a width figure by often organizing the identical overhead view image of focal length
Picture, obtains the first splicing overhead view image that this focal length is corresponding.First splicing overhead view image of different focal is carried out image co-registration,
Finally it is fused into piece image, as ultra-low illumination overhead view image.
In the technical scheme provided, different images can be gathered for different state of flights, assist peace with this
Full flight.State monitoring module can be triggered when testing result meets ultra-low illumination treatment conditions, is used for detecting flight shape
State, if the takeoff condition of being in, then by the ultra-low illumination front view picture of ultra-low illumination camera collection current location;If place
In landing state, then by the ultra-low illumination overhead view image of ultra-low illumination camera collection current location;If being in flight shape
State, then by ultra-low illumination front view picture and the ultra-low illumination overhead view image of ultra-low illumination camera collection current location.
Starlight obstacle detection module 2040, for respectively to the ultra-low illumination front view picture and ultralow after splicing is merged
Illumination overhead view image carries out obstacle detection, exists when detecting in ultra-low illumination front view picture and/or ultra-low illumination overhead view image
During obstacle, generate and send warning information;
In an embodiment, starlight obstacle detection module 2040 can be by judging ultra-low illumination front view picture and/or ultralow
Whether illumination overhead view image exist image characteristic point to judge whether obstacle, when determine there is image characteristic point time, carry
Take image characteristic point, according to characteristics of image point-rendering obstacle image, and in corresponding ultra-low illumination front view picture and ultra-low illumination
Labelling obstacle image in overhead view image;When determine there is not image characteristic point time, determine and there is not obstacle.
Starlight obstacle detection module 2040, is additionally operable to the positional information by satellite fix obstacle, according to the position of obstacle
Information, and it is marked with ultra-low illumination front view picture and the ultra-low illumination overhead view image of obstacle image, generate warning information, send
Warning information.
Infrared collecting module 2050, for when testing result does not meets ultra-low illumination treatment conditions, then passes through infra-red heat
Imaging camera head gathers the infrared level front view picture of current location and infrared level overhead view image and carries out splicing and fusion treatment, or
Person's splicing and fusion treatment function can be completed by other separate modular;
In embodiments of the present invention, before infrared collecting module 2050 is used for using infrared thermal imaging camera to gather some width
Visible image and some width overhead view images, wherein, front view picture is the image in front apparent direction overlying lid the first predetermined angle scope,
Overhead view image is to overlook the image covering the second predetermined angle scope on direction.
Selectively, front view picture identical for focal length also can be divided into by infrared collecting module 2050 or other separate modular
One group, the front view picture often organizing focal length identical is spliced into piece image, obtains the second splicing front view picture that this focal length is corresponding.
Second splicing front view picture of different focal is carried out image co-registration, is finally fused into piece image, as infrared level front view
Picture.
In like manner, also overhead view image identical for focal length can be divided into one group, the overhead view image splicing identical by often organizing focal length
Become piece image, obtain the second splicing overhead view image that this focal length is corresponding.Second splicing overhead view image of different focal is carried out
Image co-registration, is finally fused into piece image, as infrared level overhead view image.
In embodiments of the present invention, different images can be gathered for different state of flights, carry out auxiliary security with this
Flight.This device can include state detection module, and for when testing result does not meets ultra-low illumination treatment conditions, detection flies
Row state, if the takeoff condition of being in, then by the infrared level front view picture of infrared level camera collection current location;If place
In landing state, then by the infrared level overhead view image of infrared level camera collection current location;When being in state of flight, then
By infrared level front view picture and the infrared level overhead view image of infrared level camera collection current location.
Infrared obstacle detection module 2060, infrared level front view picture and infrared level after merging splicing respectively are overlooked
Image carries out obstacle detection, when detect there is obstacle in infrared level front view picture and/or infrared level overhead view image time, generate also
Send a warning message.
In an embodiment, infrared obstacle detection module 2060 is for by judging infrared level front view picture and/or infrared level
Whether overhead view image exist image characteristic point to judge whether obstacle, when determine there is image characteristic point time, extract figure
As characteristic point, according to characteristics of image point-rendering obstacle image, and at corresponding infrared level front view picture and infrared level overhead view image
Middle labelling obstacle image;When determine there is not image characteristic point time, determine and there is not obstacle.
It is additionally operable to the positional information by satellite fix obstacle, according to the positional information of obstacle, and is marked with obstacle figure
The infrared level front view picture of picture and infrared level overhead view image, generate warning information, send a warning message.It should be noted that
In the embodiment of the present invention, it is also possible to include touch screen module, it is used for showing this warning information, and when touch screen event being detected,
Amplify obstacle image.The ultra-low illumination photographic head mentioned in the embodiment of the present invention is specially low-light (level) ultra-low illumination photographic head.
The technique effect that this portion of techniques scheme obtains is: equipment, by adjusting self orientation, makes the forward sight shooting direction be
Level, detects current illumination value, selects to use ultra-low illumination photographic head or infrared thermal imaging shooting according to the testing result obtained
Head gathers front view picture and the overhead view image of current location respectively, respectively front view picture and overhead view image is carried out obstacle detection,
When obstacle being detected, generate and send warning information.Use infrared thermal imaging camera and ultra-low illumination photographic head to coordinate to make
Method, it is possible under conditions of light condition extreme difference, even 0Lux, use infrared thermal imaging camera gather artwork master
Picture, detects obstacle, it is possible to effectively early warning obstacle.Further, light condition meets ultra-low illumination condition when, use super
Low-light (level) camera collection coloured image, the image of acquisition is apparent, true to nature, advantageously in alarm obstruction, thus improves and flies
Row safety.
Four, collision free obstacle based on data syn-chronization method and application the method aircraft
As it is shown in figure 9, a kind of method that present invention also offers collision free obstacle based on data syn-chronization, including:
Step 401: aircraft obtains current positional information, obtains environmental information request according to positional information tissue, and
Environmental information request will be obtained send to data center;
In an embodiment, aircraft obtains longitude, latitude and height above sea level by global position system, by longitude, latitude
Degree and height above sea level, as current location information, specifically can send Location Request to GPS or big-dipper satellite alignment system, connect
Receive longitude, latitude and the height above sea level returned.
Step 402: when receiving the environmental information response that data center returns, obtain obstacle from environmental information responds
Information, as history complaint message, obtains flight parameter, as history flight parameter from environmental information responds;
Data center's storage has the environmental information corresponding with positional information to respond, and comprises the position letter of obstacle in environmental response
Breath, dimension information, mobile attribute information and flight parameter when walking around this obstacle.In this programme, aircraft can be from connecing
In the environmental information response received, obtain the positional information of obstacle, dimension information, mobile attribute information, the position of obstacle is believed
Breath, dimension information and mobile attribute information as history complaint message, obtain from this environmental response flight parameter, as
History flight parameter.
Step 403: gather ambient image according to the state of flight of self, obtain current complaint message from ambient image,
According to current complaint message, history complaint message and history flight parameter, this obstacle of labelling in ambient image, display is marked
The ambient image of note obstacle;
For aircraft, there are four kinds of state of flights, i.e. take off, fly, land and static, fly for different
The ambient image assisting in flying that row state acquisition is different.Aircraft gathers the method for image: obtain the flight shape of self
State, if takeoff condition or land state or state of flight, then collection front view picture, overhead view image (see foregoing teachings one
With content two part), using the image that collects as ambient image.Obstacle detection is carried out of course for omnibearing, it is also possible to
Gather upward view picture, left view picture, right view picture and rearview picture such that it is able to the omnibearing environment observing surrounding,
The obstacle avoiding other sides to move upward hinders flight safety.
As it was previously stated, in order to improve the definition of the image collected and cost-effective, when gathering ambient image, can
To detect current illumination value, select corresponding image pickup mode to gather ambient image according to the testing result obtained, work as testing result
When meeting ultra-low illumination treatment conditions, then by ultra-low illumination camera collection ambient image;When testing result does not meets ultralow
During illumination treatment conditions, then gather ambient image by infrared thermal imaging camera.Wherein, ultra-low illumination treatment conditions can be
Brightness value more than 0.0001Lux.
Current complaint message can be obtained from ambient image, for adopting in the same direction after getting environmental information
The ambient image that collection arrives, carries out image mosaic by the ambient image of identical focal length, is spliced into piece image.By different in the direction
The spliced image of focal length carries out image co-registration, the image after being merged, and the image after merging extracts characteristic point, when
When extracting characteristic point, draw obstacle (this process is same or similar with foregoing teachings three part), and labelling barrier according to characteristic point
Hinder, obtained the positional information of obstacle by global position system, according to positional information and the image of labelling obstacle of obstacle, meter
Calculate the dimension information of obstacle, mobile attribute information, using the positional information of obstacle, dimension information, mobile attribute information as currently
Complaint message.Then, it is judged that current complaint message is the most identical with history complaint message, it is that then in ambient image, labelling hinders
Hinder, and point out history flight parameter;Otherwise according to current complaint message labelling obstacle in ambient image.
This aspect in invention, it is also possible to draw the motion track of obstacle on the image of labelling obstacle, can be every in advance
If obstacle of timi requirement, when location obstacle every time, obtain the height above sea level residing for obstacle, the angle of pitch, azimuth, and obtain
The displacement of obstacle during double location obstacle;
According to the displacement of the height above sea level residing for obstacle, the angle of pitch, azimuth and obstacle, calculate the track letter that obstacle moves
Breath, and the direction of motion of obstacle and movement velocity;
The trace information moved according to obstacle and the direction of motion of obstacle and movement velocity, analog obstacle is in three dimensions
The track that will move;
Draw the trace information that obstacle moves and the track that will move in three dimensions.
Step 404: when avoiding obstacle, obtains the flight parameter avoiding in obstructive process, joins as current flight
Number, sends current location information, current flight parameter and current complaint message to data center.
In this aspect of invention, data center receives current location information, current flight parameter and current obstacle letter
During breath, obtain the flight parameter corresponding with current location information and complaint message, replace original flying by current flight parameter
Line parameter, replaces original complaint message with current complaint message.
It should be noted that in the embodiment of the present invention, aircraft does not receives the environmental information response that data center returns
Time, ambient image can be gathered according to the state of flight of self, from ambient image, obtain current complaint message, according to current barrier
Hinder information this obstacle of labelling in ambient image, show the ambient image of marked obstacle, when avoiding obstacle, obtain and avoid barrier
Flight parameter during hindering, as current flight parameter, by current location information, current flight parameter and currently hinders
Information is hindered to send to data center.In embodiments of the present invention, the video camera for ambient image radially can be arranged,
With this broadening angle of visual field, make the image f iotaeld-of-view angle collected wider.
On the other hand, in the technical scheme that the present invention provides, the purpose identical for reaching the present invention, it is also possible to by step
401 to step 403 replaces with step a1 to step a3.
Step a1: aircraft gathers ambient image according to the state of flight of self, obtains current obstacle from ambient image
Information, is sent to data center by ambient image and/or current complaint message;
Detailed description of the invention in this step is identical with the mode described in above-mentioned steps 403, and here is omitted.
Step a2: when receiving the environmental information response that data center returns, obtain obstacle from environmental information responds
Information, as history complaint message, obtains flight parameter, as history flight parameter from environmental information responds;
This step obtains from environmental information responds complaint message and from environmental information responds, obtains flight parameter
Detailed description of the invention identical with the mode described in above-mentioned steps 402, here is omitted.
Step a3: according to current complaint message, history complaint message and history flight parameter, labelling in ambient image
This obstacle, shows the ambient image of marked obstacle, performs step 404.
Labelling obstacle in this step and show the detailed description of the invention of ambient image and the step 403 of marked obstacle
Described in mode identical, here is omitted.
Corresponding with said method, as shown in Figure 10, the present invention also provides for a kind of collision free based on data syn-chronization
The aircraft of obstacle, including:
Information request module 4010, for obtaining current positional information, obtains environmental information according to positional information tissue
Request, and environmental information request transmission will be obtained to data center;
In this aircraft, information request module 4010 is for obtaining longitude, latitude and sea by global position system
Degree of lifting, using longitude, latitude and height above sea level as current location information.Can send out to GPS or big-dipper satellite alignment system
Send Location Request, receive longitude, latitude and the height above sea level returned.Environmental information request is obtained according to positional information tissue,
And send it to data center.
Information receiving module 4020, for when receiving the environmental information response that data center returns, from environmental information
Response obtains complaint message, as history complaint message, from environmental information responds, obtains flight parameter, as
History flight parameter;
In this aircraft, information receiving module 4020, from the environmental information response received, obtain the position of obstacle
Information, dimension information, mobile attribute information, hinder the positional information of obstacle, dimension information and mobile attribute information as history
Hinder information, from this environmental response, obtain flight parameter, as history flight parameter.
Obstacle detection module 4030 gathers ambient image for the state of flight according to self, obtains and work as from ambient image
Front complaint message, according to current complaint message, history complaint message and history flight parameter, this barrier of labelling in ambient image
Hinder, show the ambient image of marked obstacle;
Data simultaneous module 4040, for when avoiding obstacle, obtains the flight parameter avoiding in obstructive process, is made
For current flight parameter, current location information, current flight parameter and current complaint message are sent to data center.
Herein, as shown in figure 11, obstacle detection module 4030, it may include:
State detection unit 2031, for obtaining the state of flight of self;
Ambient image collecting unit 2032, in takeoff condition or landing state or during state of flight, gathering front view
Picture and overhead view image, as ambient image;
Obstacle detection unit 2033, for can obtain current obstacle letter after getting environmental information from ambient image
Breath, for the ambient image collected in the same direction, carries out image mosaic by the ambient image of identical focal length, is spliced into one
Width image.The spliced image of different focal in the direction is carried out image co-registration, and the image after being merged, after merging
Image on extract characteristic point, when extracting characteristic point, draw obstacle, and labelling obstacle according to characteristic point, pass through satellite
Position system obtains the positional information of obstacle, according to positional information and the image of labelling obstacle of obstacle, calculates the size of obstacle
Information, mobile attribute information, using the positional information of obstacle, dimension information, mobile attribute information as current complaint message;Herein
Image acquisition, splice and fusion sees foregoing teachings one, two and three part, same or similar, do not repeat them here.
Labelling obstacle unit 2034, is used for judging that current complaint message is the most identical with history complaint message, is then at ring
Labelling obstacle on the image of border, and point out history flight parameter;Otherwise according to current complaint message labelling barrier in ambient image
Hinder.
It addition, obstacle detection module 4030 can also draw the motion track of obstacle on the image of labelling obstacle, its bag
Include: obstacle positioning unit, for an obstacle can be positioned every Preset Time, when location obstacle every time, obtain obstacle institute
The height above sea level at place, the angle of pitch, azimuth, and the displacement of obstacle when obtaining double location obstacle;Computing unit, for basis
The displacement of height above sea level, the angle of pitch, azimuth and obstacle residing for obstacle, the trace information that calculating obstacle moves, and obstacle
The direction of motion and movement velocity;Analog track unit, for the trace information that moves according to obstacle and the direction of motion of obstacle and
Movement velocity, the track that analog obstacle will move in three dimensions;Track drafting unit, for drawing the rail that obstacle moves
Mark information and the track that will move in three dimensions.
In this device, obstacle detection module 4030 can also include: mode checking unit, for gathering ambient image
Time, detect current illumination value, select corresponding image pickup mode to gather ambient image according to the testing result obtained, work as testing result
When meeting ultra-low illumination treatment conditions, then by ultra-low illumination camera collection ambient image;Wherein, ultra-low illumination treatment conditions
It can be the brightness value more than 0.0001Lux.When testing result does not meets ultra-low illumination treatment conditions, then become by infra-red heat
As camera collection ambient image.
In technical solution of the present invention, data center receives current location information, current flight parameter and currently hinders
When hindering information, obtain the flight parameter corresponding with current location information and complaint message, replace by current flight parameter original
Flight parameter, replace original complaint message with current complaint message.
What the contents of the section technical scheme obtained has the beneficial effect that aircraft obtains according to self present position information tissue
Environmental information is asked, and sends it to data center;When receiving the environmental information response that data center returns, from environment
Information response obtains history complaint message, and history flight parameter, gathers ambient image according to the state of flight of self, from
Ambient image obtains current complaint message, according to current complaint message, history complaint message and history flight parameter, at ring
This obstacle of labelling on the image of border, shows the ambient image of marked obstacle, flies with this assisting in flying device, is additionally avoiding barrier
Hinder, obtain the flight parameter avoiding in obstructive process, as current flight parameter, by current location information, current flight
Parameter and current complaint message transmission are backed up to data center, in order to the reference when subsequent flights.Further, this
In the technical scheme of bright offer, also by infrared camera and ultra-low illumination photographic head with the use of, gather ambient image, protect with this
The image that card gets is clear, and is not affected by factors such as illumination, mist, haze, rain, snow.
It should be noted that the present invention each embodiment tetrameric all uses the mode gone forward one by one or partly repeat to describe,
What each embodiment stressed is the difference with other embodiments, and between each embodiment, identical similar part is mutual
See mutually.For system embodiment, due to itself and embodiment of the method basic simlarity, so describe is fairly simple, relevant
Part sees the part of embodiment of the method and illustrates.
Also, it should be noted in the present invention, the relational terms of such as first and second or the like is used merely to one
Individual entity or operation separate with another entity or operating space, and not necessarily require or imply these entities or operate it
Between exist any this reality relation or order.And, term " includes ", " comprising " or its any other variant are intended to
Contain comprising of nonexcludability, so that include that the process of a series of key element, method, article or equipment not only include those
Key element, but also include other key elements being not expressly set out, or also include for this process, method, article or set
Standby intrinsic key element.In the case of there is no more restriction, statement " including ... " key element limited, it is not excluded that
Other identical element is there is also in including the process of described key element, method, article or equipment.
Presently disclosed embodiment, is used for making those skilled in the art realize or using the present invention.To these embodiments
Multiple amendment or modification will be apparent from for a person skilled in the art.General Principle described herein can be
In the case of the spirit or scope of the present invention, realize in other embodiments.Therefore, the present invention is not intended to be limited to
Embodiment described herein, and it is to fit to the widest model consistent with principles disclosed herein and novel features
Enclose.
Claims (36)
1. having an aircraft for flight DAS (Driver Assistant System), including video acquisition unit, manipulation unit, manipulation unit includes
Industrial computer and display unit, described industrial computer includes image capture module and image processing module, it is characterised in that described video
Collecting unit is positioned at below aircraft, and it includes forward sight unit and overlooks unit, and described forward sight unit and described vertical view unit divide
Do not absorb the image below described aircraft front and described aircraft, and respectively the image of picked-up is sent to image acquisition mould
Block, the view data of acquisition is sent transmission display unit after processing to image processing module and enters by described image capture module
Row display.
2. aircraft as claimed in claim 1, it is characterised in that described forward sight unit includes imaging lens, visible light microscopic
Head, is respectively used to absorb the graphic images in described aircraft front, visible images;Overlook unit include imaging lens,
Visible light lens, is respectively used to absorb the graphic images below described aircraft, visible images.
3. aircraft as claimed in claim 1, it is characterised in that described forward sight unit include the first forward sight imaging lens and
Second forward sight imaging lens and the first forward sight ultra-low illumination camera lens and the second forward sight ultra-low illumination camera lens, be respectively used to take the photograph
Take graphic images and the visible images in described aircraft front, described vertical view unit include the first vertical view imaging lens and
Second overlooks imaging lens and first overlooks ultra-low illumination camera lens and the second vertical view ultra-low illumination camera lens, is respectively used to take the photograph
Take the graphic images below described aircraft and visible images.
4. aircraft as claimed in claim 1, it is characterised in that described forward sight unit and described vertical view unit are by individually or become
The imaging lens of group, and independent or groups of visible light lens composition, described imaging lens or the quantity of visible light lens
It is 1,2 or 4.
5. aircraft as claimed in claim 3, it is characterised in that two imaging lens of described forward sight unit are same focal length
Camera lens, two ultra-low illumination camera lenses of described forward sight unit are same focal length lens, two imaging lens of described vertical view unit
For same focal length lens, two ultra-low illumination camera lenses of described vertical view unit are same focal length lens;
Described image processing module includes image mosaic submodule, for being taken the photograph two imaging lens of described forward sight unit
The image taken splices, and the image being absorbed two ultra-low illumination camera lenses of described forward sight unit splices, to described
The image that two imaging lens of vertical view unit are absorbed splices, and is taken the photograph two the ultra-low illumination camera lenses overlooking unit
The image taken splices.
6. aircraft as claimed in claim 3, it is characterised in that two imaging lens of described forward sight unit are different burnt
Away from camera lens, two ultra-low illumination camera lenses of described forward sight unit are different focal camera lens, two thermal imagings of described vertical view unit
Camera lens is different focal camera lens, and two ultra-low illumination camera lenses of described vertical view unit are different focal camera lens;
Described image processing module includes image co-registration submodule, for being taken the photograph two imaging lens of described forward sight unit
The image taken merges, and the image being absorbed two ultra-low illumination camera lenses of described forward sight unit merges, to described
The image that two imaging lens of vertical view unit are absorbed merges, two ultra-low illumination camera lenses to described vertical view unit
The image absorbed merges.
7. aircraft as claimed in claim 1, it is characterised in that described forward sight unit include two 25mm imaging lens,
Two 12.3mm imaging lens, two 25mm ultra-low illumination camera lenses, two 8mm ultra-low illumination camera lenses, described vertical view unit bag
Include two 25mm imaging lens, two 12.3mm imaging lens, two 25mm ultra-low illumination camera lenses, two ultralow photographs of 8mm
Degree camera lens, described imaging lens is used for absorbing graphic images, and described ultra-low illumination camera lens is used for absorbing visible images;Institute
Stating image processing module and include image mosaic submodule, image co-registration submodule, described image mosaic submodule is for described right
The image that two 25mm imaging lens of forward sight unit are absorbed splices, two 12.3mm heat to described forward sight unit
The image that imaging lens is absorbed splices, the image being absorbed two 25mm ultra-low illumination camera lenses of described forward sight unit
Splicing, the image being absorbed two 8mm ultra-low illumination camera lenses of described forward sight unit splices, to described overlook single
The image that two 25mm imaging lens of unit are absorbed splices, two 12.3mm thermal imaging mirrors to described vertical view unit
The image that absorbed of head splices, and the image being absorbed two 25mm ultra-low illumination camera lenses of described vertical view unit is spelled
Connecing, the image being absorbed two 8mm ultra-low illumination camera lenses of described vertical view unit splices;Described image co-registration submodule
Image after the image mosaic that two 25mm imaging lens of described forward sight unit are absorbed and described forward sight unit
The image mosaic absorbed of two 12.3mm imaging lens after image merge, two to described forward sight unit
Two 8mm ultra-low illumination camera lens institutes of the image after the image mosaic that 25mm ultra-low illumination camera lens is absorbed and described forward sight unit
Image after the image mosaic of picked-up merges, the image being absorbed two 25mm imaging lens of described vertical view unit
Image after the image mosaic that two 12.3mm imaging lens of spliced image and described vertical view unit are absorbed is carried out
Merge, the image after the image mosaic that two 25mm ultra-low illumination camera lenses of described vertical view unit are absorbed with described overlook single
Image after the image mosaic that two 8mm ultra-low illumination camera lenses of unit are absorbed merges.
8. the aircraft as described in any one of claim 1-7, it is characterised in that described video acquisition unit also includes gyro
Instrument, servo platform, for making the shooting direction holding level of forward sight unit and overlooking the shooting direction holding of unit vertically.
9. the aircraft as described in any one of claim 1-7, it is characterised in that described industrial computer also includes memory module, uses
Video image after processing image processing module carries out real-time storage.
10. the aircraft as described in any one of claim 1-7, it is characterised in that described industrial computer also includes playback module, uses
In the video image stored carries out playback display, and video selects module, is used for selecting graphic images or visible ray figure
As output to display unit.
The method of 11. 1 kinds of assisting in flying device safe flights, comprises the following steps:
Detection brightness value, judges enter ultra-low illumination pattern or enter infrared mode according to the testing result obtained;
When entering described ultra-low illumination pattern, gather the first focal length some width of ultra-low illumination image respectively and the second focal length is ultralow
The some width of illumination image, the ultra-low illumination image that focusing is identical respectively carries out image mosaic, obtains the ultralow photograph of the first focal length
Degree stitching image and the ultra-low illumination stitching image of the second focal length;
Then the ultra-low illumination stitching image of the first focal length and the ultra-low illumination stitching image of the second focal length are carried out image co-registration,
Ultra-low illumination image after being merged, is sent to aircraft human pilot by the ultra-low illumination image after described fusion, display
Ultra-low illumination image after fusion, return terminates;
When entering described infrared mode, gather the 3rd some width of focal length infrared image respectively and the 4th focal length infrared image is some
Width, the infrared image that focusing is identical respectively carries out image mosaic, obtains trifocal infrared mosaic image and the 4th focal length
Infrared mosaic image;
Then the infrared mosaic image of trifocal infrared mosaic image and the 4th focal length is carried out image co-registration, merged
After infrared image, the infrared image after described fusion is sent to aircraft human pilot, display merge after infrared image,
Return terminates.
The method of 12. assisting in flying device safe flights according to claim 11,
Described gather the first focal length ultra-low illumination image and the second focal length ultra-low illumination image respectively, including: gather forward sight respectively
The second focal length ultra-low illumination image in the first focal length ultra-low illumination image on direction, front apparent direction, gathers and overlooks on direction
The first focal length ultra-low illumination image, the second focal length ultra-low illumination image overlooked on direction;
The identical ultra-low illumination image of described focusing respectively carries out image mosaic, obtains the ultra-low illumination spliced map of the first focal length
The ultra-low illumination stitching image of picture and the second focal length, including: the described front apparent direction part of the body cavity above the diaphragm housing the heart and lungs is entered away from identical ultra-low illumination image
Row image mosaic, obtains the second focal length in the ultra-low illumination stitching image of the first focal length in front apparent direction and front apparent direction
Ultra-low illumination stitching image;And the ultra-low illumination image that focal length on described vertical view direction is identical is carried out image mosaic, obtain
The ultra-low illumination stitching image overlooking the first focal length on direction and the ultra-low illumination spliced map of the second focal length overlooked on direction
Picture;
The ultra-low illumination stitching image of the described ultra-low illumination stitching image by described first focal length and described second focal length is carried out
Image co-registration, the ultra-low illumination image after being merged, including: the ultra-low illumination of the first focal length in described front apparent direction is spelled
The ultra-low illumination stitching image of the second focal length on map interlinking picture and described front apparent direction carries out image co-registration, obtains in front apparent direction
Ultra-low illumination image after fusion;By ultra-low illumination stitching image and the described vertical view side of the first focal length on described vertical view direction
The ultra-low illumination stitching image of the second focal length upwards carries out image co-registration, obtains overlooking the ultra-low illumination figure after merging on direction
Picture.
The method of 13. assisting in flying device safe flights according to claim 11,
Described collection the 3rd focal length infrared image and the 4th focal length infrared image, including: gather the 3rd in front apparent direction respectively
The 4th focal length infrared image in focal length infrared image, front apparent direction, gathers the 3rd focal length infrared image on vertical view direction, bows
The 4th focal length infrared image in apparent direction;
The identical infrared image of described focusing respectively carries out image mosaic, obtains trifocal infrared mosaic image and the 4th
The infrared mosaic image of focal length, including: the described front apparent direction part of the body cavity above the diaphragm housing the heart and lungs is carried out image mosaic away from identical infrared image, before obtaining
Trifocal infrared mosaic image in apparent direction and the infrared mosaic image of the 4th focal length in front apparent direction;Bow to described
The infrared image that in apparent direction, focal length is identical carries out image mosaic, obtains overlooking the trifocal infrared mosaic image on direction
Infrared mosaic image with the 4th focal length overlooking direction;
The described infrared mosaic image by described trifocal infrared mosaic image and described 4th focal length carries out image co-registration,
Infrared image after being merged, including: by trifocal infrared mosaic image and described forward sight in described front apparent direction
The infrared mosaic image of the 4th focal length on direction carries out image co-registration, obtains the infrared image after merging in front apparent direction;With
And by the trifocal infrared mosaic image on described vertical view direction and the infrared spelling of the 4th focal length on described vertical view direction
Map interlinking picture carries out image co-registration, obtains overlooking the infrared image after merging on direction.
The method of 14. assisting in flying device safe flights according to claim 11, described detection brightness value, according to obtain
Testing result selects to enter ultra-low illumination pattern or infrared mode, including: the brightness value that described detection is current, if the inspection obtained
Survey result and meet first pre-conditioned, then enter ultra-low illumination pattern;Otherwise enter infrared mode.
The method of 15. assisting in flying device safe flights according to claim 11, described method also includes: detection forward sight is clapped
Take the photograph whether direction is in level, when detect be not in level time, adjust orientation to level.
The device of 16. 1 kinds of assisting in flying device safe flights, it is characterised in that described device is installed on board the aircraft, described
Aircraft takeoff, gather image when flying and land, including:
Mode selection module, is used for detecting brightness value, selects to enter ultra-low illumination pattern or infrared according to the testing result obtained
Pattern;
Ultra-low illumination image capture module, for when entering described ultra-low illumination pattern, gathering the first ultralow photograph of focal length respectively
The degree some width of image and the second focal length some width of ultra-low illumination image;
Ultra-low illumination image mosaic module, carries out image mosaic for the ultra-low illumination image that focusing is identical respectively, obtains the
The ultra-low illumination stitching image of one focal length and the ultra-low illumination stitching image of the second focal length;
Ultra-low illumination image co-registration module, for by the ultra-low illumination stitching image of described first focal length and described second focal length
Ultra-low illumination stitching image carries out image co-registration, and the ultra-low illumination image after being merged, by the ultra-low illumination after described fusion
Image is sent to aircraft human pilot, shows the ultra-low illumination image after described fusion;
Infrared image acquisition module, for when entering described infrared mode, gathers the 3rd some width of focal length infrared image respectively
With the 4th some width of focal length infrared image;
Infrared image concatenation module, carries out image mosaic for the infrared image that focusing is identical respectively, obtains the trifocal
Infrared mosaic image and the infrared mosaic image of the 4th focal length;
Infrared image Fusion Module, for by described trifocal infrared mosaic image and the infrared mosaic of described 4th focal length
Image carries out image co-registration, the infrared image after being merged, and the infrared image after described fusion is sent to aircraft and drives
Personnel, show the infrared image after described fusion.
17. 1 kinds of methods improving landing safety based on detection obstacle, it is characterised in that comprise the following steps:
Capture apparatus adjusts self orientation, and making forward sight shooting direction is level;
Detection current illumination value, it is judged that whether the testing result obtained meets ultra-low illumination treatment conditions;
When described testing result meets ultra-low illumination treatment conditions, then by before ultra-low illumination camera collection current location
Visible image and overhead view image, obtain ultra-low illumination front view picture and ultra-low illumination overhead view image after splicing and fusion;
Ultra-low illumination front view picture and ultra-low illumination overhead view image after merging splicing respectively carry out obstacle detection, when detecting
When ultra-low illumination front view picture and/or ultra-low illumination overhead view image exist obstacle, generate and send warning information, return knot
Bundle;
When described testing result does not meets described ultra-low illumination treatment conditions, then gathered current by infrared thermal imaging camera
The front view picture of position and overhead view image, obtain infrared level front view picture and infrared level overhead view image after splicing and fusion;
Respectively the infrared level front view picture after splicing and fusion and infrared level overhead view image are carried out obstacle detection, red when detecting
When there is obstacle in outer level front view picture and infrared level overhead view image, generating and sending warning information, return terminates.
18. methods as claimed in claim 17, wherein, described capture apparatus adjusts self orientation makes forward sight shooting direction be water
Flat, including:
Described equipment detects whether forward sight shooting direction is in level in real time, when being not in level, adjusts self-position, before making
Regard shooting direction as level.
19. methods as claimed in claim 17, wherein, the described front view by ultra-low illumination camera collection current location
Picture and overhead view image, including:
By ultra-low illumination camera collection some width front view picture and some width overhead view images;
Front view picture identical for focal length is divided into one group, the front view picture often organizing focal length identical is spliced into piece image,
To the first splicing front view picture that this focal length is corresponding;
First splicing front view picture of different focal is carried out image co-registration, is finally fused into piece image, as ultra-low illumination
Front view picture;
Overhead view image identical for focal length is divided into one group, the overhead view image often organizing focal length identical is spliced into piece image,
To the first splicing overhead view image that this focal length is corresponding;
First splicing overhead view image of different focal is carried out image co-registration, is finally fused into piece image, as ultra-low illumination
Overhead view image.
20. methods as claimed in claim 17, wherein, the described forward sight being gathered current location by infrared thermal imaging camera
Image and overhead view image, including:
Some width front view pictures and some width overhead view images is gathered by infrared thermal imaging camera;
Front view picture identical for focal length is divided into one group, the front view picture often organizing focal length identical is spliced into piece image,
To the second splicing front view picture that this focal length is corresponding;
Second splicing front view picture of different focal is carried out image co-registration, is finally fused into piece image, before infrared level
Visible image;
Overhead view image identical for focal length is divided into one group, the overhead view image often organizing focal length identical is spliced into piece image,
To the second splicing overhead view image that this focal length is corresponding;
Second splicing overhead view image of different focal is carried out image co-registration, is finally fused into piece image, bows as infrared level
Visible image.
21. methods as claimed in claim 17, wherein, described front view picture specifically includes: at front apparent direction overlying lid first
The image of predetermined angle scope;Described overhead view image specifically includes: overlook the figure covering the second predetermined angle scope on direction
Picture.
22. methods as claimed in claim 17, wherein, described respectively to described ultra-low illumination front view picture and described ultralow photograph
Degree overhead view image carries out obstacle detection, specifically includes:
Judge whether described ultra-low illumination front view picture and/or described ultra-low illumination overhead view image exist image characteristic point, be
Then extract described image characteristic point, according to described characteristics of image point-rendering obstacle image, and at corresponding ultra-low illumination front view
In picture and/or ultra-low illumination overhead view image, obstacle image described in labelling, determines with this and there is obstacle;Otherwise determine and there is not barrier
Hinder.
23. methods as claimed in claim 17, wherein, described when described ultra-low illumination front view picture and/or described be detected
When ultra-low illumination overhead view image exists obstacle, generate and send warning information and include:
By the positional information of obstacle described in satellite fix, according to the positional information of described obstacle, and it is marked with described obstacle
The described ultra-low illumination front view picture of image and/or ultra-low illumination overhead view image, generate warning information, sends described alarm letter
Breath.
24. methods as claimed in claim 17, wherein, described when described testing result meets ultra-low illumination treatment conditions,
Then by ultra-low illumination front view picture and the ultra-low illumination overhead view image of ultra-low illumination camera collection current location, specifically wrap
Include:
When described testing result meets ultra-low illumination treatment conditions, detect state of flight,
If the takeoff condition of being in or landing state or state of flight, then super by ultra-low illumination camera collection current location
Low-light (level) front view picture and ultra-low illumination overhead view image.
25. methods as claimed in claim 17, wherein, described when described testing result do not meet described ultra-low illumination process bar
During part, then gather the infrared level front view picture of current location and infrared level overhead view image by infrared thermal imaging camera, specifically
Including:
When described testing result does not meets described ultra-low illumination treatment conditions, detect state of flight,
If the takeoff condition of being in or landing state or state of flight, then infrared by infrared level camera collection current location
Level front view picture and infrared level overhead view image.
26. 1 kinds of equipment improving landing safety based on detection obstacle, it is characterised in that including:
Orientation adjustment module, is used for adjusting self orientation, and making forward sight shooting direction is level;
Luminance detection module, is used for detecting current illumination value, it is judged that whether the testing result obtained meets ultra-low illumination processes bar
Part;
Starlight acquisition module, for when described testing result meets ultra-low illumination treatment conditions, is then imaged by ultra-low illumination
Head gathers the ultra-low illumination front view picture of current location and ultra-low illumination overhead view image and carries out splicing and fusion treatment;
Starlight obstacle detection module, for overlooking the ultra-low illumination front view picture after splicing is merged and ultra-low illumination respectively
Image carries out obstacle detection, there is obstacle when detecting in described ultra-low illumination front view picture and/or ultra-low illumination overhead view image
Time, generate and send warning information;
Infrared collecting module, for when described testing result does not meets described ultra-low illumination treatment conditions, then passes through infra-red heat
Imaging camera head gathers the infrared level front view picture of current location and infrared level overhead view image and carries out splicing and fusion treatment;
Infrared obstacle detection module, described infrared level front view picture and described infrared level after merging splicing respectively are overlooked
Image carries out obstacle detection, when detect there is obstacle in described infrared level front view picture and/or infrared level overhead view image time, raw
Become and send a warning message.
The method of 27. 1 kinds of collision free obstacles based on data syn-chronization, it is characterised in that including:
Aircraft obtains current positional information, obtains environmental information request according to described positional information tissue, and obtains described
Take environmental information request to send to data center;
When receiving the environmental information response that described data center returns, from described environmental information responds, obtain obstacle letter
Breath, as history complaint message, obtains flight parameter from described environmental information responds, as history flight ginseng
Number;
State of flight according to self gathers ambient image, obtains current complaint message, according to described from described ambient image
Current complaint message, described history complaint message and described history flight parameter, this obstacle of labelling in described ambient image,
Show the ambient image of marked obstacle;
When avoiding described obstacle, obtain the flight parameter avoiding in described obstructive process, as current flight parameter, will
Described current location information, described current flight parameter and described current complaint message send to data center.
28. methods as claimed in claim 27, wherein, described aircraft obtains current location information, including:
Described aircraft uses global position system to obtain longitude, latitude and height above sea level, by described longitude, described latitude with
And described height above sea level is as current location information.
29. methods as claimed in claim 27, wherein, obtain complaint message from described environmental information responds, as
History complaint message, including:
The positional information of obstacle, dimension information, mobile attribute information is obtained, by described obstacle from described environmental information responds
Positional information, dimension information and mobile attribute information are as history complaint message.
30. methods as claimed in claim 27, wherein, the described state of flight according to self gathers ambient image, specifically wraps
Include:
Obtain the state of flight of self, if takeoff condition or land state or state of flight, then gather front view picture and bow
Visible image, using the image that collects as ambient image.
31. methods as claimed in claim 27, wherein, described obtain current complaint message, specifically from described ambient image
Including:
For the ambient image collected in the same direction, the ambient image of identical focal length is carried out image mosaic, is spliced into
Piece image, carries out image co-registration by the spliced image of different focal in the direction, and the image after being merged, from described
Extract characteristic point on image after fusion, when extracting characteristic point, draw obstacle according to described characteristic point, and hinder described in labelling
Hinder, obtained the positional information of described obstacle by global position system, according to the positional information of described obstacle and labelling
The image of obstacle, calculates the dimension information of described obstacle, mobile attribute information, by the positional information of described obstacle, dimension information,
Mobile attribute information is as current complaint message.
32. methods as claimed in claim 27, wherein, described according to described current complaint message, described history complaint message
And described history flight parameter labelling obstacle in described ambient image, including:
Judge that described current complaint message is the most identical with described history complaint message, be that then labelling hinders in described ambient image
Hinder, and point out described history flight parameter;Otherwise according to described current complaint message labelling obstacle in described ambient image.
33. methods as claimed in claim 27, wherein, described method also includes: draw described on the image of labelling obstacle
The motion track of obstacle;
The motion track of the described obstacle of described drafting, including:
Position the most described obstacle every Preset Time, when positioning described obstacle every time, obtain the height above sea level residing for described obstacle,
The angle of pitch, azimuth, and the displacement of described obstacle when obtaining the described obstacle in double location;
According to the height above sea level residing for described obstacle, the angle of pitch, azimuth and the displacement of described obstacle, calculate what described obstacle moved
Trace information, and the direction of motion of described obstacle and movement velocity;
The trace information moved according to described obstacle and the direction of motion of described obstacle and movement velocity, simulate described obstacle three
The track that will move in dimension space;
Draw the trace information that described obstacle moves and the track that will move in three dimensions.
34. methods as claimed in claim 27, wherein, described collection ambient image, including:
Detection current illumination value, selects corresponding image pickup mode to gather ambient image according to the testing result obtained,
When described testing result meets ultra-low illumination treatment conditions, then by ultra-low illumination camera collection ambient image;
When described testing result does not meets described ultra-low illumination treatment conditions, then gather environment by infrared thermal imaging camera
Image.
35. methods as claimed in claim 27, wherein, described method also includes:
Described data center receives described current location information, described current flight parameter and described current complaint message
Time, obtain the flight parameter corresponding with described current location information and complaint message, replace former by described current flight parameter
The flight parameter come, replaces original complaint message with described current complaint message.
The aircraft of 36. 1 kinds of collision free obstacles based on data syn-chronization, it is characterised in that including:
Information request module, for obtaining current positional information, obtains environmental information request according to described positional information tissue,
And the request of described acquisition environmental information is sent to data center;
Information receiving module, for when receiving the environmental information response that described data center returns, from described environmental information
Response obtains complaint message, as history complaint message, from described environmental information responds, obtains flight parameter, by it
As history flight parameter;
Obstacle detection module, gathers ambient image for the state of flight according to self, obtains current from described ambient image
Complaint message, according to described current complaint message, described history complaint message and described history flight parameter, at described environment
This obstacle of labelling on image, shows the ambient image of marked obstacle;
Data simultaneous module, for when avoiding described obstacle, obtains the flight parameter avoiding in described obstructive process, is made
For current flight parameter, described current location information, described current flight parameter and described current complaint message are sent extremely
Data center.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710134868.4A CN106828952B (en) | 2016-07-14 | 2016-07-14 | A kind of method and device of assisting in flying device safe flight |
CN201610556749.3A CN106184787B (en) | 2016-07-14 | 2016-07-14 | Aircraft and its landing with DAS (Driver Assistant System) and the method avoided collision |
CN201710136148.1A CN106965945B (en) | 2016-07-14 | 2016-07-14 | A kind of method and aircraft for avoiding collision obstacle synchronous based on data |
CN201710136026.2A CN106965946B (en) | 2016-07-14 | 2016-07-14 | A kind of method and apparatus improving landing safety based on detection obstacle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610556749.3A CN106184787B (en) | 2016-07-14 | 2016-07-14 | Aircraft and its landing with DAS (Driver Assistant System) and the method avoided collision |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710136148.1A Division CN106965945B (en) | 2016-07-14 | 2016-07-14 | A kind of method and aircraft for avoiding collision obstacle synchronous based on data |
CN201710136026.2A Division CN106965946B (en) | 2016-07-14 | 2016-07-14 | A kind of method and apparatus improving landing safety based on detection obstacle |
CN201710134868.4A Division CN106828952B (en) | 2016-07-14 | 2016-07-14 | A kind of method and device of assisting in flying device safe flight |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106184787A true CN106184787A (en) | 2016-12-07 |
CN106184787B CN106184787B (en) | 2018-10-26 |
Family
ID=57475481
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710136148.1A Active CN106965945B (en) | 2016-07-14 | 2016-07-14 | A kind of method and aircraft for avoiding collision obstacle synchronous based on data |
CN201710134868.4A Active CN106828952B (en) | 2016-07-14 | 2016-07-14 | A kind of method and device of assisting in flying device safe flight |
CN201610556749.3A Active CN106184787B (en) | 2016-07-14 | 2016-07-14 | Aircraft and its landing with DAS (Driver Assistant System) and the method avoided collision |
CN201710136026.2A Active CN106965946B (en) | 2016-07-14 | 2016-07-14 | A kind of method and apparatus improving landing safety based on detection obstacle |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710136148.1A Active CN106965945B (en) | 2016-07-14 | 2016-07-14 | A kind of method and aircraft for avoiding collision obstacle synchronous based on data |
CN201710134868.4A Active CN106828952B (en) | 2016-07-14 | 2016-07-14 | A kind of method and device of assisting in flying device safe flight |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710136026.2A Active CN106965946B (en) | 2016-07-14 | 2016-07-14 | A kind of method and apparatus improving landing safety based on detection obstacle |
Country Status (1)
Country | Link |
---|---|
CN (4) | CN106965945B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108318007A (en) * | 2018-01-26 | 2018-07-24 | 广州市红鹏直升机遥感科技有限公司 | A kind of image pickup method of spliced aviation oblique photograph |
WO2018133589A1 (en) * | 2017-01-17 | 2018-07-26 | 亿航智能设备(广州)有限公司 | Aerial photography method, device, and unmanned aerial vehicle |
CN109700439A (en) * | 2019-02-19 | 2019-05-03 | 酷黑科技(北京)有限公司 | A kind of data processing method, device and aircraft |
CN109756685A (en) * | 2017-11-07 | 2019-05-14 | 科盾科技股份有限公司 | A kind of Vehicular night vision system based on image mosaic and image co-registration |
CN110874921A (en) * | 2018-08-31 | 2020-03-10 | 百度在线网络技术(北京)有限公司 | Intelligent road side unit and information processing method thereof |
WO2021037286A1 (en) * | 2019-08-29 | 2021-03-04 | 深圳市道通智能航空技术有限公司 | Image processing method, apparatus, and device, and storage medium |
CN112884807A (en) * | 2021-01-18 | 2021-06-01 | 珠海翔翼航空技术有限公司 | Flight control action monitoring method and system based on infrared thermal imaging |
CN114927024A (en) * | 2022-05-18 | 2022-08-19 | 安胜(天津)飞行模拟系统有限公司 | Visual scene generation system, method and equipment of flight simulator |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109963070A (en) * | 2017-12-26 | 2019-07-02 | 富泰华工业(深圳)有限公司 | Picture sewing method and system |
CN108225277A (en) * | 2018-03-09 | 2018-06-29 | 深圳臻迪信息技术有限公司 | Image acquiring method, vision positioning method, device, the unmanned plane of unmanned plane |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101604830A (en) * | 2009-07-14 | 2009-12-16 | 山东电力研究院 | Patrolling trolly wire route and shaft tower unmanned helicopter system and method thereof |
CN204264449U (en) * | 2014-11-13 | 2015-04-15 | 国家电网公司 | A kind of line walking unmanned plane with infrared thermal imaging and aerial photography function |
CN104539905A (en) * | 2015-01-06 | 2015-04-22 | 山东鲁能智能技术有限公司 | Visible light accurate detection system for electric unmanned helicopter |
CN204956947U (en) * | 2015-09-11 | 2016-01-13 | 周艺哲 | Can multidirectional model aeroplane and model ship of gathering real -time image |
CN205131695U (en) * | 2015-11-03 | 2016-04-06 | 天津艾思科尔科技有限公司 | Unmanned aerial vehicle with hot image device |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6898331B2 (en) * | 2002-08-28 | 2005-05-24 | Bae Systems Aircraft Controls, Inc. | Image fusion system and method |
US7148861B2 (en) * | 2003-03-01 | 2006-12-12 | The Boeing Company | Systems and methods for providing enhanced vision imaging with decreased latency |
CN101064835A (en) * | 2007-04-29 | 2007-10-31 | 戴宏 | Night viewing apparatus for motor vehicle, vessel |
US8155806B2 (en) * | 2008-07-23 | 2012-04-10 | Honeywell International Inc. | Aircraft display systems and methods for enhanced display of landing information |
US8144937B2 (en) * | 2008-10-15 | 2012-03-27 | The Boeing Company | System and method for airport mapping database automatic change detection |
FR2992917A1 (en) * | 2012-07-05 | 2014-01-10 | Bouchaib Hoummady | Method for auditing and dynamic controlling of behavior of driver of e.g. land transport vehicle, involves performing image analysis to control certain qualities of control behaviors with respect to factor such as pedestrians on lane |
CN103224026B (en) * | 2012-12-05 | 2016-01-20 | 福建省电力有限公司 | A kind ofly be applicable to dedicated unmanned helicopter obstacle avoidance system that mountain area electrical network patrols and examines and workflow thereof |
CN103217855B (en) * | 2013-04-02 | 2015-07-15 | 金三立视频科技(深圳)有限公司 | Automatic focusing method of camera |
CN103679674B (en) * | 2013-11-29 | 2017-01-11 | 航天恒星科技有限公司 | Method and system for splicing images of unmanned aircrafts in real time |
CN204303178U (en) * | 2014-12-09 | 2015-04-29 | 西安航空电子科技有限公司 | A kind of airborne obstacle display based on video image and alarm device |
CN204846371U (en) * | 2015-05-22 | 2015-12-09 | 刘道满 | Can keep away unmanned aerial vehicle system of barrier |
CN105120240B (en) * | 2015-09-22 | 2018-08-28 | 成都时代星光科技有限公司 | The aerial high definition multidimensional of high power zoom unmanned plane investigates transmitting, monitoring device in real time |
CN105314122B (en) * | 2015-12-01 | 2017-08-15 | 浙江宇视科技有限公司 | A kind of unmanned plane collected evidence for emergency command and road occupying |
CN205263980U (en) * | 2015-12-09 | 2016-05-25 | 中国民用航空总局第二研究所 | Monitoring system of airport runway foreign matter |
CN105676861A (en) * | 2016-02-29 | 2016-06-15 | 北方民族大学 | Unmanned aerial vehicle-based straw burning monitoring system and measurement method |
-
2016
- 2016-07-14 CN CN201710136148.1A patent/CN106965945B/en active Active
- 2016-07-14 CN CN201710134868.4A patent/CN106828952B/en active Active
- 2016-07-14 CN CN201610556749.3A patent/CN106184787B/en active Active
- 2016-07-14 CN CN201710136026.2A patent/CN106965946B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101604830A (en) * | 2009-07-14 | 2009-12-16 | 山东电力研究院 | Patrolling trolly wire route and shaft tower unmanned helicopter system and method thereof |
CN204264449U (en) * | 2014-11-13 | 2015-04-15 | 国家电网公司 | A kind of line walking unmanned plane with infrared thermal imaging and aerial photography function |
CN104539905A (en) * | 2015-01-06 | 2015-04-22 | 山东鲁能智能技术有限公司 | Visible light accurate detection system for electric unmanned helicopter |
CN204956947U (en) * | 2015-09-11 | 2016-01-13 | 周艺哲 | Can multidirectional model aeroplane and model ship of gathering real -time image |
CN205131695U (en) * | 2015-11-03 | 2016-04-06 | 天津艾思科尔科技有限公司 | Unmanned aerial vehicle with hot image device |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018133589A1 (en) * | 2017-01-17 | 2018-07-26 | 亿航智能设备(广州)有限公司 | Aerial photography method, device, and unmanned aerial vehicle |
CN109756685A (en) * | 2017-11-07 | 2019-05-14 | 科盾科技股份有限公司 | A kind of Vehicular night vision system based on image mosaic and image co-registration |
CN108318007A (en) * | 2018-01-26 | 2018-07-24 | 广州市红鹏直升机遥感科技有限公司 | A kind of image pickup method of spliced aviation oblique photograph |
CN110874921A (en) * | 2018-08-31 | 2020-03-10 | 百度在线网络技术(北京)有限公司 | Intelligent road side unit and information processing method thereof |
US11217091B2 (en) | 2018-08-31 | 2022-01-04 | Apollo Intelligent Driving Technology (Beijing) Co., Ltd. | Smart roadside unit and method for processing information by smart roadside unit |
CN109700439A (en) * | 2019-02-19 | 2019-05-03 | 酷黑科技(北京)有限公司 | A kind of data processing method, device and aircraft |
CN109700439B (en) * | 2019-02-19 | 2023-11-21 | 酷黑科技(北京)有限公司 | Data processing method and device and aircraft |
WO2021037286A1 (en) * | 2019-08-29 | 2021-03-04 | 深圳市道通智能航空技术有限公司 | Image processing method, apparatus, and device, and storage medium |
CN112884807A (en) * | 2021-01-18 | 2021-06-01 | 珠海翔翼航空技术有限公司 | Flight control action monitoring method and system based on infrared thermal imaging |
CN112884807B (en) * | 2021-01-18 | 2024-02-27 | 珠海翔翼航空技术有限公司 | Flight control action monitoring method and system based on infrared thermal imaging |
CN114927024A (en) * | 2022-05-18 | 2022-08-19 | 安胜(天津)飞行模拟系统有限公司 | Visual scene generation system, method and equipment of flight simulator |
Also Published As
Publication number | Publication date |
---|---|
CN106828952B (en) | 2019-03-15 |
CN106828952A (en) | 2017-06-13 |
CN106965946B (en) | 2019-06-18 |
CN106965945A (en) | 2017-07-21 |
CN106965945B (en) | 2019-09-03 |
CN106965946A (en) | 2017-07-21 |
CN106184787B (en) | 2018-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106184787B (en) | Aircraft and its landing with DAS (Driver Assistant System) and the method avoided collision | |
CN104601953B (en) | A kind of video image fusion processing system | |
JP5055516B2 (en) | System and method for displaying device maintenance and operation instructions using augmented reality | |
CN107250728A (en) | The visually-perceptible that shown colour symbolism is represented is strengthened | |
CN105869340B (en) | A kind of fiery point monitoring system of exception based on unmanned plane and monitoring method | |
CN108496129A (en) | A kind of facility detection method and control device based on aircraft | |
CN106878687A (en) | A kind of vehicle environment identifying system and omni-directional visual module based on multisensor | |
CN104380369B (en) | Image display and method for displaying image | |
CN103400463B (en) | A kind of forest fires localization method based on two dimensional image and device | |
CN109562844A (en) | The assessment of automatic Landing topographical surface and relevant system and method | |
CN106204457A (en) | A kind of method for capture target and catching device | |
CN206611521U (en) | A kind of vehicle environment identifying system and omni-directional visual module based on multisensor | |
CN107728633A (en) | Obtain object positional information method and device, mobile device and its control method | |
CN106096207A (en) | A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure based on multi-vision visual and system | |
CN106919186A (en) | Unmanned vehicle flight control operation method and device | |
CN108024070A (en) | The method and relevant display system of sensor image are covered on the composite image | |
CN105676861A (en) | Unmanned aerial vehicle-based straw burning monitoring system and measurement method | |
EP0399670A2 (en) | Airborne computer generated image display systems | |
CN111210464A (en) | System and method for alarming people falling into water based on convolutional neural network and image fusion | |
Schleiss et al. | VPAIR--Aerial Visual Place Recognition and Localization in Large-scale Outdoor Environments | |
CN108596914B (en) | Unmanned aerial vehicle coal inventory method | |
CN111176316A (en) | Unmanned aerial vehicle oblique photography flight system suitable for ancient building | |
CN103175526B (en) | A kind of high dynamically lower fixed star star image restoration methods | |
KR100940118B1 (en) | Aerial photographing system for making digital map | |
CN114463164A (en) | Stereo video fusion method for vehicle fleet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20180906 Address after: 265200 No. 2 the Yellow Sea Road four, Laiyang food industry park, Yantai, Shandong Applicant after: KEDUN SCIENCE & TECHNOLOGY CO., LTD. Address before: 100193 room 3, 3 building, 3 building, 29 Northeast Road, Haidian District, Beijing. Applicant before: Shield Polytron Technologies Inc Beijing branch |
|
TA01 | Transfer of patent application right | ||
GR01 | Patent grant | ||
GR01 | Patent grant |