CN105120136A - Shooting device based on unmanned aerial vehicle and shooting processing method thereof - Google Patents

Shooting device based on unmanned aerial vehicle and shooting processing method thereof Download PDF

Info

Publication number
CN105120136A
CN105120136A CN201510553021.0A CN201510553021A CN105120136A CN 105120136 A CN105120136 A CN 105120136A CN 201510553021 A CN201510553021 A CN 201510553021A CN 105120136 A CN105120136 A CN 105120136A
Authority
CN
China
Prior art keywords
shooting
unmanned vehicle
flight
processing module
location parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510553021.0A
Other languages
Chinese (zh)
Inventor
杨珊珊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
High Domain (beijing) Intelligent Technology Research Institute Co Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201510553021.0A priority Critical patent/CN105120136A/en
Publication of CN105120136A publication Critical patent/CN105120136A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The invention relates to a shooting device based on an unmanned aerial vehicle and a shooting processing method of the shooing device. The device comprises the unmanned aerial vehicle (1) and a flight controller (2). The unmanned aerial vehicle (1) comprises a measuring module (3) and shooting equipment (5) arranged on a cloud platform (4). The measuring module (3) is used for measuring environmental information of the unmanned aerial vehicle (1) and transmitting to the flight controller (2); the cloud platform (4) is adjustably arranged at the shooting equipment (5) thereon. The flight controller (2) comprises a flight control module (6), a shooting control module (7) and a processing module (8). The shooting control module (7) controls the cloud platform (4) to adjust a position parameter of the shooting equipment (5) and adjust a shooting parameter of the shooting equipment (5). The processing module (8) is used for processing corresponding shooting materials shot at the shooting moment based on the environmental information, the position parameter and/or the shooting parameter received at every shooting moment.

Description

Based on filming apparatus and the photographing process method thereof of unmanned vehicle
Technical field
The invention belongs to the field of taking photo by plane, particularly relate to a kind of filming apparatus based on unmanned vehicle and photographing process method thereof.
Background technology
It is a collection singlechip technology that unmanned vehicle is taken photo by plane, sensor technology of taking photo by plane, GPS navigation take photo by plane technology, many technology such as service technology, flight control technology, task control technology, programming technique are taken photo by plane in communication and rely on the high-tech product of hardware, and its filmed image has the advantage of high definition, large scale, small size, high Up-to-date state.And unmanned vehicle provides easy to operate for photography of taking photo by plane, be easy to the remote sensing platform of transition; Taking off landing is less by place restriction, and on playground, highway or other openr ground all can landings, and its stability, fail safe are good, and transition etc. are very easy to; The outstanding feature that small portable, low-noise energy-saving, efficiently motor-driven, image definition, lightness, miniaturization, intelligent unmanned vehicle are especially taken photo by plane.Apply the monitoring that many rotor wing unmanned aerial vehicles carry out specific region and can worry personal security, also need not worry the muscle power restriction of pilot.Unmanned vehicle can perform various task the most dangerous, such as bad environments, people cannot be close environment under adopt unmanned plane to monitor in real time.Therefore, employing unmanned vehicle realizes taking photo by plane and is used widely now.
To take photo by plane field in specialty on the one hand, unmanned vehicle, through manipulating accurately, can realize, from the arbitrarily angled flexible shooting to arbitrary target, being all used widely, such as celebration activity scene, film shooting, the live relay of ball match etc. in a lot of occasion.On the other hand, to take photo by plane field in consumer level, domestic consumer is also to can to take to target the mode obtaining aerial photograph or film very interested from aerial, in fact, social media exists a large amount of domestic consumer oneself and has taken the various aerial photograph and video that obtain, and be shared with friend.The demand of taking photo by plane is along with the progress of technology, and the cost of equipment reduces and obtains tremendous expansion, but current means of taking photo by plane are still very simple.The most direct, carry picture pick-up device with unmanned vehicle exactly, then take.Although from aspects such as flight control, flight performance, stabilization technology, image processing techniques, had a lot of people to improve in the continuous technical force that drops into.But take photo by plane and finally serve user, user how can be allowed more simply to use equipment of taking photo by plane, user's gain sharing of taking photo by plane more easily how can be allowed to go out, and this is demand place, also should be the emphasis of technical concerns.Inventor notices; present user is when use unmanned vehicle is taken photo by plane; there is a trouble; unmanned vehicle aloft; continuous print is taken destination object, and often can obtain multiple photograph or image pickup result in flight process, that is; in flight process, often produce multiple pictures and multistage video.When these photos of user's last phase tidying up or video, a difficulty is had to be, when the quantity of photo and video is too much time, user is difficult to generation time and the place of differentiating these photos, even if some filming apparatus carry time mark function, but for being often for the content of taking in the case of movement, can not well by scheme that shooting time, spot for photography, content of shooting all well show, the user demand of existing user can not be met, and arrange and the work of use to the content of shooting of user, make troubles.
Patent documentation CN204515536 discloses a kind of autonomous cruise camera system based on four rotors, and it comprises: aircraft, ground station's supervisory control system, GPS locator, to take photo by plane system and data storage in real time; Wherein aircraft is provided with flight controller, the input of described flight controller is connected to command receiver and data transmitting/receiving equipment, between GPS locator with flight controller, holding wire is connected, the system of taking photo by plane is placed on aircraft in real time, comprise camera of taking photo by plane, brushless The Cloud Terrace and wireless image transmitting apparatus, camera of taking photo by plane is arranged on brushless The Cloud Terrace; Brushless The Cloud Terrace is installed on board the aircraft, and wireless image transmitter is connected with camera of taking photo by plane; Described data storage is connected with flight controller, data is stored; Described ground station supervisory control system comprises wireless image receiving equipment and wireless data transceiving equipment is connected with a microcomputer, accepted the view data of wireless image transmitter transmitting by wireless image receiving equipment, the data that data transmitting/receiving equipment sends are accepted by wireless data transceiving equipment.This patent can be located in real time appointed area and take photo by plane, thus realizes the real-time monitoring of appointed area, effectively reduces the input of manpower.But this system configuration is complicated, cost is high and can not records photographing time the relevant information of inscribing, and can not by corresponding for the information inscribed when shooting material and shooting and carry out processing and alleviate shooting material and arrange and the trouble of editing.
Patent documentation CN201604796 discloses a kind of Intelligent aerial photography unmanned aerial vehicle, it comprises body, cabin is provided with) in the middle of body, cabin connects 4 cantilevers respectively, respectively screw is installed up and down at the tail end of each cantilever, screw connects power plant module, and power plant module is connected with flight control modules, and flight control modules connects navigation module and real-time video passback module respectively.This unmanned vehicle high definition real-time video returns, the state of flight information allowing manipulator only just can easily grasp aircraft by monitor screen on ground is arranged net, but its can not records photographing time the environmental information of inscribing and acquisition parameters is adjusted, and can not by corresponding for the information inscribed when shooting material and shooting and carry out processing and alleviate shooting material and arrange and the trouble of editing.
Tradition filming apparatus only completes shooting, stores and/or sending function, can not well retain the physical message relevant to this shooting.What tradition filming apparatus had can preserve some technical information relevant with shooting results, such as aperture time, exposure intensity, photosensitivity etc.Therefore, this area urgent need is a kind of can according to certain rule, such as according to factors such as shooting time, spot for photography, photographic subjects features, carry out regularly selecting and edit and arranging to shooting results, just can be formed with the shooting material sharing value rapidly, automatically, in other words, by magnanimity shooting results, become the shooting material that can reflect certain content from time shaft or azimuth axis angle.
Disclosed above-mentioned information is only for strengthening the understanding to background of the present invention in the background section, therefore may comprise the information not being formed in prior art known to a person of ordinary skill in the art in this country.
Summary of the invention
The object of the invention is to be achieved by the following technical programs.
According to a first aspect of the invention, a kind of filming apparatus based on unmanned vehicle disclosed by the invention, it comprises unmanned vehicle and flight controller.
Described unmanned vehicle comprises measurement module and is located at the capture apparatus on The Cloud Terrace.
Described measurement module is measured the environmental information of described unmanned vehicle and is sent to described flight controller.
Be arranged on the described capture apparatus that described The Cloud Terrace adjustable on described unmanned vehicle is mounted thereto.
Described flight controller comprises flight control modules, shooting control module and processing module, wherein, described flight control modules controls the state of flight of described unmanned vehicle, described shooting control module controls described The Cloud Terrace and adjusts the location parameter of described capture apparatus and adjust the acquisition parameters of described capture apparatus and when taking, the described location parameter inscribed when taking and acquisition parameters be sent to described processing module.
Described processing module processes the described corresponding shooting material captured by the described shooting moment based on the described environmental information of each described shooting reception, described location parameter and/or described acquisition parameters.
Preferably, described measurement module forms by be selected from group that electronic compass, altimeter, GPS unit, light sensor, humidity sensor and temperature sensor form one or more, and one or more in the group that the described environmental information measured of described measurement module is made up of the azimuth information of described unmanned vehicle, elevation information, positional information, intensity signal, humidity information and temperature information form.
Preferably, described The Cloud Terrace comprises the adjusting mechanism for zoom operational, elevating movement and azimuth motion.
Preferably, described capture apparatus is visible ray high-definition camera, high-resolution camera, infrared thermoviewer or ultraviolet imagery equipment.
Preferably, described flight control modules controls the state of flight of described unmanned vehicle based on the flight information that the GIS module on flight controller provides and/or adjusts the state of flight of described unmanned vehicle based on the flight directive of user's input.
Preferably, described location parameter is the shooting angle of described capture apparatus, and described acquisition parameters is one or more in aperture time, exposure intensity and photosensitivity.
Preferably, corresponding described shooting material can carry out sorting or calling by predetermined condition according to described shooting moment, described environmental information, described location parameter or described acquisition parameters by described processing module.
Preferably, described flight control modules controls described unmanned vehicle and hovers in precalculated position.
Preferably, the described flight controller display module and/or described processing module comprised for showing described shooting material comprises the memory cell for storing.
Preferably, described processing module based on the described environmental information of each described shooting reception, described location parameter, described acquisition parameters or in the shooting moment any two or plural combination described corresponding shooting material is processed.
Preferably, described processing module processes described corresponding shooting material based on each described environmental information of described shooting reception and the combination of described location parameter.
Preferably, described processing module is based on the described environmental information of each described shooting reception, described location parameter, described acquisition parameters or arbitraryly in the shooting moment rear acquisition first is processed to described corresponding shooting material take material group, then, described processing module is based on described environmental information, described location parameter, described acquisition parameters or in the shooting moment, another carries out process acquisition second to described first shooting material group and takes material group, described processing module is based on described environmental information, described location parameter, described acquisition parameters or any one of taking in two remaining in the moment are carried out process acquisition third shot to described second shooting material group and are taken the photograph material group.
Preferably, described processing module processes rear acquisition first based on the described environmental information of each described shooting reception to described corresponding shooting material and takes material group, then, described processing module is carried out process acquisition second based on described location parameter to described first shooting material group and is taken material group, and described processing module takes the photograph material group based on carrying out process acquisition third shot to described second shooting material group in the described shooting moment.
Similarly, described processing module is first based on described position information process, and then based on environmental information process, it is also the content that the present invention protects that last processing module carries out the similar process such as processing based on acquisition parameters.
According to a second aspect of the invention, the photographing process method of the filming apparatus based on unmanned vehicle described in use disclosed by the invention, it comprises the following steps.
In first step, described flight control modules controls described unmanned vehicle state of flight to meet required shooting condition, and described measurement module is measured the environmental information of described unmanned vehicle and is sent to described flight controller.
In second step, described shooting control module controls described The Cloud Terrace and adjusts the location parameter of described capture apparatus and adjust the acquisition parameters of described capture apparatus and when when taking, the described location parameter inscribed when taking and acquisition parameters be sent to described processing module.
In third step, described processing module processes corresponding described shooting material based on the described environmental information of each shooting reception, described location parameter and described acquisition parameters.
The scheme that the present invention proposes can by content of shooting and shooting time, spot for photography, and other shooting conditions are mapped.The scheme that the present invention proposes can also shooting material needed for the shooting time of content of shooting, spot for photography and other shooting condition Automatic generation of informations.
Accompanying drawing explanation
Fig. 1 is according to an embodiment of the invention based on the structural representation of the filming apparatus of unmanned vehicle.
Fig. 2 is in accordance with another embodiment of the present invention based on the structural representation of the filming apparatus of unmanned vehicle.
Fig. 3 is the step schematic diagram of the photographing process method of the filming apparatus used according to an embodiment of the invention based on unmanned vehicle.
Below in conjunction with drawings and Examples, the present invention is further explained.
Embodiment
Below describe in detail be in fact only exemplary and be not intended to limit application and use., and the theory constraint be not intended to by any clear and definite or hint presented in above technical field, background, brief overview or following detailed description in addition.As used herein, term " module " or " unit " refer to any hardware, software, firmware, electronic control part, processing logic and/or processor device (individually or with any combination), include, without being limited to: the processor of application-specific integrated circuit (ASIC) (ASIC), electronic circuit, the one or more software of execution or firmware program (shared, special or in groups) and memory, combinational logic circuit and/or provide described functional parts that other are applicable to.In addition, unless had contrary description clearly, otherwise word " comprises " and different modification should be understood to implicit and comprises described parts but do not get rid of any miscellaneous part." process " of the present invention includes but not limited to application such as the selection of taking material, sequence, editor and combinations.
Present embodiments describe a kind of filming apparatus based on unmanned vehicle, the schematic diagram of the filming apparatus based on unmanned vehicle according to an embodiment of the invention as shown in Figure 1, the filming apparatus based on unmanned vehicle comprises unmanned vehicle 1 and flight controller 2.
In the art, unmanned vehicle 1 refers to that employing controls automatically, has the unmanned vehicle of self-navigation.This unmanned vehicle 1 can be many rotary wind types unmanned vehicle.Flight controller 2 is that unmanned vehicle 1 utilizes the equipment automatically controlling to implement control to the configuration of aircraft, flight attitude and kinematic parameter in flight course, and user is controlled by the flight course of this flight controller 2 pairs of unmanned vehicles 1.
The capture apparatus 5 that described unmanned vehicle 1 comprises measurement module 3 and is located on The Cloud Terrace 4.Described measurement module 3 is measured the environmental information of described unmanned vehicle 1 and is sent to described flight controller 2.The measurement module 3 of configuration in unmanned vehicle 1 can form by be selected from group that electronic compass, altimeter, GPS unit, light sensor, humidity sensor and temperature sensor form one or more, and one or more in the group that the described environmental information of described measurement module 3 measurement is made up of the azimuth information of described unmanned vehicle, elevation information, positional information, intensity signal, humidity information and temperature information form.
In one embodiment, described measurement module 3 can according to GPS unit determination positional information, and positional information is determined in the preset coordinates location that can also provide according to GPS unit and the GIS module 9 on described flight controller 2 further. and described measurement module 3 measures the environmental informations such as light intensity, humidity, temperature, height by light sensor, humidity sensor, temperature sensor, altimeter etc.
The Cloud Terrace 4 is responsible for establishing shot equipment 5, and device location adjustment function and stabilization function can be provided, and capture apparatus 5 is generally moving camera, under circumstances, can effectively complete shooting task. The Cloud Terrace 4 is can the left rotation and right rotation all-directional tripod head that can rotate up and down again.The Cloud Terrace 4 can be applicable to the electric platform to taking on a large scale, and this electric platform high speed attitude can be realized by two operating motors, and motor accepts accurately to run location from the signal of flight controller 2.
Be arranged on the described capture apparatus 5 that described The Cloud Terrace 4 adjustable on described unmanned vehicle 1 is mounted thereto.In one embodiment, described The Cloud Terrace 4 comprises the adjusting mechanism for zoom operational, elevating movement and azimuth motion.
In one embodiment, described capture apparatus 5 is visible ray high-definition camera, high-resolution camera, infrared thermoviewer or ultraviolet imagery equipment.
Described flight controller 2 comprises flight control modules 6, shooting control module 7 and processing module 8.
Described flight control modules 6 controls the state of flight of described unmanned vehicle 1.Fly flight control modules 6 according to shooting task adjustment unmanned vehicle state of flight, it comprises unmanned vehicle landing point, hover point, level point, track points, makes a return voyage a little, the flying height on flight path, flying speed etc.In one embodiment, described flight control modules 6 controls described unmanned vehicle 1 and hovers in precalculated position.User sends flight directive control unmanned vehicle 1 by flight control modules 6 and takes off and complete aerial mission in predetermined spatial domain, normally in visual range, complete the flight path of specifying, then in the hovering of selected spot for photography, then flight shooting is completed, or according to the flight path that is estimated, in flight course, complete shooting.
Described shooting control module 7 controls described The Cloud Terrace 4 and adjusts the location parameter of described capture apparatus 5 and adjust the acquisition parameters of described capture apparatus 5 and when taking, the described location parameter inscribed when taking and acquisition parameters be sent to described processing module 8.
In one embodiment, described location parameter is the shooting angle of described capture apparatus 5, and described acquisition parameters is one or more in aperture time, exposure intensity and photosensitivity.
Described processing module 8 processes the described corresponding shooting material captured by the described shooting moment based on the described environmental information of each described shooting reception, described location parameter and/or described acquisition parameters.
In one embodiment, corresponding described shooting material can carry out sorting or calling by predetermined condition according to described shooting moment, described environmental information, described location parameter or described acquisition parameters by described processing module 8.
In one embodiment, user utilize the filming apparatus of unmanned vehicle continual complete 4 times flight shooting, 20 minutes each flight time, all have taken at every turn 30 shooting materials.Be in the shooting process of 20 minutes at single flight duration, the flight course of unmanned vehicle is irregular often, in other words, substantially that unmanned vehicle from the beginning to the end can not be there is all according to the situation of at the uniform velocity circuit and shooting, if want rapidly from shooting material, pick out the shooting material that unmanned vehicle is taken at diverse geographic location, also need the regular hour.Especially, when content of shooting is many, just more taken time.According to the application's scheme, due to shooting material to the positional information that measurement module 3 should be had to measure, processing module 8 according to positional information to shooting material classify, then each positional information call one take material use for user.In simple terms, suppose that user in shooting similar World Park like this, just there is every 20 to 30 meters the environment of a different landscape, so by mode that geographical location information screens, just can effectively draw substantially unduplicated for showing achievement, and without the need to the loaded down with trivial details preview of user, select work.And, under this mode, how random no matter user handles unmanned plane carry out the flying process of shooting has, such as first clap A point for a moment, then remove to have clapped B point for a moment, turn around again after a while again to have clapped A point, then go again B point to autodyne and take a photograph, even if these shooting materials are difficult to tell difference on a timeline, but just effectively can carry out object division by positional information.
In another embodiment, based on similar principle, as long as combine shooting material and wherein corresponding described shooting moment, described environmental information, described location parameter or described acquisition parameters, processing module 8 can process and produce valuable displaying content.Based on environmental information, likely pluck the content of shooting selected when sunshine, the strongest, outdoor illumination condition was good; Based on Illuminant chromaticity analysis, likely pluck the material selected and take with greenery patches by the sea respectively; Based on information such as temperature, humidity, height, also can produce shooting material targetedly and be used for showing, and this displaying is very valuable often.Such as, based on the information that altimeter obtains, shooting material sustained height scope be filmed all screens, represent in front of the user simultaneously, close due to shooting condition, be easy to allow user produce a kind of associative perception, can be shown by the collective of these shooting materials, produce unforeseeable bandwagon effect.
In one embodiment, described shooting moment, described environmental information, described location parameter or described acquisition parameters can also be labeled on shooting material by processing module 8, the information marked both can be temporal information, positional information more intuitively, also can be with produce take that material combines according to relevant parameter information.Processing module 8 can also utilize template generation to take various types of combinations of material, can increase interest and the effect of the displaying of shooting material.
As shown in Figure 2 in accordance with another embodiment of the present invention based on the structural representation of the filming apparatus of unmanned vehicle, the filming apparatus based on unmanned vehicle comprises unmanned vehicle 1 and flight controller 2.
The capture apparatus 5 that described unmanned vehicle 1 comprises measurement module 3 and is located on The Cloud Terrace 4.
Described measurement module 3 is measured the environmental information of described unmanned vehicle 1 and is sent to described flight controller 2.
Be arranged on the described capture apparatus 5 that described The Cloud Terrace 4 adjustable on described unmanned vehicle 1 is mounted thereto.
Described flight controller 2 comprises flight control modules 6, shooting control module 7 and processing module 8.
Described flight control modules 6 controls the state of flight of described unmanned vehicle 1 based on the flight information that the GIS module 9 on described flight controller 2 provides and/or adjusts the state of flight of described unmanned vehicle 1 based on the flight directive of user's input.Described GIS module 9 for input and output unmanned vehicle 1 flight information and be used in GIS map and browse, edit and delete corresponding described flight information.Show flight information in GIS module 9, and planning, the operation such as selection and frame choosing of flight information are provided, and be used in GIS map and browse, edit and delete corresponding flight information.
In one embodiment, described GIS module 9 comprises GIS information and imports and exports unit, information display unit and information maintenance unit.Wherein, GIS information imports and exports unit, in GIS map without the importing of aircraft 1 flight information and derivation, its import and derive form include but not limited to the form such as text, XML, CSV, EXCEL, WORD, PDF; Information display unit, for showing flight information in GIS map; Information maintenance unit, for browsing, editing and delete corresponding flight information in GIS map.GIS module 9 provides geographical information query for flight line planning, comprises the altitude info ination of GPS information, place and flight line, building, road, river information etc.
Described shooting control module 7 controls described The Cloud Terrace 4 and adjusts the location parameter of described capture apparatus 5 and adjust the acquisition parameters of described capture apparatus 5 and when taking, the described location parameter inscribed when taking and acquisition parameters be sent to described processing module 8.
Described processing module 8 processes the described corresponding shooting material captured by the described shooting moment based on the described environmental information of each described shooting reception, described location parameter and/or described acquisition parameters.
Described flight controller 2 display module 10 and/or described processing module comprised for showing described shooting material comprises the memory cell 11 for storing.
Processing module 8 can compile, organizes or analyze sensing data in the storage format of memory to perform statistical analysis to data.Processing module 8 can comprise general processor, digital signal processor, application-specific integrated circuit ASIC, on-site programmable gate array FPGA, analog circuit, digital circuit and combination thereof or the processor of other known or later exploitations.Memory cell 11 can be volatile memory or nonvolatile memory.Memory cell 11 can comprise the memory of one or more read only memory ROM, random access memory ram, flash memory, Electrical Erasable programmable read only memory EEPROM or other type.
In one embodiment, processing module 8 based on the described environmental information of each described shooting reception, described location parameter, described acquisition parameters or in the shooting moment any two or plural combination described corresponding shooting material is processed.
In one embodiment, described processing module 8 processes described corresponding shooting material based on each described environmental information of described shooting reception and the combination of described location parameter.
In one embodiment, described processing module 8 is based on the described environmental information of each described shooting reception, described location parameter, described acquisition parameters or arbitraryly in the shooting moment rear acquisition first is processed to described corresponding shooting material take material group, then, described processing module 8 is based on described environmental information, described location parameter, described acquisition parameters or in the shooting moment, another carries out process acquisition second to described first shooting material group and takes material group, described processing module 8 is based on described environmental information, described location parameter, described acquisition parameters or any one of taking in two remaining in the moment are carried out process acquisition third shot to described second shooting material group and are taken the photograph material group.
In one embodiment, described processing module 8 processes rear acquisition first based on the described environmental information of each described shooting reception to described corresponding shooting material and takes material group, then, described processing module 8 is carried out process acquisition second based on described location parameter to described first shooting material group and is taken material group, and described processing module 8 takes the photograph material group based on carrying out process acquisition third shot to described second shooting material group in the described shooting moment.The shown in Figure 3 photographing process method used according to an embodiment of the invention according to the described filming apparatus based on unmanned vehicle, it comprises the following steps.
In first step S1, described flight control modules 6 controls described unmanned vehicle 1 state of flight to meet required shooting condition, and described measurement module 3 is measured the environmental information of described unmanned vehicle 1 and is sent to described flight controller 2.
In second step S2, described shooting control module 7 controls described The Cloud Terrace 4 and adjusts the location parameter of described capture apparatus 5 and adjust the acquisition parameters of described capture apparatus 5 and when when taking, the described location parameter inscribed when taking and acquisition parameters be sent to described processing module 8.
In third step S3, described processing module 8 processes corresponding described shooting material based on the described environmental information of each shooting reception, described location parameter and/or described acquisition parameters.
In the method, such as, unmanned aerial vehicle device 1 is continual completes 4 flight shootings, 20 minutes each flight time, all have taken 30 shooting materials at every turn.User wishes to share the combination of shooting material in time, but inconvenient a large amount of shooting materials to storing are edited again, now, processing module 4 is screened and Automatic Combined shooting material according to time parameter, just can generate valuable shooting achievement, such as, the time starting to take with first time is for terminal, with the time of taking for the last time for terminating point, be assumed to be total shooting cycle, within this shooting cycle, in the mode of constant duration, selected 6 time points hypothesis starting point is No. 1 point, terminating point is No. 8 points, so come to eight time points, time point 2-time point 7, according to principle immediate with above-mentioned time point in shooting material, select 6 shooting materials of needs, these 6 shooting materials are combined, enclose shooting time, as a group, this shooting material group just can be utilized directly to show, and user played the roughly situation of whole process the same day.And because the selection of this shooting material is both regular, there is again certain randomness simultaneously, so the process of playing reflecting user's same day that can be interesting, and convenient and swift.Such as, or under condition just now, as long as suitably utilize the parameter of these shootings corresponding to material, more contents of shooting can also automatically be produced for showing and sharing.Such as, can respectively select a photo from 4 flight shooting process, as the shooting material shown, now only need simply to select in 4 flight shooting process, the shooting material at the median time point place of every section of flight shooting process.Because, the choose opportunities of 4 flight shootings is relevant with tourist attractions characteristic, usually not be evenly distributed according to time shaft, so the result that such selection mode produces, substantially all larger difference can be had with above-mentioned example, but be also reflect the process that user plays from different sides, create and can directly be demonstrated and the value shared.
Such as, in real work, according to the generation time of actual photographed material, likely need to select with the time point calculated that immediate shooting material uses.The shooting moment is screened shooting material as parameter by processing module 4.
In one embodiment, the described shooting moment can be shooting absolute time, the all right corresponding temporal information producing lunar calendar division of day and night when being necessary, the relative time can also be time of starting with this work of taking photo by plane being starting point, such as the 5th minute after this work of taking photo by plane starts, also can be the relative time parameter of taking photo by plane continuously in process with single, be such as the 7th photo etc. of certain ten continuous shooting shooting action.
Although be below described embodiment of the present invention by reference to the accompanying drawings, the present invention is not limited to above-mentioned specific embodiments and applications field, and above-mentioned specific embodiments is only schematic, guiding, instead of restrictive.Those of ordinary skill in the art, under the enlightenment of this specification and when not departing from the scope that the claims in the present invention are protected, can also make a variety of forms, and these all belong to the row of the present invention's protection.

Claims (10)

1., based on a filming apparatus for unmanned vehicle, it comprises unmanned vehicle (1) and flight controller (2), wherein,
Described unmanned vehicle (1) comprises measurement module (3) and is located at the capture apparatus (5) on The Cloud Terrace (4), wherein,
Described measurement module (3) is measured the environmental information of described unmanned vehicle (1) and is sent to described flight controller (2);
Be arranged on the described capture apparatus (5) that described The Cloud Terrace (4) adjustable on described unmanned vehicle (1) is mounted thereto;
Described flight controller (2) comprises flight control modules (6), shooting control module (7) and processing module (8), wherein,
Described flight control modules (6) controls the state of flight of described unmanned vehicle (1), described shooting control module (7) controls described The Cloud Terrace (4) and adjusts the location parameter of described capture apparatus (5) and adjust the acquisition parameters of described capture apparatus (5) and when taking, the described location parameter inscribed when taking and acquisition parameters be sent to described processing module (8);
Described processing module (8) processes the described corresponding shooting material captured by the described shooting moment based on the described environmental information of each described shooting reception, described location parameter and/or described acquisition parameters.
2. the filming apparatus based on unmanned vehicle according to claim 1, it is characterized in that: described measurement module (3) forms by be selected from group that electronic compass, altimeter, GPS unit, light sensor, humidity sensor and temperature sensor form one or more, one or more in the group that the described environmental information measured of described measurement module (3) is made up of the azimuth information of described unmanned vehicle, elevation information, positional information, intensity signal, humidity information and temperature information form.
3. the filming apparatus based on unmanned vehicle according to claim 1, is characterized in that:
Described processing module (8) based on the described environmental information of each described shooting reception, described location parameter, described acquisition parameters or in the shooting moment any two or plural combination described corresponding shooting material is processed.
4. the filming apparatus based on unmanned vehicle according to claim 3, is characterized in that:
Described processing module (8) processes described corresponding shooting material based on each described environmental information of described shooting reception and the combination of described location parameter.
5. the filming apparatus based on unmanned vehicle according to claim 1, it is characterized in that: described flight control modules (6) controls the state of flight of described unmanned vehicle (1) based on the flight information that the GIS module (9) on described flight controller (2) provides and/or adjusts the state of flight of described unmanned vehicle (1) based on the flight directive of user's input, described flight controller (2) display module (10) and/or described processing module comprised for showing described shooting material comprises the memory cell (11) for storing.
6. the filming apparatus based on unmanned vehicle according to claim 1, it is characterized in that: described location parameter is the shooting angle of described capture apparatus (5), described acquisition parameters is one or more in aperture time, exposure intensity and photosensitivity.
7. the filming apparatus based on unmanned vehicle according to claim 1, is characterized in that: corresponding described shooting material can carry out sorting or calling by predetermined condition according to described shooting moment, described environmental information, described location parameter or described acquisition parameters by described processing module (8).
8. the filming apparatus based on unmanned vehicle according to claim 1, is characterized in that:
Described processing module (8) is based on the described environmental information of each described shooting reception, described location parameter, described acquisition parameters or arbitraryly in the shooting moment rear acquisition first is processed to described corresponding shooting material take material group, then, described processing module (8) is based on described environmental information, described location parameter, described acquisition parameters or in the shooting moment, another carries out process acquisition second to described first shooting material group and takes material group, described processing module (8) is based on described environmental information, described location parameter, described acquisition parameters or any one of taking in two remaining in the moment are carried out process acquisition third shot to described second shooting material group and are taken the photograph material group.
9. the filming apparatus based on unmanned vehicle according to claim 1, it is characterized in that: described processing module (8) processes rear acquisition first based on the described environmental information of each described shooting reception to described corresponding shooting material and takes material group, then, described processing module (8) is carried out process acquisition second based on described location parameter to described first shooting material group and is taken material group, and described processing module (8) takes the photograph material group based on carrying out process acquisition third shot to described second shooting material group in the described shooting moment.
10. use a photographing process method for the filming apparatus based on unmanned vehicle according to any one of claim 1-9, it comprises the following steps:
In first step (S1), described flight control modules (6) controls described unmanned vehicle (1) state of flight to meet required shooting condition, and described measurement module (3) is measured the environmental information of described unmanned vehicle (1) and is sent to described flight controller (2);
In second step (S2), described shooting control module (7) controls described The Cloud Terrace (4) and adjusts the location parameter of described capture apparatus (5) and adjust the acquisition parameters of described capture apparatus (5) and when when taking, the described location parameter inscribed when taking and acquisition parameters be sent to described processing module (8);
In third step (S3), described processing module (8) processes corresponding described shooting material based on the described environmental information of each shooting reception, described location parameter and/or described acquisition parameters.
CN201510553021.0A 2015-09-01 2015-09-01 Shooting device based on unmanned aerial vehicle and shooting processing method thereof Pending CN105120136A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510553021.0A CN105120136A (en) 2015-09-01 2015-09-01 Shooting device based on unmanned aerial vehicle and shooting processing method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510553021.0A CN105120136A (en) 2015-09-01 2015-09-01 Shooting device based on unmanned aerial vehicle and shooting processing method thereof

Publications (1)

Publication Number Publication Date
CN105120136A true CN105120136A (en) 2015-12-02

Family

ID=54668013

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510553021.0A Pending CN105120136A (en) 2015-09-01 2015-09-01 Shooting device based on unmanned aerial vehicle and shooting processing method thereof

Country Status (1)

Country Link
CN (1) CN105120136A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105373629A (en) * 2015-12-17 2016-03-02 谭圆圆 Unmanned aerial vehicle-based flight condition data processing device and method
CN105468789A (en) * 2015-12-30 2016-04-06 谭圆圆 Image processing apparatus based on photographing of unmanned aerial vehicle and image processing method thereof
CN106060469A (en) * 2016-06-23 2016-10-26 杨珊珊 Image processing system based on photographing of unmanned aerial vehicle and image processing method thereof
CN106534775A (en) * 2016-11-03 2017-03-22 北京小米移动软件有限公司 Video editing method and device as well as video shooting method and device
CN106791404A (en) * 2016-12-26 2017-05-31 上海青橙实业有限公司 Unmanned plane and its photographic method
WO2017107075A1 (en) * 2015-12-22 2017-06-29 SZ DJI Technology Co., Ltd. System, method, and mobile platform for supporting bracketing imaging
WO2017173734A1 (en) * 2016-04-06 2017-10-12 高鹏 Method and device for adjusting photographic angle and unmanned aerial vehicle
WO2017193374A1 (en) * 2016-05-13 2017-11-16 SZ DJI Technology Co., Ltd. System and method for presenting a video via transcode
CN108496349A (en) * 2017-04-22 2018-09-04 深圳市大疆灵眸科技有限公司 Shooting control method and device
CN108965689A (en) * 2017-05-27 2018-12-07 昊翔电能运动科技(昆山)有限公司 Unmanned plane image pickup method and device, unmanned plane and ground control unit
CN109871027A (en) * 2017-12-05 2019-06-11 深圳市九天创新科技有限责任公司 A kind of oblique photograph method and system
CN110278717A (en) * 2018-01-22 2019-09-24 深圳市大疆创新科技有限公司 Control the method and apparatus of aircraft flight
CN112106341A (en) * 2019-08-30 2020-12-18 深圳市大疆创新科技有限公司 Shooting method and device and shooting equipment
WO2021036947A1 (en) * 2019-08-23 2021-03-04 深圳市道通智能航空技术有限公司 Auxiliary focusing method and apparatus, and unmanned aerial vehicle
CN113148163A (en) * 2016-09-22 2021-07-23 深圳市大疆创新科技有限公司 Function control method and device based on aircraft
CN113467499A (en) * 2018-05-30 2021-10-01 深圳市大疆创新科技有限公司 Flight control method and aircraft

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003110981A (en) * 2001-09-28 2003-04-11 Hiroboo Kk Aerial video image processing system and wireless small- sized unmanned flying body
CN101540800A (en) * 2008-03-19 2009-09-23 索尼爱立信移动通信日本株式会社 Mobile terminal device and computer program
CN201604796U (en) * 2010-03-23 2010-10-13 贵阳帝三数字技术有限公司 Intelligent aerial photography unmanned aerial vehicle
CN201774596U (en) * 2010-09-16 2011-03-23 天津三星光电子有限公司 Digital camera having function of photograph sort management
CN102314515A (en) * 2011-09-22 2012-01-11 宇龙计算机通信科技(深圳)有限公司 Photo classification method and device
CN102436738A (en) * 2011-09-26 2012-05-02 同济大学 Traffic monitoring device based on unmanned aerial vehicle (UAV)
US20120133778A1 (en) * 2010-11-30 2012-05-31 Industrial Technology Research Institute Tracking system and method for image object region and computer program product thereof
CN102880871A (en) * 2012-02-17 2013-01-16 邰锋 System and method for classifying images according to time axis and places
CN103188431A (en) * 2011-12-27 2013-07-03 鸿富锦精密工业(深圳)有限公司 System and method for controlling unmanned aerial vehicle to conduct image acquisition
CN103763473A (en) * 2014-01-23 2014-04-30 徐鹏 Control device for adjusting parameters of aerial camera in real time
CN103886189A (en) * 2014-03-07 2014-06-25 国家电网公司 Patrolling result data processing system and method used for unmanned aerial vehicle patrolling
CN103941745A (en) * 2014-03-07 2014-07-23 国家电网公司 Movable substation and working method for unmanned aerial vehicle electric transmission line inspection
CN104331509A (en) * 2014-11-21 2015-02-04 深圳市中兴移动通信有限公司 Picture managing method and device
CN104679873A (en) * 2015-03-09 2015-06-03 深圳市道通智能航空技术有限公司 Aircraft tracing method and aircraft tracing system
CN204462869U (en) * 2015-04-08 2015-07-08 王全超 Unmanned plane zoom The Cloud Terrace
CN104792313A (en) * 2015-03-31 2015-07-22 深圳一电科技有限公司 Surveying and mapping control method, device and system of unmanned reconnaissance system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003110981A (en) * 2001-09-28 2003-04-11 Hiroboo Kk Aerial video image processing system and wireless small- sized unmanned flying body
CN101540800A (en) * 2008-03-19 2009-09-23 索尼爱立信移动通信日本株式会社 Mobile terminal device and computer program
CN201604796U (en) * 2010-03-23 2010-10-13 贵阳帝三数字技术有限公司 Intelligent aerial photography unmanned aerial vehicle
CN201774596U (en) * 2010-09-16 2011-03-23 天津三星光电子有限公司 Digital camera having function of photograph sort management
US20120133778A1 (en) * 2010-11-30 2012-05-31 Industrial Technology Research Institute Tracking system and method for image object region and computer program product thereof
CN102314515A (en) * 2011-09-22 2012-01-11 宇龙计算机通信科技(深圳)有限公司 Photo classification method and device
CN102436738A (en) * 2011-09-26 2012-05-02 同济大学 Traffic monitoring device based on unmanned aerial vehicle (UAV)
CN103188431A (en) * 2011-12-27 2013-07-03 鸿富锦精密工业(深圳)有限公司 System and method for controlling unmanned aerial vehicle to conduct image acquisition
CN102880871A (en) * 2012-02-17 2013-01-16 邰锋 System and method for classifying images according to time axis and places
CN103763473A (en) * 2014-01-23 2014-04-30 徐鹏 Control device for adjusting parameters of aerial camera in real time
CN103886189A (en) * 2014-03-07 2014-06-25 国家电网公司 Patrolling result data processing system and method used for unmanned aerial vehicle patrolling
CN103941745A (en) * 2014-03-07 2014-07-23 国家电网公司 Movable substation and working method for unmanned aerial vehicle electric transmission line inspection
CN104331509A (en) * 2014-11-21 2015-02-04 深圳市中兴移动通信有限公司 Picture managing method and device
CN104679873A (en) * 2015-03-09 2015-06-03 深圳市道通智能航空技术有限公司 Aircraft tracing method and aircraft tracing system
CN104792313A (en) * 2015-03-31 2015-07-22 深圳一电科技有限公司 Surveying and mapping control method, device and system of unmanned reconnaissance system
CN204462869U (en) * 2015-04-08 2015-07-08 王全超 Unmanned plane zoom The Cloud Terrace

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105373629A (en) * 2015-12-17 2016-03-02 谭圆圆 Unmanned aerial vehicle-based flight condition data processing device and method
WO2017107075A1 (en) * 2015-12-22 2017-06-29 SZ DJI Technology Co., Ltd. System, method, and mobile platform for supporting bracketing imaging
US11336837B2 (en) 2015-12-22 2022-05-17 SZ DJI Technology Co., Ltd. System, method, and mobile platform for supporting bracketing imaging
CN105468789A (en) * 2015-12-30 2016-04-06 谭圆圆 Image processing apparatus based on photographing of unmanned aerial vehicle and image processing method thereof
WO2017173734A1 (en) * 2016-04-06 2017-10-12 高鹏 Method and device for adjusting photographic angle and unmanned aerial vehicle
US11197058B2 (en) 2016-05-13 2021-12-07 SZ DJI Technology Co., Ltd. System and method for presenting a video via transcode
WO2017193374A1 (en) * 2016-05-13 2017-11-16 SZ DJI Technology Co., Ltd. System and method for presenting a video via transcode
CN106060469A (en) * 2016-06-23 2016-10-26 杨珊珊 Image processing system based on photographing of unmanned aerial vehicle and image processing method thereof
CN113148163A (en) * 2016-09-22 2021-07-23 深圳市大疆创新科技有限公司 Function control method and device based on aircraft
CN106534775A (en) * 2016-11-03 2017-03-22 北京小米移动软件有限公司 Video editing method and device as well as video shooting method and device
CN106791404A (en) * 2016-12-26 2017-05-31 上海青橙实业有限公司 Unmanned plane and its photographic method
CN108496349A (en) * 2017-04-22 2018-09-04 深圳市大疆灵眸科技有限公司 Shooting control method and device
CN108496349B (en) * 2017-04-22 2022-05-13 深圳市大疆灵眸科技有限公司 Shooting control method and device
CN108965689A (en) * 2017-05-27 2018-12-07 昊翔电能运动科技(昆山)有限公司 Unmanned plane image pickup method and device, unmanned plane and ground control unit
CN109871027A (en) * 2017-12-05 2019-06-11 深圳市九天创新科技有限责任公司 A kind of oblique photograph method and system
CN109871027B (en) * 2017-12-05 2022-07-01 深圳市九天创新科技有限责任公司 Oblique photography method and system
CN110278717A (en) * 2018-01-22 2019-09-24 深圳市大疆创新科技有限公司 Control the method and apparatus of aircraft flight
CN113467499A (en) * 2018-05-30 2021-10-01 深圳市大疆创新科技有限公司 Flight control method and aircraft
WO2021036947A1 (en) * 2019-08-23 2021-03-04 深圳市道通智能航空技术有限公司 Auxiliary focusing method and apparatus, and unmanned aerial vehicle
CN112106341A (en) * 2019-08-30 2020-12-18 深圳市大疆创新科技有限公司 Shooting method and device and shooting equipment

Similar Documents

Publication Publication Date Title
CN105120136A (en) Shooting device based on unmanned aerial vehicle and shooting processing method thereof
CN205017413U (en) Shoot device based on unmanned vehicles
US9851716B2 (en) Unmanned aerial vehicle and methods for controlling same
CN108351649B (en) Method and apparatus for controlling a movable object
Anweiler et al. Multicopter platform prototype for environmental monitoring
CN105242685B (en) A kind of accompanying flying unmanned plane system and method
US11288824B2 (en) Processing images to obtain environmental information
CN110537365B (en) Information processing device, information processing method, information processing program, image processing device, and image processing system
CN104118561B (en) Method for monitoring large endangered wild animals based on unmanned aerial vehicle technology
JP7179382B2 (en) Phenotypic information collection system for field plants
CN106060469A (en) Image processing system based on photographing of unmanned aerial vehicle and image processing method thereof
CN105539870A (en) Swinging camera oblique photographing device carried on unmanned aerial vehicle
CN106043723A (en) Swinging oblique photography system and method of fixed-wing unmanned aerial vehicle
WO2022141956A1 (en) Flight control method, video editing method, device, unmanned aerial vehicle, and storage medium
CN106101563A (en) Unmanned vehicle time-lapse shooting device and time-lapse shooting method thereof
CN105758384A (en) Unmanned aerial vehicle rocking oblique photograph system
CN103438869A (en) Aerial dynamic large-scale vegetation coverage acquisition system
CN205017419U (en) Device of taking photo by plane
CN105468789A (en) Image processing apparatus based on photographing of unmanned aerial vehicle and image processing method thereof
CN114556256A (en) Flight control method, video editing method, device, unmanned aerial vehicle and storage medium
CN110554422B (en) Full artificial intelligence system based on many rotor unmanned aerial vehicle flight type nuclide identification appearance
CN109417609B (en) Method and system for combining and editing UAV operational data and video data
CN205263814U (en) Image processing device based on unmanned vehicles shoots
CN212850809U (en) Unmanned aerial vehicle engineering image real-time uploading and partition display system
WO2020225979A1 (en) Information processing device, information processing method, program, and information processing system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20190903

Address after: 100020 Beijing City West Road No. 12 Chaoyang District Building No. 2 (national advertising Industrial Park incubator No. 25978)

Applicant after: High domain (Beijing) Intelligent Technology Research Institute Co., Ltd.

Address before: 100052 Beijing City, Xicheng District Caishikou Street No. 2 CITIC Qinyuan 3-3-701

Applicant before: Yang Shanshan

TA01 Transfer of patent application right
RJ01 Rejection of invention patent application after publication

Application publication date: 20151202

RJ01 Rejection of invention patent application after publication