CN109436355B - Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle - Google Patents

Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle Download PDF

Info

Publication number
CN109436355B
CN109436355B CN201811214117.4A CN201811214117A CN109436355B CN 109436355 B CN109436355 B CN 109436355B CN 201811214117 A CN201811214117 A CN 201811214117A CN 109436355 B CN109436355 B CN 109436355B
Authority
CN
China
Prior art keywords
exposure
propeller
period
time
exposure period
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811214117.4A
Other languages
Chinese (zh)
Other versions
CN109436355A (en
Inventor
梁宇恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN201811214117.4A priority Critical patent/CN109436355B/en
Publication of CN109436355A publication Critical patent/CN109436355A/en
Application granted granted Critical
Publication of CN109436355B publication Critical patent/CN109436355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)

Abstract

The embodiment of the application provides a method for controlling exposure of vision measuring equipment, a processing device, the vision measuring equipment, an unmanned aerial vehicle and a machine readable storage medium, and belongs to the field of unmanned aerial vehicles. The method comprises the following steps: acquiring the position of a propeller of the unmanned aerial vehicle during rotation; and controlling the camera device to perform exposure according to at least the position so that the camera device is exposed under the condition that the degree of the view field of the camera device which is shielded by the propeller is within an acceptable range. Through the technical scheme, the exposure time of the camera device of the unmanned aerial vehicle can be adapted to the position of the propeller, so that the influence of the propeller on the camera device can be reduced, and the imaging quality of the camera device is improved.

Description

Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle
Technical Field
The present application relates to the field of unmanned aerial vehicles, in particular to a method for controlling exposure of a vision measurement device, a processing apparatus, a vision measurement device, an unmanned aerial vehicle and a machine readable storage medium.
Background
In the current society, the application of unmanned vehicles is more and more extensive, and unmanned vehicles have been applied to fields such as aerial photography, agriculture, plant protection, express transportation and disaster rescue at present. In order to achieve more abundant functions, many unmanned aerial vehicles are provided with a camera device for visual measurement, however, a propeller and other structures of the unmanned aerial vehicle may block the view field of the camera device, thereby affecting the view angle and the imaging quality of the camera device.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method, a processing device, a vision measuring device, an unmanned aerial vehicle and a machine readable storage medium for controlling exposure of the vision measuring device, so as to at least solve the problem that a propeller of the unmanned aerial vehicle shields a camera device when the unmanned aerial vehicle is in operation.
In order to achieve the above object, in a first aspect of the present application, there is provided a method for controlling exposure of a vision measuring apparatus applied to an unmanned aerial vehicle, the vision measuring apparatus including an image pickup device, the method comprising: acquiring the position of a propeller of the unmanned aerial vehicle during rotation; and controlling the camera device to perform exposure according to at least the position so that the camera device is exposed under the condition that the degree of the view field of the camera device which is shielded by the propeller is within an acceptable range.
Optionally, the method further includes acquiring a rotation speed of the propeller when rotating, and the controlling the camera device to perform exposure according to at least the position includes: determining the position of the propeller in a subsequent predetermined period of time from the current position of the propeller and the rotational speed; determining an exposure period from the position of the propeller over the predetermined period of time, wherein the extent to which the field of view is obscured by the propeller over the exposure period is within the acceptable range; and controlling the image pickup device to expose in the exposure period.
Optionally, the controlling the image pickup device to expose in the exposure period includes: and controlling the camera to start exposure at the initial position of the exposure period.
Optionally, the method further comprises: acquiring the rotating speed control instruction, wherein the rotating speed control instruction is used for controlling the rotating speed of the propeller, and the position of the propeller in the preset time period is further determined according to the rotating speed control instruction.
Optionally, the method further comprises: if the exposure period does not exist in the predetermined period of time, the rotation speed of the propeller is reduced to re-determine the exposure period.
Optionally, the propeller comprises a first propeller and a second propeller; the determining an exposure period from the position of the propeller within the predetermined time period comprises: determining a first exposure period from the position of the first propeller over the predetermined period of time, wherein the extent to which the field of view is obscured by the first propeller over the first exposure period is within a first acceptable range; determining a second exposure period from the position of the second propeller over the predetermined period of time, wherein the extent to which the field of view is obscured by the second propeller over the second exposure period is within a second acceptable range; determining a desired exposure period according to the first exposure period and the second exposure period, wherein the desired exposure period is located in the first exposure period and the second exposure period, and the desired exposure period is not less than the exposure time of the camera device; the controlling the image pickup device to expose in the exposure period includes: and controlling the camera to expose in the expected exposure period.
Optionally, the controlling the image pickup device to expose in the desired exposure period includes: and controlling the image pickup device to start exposure at the initial time of the expected exposure period.
Optionally, the method further comprises: if the desired exposure period is not present in the predetermined period of time, the rotational speed of the first propeller and/or the second propeller is reduced to re-determine the desired exposure period.
Optionally, the image capturing device includes a plurality of image capturing devices exposed simultaneously, and the field of view of the image capturing devices is a field of view in which the fields of view of the plurality of image capturing devices exposed simultaneously are superimposed.
Optionally, the method further comprises: if the exposure period or desired exposure period is not present in the predetermined period of time, the rotational speed of the propeller is reduced to re-determine the exposure period or desired exposure period.
Optionally, the camera device comprises a first camera device and a second camera device, and the propeller comprises a first propeller and a second propeller; the determining an exposure period from the position of the propeller within the predetermined time period comprises: determining a first exposure period according to the position of the first propeller within the predetermined time period, wherein the degree to which the field of view of the first camera device is obscured by the first propeller within the first exposure period is within a first acceptable range; determining a second exposure period according to the position of the second propeller in the predetermined time period, wherein the degree of the field of view of the second camera device which is shielded by the second propeller in the second exposure period is within a second acceptable range; determining a desired exposure period according to the first exposure period and the second exposure period, wherein the desired exposure period is located in the first exposure period and the second exposure period, and the desired exposure period is not less than the exposure time of the first camera device and the second camera device; the controlling the image pickup device to expose in the exposure period includes: and controlling the first camera device and the second camera device to be exposed in the expected exposure period.
Optionally, the controlling the first and second image capturing devices to expose in the desired exposure period comprises: and controlling the first image pickup device and the second image pickup device to start exposure simultaneously at the initial time of the expected exposure period.
Optionally, in the absence of the desired exposure period within the predetermined period of time, the method further comprises: determining exposure timings of the first and second image pickup devices according to the first and second exposure periods, wherein a first part and a second part of an exposure time during which the first and second image pickup devices perform exposure at the exposure timing respectively overlap with the first and second exposure periods which overlap or are continuous in time; and controlling the first camera device and the second camera device to perform exposure at the exposure time.
Optionally, the first portion is half of the exposure time.
Optionally, the method further comprises: and after the last exposure, determining the expected exposure period is started before the first occurring one of the starting time and the ending time of the first exposure period and the starting time and the ending time of the second exposure period and at a time point which is not less than the first part.
Optionally, in the absence of the desired exposure period within the predetermined period of time, the method further comprises: determining a first exposure period and a second exposure period which are continuous in time according to the first exposure period and the second exposure period; the first and second image pickup devices are controlled to expose the end portion of one earlier in time and the start portion of the other later in time in the corresponding first and second exposure periods, respectively.
Alternatively, the exposure times of the first and second image pickup devices for the end portion and the start portion exposure are continuous in time.
Optionally, the method further comprises: and after the last exposure, determining the expected exposure period is started at a time point which is before and is not less than the exposure time of the first one of the starting time and the ending time of the first exposure period and the starting time and the ending time of the second exposure period.
Optionally, the method further comprises: in the case of exposure priority, if the desired exposure period does not exist within the predetermined period of time, the rotational speed of the first propeller and/or the second propeller is reduced to re-determine the desired exposure period.
In a second aspect of the present application, a processing apparatus is provided that is configured to perform the above-described method for controlling exposure of a vision measurement device.
In a third aspect of the present application, there is provided a vision measuring apparatus applied to an unmanned aerial vehicle, the vision measuring apparatus including: a camera device; a position detection device configured to detect a position of at least one propeller of the unmanned aerial vehicle; and the processing apparatus described above.
In a fourth aspect of the application, an unmanned aerial vehicle is provided, comprising a vision measuring device according to the above.
In a fifth aspect of the application, a machine-readable storage medium is provided having instructions stored thereon for enabling a processor to perform the above-described method for controlling exposure of a vision measurement apparatus when executed by the processor.
Through the technical scheme, the exposure time of the camera device of the unmanned aerial vehicle can be adapted to the position of the propeller, so that the influence of the propeller on the camera device can be reduced, and the imaging quality of the camera device is improved.
Additional features and advantages of embodiments of the present application will be described in detail in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the detailed description serve to explain the embodiments of the disclosure, but are not intended to limit the embodiments of the disclosure. In the drawings:
fig. 1 illustrates a schematic view of an unmanned aerial vehicle provided by an embodiment of the present application, which may be a quad-rotor unmanned aerial vehicle;
FIG. 2 illustrates a right side view of an unmanned aerial vehicle that may be a single-rotor helicopter provided by an embodiment of the present application;
FIG. 3 illustrates a top view of an unmanned aerial vehicle that may be a single-rotor helicopter provided by an embodiment of the present application;
FIG. 4 illustrates a flow chart of a method for controlling exposure of a vision measurement device provided by one embodiment of the present application;
FIG. 5 illustrates a lateral histogram representing propeller position provided by one embodiment of the present application;
FIG. 6 illustrates a flow chart of a method for controlling exposure of a vision measurement device provided by an alternative embodiment of the present application;
FIG. 7 illustrates a top view of an unmanned aerial vehicle that may be a single-rotor helicopter provided by an alternative embodiment of the present application;
fig. 8 illustrates a schematic view of an alternative embodiment of the present application providing an unmanned aerial vehicle that may be a quad-rotor drone;
fig. 9 is a schematic diagram illustrating an area of the quad-rotor drone of fig. 8 occluded by the propeller of the camera;
FIG. 10 illustrates a right side view of an unmanned aerial vehicle that may be a twin-bladed helicopter provided in an alternative embodiment of the present application;
FIG. 11 illustrates a top view of an unmanned aerial vehicle that may be a twin-bladed helicopter provided in accordance with an embodiment of the present application;
FIG. 12 illustrates a transverse histogram representing propeller occlusion of the field of view of the camera provided by an alternative embodiment of the present application;
fig. 13 is a schematic view illustrating an area of the quad-rotor drone that is blocked by the propeller of the quad-rotor drone by the camera according to an alternative embodiment of the present application;
FIG. 14 illustrates a top view of an unmanned aerial vehicle that may be a twin-bladed helicopter provided in accordance with an embodiment of the present application;
fig. 15 illustrates a schematic view of an alternative embodiment of the present application providing an unmanned aerial vehicle that may be a quad-rotor drone;
FIG. 16 illustrates a transverse histogram representing propeller occlusion of the field of view of the camera provided by an alternative embodiment of the present application;
FIG. 17 illustrates a flow chart of a method for controlling exposure of a vision measurement device provided by an alternative embodiment of the present application;
FIG. 18 illustrates a transverse histogram representing propeller occlusion of the field of view of the camera provided by an alternative embodiment of the present application;
FIG. 19 illustrates a flow chart of a method for controlling exposure of a vision measurement device provided by an alternative embodiment of the present application; and
FIG. 20 illustrates a lateral histogram representing the blockage of the field of view of the camera by the propeller, as provided by an alternative embodiment of the present application.
Description of the reference numerals
1 fuselage 2 horn
3 camera 4 propeller
5 electric machine 21 first arm
22 second arm 31 first camera device
32 second camera 41 first propeller
42 second propeller 51 first motor
52 second electric machine
Detailed Description
The following detailed description of embodiments of the present application will be made with reference to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the application, are given by way of illustration and explanation only, not limitation.
In the embodiments of the present application, unless otherwise specified, the use of directional terms such as "upper, lower, top, and bottom" is generally used with respect to the orientation shown in the drawings or the positional relationship of the components with respect to each other in the vertical, or gravitational direction.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present application, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between the various embodiments can be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present application.
Fig. 1 illustrates a schematic view of an unmanned aerial vehicle provided by an embodiment of the present application, which may be a quad-rotor unmanned aerial vehicle. Fig. 2 to 3 are schematic diagrams illustrating that an unmanned aerial vehicle provided by an embodiment of the present application may be a single-rotor helicopter. As shown in fig. 1 to 3, in an embodiment of the present invention, an unmanned aerial vehicle, which may be, for example, a quad-rotor drone, a single-rotor helicopter, or the like, may include a fuselage 1, a horn 2, a propeller 4, and a motor 5 (not shown in fig. 2 and 3) for driving the propeller. The unmanned aerial vehicle may further include a vision measuring apparatus, which may include the camera device 3, a position detection device (not shown in the figure), and a processing device (not shown in the figure). A position detection device may be mounted on the motor 5 and may be used to detect (e.g., detect in real time) the position of the propeller 4 of the unmanned aerial vehicle as the propeller 4 rotates. The processing means may be mounted within the body 1. Position sensing devices may include, but are not limited to, encoders, electro-optic triggers, and electronic governors, for example. The camera means 3 may comprise a camera, a camera or the like. The processing device may include, but is not limited to, a general purpose processor, a special purpose processor, a conventional processor, a Digital Signal Processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of Integrated Circuit (IC), a state machine, and the like.
When the unmanned aerial vehicle works, the motor 5 drives the propeller 4 to rotate so that the unmanned aerial vehicle flies in the air, and the camera device 3 acquires image data through continuous exposure. As shown in fig. 1 to 3, in order to reduce the influence of the propeller 4 on the imaging device 3, which may block the field of view of the imaging device 3 during the rotation of the propeller 4, the present embodiment provides a method for controlling the exposure of the vision measuring apparatus, and the method is described below by embodiments.
As shown in fig. 4, the present application provides a method for controlling exposure of a vision measuring apparatus, which may include the following steps.
Step S11, acquiring the position of the propeller of the unmanned aerial vehicle when the propeller rotates;
and step S12, controlling the camera device to expose according to the position of the propeller, so that the camera device exposes under the condition that the degree of the view field of the camera device which is blocked by the propeller is within an acceptable range. The condition that the degree of the view field of the camera device which is blocked by the propeller is within the acceptable range may include a condition that the propeller does not block the view field of the camera device at all and a condition that the degree of the view field of the camera device which is blocked by the propeller is smaller than a preset threshold.
Specifically, as shown in fig. 5, the propeller may block the image pickup device when rotating, but there is a time period in which the position of the propeller does not block the image pickup device during the rotation of the propeller, so that the blocking degree of the field of view of the image pickup device when the propeller is at different positions can be determined in advance according to the positional relationship between the propeller and the image pickup device, and then the position of the propeller is divided into an unblocked position where the blocking degree is within an acceptable range and a blocked position where the blocking degree is not within the acceptable range according to the blocking degree of the field of view of the image pickup device. Therefore, when the position of the propeller is obtained when the propeller rotates, whether the shielding degree of the view field of the camera device by the propeller is within the acceptable range or not can be determined according to the position of the propeller, and the camera device can be exposed under the condition that the shielding degree of the view field of the camera device by the propeller is within the acceptable range. For example, as shown in fig. 5, the image pickup apparatus may be controlled to perform exposure in the a period or the B period in fig. 5, that is, the a period or the B period as the exposure time of the image pickup apparatus. The position of the propeller can be determined, for example, by a position detection device mounted on the motor of the propeller.
It should be noted that the case where the degree that the field of view of the image pickup apparatus is blocked by the propeller is within the acceptable range may include a case where the propeller does not block the field of view of the image pickup apparatus at all and a case where the degree that the field of view of the image pickup apparatus is blocked by the propeller is less than a preset threshold. For example, in one case, the position of the propeller when the field of view of the imaging device is not blocked at all may be classified as a non-blocked position in which the blocking degree is within the acceptable range, and the remaining positions of the propeller may be classified as blocked positions in which the blocking degree is not within the acceptable range. For example, as shown in fig. 3, when the propeller is in the X region, the imaging device is shielded, and therefore, a shielding position can be set when the propeller is located in the X region, and when the propeller is located in the Y region, the imaging device is not shielded, and therefore, a non-shielding region can be set when the propeller is located in the Y region. In another case, the position where the propeller blocks the field of view of the image pickup device to a degree less than a preset threshold (for example, the proportion of the blocked field of view of the image pickup device is less than 5%) may be classified as a non-blocked position where the blocking degree is within an acceptable range, and the rest of the positions of the propeller may be classified as blocked positions where the blocking degree is not within the acceptable range.
Therefore, according to the technical scheme of the embodiment, the exposure time of the camera device can be adapted to the position of the propeller, so that the influence of the propeller on the camera device can be reduced, and the imaging quality of the camera device is improved.
In an alternative embodiment of the present application, as shown in fig. 6, a method for controlling exposure of a vision measuring device includes the steps of:
and step S21, acquiring the position and the rotating speed of the propeller of the unmanned aerial vehicle when the propeller rotates.
In step S22, the position of the propeller in the subsequent predetermined period of time is determined based on the current position and the rotational speed of the propeller.
Step S23, determining an exposure period according to the position of the propeller within a predetermined period of time, wherein the degree to which the field of view of the image pickup device is blocked by the propeller within the exposure period is within an acceptable range.
In step S24, the imaging device is controlled to expose in an exposure period.
Specifically, the unmanned aerial vehicle may also determine the rotation speed of the propeller through a position detection device mounted on the motor, and then may predict the position of the propeller within a subsequent predetermined period of time from the current position and rotation speed of the propeller, so that the exposure period of the imaging device may be determined from the position of the propeller within the subsequent predetermined period of time. It is understood that the distribution of the propeller in the blocking position and the non-blocking position within the predetermined time period can be determined according to the current position of the propeller, and the duration of the propeller in the blocking position and the non-blocking position within the predetermined time period can be determined according to the rotating speed of the propeller. For example, a continuous time period in which the degree of the shielding of the field of view of the image pickup device by the propeller is within an acceptable range in a predetermined time period may be selected as the exposure period, and the continuous time period may be, for example, greater than the exposure time of the image pickup device, so that the image pickup device is completely shielded by the propeller when the exposure process is completed in the exposure period. Preferably, the camera means may control the camera means to start exposure at a start position of the exposure period to ensure that the propeller has minimal effect on the field of view of the camera means when the camera means is exposed. The predetermined time period may be determined, for example, according to an exposure frequency required by the image pickup device.
In this way, by acquiring the rotation speed of the propeller, the position of the propeller at a later time can be predicted based on the current position of the propeller, so that the imaging device can be controlled to perform exposure in a better time period.
Further, the method for controlling the exposure of the vision measuring device may further include obtaining a rotational speed control command for controlling the rotational speed of the propeller. Specifically, the rotating speed control command may affect the rotating speed of the propeller within the predetermined time period, so that the position of the propeller within the subsequent predetermined time period can be determined according to the current position, the rotating speed and the rotating speed control command of the propeller, and the position of the propeller within the predetermined time period can be determined more accurately.
The imaging device of the above-described unmanned aerial vehicle is not limited to one imaging device, and may be a plurality of imaging devices that require simultaneous exposure. For example, as shown in fig. 7, the unmanned aerial vehicle includes a first camera device 41 and a second camera device 42, the first camera device 41 and the second camera device 42 need to be exposed simultaneously, the propeller 4 blocks both the first camera device 31 and the second camera device 32 during rotation, at this time, the first camera device 41 and the second camera device 42 can be regarded as a total camera device, and the field of view of the first camera device 41 and the field of view of the second camera device 42 are overlapped to obtain the field of view of the total camera device, for example, in fig. 7, when the propeller 4 rotates to an X region, the field of view of the total camera device is blocked, and when the propeller 4 rotates to a Y region, the field of view of the total camera device is not blocked. Therefore, in this case, since two image pickup devices can be regarded as one total image pickup device, the above-described methods for controlling the exposure of the vision measuring apparatus shown in fig. 4 and 6 are also applicable.
In an alternative embodiment of the invention, if it is determined from the current position, the rotational speed and/or the rotational speed control command of the propeller that there is no exposure period in the subsequent predetermined period of time, the rotational speed of the propeller may be reduced, for example by the rotational speed control command, and the exposure period re-determined after the rotational speed of the propeller has been reduced. In particular, in the night or in a dark scene, the exposure process of the image pickup apparatus often requires a sufficiently long exposure time in order to satisfy the exposure quality. When the rotation speed of the propeller is too fast, the lengths of time during which the field of view of the image pickup device is blocked by the propeller within an acceptable range may each be shorter than the exposure time required for the image pickup device, so that there is no exposure period that meets the requirements within a predetermined period of time. At this time, if the camera device determines that exposure is required according to, for example, a user instruction, the rotation speed of the propeller may be reduced to a predetermined rotation speed value by the rotation speed control instruction, and then an exposure period in a subsequent predetermined period of time may be determined according to the current position, rotation speed, and/or rotation speed control instruction of the propeller. The predetermined rotation speed value may be predetermined according to an exposure time required by the image pickup device. In addition, the rotating speed of the propeller can be reduced by a preset value each time, and whether the exposure period exists in the following preset time period or not can be determined according to the current position, the rotating speed and/or the rotating speed control instruction of the propeller. If the exposure period exists, the camera device is controlled to expose in the exposure period, and if the exposure period does not exist, the rotating speed of the propeller is reduced by the preset value until the exposure period exists in the following preset time period.
In some cases, there may be more than one propeller that obscures the camera. For example, as shown in fig. 8 to 9, the unmanned aerial vehicle may be, for example, a quad-rotor unmanned aerial vehicle having four arms, wherein the first and second arms 21 and 22 are respectively provided with a first propeller 41 and a second propeller 42, when the quad-rotor unmanned aerial vehicle is in operation, the first and second propellers 41 and 42 are respectively driven by a first motor 51 and a second motor 52 to rotate, and both the first and second propellers 41 and 42 obstruct the view field of the image capturing device 3 during rotation. Specifically, as shown in fig. 9, the Z region is an intersection of the rotation ranges of the two propellers and the field of view of the imaging device 3, and when any propeller rotates to the Z region, the imaging device 3 is shielded. In addition, as shown in fig. 10 to 11, the unmanned aerial vehicle may also be, for example, a twin-screw helicopter whose horn 2 is vertically arranged, and on which horn 2 a first propeller 41 and a second propeller 42 having different vertical heights are mounted, and when the first propeller 41 and the second propeller 42 are rotated to the Y region, the field of view of the imaging device 3 is not blocked, but when the first propeller 41 and the second propeller 42 are rotated to the X region, the field of view of the imaging device 3 is blocked.
As an optional embodiment of the present application, when both of the two propellers can block the image capturing device, the step S23 may specifically include:
determining a first exposure period according to the position of the first propeller in a preset time period, wherein the degree of the visual field of the camera device which is shielded by the first propeller in the first exposure period is within a first acceptable range; determining a second exposure period according to the position of the second propeller in a preset time period, wherein the degree of the vision field of the camera device which is shielded by the second propeller in the second exposure period is within a second acceptable range;
a desired exposure period is determined from the first exposure period and the second exposure period, wherein the desired exposure period is within the first exposure period and also within the second exposure period, and the desired exposure period is not less than an exposure time of the image pickup device.
The step S24 may specifically include: the imaging device is controlled to expose within a desired exposure period.
Wherein the first acceptable range and the second acceptable range may be the same or different.
Specifically, as shown in fig. 12, a horizontal axis T in fig. 12 is a time axis, a vertical dotted line in the figure is a state boundary line where the first propeller and/or the second propeller are switched between the shielding position and the non-shielding position, T0 is a current time (i.e., a time at which the determination of the desired exposure period starts), a time period between T0 and tn is a predetermined time period, and three horizontal histograms from top to bottom respectively represent a position of the first propeller within the predetermined time period, a position of the second propeller within the predetermined time period, and shielding conditions of the first propeller and the second propeller on the image pickup device within the predetermined time period. When two propellers block the field of view of the camera device in the rotation process, at least one exposure period of each propeller in a subsequent preset time period can be determined according to the position of each propeller, a part, in which the exposure periods of the two propellers are coincident and not less than the exposure time of the camera device, is selected as a desired exposure period of the camera device (i.e. a time period in which neither the first propeller nor the second propeller is blocked in fig. 12), and the camera device is controlled to be exposed in the desired exposure period, i.e. the exposure process of the camera device is controlled to be completed in the desired exposure period, for example, a time period C or a time period D in fig. 12 can be selected as the exposure time of the camera device. Therefore, the shielding degree of the two propellers to the view field of the camera device can be ensured to be within an acceptable range in the exposure process of the camera device, and the imaging quality of the camera device is improved.
More preferably, after the expected exposure period is determined, the imaging device may be controlled to start exposure at the initial time of the expected exposure period, that is, the period C shown in fig. 12 is used as the exposure time of the imaging device, so as to ensure that the degree of shielding of the field of view of the imaging device by the two propellers is within an acceptable range during the whole exposure process of the imaging device.
In addition, it should be noted that the image pickup device of the unmanned aerial vehicle may include a plurality of image pickup devices that need to be exposed simultaneously, and the first propeller 41 and the second propeller 42 may block the plurality of image pickup devices that need to be exposed simultaneously during the rotation process. In this case, the plurality of image capturing devices requiring simultaneous exposure may be regarded as a total image capturing device, and the fields of view of the plurality of image capturing devices may be superimposed to be the field of view of the total image capturing device. For example, as shown in fig. 13 and 14, the unmanned aerial vehicle includes two cameras, and the two cameras need to be exposed simultaneously, in which case, the two cameras can be regarded as a total camera, and the fields of view of the two cameras are superimposed to be the field of view of the total camera. In fig. 13, when any propeller of the unmanned aerial vehicle is rotated to the Z region, the field of view of the total imaging device is affected, and in fig. 14, when any propeller of the unmanned aerial vehicle is rotated to the X region, the field of view of the total imaging device is affected. In this case, since two cameras can be regarded as one total camera, the exposure time of the total camera can be calculated by the above-described method for determining the exposure time of the camera in the case where the first propeller 41 and the second propeller 42 cause a block to one camera.
In the case of a small probability, there may be no expected exposure period within a predetermined period of time, in which case, if it is necessary to perform at least one exposure within the predetermined period of time according to the actual requirements measured by the camera device (e.g. the requirement for image measurement is high in real-time), the exposure time may be selected within the predetermined period of time according to the requirements of the camera device for exposure frequency and the positions of the two propellers at the time of exposure are sent to the processing device, which may for example comprise an image processing unit, so that the processing device may perform a masking process on the pattern obtained by the exposure by the camera device according to the positions of the propellers.
Alternatively, when there is no desired exposure period within a predetermined period of time, it is also possible to reduce the rotational speed of the first propeller and/or the second propeller, for example by means of a rotational speed control command, and to re-determine the desired exposure period after the rotational speed of the propeller has been reduced. Specifically, when the rotation speed of the first propeller and/or the second propeller is excessively fast, a desired exposure period may not be found within a predetermined period of time, at which time the rotation speed of at least one of the first propeller and the second propeller may be reduced, and it may be determined again whether the desired exposure period exists after the rotation speed of the propeller is reduced. For example, when the rotation speeds of the first propeller and the second propeller are the same and both are too fast (e.g., the rotation speeds are higher than a predetermined rotation speed value), the rotation speeds of the first propeller and the second propeller may be simultaneously reduced to the predetermined rotation speed value, and then the exposure period in the subsequent predetermined period may be determined according to the current positions, rotation speeds and/or rotation speed control commands of the first propeller and the second propeller. In addition, the first propeller and the second propeller may be simultaneously lowered by a predetermined value each time, and it may be determined whether there is an exposure period in a subsequent predetermined period of time, and if there is an exposure period, the imaging device may be controlled to expose in the exposure period, and if there is still no exposure period, the rotation speeds of the first propeller and the second propeller may be further lowered by the predetermined value until there is an exposure period in a subsequent predetermined period of time. Further, when the rotation speeds of the two propellers are not coincident and one of the propellers is excessively fast, the rotation speed of the excessively fast propeller may be reduced so that a desired exposure period exists within a predetermined period of time after the rotation speed of the propeller is reduced. The rotating speed of the propeller with the excessively high rotating speed can be directly reduced to a preset rotating speed value, the rotating speed of the propeller with the excessively high rotating speed can also be gradually reduced, and the specific process of reducing the rotating speed is similar to the process of reducing the rotating speeds of the two propellers simultaneously, so that the detailed description is omitted. In addition, when the rotation speeds of the two propellers are consistent, there may be no expected exposure period in a subsequent predetermined time period due to the relative positions of the two propellers, and at this time, the rotation speed of one of the propellers may be reduced to adjust the relative positions of the two propellers, so that the expected exposure period exists in a predetermined time period after the relative positions of the propellers are adjusted.
In some cases, the unmanned aerial vehicle may include a plurality of photographic devices. For example, as shown in fig. 15, the unmanned aerial vehicle may be, for example, a quad-rotor unmanned aerial vehicle including four propellers, wherein a first propeller 41 is located on the first arm 21, a second propeller 42 is located on the second arm 22, and the front end of the fuselage 1 includes a binocular camera composed of a first camera 31 and a second camera 32. When the unmanned aerial vehicle works, the first propeller 41 and the second propeller 42 are driven by the first motor 51 and the second motor 52 to rotate, wherein the first propeller 41 can shield the camera device 31 during the rotation process, and the second propeller 42 can shield the camera device 32 during the rotation process. The first camera device 31 and the second camera device 32 of the unmanned aerial vehicle have the same exposure time and can cooperate with each other to jointly complete image acquisition.
For the above unmanned aerial vehicle, as an alternative embodiment of the present application, step S23 may specifically include:
and determining a first exposure period according to the position of the first propeller in a preset time period, and determining a second exposure period according to the position of the second propeller in the preset time period, wherein the degree of the visual field of the first camera device which is shielded by the first propeller in the first exposure period is within a first acceptable range, and the degree of the visual field of the second camera device which is shielded by the second propeller in the second exposure period is within a second acceptable range.
And determining a desired exposure period according to the first exposure period and the second exposure period, wherein the desired exposure period is positioned in the first exposure period and the second exposure period, and the desired exposure period is not less than the exposure time of the first camera device and the second camera device.
Step S24 may specifically include: the first and second image pickup devices are controlled to expose in a desired exposure period.
Specifically, the positions and the speeds of the first propeller and the second propeller during rotation can be respectively obtained through the position detection device arranged on the first motor and the position detection device arranged on the second motor, the positions of the two propellers in a subsequent preset time period are determined by combining respective rotating speed control instructions of the two propellers, and the positions of the first propeller and the second propeller in the preset time period are divided into an unshielded position with the shielding degree in an acceptable range and a shielded position with the shielding degree out of the acceptable range according to the shielding degree of the camera device. As shown in fig. 16, a horizontal axis T in fig. 16 is a time axis, a vertical dotted line in the figure is a state boundary line where the first propeller and/or the second propeller are switched between the shielding position and the non-shielding position, T0 is a current time (i.e., a time at which the determination of the desired exposure period starts), a time period between T0 and tn is a predetermined time period, and three horizontal histograms from top to bottom respectively represent a position of the first propeller within the predetermined time period, a position of the second propeller within the predetermined time period, and shielding conditions of the first propeller and the second propeller on the first image pickup device and the second image pickup device within the predetermined time period. When the time period in which the first propeller and the second propeller are both in the non-shielding position and the duration time in which the first propeller and the second propeller are both in the non-shielding position is not less than the exposure time is the desired exposure period (i.e., the time period in which the duration time in which the propellers 51 and 52 are both in the non-shielding position is greater than the exposure time of the image pickup device in fig. 16), the first image pickup device and the second image pickup device may complete exposure simultaneously in the desired exposure period. For example, the E period, the F period, or the G period in fig. 16 may be set as the exposure time for the common exposure of the first image pickup device and the second image pickup device, so as to reduce the influence of the propeller on the image pickup device, wherein the duration of the E period, the F period, and the G period is equal to the exposure time of the first image pickup device and the second image pickup device. Preferably, the first and second image pickup devices may be controlled to simultaneously start exposure at an initial timing of a desired exposure period, i.e., an E period or a G period as an exposure time for the common exposure of the first and second image pickup devices.
As shown in fig. 17, for the unmanned aerial vehicle in the embodiment shown in fig. 15, there may be no desired exposure period within a predetermined time period with a small probability, in which case the method for controlling the exposure of the vision measuring device may further include the steps of:
step S31 of determining exposure timings of the first and second image pickup devices according to a first exposure period and a second exposure period, in which a first part and a second part of an exposure time during which the first and second image pickup devices perform exposure at the exposure timings temporally overlap with the temporally overlapping or continuous first and second exposure periods, respectively;
in step S32, the first image pickup device and the second image pickup device are controlled to perform exposure at exposure timing.
Specifically, in a case of a low probability, when a period of time that meets a desired exposure period does not occur within a predetermined period of time and at least one exposure needs to be performed within the predetermined period of time according to an actual requirement (for example, a requirement on image measurement is high in real-time), which is measured by the image capturing devices, it is possible to select the exposure when the blocked areas of the fields of view of the two image capturing devices are small. As shown in fig. 18, a horizontal axis T in fig. 18 is a time axis, a vertical dotted line in the figure is a state boundary line where the first propeller and/or the second propeller are switched between the shielding position and the non-shielding position, T0 is a current time (i.e., a time at which the determination of the desired exposure period starts), a time period between T0 and tn is a predetermined time period, and three horizontal histograms from top to bottom respectively represent a position of the first propeller within the predetermined time period, a position of the second propeller within the predetermined time period, and shielding conditions of the first propeller and the second propeller on the first image pickup device and the second image pickup device within the predetermined time period. As can be seen from fig. 18, in order to reduce the blocked area of the field of view of the two imaging devices during exposure, a time period in the vicinity of when the two propellers are switched between the blocking position and the non-blocking position may be selected to control the two imaging devices to simultaneously expose, and at this time, the propellers are just entering the blocking position or just leaving the blocking position, so that the blocking of the field of view of the imaging devices is reduced. Specifically, a time period in the vicinity of the state boundary (for example, a time period including the time at which the state boundary is located) in fig. 18 may be selected to control simultaneous exposure of the two imaging devices, and there is no case where both propellers are blocked during the time period. For example, the H period, the I period, and the J period in fig. 18 may be selected as the exposure times of the two image capturing devices, and in fig. 18, the duration of the propeller 51 and the propeller 52 in the non-shielding position is greater than the exposure time, so that the period of the propeller 51 in the non-shielding position is the first exposure period, and the period of the propeller 52 in the non-shielding position is the second exposure period. The duration of the H period, the I period, and the J period is equal to the exposure time of the two imaging devices, and may be defined by a state boundary of the propeller as a boundary, a portion before the state boundary being a first portion, and a portion after the state boundary being a second portion. As can be seen from fig. 18, the first part of the H period overlaps both the first exposure period of the propeller 51 and the second exposure period of the propeller 52, so that both the propeller 51 and the propeller 52 are in the non-shielding position during the first part of the H period, and the second part of the H period overlaps the first exposure period of the propeller 51, so that the propeller 51 is in the non-shielding position during the second part of the H period, thereby ensuring that at any time point during the exposure process when the H period is used as the exposure time of two image capturing devices, the extent to which the field of view of at least one image capturing device is shielded by the propeller is within an acceptable range. Likewise, a first portion of the I period overlaps with the second exposure period of the propeller 52, so that during the first portion of the I period the propeller 52 is in the non-occluding position, and a second portion of the I period overlaps with the first exposure period of the propeller 51, so that during the second portion of the I period the propeller 51 is in the non-occluding position; the first part of the J time period is overlapped with the first exposure period of the propeller 51, so that the propeller 51 is in the non-shielding position in the first part of the J time period, and the second part of the J time period is overlapped with both the first exposure period of the propeller 51 and the second exposure period of the propeller 52, so that the propeller 51 and the propeller 52 are in the non-shielding position in the second part of the J time period, and therefore, when the I time period or the J time period is taken as the exposure time of the two image pickup devices, the degree that the visual field of at least one image pickup device is shielded by the propeller can be ensured to be within an acceptable range at any time point in the exposure process. In this way, since both the propellers are located at positions where the field of view of the imaging device is blocked is small at the position of the state boundary of the propellers, the area where the field of view of the imaging device is blocked during exposure can be reduced when exposure is performed near the position of the state boundary.
Preferably, the first portion of the D, E and F periods may be half of the exposure time. For example, as shown in fig. 18, the durations of the D period, the E period, and the F period are exposure time t, where the D period, the E period, and the F period are centered on a state boundary of the propeller, and the durations on both sides of the state boundary are equal to 1/2 t. Therefore, when the time period is selected as the exposure time, the field of view of the first camera device and the field of view of the second camera device are relatively evenly affected by the corresponding propellers, so that the situation that a certain camera device is shielded by the corresponding propeller for too long time cannot occur in the exposure process.
Further, the desired exposure period may be determined, after the last exposure, at a time point before and spaced from the one which occurs first of the start time, the end time, the start time, and the end time of the first exposure period by not less than the first part of any one of the H period, the I period, and the J period. Specifically, after the previous exposure, determining whether there is a desired exposure period in a subsequent predetermined time period may begin at a dt time before a state boundary of the first propeller and/or the second propeller, where dt may be a duration greater than a first portion of any of an H period, an I period, and a J period, where the first portions of the H period, the I period, and the J period may be the same or different. And if the expected exposure period exists in the subsequent preset time period, controlling the first camera device and the second camera device to complete exposure in the expected exposure period, and if the expected exposure period does not exist in the subsequent preset time period, controlling the first camera device and the second camera device to perform exposure in the D time period, the E time period or the F time period in the subsequent preset time period according to the requirement of the camera device on the exposure frequency. In the case of a small probability, if there is no D period, no E period, and no F period within a predetermined period, the exposure time is selected within the predetermined period in accordance with the requirement of the imaging device for the exposure frequency, and the positions of the two propellers at the time of exposure are sent to the processing device, which may for example comprise an image processing unit, so that the processing device may mask the pattern obtained by the imaging device through exposure in accordance with the positions of the propellers.
As shown in fig. 19, for the unmanned aerial vehicle in the embodiment shown in fig. 15, there may be no expected exposure period within a predetermined time period with a small probability, and if the moving speed of the unmanned aerial vehicle is low (i.e., the unmanned aerial vehicle is in a low dynamic state) at this time, the method for controlling the exposure of the vision measuring apparatus may further include the steps of:
step S41 of determining a first exposure period and a second exposure period that are consecutive in time from the first exposure period and the second exposure period;
in step S42, the first and second image pickup devices are controlled to expose the end portion of the temporally earlier one and the start portion of the temporally later one in the corresponding first and second exposure periods, respectively.
Specifically, at a lower probability, no time period corresponding to the desired exposure period may occur within the predetermined time period. In this case, if at least one exposure is required within a predetermined period of time according to the actual requirements of the camera measurement (for example, the requirement on image measurement is high in real-time), and the unmanned aerial vehicle is in a low dynamic state, the two cameras may be exposed separately when their respective fields of view are less obstructed. However, in order to maintain the synchronism of the two image pickup devices, the interval between the exposure times of the two image pickup devices must not be excessively large when the two image pickup devices are exposed to light separately. As shown in fig. 20, a horizontal axis T in fig. 20 is a time axis, a vertical dotted line in the figure is a state boundary line where the first propeller and/or the second propeller are switched between the shielding position and the non-shielding position, T0 is a current time (i.e., a time at which the determination of the desired exposure period starts), a time period between T0 and tn is a predetermined time period, and three horizontal histograms from top to bottom respectively represent a position of the first propeller within the predetermined time period, a position of the second propeller within the predetermined time period, and shielding conditions of the first propeller and the second propeller on the first image pickup device and the second image pickup device within the predetermined time period. As can be seen from fig. 20, in order to make the blocked area of the field of view smaller and make the interval between the exposure times of the two image capturing devices smaller when the two image capturing devices are exposed, it is selected that the two image capturing devices are controlled to expose at both ends of the state boundary, and each image capturing device has its corresponding propeller at the non-blocking position during the exposure time. In fig. 20, the duration of the propeller 51 and the propeller 52 in the non-shielding position is longer than the exposure time, so that the time period of the propeller 51 in the non-shielding position is the first exposure period, and the time period of the propeller 52 in the non-shielding position is the second exposure period. Thus, it is possible to select a first exposure period and a second exposure period which are consecutive in time, and control the first image pickup device and the second image pickup device to expose at an end portion of one earlier in time and a start portion of the other later in time in the corresponding first exposure period and second exposure period, respectively. For example, the first and second image pickup devices may be controlled to perform exposure respectively by using the K1 time period as the exposure time of the second image pickup device and the K2 time period as the exposure time of the first device in fig. 20, or the first and second image pickup devices may be controlled to perform exposure respectively by using the L1 time period as the exposure time of the second image pickup device and the L2 time period as the exposure time of the first device in fig. 20, or the first and second image pickup devices may be controlled to perform exposure respectively by using the M1 time period as the exposure time of the first image pickup device and the M2 time period as the exposure time of the second device in fig. 20. Among them, there may be a small time interval between the K1 and K2 periods, between the L1 and L2 periods, between the M1 and M2 periods, but more preferably, the K1 and K2 periods, between the L1 and L2 periods, between the M1 and M2 periods are consecutive. In this way, when the two cameras respectively expose, the corresponding propellers are both in the non-shielding position, and the exposure time of the two cameras is continuous or the interval is short, so the influence on the imaging quality is low.
Further, the determination of the desired exposure period may be started after the last exposure, at a time point which is before and spaced from the one which occurs first among the start time, the end time, the start time, and the end time of the first exposure period by not less than the exposure time. Specifically, after the previous exposure, it may be determined whether there is a desired exposure period in a subsequent predetermined period of time starting at a dt time before a state boundary of the first propeller and/or the second propeller, where dt may be a duration greater than t, if there is a desired exposure period in the subsequent predetermined period of time, the first image pickup device and the second image pickup device are controlled to complete exposure in the desired exposure period, and if there is no desired exposure period in the subsequent predetermined period of time, the first image pickup device and the second image pickup device are controlled to perform exposure in a K1 period and a K2 period, an L1 period and an L2 period, or an M1 period and an M2 period, respectively, in the subsequent predetermined period of time, according to a requirement for an exposure frequency of the image pickup device. In a case of a small probability, if there are no K1 time period and K2 time period, L1 time period and L2 time period, and M1 time period and M2 time period corresponding to the first image pickup device and the second image pickup device, respectively, within a predetermined time period, an exposure time is selected within the predetermined time period in accordance with a requirement of the image pickup devices for exposure frequency, and positions of the two propellers at the time of exposure are transmitted to a processing device, which may include, for example, an image processing unit, so that the processing device may perform masking processing on a pattern obtained by the image pickup devices through exposure in accordance with the positions of the propellers.
In an alternative embodiment of the present invention, in the case of priority of exposure, if there is no desired exposure period within a predetermined time period, the rotation speed of the first propeller and/or the second propeller may be further reduced by a rotation speed control command to re-determine the desired exposure period. In particular, when the rotational speed of the first propeller and/or the second propeller is too fast, a desired exposure period may not be found within a predetermined period of time. At this time, if it is necessary to ensure that the imaging device can perform exposure (i.e., exposure priority) according to the setting of the user, the rotation speed of at least one of the first propeller and the second propeller may be reduced, and the desired exposure period may be newly determined after the rotation speed of the propeller is reduced. For example, when the rotation speeds of the first propeller and the second propeller are the same and both are too fast (e.g., the rotation speeds are higher than a predetermined rotation speed value), the rotation speeds of the first propeller and the second propeller may be simultaneously reduced to the predetermined rotation speed value, and then the exposure period in the subsequent predetermined period may be determined according to the current positions, rotation speeds and/or rotation speed control commands of the first propeller and the second propeller. In addition, the first propeller and the second propeller may be simultaneously lowered by a predetermined value each time, and it may be determined whether there is an exposure period in a subsequent predetermined period of time, if there is an exposure period, the first camera and the second camera are controlled to be exposed in the exposure period, and if there is no exposure period yet, the rotation speeds of the first propeller and the second propeller are further lowered by the predetermined value until there is an exposure period in the subsequent predetermined period of time. In addition, when the rotation speeds of the two propellers are not coincident and one of the propellers is excessively fast, the rotation speed of the excessively fast propeller may be reduced so that a desired exposure period exists within a predetermined period of time after the rotation speed of the propeller is reduced. The rotating speed of the propeller with the excessively high rotating speed can be directly reduced to a preset rotating speed value, the rotating speed of the propeller with the excessively high rotating speed can also be gradually reduced, and the specific process of reducing the rotating speed is similar to the process of reducing the rotating speeds of the two propellers simultaneously, so that the detailed description is omitted. In addition, when the rotation speeds of the two propellers are consistent, there may be no expected exposure period in a subsequent predetermined time period due to the relative positions of the two propellers, and at this time, the rotation speed of one of the propellers may be reduced to adjust the relative positions of the two propellers, so that the expected exposure period exists in a predetermined time period after the relative positions of the propellers are adjusted.
It should be noted that, in the above-described embodiment, the method for controlling the exposure of the vision measuring apparatus may be performed by a processing device located in the main body 1. Furthermore, the present application may also provide a machine-readable storage medium, which may be located inside the main body 1, for example, and may store instructions thereon, which when executed by a processor, for example, enable the processor to execute the above-mentioned method for controlling the exposure of the vision measuring apparatus, wherein the machine-readable storage medium may be a part of the processing device or a separate storage unit within the main body 1.
So, through the above-mentioned technical scheme of this application, can confirm camera device's exposure time according to the position of screw to can reduce unmanned vehicles's screw and to the sheltering from of camera device's the visual field at rotatory in-process, improve camera device's imaging quality, and this application embodiment need not carry out structural adjustment to unmanned vehicles, consequently can not increase unmanned vehicles's hardware cost, and the economic nature is better.
While the embodiments of the present application have been described in detail with reference to the accompanying drawings, the embodiments of the present application are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solutions of the embodiments of the present application within the technical concept of the embodiments of the present application, and the simple modifications belong to the protection scope of the embodiments of the present application.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present application do not separately describe various possible combinations.
Those skilled in the art will understand that all or part of the steps in the method according to the above embodiments may be implemented by a program, which is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In addition, any combination may be made between various embodiments of the present application, and the same should be considered as disclosed in the embodiments of the present application as long as it does not depart from the spirit of the embodiments of the present application.

Claims (11)

1. A method for controlling exposure of a vision measuring apparatus applied to an unmanned aerial vehicle, the vision measuring apparatus including an image pickup device including a first image pickup device and a second image pickup device, the method comprising:
acquiring the position of a propeller of the unmanned aerial vehicle when the propeller rotates, wherein the propeller comprises a first propeller and a second propeller;
determining a first exposure period from the position of the first propeller over a subsequent predetermined period of time, wherein the extent to which the field of view of the first camera is obscured by the first propeller during the first exposure period is within a first acceptable range;
determining a second exposure period according to the position of the second propeller in the predetermined time period, wherein the degree of the field of view of the second camera device which is shielded by the second propeller in the second exposure period is within a second acceptable range;
determining a desired exposure period according to the first exposure period and the second exposure period, wherein the desired exposure period is located in the first exposure period and the second exposure period, and the desired exposure period is not less than the exposure time of the first camera device and the second camera device; and
controlling the first camera device and the second camera device to simultaneously start exposure at the initial time of the expected exposure period;
wherein, in the absence of the desired exposure period within the predetermined period of time, determining exposure timings of the first and second image pickup devices according to the first and second exposure periods, and controlling the first and second image pickup devices to perform exposure at the exposure timings, wherein a first portion and a second portion of an exposure time during which the first and second image pickup devices perform exposure at the exposure timings temporally overlap with the first and second exposure periods that temporally overlap or continue, respectively; the first portion is half of the exposure time;
and after the last exposure, determining the expected exposure period is started before the first occurring one of the starting time and the ending time of the first exposure period and the starting time and the ending time of the second exposure period and at a time point which is not less than the first part.
2. The method of claim 1, further comprising acquiring a rotational speed of the propeller while rotating, the method further comprising:
determining a position of the propeller within the predetermined time period based on the current position of the propeller and the rotational speed.
3. The method of claim 2, further comprising:
acquiring the rotating speed control instruction, wherein the rotating speed control instruction is used for controlling the rotating speed of the propeller, and the position of the propeller in the preset time period is further determined according to the rotating speed control instruction.
4. The method according to any one of claims 1 to 3, further comprising:
in the case of exposure priority, if the desired exposure period does not exist within the predetermined period of time, the rotational speed of the first propeller and/or the second propeller is reduced to re-determine the desired exposure period.
5. The method of claim 1, wherein in the absence of the desired exposure period within the predetermined period of time, the method comprises:
determining a first exposure period and a second exposure period which are continuous in time according to the first exposure period and the second exposure period;
the first and second image pickup devices are controlled to expose the end portion of one earlier in time and the start portion of the other later in time in the corresponding first and second exposure periods, respectively.
6. The method of claim 5, wherein the exposure times of the first and second cameras for the end and beginning partial exposures are consecutive in time.
7. The method of claim 5, further comprising:
and after the last exposure, determining the expected exposure period is started at a time point which is before and is not less than the exposure time of the first one of the starting time and the ending time of the first exposure period and the starting time and the ending time of the second exposure period.
8. A processing apparatus, characterized in that it is configured to carry out a method for controlling the exposure of a vision measuring device according to any one of claims 1 to 7.
9. A vision measuring device applied to an unmanned aerial vehicle, characterized by comprising:
a camera device;
a position detection device configured to detect a position of at least one propeller of the unmanned aerial vehicle; and
the processing apparatus of claim 8.
10. An unmanned aerial vehicle comprising the vision measuring device of claim 9.
11. A machine-readable storage medium having stored thereon instructions for enabling a processor to execute the method for controlling exposure of a vision measurement apparatus according to any one of claims 1 to 7 when executed by the processor.
CN201811214117.4A 2018-10-18 2018-10-18 Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle Active CN109436355B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811214117.4A CN109436355B (en) 2018-10-18 2018-10-18 Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811214117.4A CN109436355B (en) 2018-10-18 2018-10-18 Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN109436355A CN109436355A (en) 2019-03-08
CN109436355B true CN109436355B (en) 2020-12-18

Family

ID=65547147

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811214117.4A Active CN109436355B (en) 2018-10-18 2018-10-18 Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN109436355B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113630554B (en) * 2021-08-12 2022-06-28 北京航空航天大学 Image acquisition device, method and system based on blade rotation angle

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016095094A1 (en) * 2014-12-15 2016-06-23 深圳市大疆创新科技有限公司 Image processing system, remote control shooting module, and exposal information prompting method
WO2016168976A1 (en) * 2015-04-20 2016-10-27 SZ DJI Technology Co., Ltd. Imaging system
CN109415126B (en) * 2016-07-08 2021-12-07 深圳市大疆创新科技有限公司 System and method for improved mobile platform imaging
CN206476124U (en) * 2017-02-24 2017-09-08 深圳市大疆创新科技有限公司 Head assembly and frame
WO2018152770A1 (en) * 2017-02-24 2018-08-30 深圳市大疆创新科技有限公司 Cradle head assembly and rack
CN107352037B (en) * 2017-07-11 2020-01-03 成都纵横自动化技术股份有限公司 Device and method for acquiring camera exposure position information and unmanned aerial vehicle

Also Published As

Publication number Publication date
CN109436355A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN105759839A (en) Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV
CN106375666B (en) A kind of Atomatic focusing method and device based on license plate
US10621456B2 (en) Distance measurement method and apparatus, and unmanned aerial vehicle
WO2018086050A1 (en) Depth map generation method and unmanned aerial vehicle based on this method
US11967228B2 (en) Peccancy monitoring system and peccancy monitoring method
CN108750129B (en) Manned unmanned aerial vehicle positioning landing method and manned unmanned aerial vehicle
US11654823B2 (en) Light irradiation control apparatus and method of light irradiation control
WO2020037604A1 (en) Automobile blind area monitoring and alarming method and apparatus, device and storage medium
CN110706282A (en) Automatic calibration method and device for panoramic system, readable storage medium and electronic equipment
JP6818820B2 (en) Intelligent roadside unit and its information processing method
EP3618033A1 (en) System and method for controlling traffic lights
CN109436355B (en) Method and device for controlling exposure, vision measuring equipment and unmanned aerial vehicle
CN105023429B (en) Automobile-used wireless vehicle tracking and device
EP3618035B1 (en) Intelligent roadside unit, control method and storage medium
US20200221005A1 (en) Method and device for tracking photographing
US20190258255A1 (en) Control device, imaging system, movable object, control method, and program
CN113612969A (en) Method and device for transmitting video data for remote control of unmanned equipment
CN115174861B (en) Method and device for automatically tracking moving target by holder camera
US9967438B2 (en) Image processing apparatus
CN109765931A (en) A kind of near-infrared video automatic navigation method suitable for the patrol unmanned machine of breakwater
JP7077282B2 (en) Intelligent roadside unit and its information processing method
US20220297721A1 (en) Multi-sensor synchronization method and system
JP6174884B2 (en) Outside environment recognition device and outside environment recognition method
CN113252008B (en) Shooting control method for aerial remote sensing narrow-view-field camera
WO2022052508A1 (en) Distance measurement method and apparatus, and terminal device and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province

Patentee after: Guangzhou Jifei Technology Co.,Ltd.

Address before: 510000, No. 1, Cheng Cheng Road, Gaotang Software Park, Guangzhou, Guangdong, Tianhe District, 3A01

Patentee before: Guangzhou Xaircraft Technology Co.,Ltd.