CN109076206B - Three-dimensional imaging method and device based on unmanned aerial vehicle - Google Patents

Three-dimensional imaging method and device based on unmanned aerial vehicle Download PDF

Info

Publication number
CN109076206B
CN109076206B CN201780018614.4A CN201780018614A CN109076206B CN 109076206 B CN109076206 B CN 109076206B CN 201780018614 A CN201780018614 A CN 201780018614A CN 109076206 B CN109076206 B CN 109076206B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
positions
target object
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201780018614.4A
Other languages
Chinese (zh)
Other versions
CN109076206A (en
Inventor
周震昊
李昊南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202110004601.XA priority Critical patent/CN112672133A/en
Publication of CN109076206A publication Critical patent/CN109076206A/en
Application granted granted Critical
Publication of CN109076206B publication Critical patent/CN109076206B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing

Abstract

The invention provides a three-dimensional imaging method and a three-dimensional imaging device based on an unmanned aerial vehicle, wherein the method comprises the following steps: acquiring at least two target images of a target object from different positions through an unmanned aerial vehicle, wherein the target object in the at least two target images is at least partially overlapped; and generating a three-dimensional image of the target object according to at least two target images. Two cameras in the binocular camera are replaced by the unmanned aerial vehicle, at least two target images of the target object are obtained from different positions, and therefore the three-dimensional image of the target object is obtained, the distance between the two acquisition points is increased in a low-cost mode, the requirement on imaging resolution is lowered, and the three-dimensional reconstruction precision is improved.

Description

Three-dimensional imaging method and device based on unmanned aerial vehicle
Technical Field
The invention relates to the field of imaging, in particular to a three-dimensional imaging method and device based on an unmanned aerial vehicle.
Background
The stereoscopic vision is realized by different positions of the phase acquisition points, for example, the left eye and the right eye of a person have a certain distance, and the phases have slight difference, so that the distance of an object can be judged. The distance between the phase acquisition points is called baseline, and the farther the baseline is, the more easily or obviously the stereoscopic vision is obtained.
In the prior art, a binocular camera is adopted to obtain a stereo image, and under the balance between image resolution and object distance precision, the distance between the binocular camera is larger as a necessary result. And the small unmanned aerial vehicle causes the application of binocular camera to be very limited due to the restriction of position. Adopt large-scale unmanned aerial vehicle to carry out relevant survey and drawing in the trade more, it is with high costs, be unfavorable for promoting.
Disclosure of Invention
The invention provides a three-dimensional imaging method and device based on an unmanned aerial vehicle.
According to a first aspect of the present invention, there is provided a stereo imaging method based on an unmanned aerial vehicle, comprising: acquiring at least two target images of a target object from different positions through an unmanned aerial vehicle, wherein the target object in the at least two target images is at least partially overlapped; and generating a three-dimensional image of the target object according to at least two target images.
According to a second aspect of the present invention, there is provided a stereoscopic imaging apparatus based on a drone, comprising one or more processors, working individually or collectively, the processors being in communicative connection with the drone; the processor is configured to: acquiring at least two target images of a target object from different positions through an unmanned aerial vehicle, wherein the target object in the at least two target images is at least partially overlapped; and generating a three-dimensional image of the target object according to at least two target images.
According to a third aspect of the present invention, there is provided a computer readable storage medium having a computer program stored thereon, the program being executed by a processor to perform the steps of: acquiring at least two target images of a target object from different positions through an unmanned aerial vehicle, wherein the target object in the at least two target images is at least partially overlapped; and generating a three-dimensional image of the target object according to at least two target images.
According to the technical scheme provided by the embodiment of the invention, the unmanned aerial vehicle replaces two cameras in the binocular cameras, and at least two target images of the target object are obtained from different positions, so that the three-dimensional image of the target object is obtained, the distance between the phase acquisition points is increased by adopting a low-cost mode, the requirement on imaging resolution is reduced, and the three-dimensional reconstruction precision is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a flow chart of a method of stereoscopic imaging based on unmanned aerial vehicles in an embodiment of the invention;
fig. 3 is a schematic structural diagram of a stereoscopic imaging device based on a drone in an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a stereo imaging apparatus based on unmanned aerial vehicles in an embodiment of the present invention, which reveals flight trajectories of two unmanned aerial vehicles;
fig. 5 is a schematic structural diagram of a stereoscopic imaging device based on a drone in another embodiment of the present invention, which reveals a flight trajectory of the drone;
fig. 6 is a block diagram of a stereoscopic imaging apparatus based on a drone in an embodiment of the present invention;
fig. 7 is a block diagram of a stereoscopic imaging apparatus based on a drone in another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following describes the stereo imaging method and device based on the unmanned aerial vehicle in detail with reference to the accompanying drawings. The features of the following examples and embodiments may be combined with each other without conflict.
Fig. 1 is a schematic diagram of a drone 100 according to an embodiment of the present invention. The drone 100 may include a carrier 102 and a load 104. In some embodiments, load 104 may be located directly on drone 100 without carrier 102. In this embodiment, the supporting body 102 is a pan-tilt, for example, a two-axis pan-tilt or a three-axis pan-tilt. The load 104 may be an image capturing device or a video camera (e.g., a camera, a camcorder, an infrared camera, an ultraviolet camera, or the like), an audio capturing device (e.g., a parabolic reflector microphone), an infrared camera, or the like, and the load 104 may provide static sensing data (e.g., pictures) or dynamic sensing data (e.g., videos). The load 104 is mounted on the carrier 102, so that the rotation of the load 104 is controlled by the carrier 102.
Further, the drone 100 may include a power mechanism 106, a sensing system 108, and a communication system 110. The power mechanism 106 may include one or more rotating bodies, propellers, blades, motors, electronic governors, and the like. For example, the rotator of the power mechanism may be a self-fastening rotator, a rotator assembly, or other rotator power unit. The drone 100 may have one or more powered mechanisms. All power mechanisms may be of the same type. Alternatively, one or more of the power mechanisms may be of a different type. The power mechanism 106 may be mounted on the drone by suitable means, such as by a support element (e.g., a drive shaft). The power mechanism 106 may be mounted at any suitable location on the drone 100, such as the top, bottom, front, back, sides, or any combination thereof. By controlling one or more power mechanisms 106 to control the flight of the drone 100.
The sensing system 108 may include one or more sensors to sense spatial orientation, velocity, and/or acceleration (e.g., rotation and translation with respect to up to three degrees of freedom) of the drone 100. The one or more sensors may include a GPS sensor, a motion sensor, an inertial sensor, a proximity sensor, or an image sensor. The sensing data provided by the sensing system 108 may be used to track the spatial orientation, velocity and/or acceleration of the target 100 (using a suitable processing unit and/or control unit, as described below). Optionally, the sensing system 108 may be used to collect environmental data of the drone, such as climate conditions, potential obstacles to approach, location of geographic features, location of man-made structures, and the like.
The communication system 110 is capable of communicating with a terminal 112 having a communication system 114 via wireless signals 116. The communication systems 110, 114 may include any number of transmitters, receivers, and/or transceivers for wireless communication. The communication may be a one-way communication such that data may be transmitted from one direction. For example, one-way communication may include only the drone 100 transmitting data to the terminal 112, or vice versa. One or more transmitters of communication system 110 may transmit data to one or more receivers of communication system 112 and vice versa. Alternatively, the communication may be two-way communication, such that data may be transmitted in both directions between the drone 100 and the terminal 112. Two-way communication includes one or more transmitters of communication system 110 that may transmit data to one or more receivers of communication system 114, and vice versa.
In some embodiments, the terminal 112 may provide control data to one or more of the drone 100, the carrier 102, and the load 104, and receive information (e.g., position and/or motion information of the drone, the carrier, or the load, load-sensed data, such as image data captured by a camera) from one or more of the drone 100, the carrier 102, and the load 104.
In some embodiments, the drone 100 may communicate with other remote devices than the terminal 112, and the terminal 112 may also communicate with other remote devices than the drone 100. For example, the drone and/or the terminal 112 may communicate with another drone or a bearer or load of another drone. The additional remote device may be a second terminal or other computing device (such as a computer, desktop, tablet, smartphone, or other mobile device) when desired. The remote device may transmit data to the drone 100, receive data from the drone 100, transmit data to the terminal 112, and/or receive data from the terminal 112. Alternatively, the remote device may be connected to the internet or other telecommunications network to enable data received from the drone 100 and/or the terminal 112 to be uploaded to a website or server.
In some embodiments, the movement of the drone 100, the movement of the carrier 102, and the movement of the load 104 relative to a fixed reference (e.g., an external environment), and/or each other may be controlled by the terminal 112. The terminal 112 may be a remote control terminal located remotely from the drone, carrier and/or load. The terminals 112 may be located on or affixed to a support platform. Alternatively, the terminal 112 may be hand-held or wearable. For example, the terminal 112 may include a smartphone, a tablet, a desktop, a computer, glasses, gloves, a helmet, a microphone, or any combination thereof. The terminal 112 may comprise a user interface such as a keyboard, mouse, joystick, touch screen or display. Any suitable user input may interact with the terminal 112, such as manual input commands, voice control, gesture control, or position control (e.g., through movement, position, or tilt of the terminal 112).
It should be noted that, in the embodiment of the present invention, the stereoscopic image may include a single stereoscopic image, or may include a continuous stereoscopic video.
Example one
The embodiment of the invention provides a three-dimensional imaging method based on an unmanned aerial vehicle. Fig. 2 is a flowchart of a stereoscopic imaging method based on an unmanned aerial vehicle according to an embodiment of the present invention. As shown in fig. 2, the stereoscopic unmanned aerial vehicle-based imaging method may include the steps of:
step S201: acquiring at least two target images of a target object from different positions by the drone 100, wherein the target object in the at least two target images at least partially coincide;
optionally, the at least two target images may include two or more, and preferably, the two target images of the target object are acquired by the drone 100 from different locations.
In this embodiment, two target images of a target object are obtained from different positions by the unmanned aerial vehicle 100, and specific implementation manners may include the following two types:
first one
With reference to fig. 3 and 4, the drone 100 includes two stations (including the drone 110 and the drone 120 shown in fig. 3 and 4), and the different positions include a first position and a second position. Step S201 may include: the two unmanned aerial vehicles 100 are controlled to be located at the first position and the second position respectively, and target images of the target object P (with a three-dimensional coordinate of X, Y, Z) shot by the two unmanned aerial vehicles 100 at respective positions are obtained, so that the two obtained target images have good real-time performance.
Second kind
Referring to fig. 5, the drone 100 includes one station (i.e., drone 130 of fig. 5), and the different positions include a first position and a second position. Step S201 may include: controlling the unmanned aerial vehicle 100 to be respectively located at the first position and the second position, and acquiring a target image of the target object P photographed by the unmanned aerial vehicle 100 at the corresponding position. Compared with the first implementation mode, the three-dimensional image is reconstructed in a time-sharing shooting mode of a single unmanned aerial vehicle 100, the cost of three-dimensional reconstruction is reduced, and the real-time performance of the two obtained target images is inferior to that of the first implementation mode.
The following describes specific implementation processes of the above two ways of acquiring the target image respectively.
(1) Mode based on two unmanned aerial vehicles 100
In this embodiment, the controlling two drones 100 to be located at the first position and the second position respectively may include: and controlling two unmanned aerial vehicles 100 to fly synchronously, wherein the first position and the second position are respectively corresponding to the real-time positions of the unmanned aerial vehicles 100 so as to ensure the synchronism of the two obtained target images.
When controlling the two unmanned aerial vehicles 100 to fly synchronously, the relative relationship between the two unmanned aerial vehicles 100 is not changed, so that the calibration relationship between the two unmanned aerial vehicles 100 is not changed, the contact ratio of the two target images is ensured to be improved as much as possible, and the three-dimensional reconstruction is facilitated. Wherein the relative relationship may comprise at least one of: two the shooting gesture of unmanned aerial vehicle 100, two the position relation between unmanned aerial vehicle 100. The shooting postures of the two drones 100 may include: an included angle between the shooting directions of the two unmanned aerial vehicles 100 (an included angle between the shooting directions of the two unmanned aerial vehicles 100 and the line of the target object P, respectively). In this embodiment, two included angles of the shooting directions of the unmanned aerial vehicle 100 are greater than 0 ° and less than 180 °. For example, during the synchronous flight of two drones 100, the angle between the shooting directions of the two drones 100 may be maintained at 45 ° or 60 ° all the time.
Referring to fig. 4, the positional relationship between two of the drones 100 may include: two distance w between the unmanned aerial vehicles 100, compared to the distance between two cameras in the binocular camera, two distance w between the unmanned aerial vehicles 100 can be greatly increased, for example, two distance w between the unmanned aerial vehicles 100 can be from several meters to several tens of meters. In this way, the resolution required for the three-dimensional reconstruction is greatly reduced.
Further, referring to fig. 4, when controlling the two unmanned aerial vehicles 100 to fly synchronously, the two unmanned aerial vehicles 100 need to be controlled to have the same real-time height h, and control the two unmanned aerial vehicles 100 to have the same real-time distance s to the target object P, so as to ensure the contact ratio of the two target images, thereby facilitating the three-dimensional reconstruction. In this embodiment, controlling two real-time heights h of the drone 100 equals means: at the same moment, the heights h of the two unmanned aerial vehicles 100 are controlled to be equal, for example, the heights h are 5 meters, the contact ratio of the two target images is high, and three-dimensional reconstruction is facilitated. And at different moments, two the height h of unmanned aerial vehicle 100 can be equal, also can be inequality, specifically can select according to the formation of image demand.
Further, the acquiring the target images of the target object P captured by the two drones 100 at the respective positions may include: and acquiring two target images of the target object P shot by the unmanned aerial vehicle 100 at respective positions synchronously, further ensuring the synchronism of the two acquired target images, and facilitating the generation of a smooth stereoscopic video.
(1) Mode based on unmanned aerial vehicle 100
In this embodiment, the relative relationship between the first position and the second position of the unmanned aerial vehicle 100 is not changed, so that the contact ratio of the two target images is ensured as much as possible, and the three-dimensional reconstruction is facilitated. Wherein the relative relationship comprises at least one of: shooting postures of the unmanned aerial vehicle 100 at the first position and the second position, and a positional relationship of the unmanned aerial vehicle 100 at the first position and the second position. The shooting postures of the drone 100 at the first position and the second position may include: the unmanned aerial vehicle 100 is in the first position with the contained angle of the shooting direction of second position (unmanned aerial vehicle 100 is in the shooting direction that corresponds when the first position with the line of target object P with unmanned aerial vehicle 100 is in the shooting direction that corresponds when the second position with the contained angle between the line of target object P). In this embodiment, the included angle between the shooting directions of the unmanned aerial vehicle 100 at the first position and the second position is greater than 0 ° and less than 180 °. For example, the angle between the shooting directions of the drone 100 in the first position and the second position may be 45 ° or 60 °. In order to reconstruct the stereoscopic video, the included angle between the shooting directions of the drone 100 at the first position and the second position needs to be always maintained at the same angle.
Referring to fig. 5, the positional relationship of the drone 100 at the first and second positions may include: the distance w between the first position and the second position of the drone 100 controls the manner in which the drone 100 is located in the first position and the second position, respectively, at different times than the distance between two of the binocular cameras, so that the distance w between the first position and the second position can be greatly increased, for example, the distance w between the first position and the second position can be from several meters to several tens of meters. In this way, the resolution required for the three-dimensional reconstruction is greatly reduced.
The distance (s in fig. 5) from the first position to the target object P and the distance (s in fig. 5) from the second position to the target object P are equal, and the height (h in fig. 5) from the first position and the height (h in fig. 5) from the second position are also equal, so that the coincidence degree of the two target images is ensured, and the three-dimensional reconstruction is facilitated. In this embodiment, the distance from the first position to the target object P and the distance from the second position to the target object P may be both 5 meters or other distances.
Further, the first position and the second position may belong to the same flight trajectory or different flight trajectories, so as to meet different requirements. For example, in one embodiment, the first location and the second location belong to the same flight trajectory. Optionally, the unmanned aerial vehicle 100 flies to the first position and stays at the first preset time, and within the first preset time, the target object P is shot by an image capture device or a camera device on the unmanned aerial vehicle 100, so as to obtain one of the target images. Then, the unmanned aerial vehicle 100 flies to the second position and stays at the second preset time, and within the second preset time, the unmanned aerial vehicle 100 shoots the target object P by aiming at the image capture device or the camera device, so as to obtain another target image and ensure the definition of the two target images. The duration of the first preset duration and the duration of the second preset duration can be set according to needs, and the first preset duration and the second preset duration can be equal or unequal. In addition, the flight trajectory of the drone 100 is not limited in this embodiment, and preferably, the flight trajectory of the drone 100 is a circle located at the same height, so as to improve the efficiency of three-dimensional reconstruction. The embodiment can acquire any two points from the circular flight path as the first position and the second position respectively. Further, the drone 100 flies transversely along a circular flight trajectory, facilitating the reconstruction of a continuous stereoscopic video.
In another embodiment, the first and second positions are located on different flight trajectories. In this embodiment, the different flight trajectories include a first flight trajectory and a second flight trajectory, and controlling the drone 100 to be located at the first position and the second position respectively may include: firstly, the unmanned aerial vehicle 100 is controlled to fly along the first flight trajectory and the second flight trajectory sequentially, then the first position is selected from the first flight trajectory, the second position is selected from the second flight trajectory, a stereo image is reconstructed in a single-machine time-sharing shooting mode, and the cost of three-dimensional reconstruction is reduced.
Further, the controlling the drone 100 to fly along the first flight trajectory and the second flight trajectory in sequence may include: and controlling the unmanned aerial vehicle 100 to transversely fly along the first flight track and the second flight track in sequence, so as to reconstruct a continuous three-dimensional video.
Selecting the first location from the first flight path and selecting the second location from the second flight path may include: and selecting a plurality of first positions from the first flight path, and selecting a plurality of second positions from the second flight path, wherein the first positions and the second positions are in one-to-one correspondence. The acquiring the target image of the target object P photographed by the drone 100 at the corresponding position may include: and acquiring the target image of the target object P shot by the unmanned aerial vehicle 100 at the corresponding first position and second position, so as to generate continuous stereoscopic video of the target object P.
Step S202: and generating a three-dimensional image of the target object P according to at least two target images.
Wherein, step S202 specifically includes: and fusing at least two target images to generate a three-dimensional image of the target object P. In this embodiment, any fusion method in the prior art may be adopted to fuse at least two target images.
When the target image includes more than two images, the two images can be fused to generate a three-dimensional image of the target object P, and then the currently generated three-dimensional image and one of the other target images are fused continuously until all the images are fused, so as to obtain one three-dimensional image.
In the embodiment of the invention, the unmanned aerial vehicle 100 replaces two cameras in the binocular cameras, and at least two target images of the target object P are acquired from different positions, so that a three-dimensional image of the target object P is acquired, the distance between the acquisition points is increased in a low-cost mode, the requirement on imaging resolution is reduced, and the three-dimensional reconstruction precision is improved.
Further, after step S202, the stereoscopic imaging method based on the drone 100 may further include: and generating a three-dimensional video of the target image according to the three-dimensional image of the target image generated at different moments, so that the user experience is improved.
Still further, after step S202, the stereoscopic imaging method based on the drone 100 may further include: the stereoscopic image is transmitted to the display apparatus 400 so that the stereoscopic image can be displayed through the display apparatus 400 to be visually presented to the user. The display device 400 may be a smart phone, a tablet computer, a desktop computer, a computer, or video glasses.
Example two
With reference to fig. 6 and 7, a second embodiment of the present invention provides a stereoscopic drone-based imaging device, which may include a drone 100 and a processor 200 (e.g., a single-core or multi-core processor), where the processor 200 is communicatively connected to the drone 100.
The processor 200 may be a Central Processing Unit (CPU). The processor 200 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
The processor 200 may comprise one or more, working individually or collectively. The processor 200 is configured to: acquiring at least two target images of a target object P from different positions by the drone 100, wherein the target object P in the at least two target images at least partially coincide; and generating a three-dimensional image of the target object P according to at least two target images.
In the embodiment of the invention, the unmanned aerial vehicle 100 replaces two cameras in the binocular cameras, and at least two target images of the target object P are acquired from different positions, so that a three-dimensional image of the target object P is acquired, the distance between the acquisition points is increased in a low-cost mode, the requirement on imaging resolution is reduced, and the three-dimensional reconstruction precision is improved.
In one embodiment, the processor 200 is configured to fuse at least two target images to generate a stereo image of the target object P.
In one embodiment, the drone 100 includes two drones, both of the drones 100 being in communication with the processor 200; the different positions include a first position and a second position; the processor 200 is configured to control two drones 100 to be located at the first position and the second position respectively; acquiring target images of the target object P shot by the two drones 100 at respective positions.
In one embodiment, the processor 200 is configured to control two drones 100 to fly synchronously, and the first position and the second position are the real-time positions of the drones 100 respectively.
In one embodiment, the processor 200 is further configured to control the relative relationship between two of the drones 100 to be unchanged while controlling the two drones 100 to fly synchronously.
In one embodiment, the relative relationship comprises at least one of: two the shooting gesture of unmanned aerial vehicle 100, two the position relation between unmanned aerial vehicle 100.
In one embodiment, the shooting postures of two of the drones 100 include: two the contained angle of unmanned aerial vehicle 100's shooting direction.
In one embodiment, the positional relationship between two of the drones 100 includes: the distance (w in fig. 4) between two of the drones 100.
In one embodiment, the processor 200 is further configured to control the real-time heights (h in fig. 4) of two of the drones 100 to be equal while controlling the two drones 100 to fly synchronously; controlling the real-time distances (s in fig. 4) from two drones 100 to the target object P to be equal.
In one embodiment, the processor 200 is configured to acquire target images of the target object P captured by two drones 100 at respective positions synchronously.
In one embodiment, the drone 100 includes one, the different locations including a first location and a second location; the processor 200 is configured to control the drone 100 to be located at the first position and the second position respectively; acquiring a target image of the target object P photographed by the unmanned aerial vehicle 100 at a corresponding position.
In one embodiment, the first position and the second position belong to the same flight trajectory.
In one embodiment, the first location and the second location are located on different flight trajectories.
In one embodiment, the different flight trajectories include a first flight trajectory and a second flight trajectory, and the processor 200 is configured to control the drone 100 to fly along the first flight trajectory and the second flight trajectory sequentially; and selecting the first position from the first flight path, and selecting the second position from the second flight path.
In one embodiment, the processor 200 is configured to control the drone 100 to fly transversely along the first flight trajectory and the second flight trajectory sequentially.
In one embodiment, the processor 200 is configured to select a plurality of first locations from the first flight path and a plurality of second locations from the second flight path, where the plurality of first locations and the plurality of second locations correspond to one another; and, the target images of the target object P photographed by the drone 100 at the corresponding first and second positions are acquired.
In one embodiment, the relative relationship of the drone 100 in the first position and the second position is unchanged.
In one embodiment, the relative relationship comprises at least one of: shooting postures of the unmanned aerial vehicle 100 at the first position and the second position, and a positional relationship of the unmanned aerial vehicle 100 at the first position and the second position.
In one embodiment, the shooting postures of the drone 100 in the first and second positions include: the unmanned aerial vehicle 100 is in the first position with the contained angle of the shooting direction of second position.
In one embodiment, the positional relationship of the drone 100 at the first and second locations includes: a distance (w in fig. 5) between the first position and the second position of the drone 100.
In one embodiment, the distance (s in fig. 5) from the first position to the target object P and the distance (s in fig. 5) from the second position to the target object P are equal, and the height (h in fig. 5) of the first position and the height (h in fig. 5) of the second position are also equal.
In one embodiment, the processor 200 is further configured to generate a stereoscopic video of the target image according to the generated stereoscopic image of the target image at different time after generating the stereoscopic image of the target object P according to two target images.
In one embodiment, the processor 200 is configured to be communicatively connected to a display device 400, and the processor 200 sends the stereoscopic image of the target object P to the display device 400 after generating the stereoscopic image.
When the unmanned aerial vehicle 100 is two, the processor 200 may be a flight controller of one of the unmanned aerial vehicles 100, or a combination of two flight controllers of the unmanned aerial vehicle 100, or the processor 200 may be an independently provided controller.
When the drone 100 is a single drone, the processor 200 may be a flight controller of the drone 100, or the processor 200 may be a controller that is independently provided.
Further, the stereoscopic unmanned aerial vehicle-based imaging device may further include a storage device. The storage device may include a volatile memory (volatile memory), such as a random-access memory (RAM); the storage device may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD); the storage means may also comprise a combination of memories of the kind described above. Optionally, the storage device is for storing program instructions. The processor 200 may call the program instructions to implement the corresponding method as in the first embodiment.
It should be noted that, for the specific implementation of the processor 200 according to the embodiment of the present invention, reference may be made to the description of corresponding contents in the foregoing embodiments, which is not repeated herein.
EXAMPLE III
A third embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the stereoscopic imaging method based on an unmanned aerial vehicle described in the first embodiment.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The description of "particular examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are well known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried out to implement the above-described implementation method can be implemented by hardware related to instructions of a program, which can be stored in a computer-readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (23)

1. A stereoscopic imaging method based on an unmanned aerial vehicle is characterized by comprising the following steps:
acquiring at least two target images of a target object from different positions by an unmanned aerial vehicle, wherein the target object in the at least two target images is at least partially overlapped;
generating a three-dimensional image of the target object according to at least two target images;
the different positions are located on different flight trajectories, each target image corresponds to one of the different positions, the different positions comprise a first position and a second position, the different flight trajectories comprise a first flight trajectory and a second flight trajectory, the first flight trajectory comprises a plurality of first positions, the second flight trajectory comprises a plurality of second positions, the first positions and the second positions correspond to one another, and the three-dimensional image is generated according to the target images respectively obtained at the corresponding first positions and the corresponding second positions;
the at least two target images of the target object are obtained from different positions by an unmanned aerial vehicle, including:
controlling the unmanned aerial vehicle to fly along the first flight trajectory and the second flight trajectory sequentially;
and acquiring target images of the target object respectively shot by the unmanned aerial vehicle at the corresponding first position and the second position.
2. The method of claim 1, wherein generating the stereoscopic image of the target object from at least two of the target images comprises:
and fusing at least two target images to generate a three-dimensional image of the target object.
3. The method of claim 1, wherein the controlling the drone to fly along the first flight trajectory and the second flight trajectory sequentially further comprises:
and selecting a plurality of first positions from the first flight path and selecting a plurality of second positions from the second flight path.
4. The method of claim 1 or 3, wherein said controlling said drone to fly along said first flight trajectory and said second flight trajectory sequentially comprises:
and controlling the unmanned aerial vehicle to transversely fly along the first flight track and the second flight track.
5. The method of claim 1, wherein the relative relationship of the drone in the first and second positions is unchanged.
6. The method of claim 5, wherein the relative relationship comprises at least one of: the shooting postures of the unmanned aerial vehicle at the first position and the second position and the position relation of the unmanned aerial vehicle at the first position and the second position.
7. The method of claim 6, wherein the filming gestures of the drone in the first and second positions comprises: the unmanned aerial vehicle is in the first position with the contained angle of the shooting direction of second position.
8. The method of claim 6, wherein the positional relationship of the drone at the first and second positions comprises: a distance of the drone between the first location and the second location.
9. The method of claim 1, wherein the distance from the first location to the target object and the distance from the second location to the target object are equal, and the height of the first location and the height of the second location are equal.
10. The method of claim 1, wherein after generating the stereoscopic image of the target object from the at least two target images, further comprising:
and generating a stereoscopic video of the target image according to the generated stereoscopic image of the target image at different moments.
11. The method of claim 1, wherein after generating the stereoscopic image of the target object, further comprising:
and sending the stereoscopic image to a display device.
12. A stereoscopic imaging device based on an unmanned aerial vehicle comprises the unmanned aerial vehicle and is characterized by further comprising one or more processors which work individually or jointly, wherein the processors are in communication connection with the unmanned aerial vehicle;
the processor is configured to:
acquiring at least two target images of a target object from different positions by an unmanned aerial vehicle, wherein the target object in the at least two target images is at least partially overlapped;
generating a three-dimensional image of the target object according to at least two target images;
the different positions are located on different flight trajectories, each target image corresponds to one of the different positions, the different positions comprise a first position and a second position, the different flight trajectories comprise a first flight trajectory and a second flight trajectory, the first flight trajectory comprises a plurality of first positions, the second flight trajectory comprises a plurality of second positions, the first positions and the second positions correspond to one another, and the three-dimensional image is generated according to the target images respectively obtained at the corresponding first positions and the corresponding second positions;
the processor is specifically configured to, when acquiring at least two target images of a target object from different positions by one unmanned aerial vehicle:
controlling the unmanned aerial vehicle to fly along the first flight trajectory and the second flight trajectory sequentially;
and acquiring target images of the target object respectively shot by the unmanned aerial vehicle at the corresponding first position and the second position.
13. The stereoscopic imaging apparatus as claimed in claim 12, wherein the processor is configured to fuse at least two of the target images to generate a stereoscopic image of the target object.
14. The stereoscopic imaging apparatus of claim 12, wherein the processor is configured to
In the process of controlling the unmanned aerial vehicle to fly along the first flight trajectory and the second flight trajectory sequentially, a plurality of first positions are selected from the first flight trajectory, and a plurality of second positions are selected from the second flight trajectory.
15. The stereoscopic imaging apparatus of claim 12, wherein the processor is configured to control the drone to fly laterally along the first flight trajectory and the second flight trajectory sequentially.
16. The stereoscopic imaging apparatus of claim 12, wherein the relative relationship of the drone in the first and second positions is unchanged.
17. The stereoscopic imaging apparatus of claim 16, wherein the relative relationship comprises at least one of: the shooting postures of the unmanned aerial vehicle at the first position and the second position and the position relation of the unmanned aerial vehicle at the first position and the second position.
18. The stereoscopic imaging apparatus of claim 17, wherein the shooting poses of the drone in the first and second positions comprise: the unmanned aerial vehicle is in the first position with the contained angle of the shooting direction of second position.
19. The stereoscopic imaging apparatus of claim 17, wherein the positional relationship of the drone in the first and second positions comprises: a distance of the drone between the first location and the second location.
20. The stereoscopic imaging apparatus according to claim 12, wherein the distance from the first position to the target object and the distance from the second position to the target object are equal, and the height of the first position and the height of the second position are also equal.
21. The stereoscopic imaging apparatus of claim 12, wherein the processor is further configured to generate a stereoscopic image of the target object from at least two of the target images, and thereafter to generate the stereoscopic image of the target object
And generating a stereoscopic video of the target image according to the generated stereoscopic image of the target image at different moments.
22. The stereoscopic imaging apparatus of claim 12, wherein the processor is communicatively coupled to a display device, and wherein the processor transmits the stereoscopic image of the target object to the display device after generating the stereoscopic image.
23. A computer-readable storage medium, on which a computer program is stored, characterized in that the program is executed by a processor to implement the steps of the drone-based stereoscopic imaging method of any one of claims 1 to 11.
CN201780018614.4A 2017-12-22 2017-12-22 Three-dimensional imaging method and device based on unmanned aerial vehicle Expired - Fee Related CN109076206B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110004601.XA CN112672133A (en) 2017-12-22 2017-12-22 Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/118034 WO2019119426A1 (en) 2017-12-22 2017-12-22 Stereoscopic imaging method and apparatus based on unmanned aerial vehicle

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110004601.XA Division CN112672133A (en) 2017-12-22 2017-12-22 Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109076206A CN109076206A (en) 2018-12-21
CN109076206B true CN109076206B (en) 2021-01-26

Family

ID=64812363

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201780018614.4A Expired - Fee Related CN109076206B (en) 2017-12-22 2017-12-22 Three-dimensional imaging method and device based on unmanned aerial vehicle
CN202110004601.XA Withdrawn CN112672133A (en) 2017-12-22 2017-12-22 Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110004601.XA Withdrawn CN112672133A (en) 2017-12-22 2017-12-22 Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium

Country Status (2)

Country Link
CN (2) CN109076206B (en)
WO (1) WO2019119426A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110068306A (en) * 2019-04-19 2019-07-30 弈酷高科技(深圳)有限公司 A kind of unmanned plane inspection photometry system and method
CN111757084A (en) * 2020-07-30 2020-10-09 北京博清科技有限公司 Acquisition method and acquisition device for three-dimensional image and readable storage medium
CN113542718A (en) * 2021-07-20 2021-10-22 翁均明 Unmanned aerial vehicle stereo photography method
CN113608550A (en) * 2021-08-06 2021-11-05 寰宇鹏翔航空科技(深圳)有限公司 Unmanned aerial vehicle data acquisition control method, unmanned aerial vehicle and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105225241A (en) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 The acquisition methods of unmanned plane depth image and unmanned plane
CN105245846A (en) * 2015-10-12 2016-01-13 西安斯凯智能科技有限公司 Multi-unmanned aerial vehicle cooperative tracking type shooting system and shooting method
CN106162145A (en) * 2016-07-26 2016-11-23 北京奇虎科技有限公司 Stereoscopic image generation method based on unmanned plane, device
CN106296821A (en) * 2016-08-19 2017-01-04 刘建国 Multi-view angle three-dimensional method for reconstructing based on unmanned plane and system
CN106331684A (en) * 2016-08-30 2017-01-11 长江三峡勘测研究院有限公司(武汉) Three-dimensional image obtaining method based on small unmanned aerial vehicle video recording in engineering geological survey
CN107124606A (en) * 2017-05-31 2017-09-01 东莞市妙音广告传媒有限公司 The long-range image pickup method of digital video advertisement based on unmanned plane

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8350894B2 (en) * 2009-04-17 2013-01-08 The Boeing Company System and method for stereoscopic imaging
IL208910A0 (en) * 2010-10-24 2011-02-28 Rafael Advanced Defense Sys Tracking and identification of a moving object from a moving sensor using a 3d model
CN105391939B (en) * 2015-11-04 2017-09-29 腾讯科技(深圳)有限公司 Unmanned plane filming control method and device, unmanned plane image pickup method and unmanned plane
CN205507553U (en) * 2016-04-07 2016-08-24 吉林禾熙科技开发有限公司 Three -dimensional scene data acquisition control device of unmanned aerial vehicle
CN106485736B (en) * 2016-10-27 2022-04-12 深圳市道通智能航空技术股份有限公司 Panoramic visual tracking method for unmanned aerial vehicle, unmanned aerial vehicle and control terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105225241A (en) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 The acquisition methods of unmanned plane depth image and unmanned plane
CN105245846A (en) * 2015-10-12 2016-01-13 西安斯凯智能科技有限公司 Multi-unmanned aerial vehicle cooperative tracking type shooting system and shooting method
CN106162145A (en) * 2016-07-26 2016-11-23 北京奇虎科技有限公司 Stereoscopic image generation method based on unmanned plane, device
CN106296821A (en) * 2016-08-19 2017-01-04 刘建国 Multi-view angle three-dimensional method for reconstructing based on unmanned plane and system
CN106331684A (en) * 2016-08-30 2017-01-11 长江三峡勘测研究院有限公司(武汉) Three-dimensional image obtaining method based on small unmanned aerial vehicle video recording in engineering geological survey
CN107124606A (en) * 2017-05-31 2017-09-01 东莞市妙音广告传媒有限公司 The long-range image pickup method of digital video advertisement based on unmanned plane

Also Published As

Publication number Publication date
CN109076206A (en) 2018-12-21
WO2019119426A1 (en) 2019-06-27
CN112672133A (en) 2021-04-16

Similar Documents

Publication Publication Date Title
US11039086B2 (en) Dual lens system having a light splitter
US10466695B2 (en) User interaction paradigms for a flying digital assistant
CN109076206B (en) Three-dimensional imaging method and device based on unmanned aerial vehicle
JP6596745B2 (en) System for imaging a target object
CN107278262B (en) Flight trajectory generation method, control device and unmanned aerial vehicle
CN108139799B (en) System and method for processing image data based on a region of interest (ROI) of a user
US10419690B2 (en) Imaging system
JP6182266B2 (en) Panorama image shooting method by UAV
US11353891B2 (en) Target tracking method and apparatus
JP2020506443A (en) Drone control method, head mounted display glasses and system
US20230239575A1 (en) Unmanned aerial vehicle with virtual un-zoomed imaging
WO2020014987A1 (en) Mobile robot control method and apparatus, device, and storage medium
CN110799801A (en) Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle
US20220345607A1 (en) Image exposure method and device, unmanned aerial vehicle
CN107636592B (en) Channel planning method, control end, aircraft and channel planning system
CN205356525U (en) Unmanned aerial vehicle
WO2022109860A1 (en) Target object tracking method and gimbal
EP3919374B1 (en) Image capturing method
WO2021093578A1 (en) High dynamic range image exposure control method, aerial camera, and unmanned aerial vehicle
CN113824885B (en) System, method and related unmanned aerial vehicle for optical path adjustment
JPWO2021064982A1 (en) Information processing device and information processing method
KR20190053018A (en) Method for controlling unmanned aerial vehicle comprising camera and electronic device
EP3348055B1 (en) System and method for supporting three-dimensional display in first person view (fpv)
JP2021064951A (en) System, method, device, and non-temporary computer-readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210126

CF01 Termination of patent right due to non-payment of annual fee