EP3889544A1 - Procédé et dispositif de traitement d'image de profondeur et véhicule aérien sans pilote - Google Patents
Procédé et dispositif de traitement d'image de profondeur et véhicule aérien sans pilote Download PDFInfo
- Publication number
- EP3889544A1 EP3889544A1 EP19905206.9A EP19905206A EP3889544A1 EP 3889544 A1 EP3889544 A1 EP 3889544A1 EP 19905206 A EP19905206 A EP 19905206A EP 3889544 A1 EP3889544 A1 EP 3889544A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- thread
- execution time
- hardware acceleration
- images
- depth map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title 1
- 238000012545 processing Methods 0.000 claims abstract description 118
- 238000000034 method Methods 0.000 claims abstract description 53
- 230000001133 acceleration Effects 0.000 claims description 111
- 230000015654 memory Effects 0.000 claims description 17
- 238000003702 image correction Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 210000003169 central nervous system Anatomy 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- Embodiments of the present invention relate to the technical field of unmanned aerial vehicles (UAV), and in particular, to a method, apparatus and UAV for processing a depth map.
- UAV unmanned aerial vehicles
- the UAV During an autonomous flight, the UAV needs to avoid obstacles. Therefore, it is necessary to detect positions of the obstacles, so that the UAV can take obstacle avoidance measures according to the positions of the obstacles.
- the UAV mostly adopts a vision system (such as a monocular vision system, a binocular vision system, etc.) for obstacle position detection, which uses a camera apparatus to take images of an area surrounding the UAV and processes the images to determine location information of surrounding obstacles. Then, the UAV takes obstacle avoidance measures such as detour, deceleration or pause according to its own speed, posture and the location information of the obstacles to avoid the obstacles.
- a vision system such as a monocular vision system, a binocular vision system, etc.
- obstacle avoidance measures such as detour, deceleration or pause according to its own speed, posture and the location information of the obstacles to avoid the obstacles.
- An objective of embodiments of the present invention is to provide a method, apparatus and UAV for processing a depth map, which can alleviate the problem of image processing blockage and a large delay during using a vision system on the UAV.
- an embodiment of the present invention provides a method for processing a depth map, applicable to a controller of the UAV.
- the UAV further includes an image collection apparatus, the image collection apparatus being communicatively connected to the controller.
- the method includes the following steps:
- the establishing at least two threads and at least one ring queue according to the execution time of step S1, the execution time of step S2 and the execution time of step S3, and executing step S1, step S2 and step S3 by the at least two threads to reduce a total execution time includes:
- the first preset condition is that: the sum of the execution time of step S1, the execution time of step S2 and the execution time of step S3 is greater than a preset value.
- the preset value is 1000/P, P being an image frame rate.
- the first thread executes two of step S1, step S2 and step S3, and the second thread executes one of step S1, step S2 and step S3; and the establishing a first thread, a second thread and a first ring queue, and executing step S1, step S2 and step S3 by the first thread and the second thread if the sum of the execution time of step S1, the execution time of step S2 and the execution time of step S3 meets the first preset condition includes:
- the second preset condition is that a sum of the execution times of the two steps executed by the first thread is greater than a preset value.
- the preset value is 1000/P, P being an image frame rate.
- the controller includes a hardware acceleration channel
- the image collection apparatus includes at least two sets of binocular units
- the performing binocular matching on the image to obtain a depth map of the target area includes:
- the hardware acceleration channel includes four hardware acceleration channels, i.e., a first hardware acceleration channel, a second hardware acceleration channel, a third hardware acceleration channel and a fourth hardware acceleration channel, respectively;
- an embodiment of the present invention provides an apparatus for processing a depth map, applicable to a controller of the UAV.
- the UAV further includes an image collection apparatus, the image collection apparatus being communicatively connected to the controller.
- the apparatus includes:
- the apparatus further includes:
- the thread and ring queue establishing module includes:
- the first preset condition is that: the sum of the execution time of step S1, the execution time of step S2 and the execution time of step S3 is greater than a preset value.
- the preset value is 1000/P, P being an image frame rate.
- the first thread executes two of step S1, step S2 and step S3, and the second thread executes one of step S1, step S2 and step S3; and the thread and ring queue establishing submodule is further configured to:
- the second preset condition is that a sum of the execution times of the two steps executed by the first thread is greater than a preset value.
- the preset value is 1000/P, P being an image frame rate.
- the controller includes a hardware acceleration channel
- the image collection apparatus includes at least two sets of binocular units
- the depth map acquisition module is further configured to:
- an embodiment of the present invention provides a UAV.
- the UAV includes:
- an embodiment of the present invention provides a non-volatile computer-readable storage medium, the computer-readable storage medium storing computer executable instructions which, when executed by a UAV, cause the UAV to execute the foregoing method.
- At least two threads and at least one ring queue are established according to the execution times of all steps during depth map processing by the controller of the UAV, and the at least two threads execute all of the steps for processing the depth map.
- Each of the threads can obtain processing results from other threads through the at least one ring queue.
- a method, apparatus and UAV for processing a depth map provided in embodiments of the present invention are applicable to an application scenario shown in FIG. 1 .
- the application scenario includes a UAV 100 and an obstacle 200.
- the UAV 100 may be a suitable UAV, including a fixed-wing UAV and a rotary-wing UAV, such as a helicopter, a quadrotor and an aircraft with other numbers of rotors and/or rotor configuration.
- the UAV 100 may further be other movable objects, such as a manned aircraft, a model airplane, an unmanned airship and an unmanned hot air balloon.
- the obstacle 200 may be, for example, a building, a mountain, a tree, a forest, a signal tower or other movable or non-movable objects (only one obstacle is shown in FIG. 1 , there may be more obstacles or no obstacle in actual application).
- the UAV 100 includes a fuselage 10, a wing connected to the fuselage 10, a power apparatus and a control system disposed on the fuselage 10.
- the power apparatus is configured to provide push force or lift force for the UAV 10 to fly.
- the control system is a central nervous system of the UAV 100 and may include a plurality of functional units, such as a flight control system, a vision system and other systems with specific functions.
- the vision system includes an image collection apparatus 30 and a vision chip 20, and the flight control system includes various sensors (such as a gyroscope and an accelerometer) and a flight control chip.
- the UAV 100 needs to recognize and avoid the obstacle 200 in front by itself.
- the UAV 100 may detect location information of obstacles around the UAV 100 through the image collection apparatus 30 and the vision chip 20, and take obstacle avoidance measures according to the location information.
- the image collection apparatus 30 is configured to acquire a target image of the target area, and may adopt, for example, a high-definition camera, a sport camera or the like.
- the vision chip 20 is communicatively connected to the image collection apparatus 30, may acquire the target image collected by the image collection apparatus 30, and perform image processing on the target image to obtain depth information of a corresponding area of the target image, thereby obtaining location information of the obstacle 200 around the UAV 100.
- the vision chip 20 can take obstacle avoidance measures according to the location information of the obstacle 200.
- the flight control chip controls the UAV 100 according to the obstacle avoidance measures.
- the obstacle avoidance measures include controlling the UAV to slow down, pause or the like.
- the vision chip 20 may further determine a distance of an obstacle and perform three-dimensional reconstruction according to the location information of the obstacle 200.
- the image collection apparatus 30 may include at least one monocular unit or at least one binocular unit (the binocular unit is used as an example for description below). Each binocular unit may obtain a set of target images, and each binocular unit collects the target images at a preset frame rate.
- the vision chip 20 needs to perform image processing on sets of target images obtained by the binocular units to obtain depth information corresponding to each of the sets of target images.
- the vision chip 20 also needs to obtain a distribution of obstacles 200 around the UAV 100 according to the depth information corresponding to each of the sets of target images.
- the frame rate at which the target images are collected is relatively high, it may cause incapability of processing by the vision chip 20, which in turn hampers the processing of the sets of target images of an inputted source, causing image processing blockage and a large delay.
- At least two threads and at least one ring queue may be established according to the execution times of all steps during depth map processing by the controller of the UAV, and when a sum of the execution times of all steps is relatively large, the at least two threads execute all of the steps for processing the depth map.
- Each of the threads can obtain processing results from other threads through the at least one ring queue.
- the vision chip 20 is disposed for the UAV 100 to obtain the obstacle avoidance measures of the UAV 100 according to the image acquired by the image collection apparatus 30.
- the UAV 100 may also use other controllers to implement the function of the vision chip 20.
- FIG. 3 is a schematic flowchart of a method for processing a depth map according to an embodiment of the present invention.
- the method is used in the controller of the UAV 100 shown in FIG. 1 or FIG. 2 .
- the controller may be the vision chip 20 of the UAV 100.
- the method includes the following steps.
- the image collection apparatus may be a binocular unit, and the correcting an image collected by the image collection apparatus includes correcting the image and calibrating each set of images collected by the binocular unit to acquire calibration parameters corresponding to each set of images.
- S2 Perform binocular matching on the image to obtain a depth map of the target area.
- binocular matching is performed on each set of images to acquire a disparity map corresponding to each set of images, and depth information of an area corresponding to each set of images is acquired according to the disparity map and the calibration parameters.
- Data processing is performed according to the depth map to obtain the distribution of obstacles around the UAV, where the distribution of obstacles is, for example, location information of the obstacles, the distances of the obstacles, a three-dimensional map of a surrounding environment of the UAV and the like.
- step S4 Acquire an execution time of step S1, an execution time of step S2 and an execution time of step S3 before executing step S1, step S2 and step S3.
- step S1, step S2 and step S3 are first executed for trial operation in the controller for a period of time, and the controller detects the execution times of the steps to obtain the execution time of step S1, the execution time of step S2 and the execution time of step S3.
- step S5 Establish at least two threads and at least one ring queue according to the execution time of step S1, the execution time of step S2 and the execution time of step S3, and respectively execute step S1, step S2 and step S3 by the at least two threads to reduce a total execution time, where of two of the at least two threads that execute adjacent ones of the steps, a processing result from a thread executing a former step is transmitted to the ring queue, and a thread executing a latter step fetches the processing result from the ring queue and executes the latter step according to the processing result.
- step S1 It is determined, according to the obtained execution time of step S1, the execution time of step S2 and the execution time of step S3, whether the at least two threads are adopted to execute step S1, step S2 and step S3 in parallel.
- the number of threads and ring queues may be determined according to the execution times of the above steps, for example, there may be two threads and one ring queue or three threads and two ring queues.
- step S1, step S2 and step S3 are relatively long (for example, greater than 1000/P, where P is the image frame rate at which the image collection apparatus collects the image), image blockage may occur.
- P is the image frame rate at which the image collection apparatus collects the image
- step S1, step S2 and step S3 are relatively long.
- the thread executing a former step may transmit a processing result to the ring queue, and the thread executing a latter step acquires the processing result from the ring queue.
- the first thread, the second thread and the first ring queue may be set.
- the first thread executes two of step S1, step S2 and step S3, and the second thread executes one of step S1, step S2 and step S3.
- the second thread executes two of step S1, step S2 and step S3, and the first thread executes one of step S1, step S2 and step S3.
- the first thread executes step S1 and step S2, and the second thread executes step S3.
- the first thread transmits the processing result of step S2 to the first ring queue, and the second thread acquires the processing result from the first ring queue and executes step S3.
- the first thread executes step S1, and the second thread executes step S2 and step S3, and so on.
- step S1 the execution time of step S2 is t2 and the execution time of step S3 is t3, if t1+t2+t3 > 1000/P, t1+t2 ⁇ 1000/P and t3 ⁇ 1000/P, the first thread executes step S1 and step S2, and the second thread executes step S3.
- step S1 step S1 and step S2
- step S3 the total execution time max(t1+t2, t3) ⁇ 1000/P, which reduces the total execution time, effectively avoiding image blockage.
- the two steps may also be executed separately by two threads.
- the first thread executes steps S1 and S2, and the second thread executes step S3. If t1+t2 > 1000/P and t3 ⁇ 1000/P, then a third thread and a second ring queue can be established.
- the second thread executes step S1, the third thread executes step S2, the first thread transmits the processing result of step S1 to a second ring queue, and the third thread fetches the processing result from the second ring queue and executes step S2.
- the third thread executes step S1, and the first thread executes step S2.
- step S1 if the execution time of one of step S1, step S2 and step S3 is relatively long, for example, t3 > 1000/P, two or more threads may be further adopted to execute step S3.
- the binocular matching may be performed by a hardware acceleration channel disposed in the controller.
- the hardware acceleration channel is an apparatus composed of hardware and interface software that can increase the operating speed of the software.
- a hardware acceleration channel may be used to increase the operating speed. Frames of a set of images consecutively captured by the at least two sets of binocular units are sequentially transmitted to the hardware acceleration channel, and each set of images are processed by performing time division multiplexing on the hardware acceleration channel, so as to obtain depth information corresponding to each set of images. That is, a first set of images are transmitted to the hardware acceleration channel for processing, and upon completion of the processing, a second set of images are transmitted to the hardware acceleration channel for processing, and so on.
- the hardware acceleration channel is multiplexed by polling.
- At least two hardware acceleration channels may be used to increase the operating speed.
- binocular matching processing is performed, by using a hardware acceleration channel, on the set of images obtained by the high-resolution binocular unit and the set of images obtained by the low-resolution binocular unit, to further increase the operating speed.
- the set of images obtained by one high-resolution binocular unit may also be processed by using a hardware acceleration channel, and the set of images obtained by at least two low-resolution binocular units (for example, two binocular units or three binocular units) are processed by using the same hardware acceleration channel.
- the set of images obtained by at least two low-resolution binocular units share a hardware acceleration channel, which can make full and reasonable use of the hardware acceleration channel without affecting the operating speed of the software when the number of hardware acceleration channels is small.
- the sets of target images may be processed by using a method of performing time division multiplexing on the hardware acceleration channel.
- an implementation of a UAV includes two pairs of binocular units having a resolution of 720P and four pairs of binocular units having a resolution of VGA, and there are four hardware acceleration channels in the controller.
- a high-resolution 720P binocular unit may use one hardware acceleration channel alone, and every two binocular units in the low-resolution VGA binocular unit share one hardware acceleration channel.
- a correspondence between the binocular unit and the hardware acceleration channel may be set in advance, so that the set of images obtained by each binocular unit can be transmitted to the corresponding hardware channel for processing.
- an embodiment of the present invention further provides an apparatus for processing a depth map.
- the apparatus is used in the controller of the UAV 100 shown in FIG. 1 or FIG. 2 .
- the controller may be the vision chip 20 of the UAV 100.
- the apparatus 500 for processing a depth map includes:
- the apparatus further includes:
- At least two threads and at least one ring queue are established according to the execution times of all steps during depth map processing by the controller of the UAV, and the at least two threads execute all of the steps for processing the depth map.
- Each of the threads can obtain processing results from other threads through the at least one ring queue.
- the thread and ring queue establishing module 505 includes:
- the first preset condition is that: the sum of the execution time of step S1, the execution time of step S2 and the execution time of step S3 is greater than a preset value.
- the preset value is 1000/P, P being an image frame rate.
- the first thread executes two of step S1, step S2 and step S3, and the second thread executes one of step S1, step S2 and step S3.
- the thread and ring queue establishing submodule 5052 is further configured to:
- the second preset condition is that a sum of the execution times of the two steps executed by the first thread is greater than a preset value.
- the preset value is 1000/P, P being an image frame rate.
- the controller includes a hardware acceleration channel, and the image collection apparatus includes at least two sets of binocular units; and the depth map acquisition module 502 is further configured to:
- the hardware acceleration channel includes four hardware acceleration channels, i.e., a first hardware acceleration channel, a second hardware acceleration channel, a third hardware acceleration channel and a fourth hardware acceleration channel, respectively;
- the foregoing apparatus may perform the method provided in the embodiments of this application, and has the corresponding functional modules for performing the method and beneficial effects thereof.
- the method provided in the embodiments of this application may perform the method provided in the embodiments of this application, and has the corresponding functional modules for performing the method and beneficial effects thereof.
- FIG. 7 is a schematic structural diagram of hardware of a vision chip 20 in an embodiment of the UAV 100.
- the vision chip 20 includes: one or more processors 21 and a memory 22.
- One processor 21 is used as an example in FIG. 7 .
- the processor 21 and the memory 22 may be connected through a bus or in other manners and are, for example, connected through a bus in FIG. 7 .
- the memory 22 may be configured to store a non-volatile software program, a non-volatile computer-executable program and module, for example, a program instruction/module (for example, the image correction module 501, the depth map acquisition module 502, the obstacle distribution acquisition module 503, the time acquisition module 504 and the thread and ring queue establishing module 505 shown in FIG. 5 ) corresponding to the method for processing a depth map in the embodiments of the present application.
- the processor 21 executes various functional applications and data processing of the UAV by executing the non-volatile software program, instructions and the module stored in the memory 22, to implement the method for processing a depth map in the foregoing method embodiment.
- the memory 22 may include a program storage area and a data storage area.
- the program storage area may store an operating system and an application program that is required by at least one function.
- the data storage area may store data created according to use of the vision chip, and the like.
- the memory 22 may include a high speed random access memory, and may also include a non-volatile memory such as at least one magnetic disk storage device, a flash memory, or another non-volatile solid-state storage device.
- the memory 22 optionally includes memories remotely disposed relative to the processor 21, and these remote memories may be connected to the UAV by using a network. Examples of the network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and a combination thereof.
- the one or more modules are stored in the memory 22.
- the method for processing a depth map in any of the above method embodiments is executed.
- the above method steps 101-105 in FIG. 3 are executed, and functions of the modules 501-505 in FIG. 5 and the modules 501-505 and 5051-5052 in FIG. 6 are implemented.
- the foregoing product may perform the method provided in the embodiments of the present application, and have the corresponding functional modules for performing the method and beneficial effects thereof.
- the method provided in the embodiments of the present application may perform the method provided in the embodiments of the present application, and have the corresponding functional modules for performing the method and beneficial effects thereof.
- technical details not described in detail in this embodiment refer to the method provided in the embodiments of the present application.
- An embodiment of the present application provides a non-transitory computer-readable storage medium storing computer-executable instructions that are executed by one or more processors, for example, the processor 21 in FIG. 7 , so that the one or more processors may execute the method for processing a depth map in any of the foregoing method embodiments, for example, execute step 101 to step 105 in the foregoing method in FIG. 3 , and implement functions of the modules 501-505 in FIG. 5 and the modules 501-505 and 5051-5052 in FIG. 6 .
- the foregoing described device embodiments are merely examples.
- the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- the embodiments may be implemented by software in combination with a universal hardware platform, and may certainly be implemented by hardware.
- a person of ordinary skill in the art may understand that, all or some of the processes of the method in the foregoing embodiments may be implemented by a computer program instructing relevant hardware.
- the program may be stored in a computer-readable storage medium. During execution of the program, the processes of the foregoing method embodiments may be included.
- the foregoing storage medium may be a magnetic disk, an optical disc, a read-only memory (ROM), a random access memory (RAM), or the like.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Mechanical Engineering (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811634006.9A CN109631853A (zh) | 2018-12-29 | 2018-12-29 | 一种深度图处理方法、装置和无人机 |
PCT/CN2019/129562 WO2020135797A1 (fr) | 2018-12-29 | 2019-12-28 | Procédé et dispositif de traitement d'image de profondeur et véhicule aérien sans pilote |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3889544A1 true EP3889544A1 (fr) | 2021-10-06 |
EP3889544A4 EP3889544A4 (fr) | 2022-01-26 |
EP3889544B1 EP3889544B1 (fr) | 2023-09-27 |
Family
ID=66054507
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19905206.9A Active EP3889544B1 (fr) | 2018-12-29 | 2019-12-28 | Procédé et dispositif de traitement d'image de profondeur et véhicule aérien sans pilote |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3889544B1 (fr) |
CN (2) | CN113776503B (fr) |
WO (1) | WO2020135797A1 (fr) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113776503B (zh) * | 2018-12-29 | 2024-04-12 | 深圳市道通智能航空技术股份有限公司 | 一种深度图处理方法、装置和无人机 |
TWI804850B (zh) * | 2020-04-16 | 2023-06-11 | 鈺立微電子股份有限公司 | 多深度資訊之融合方法與融合系統 |
CN113688868B (zh) * | 2021-07-21 | 2023-08-22 | 深圳市安软科技股份有限公司 | 一种多线程的图像处理方法及装置 |
JP2024116011A (ja) | 2023-02-15 | 2024-08-27 | 信越化学工業株式会社 | パターン形成方法 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070044103A1 (en) * | 2005-07-25 | 2007-02-22 | Mark Rosenbluth | Inter-thread communication of lock protected data |
US8443375B2 (en) * | 2009-12-14 | 2013-05-14 | Verisign, Inc. | Lockless queues |
US10002031B2 (en) * | 2013-05-08 | 2018-06-19 | Nvidia Corporation | Low overhead thread synchronization using hardware-accelerated bounded circular queues |
US10061619B2 (en) * | 2015-05-29 | 2018-08-28 | Red Hat, Inc. | Thread pool management |
CN108594851A (zh) * | 2015-10-22 | 2018-09-28 | 飞智控(天津)科技有限公司 | 一种基于双目视觉的无人机自主障碍物检测系统、方法及无人机 |
RU2623806C1 (ru) * | 2016-06-07 | 2017-06-29 | Акционерное общество Научно-производственный центр "Электронные вычислительно-информационные системы" (АО НПЦ "ЭЛВИС") | Способ и устройство обработки стереоизображений |
CN106358003B (zh) * | 2016-08-31 | 2019-02-19 | 华中科技大学 | 一种基于线程级流水线的视频分析加速方法 |
CN106384382A (zh) * | 2016-09-05 | 2017-02-08 | 山东省科学院海洋仪器仪表研究所 | 一种基于双目立体视觉的三维重建系统及其方法 |
US10296393B2 (en) * | 2016-09-19 | 2019-05-21 | Texas Instruments Incorporated | Method for scheduling a processing device |
WO2018086050A1 (fr) * | 2016-11-11 | 2018-05-17 | 深圳市大疆创新科技有限公司 | Procédé de génération de carte de disparité et aéronef sans pilote basé sur ce procédé |
CN107636679B (zh) * | 2016-12-30 | 2021-05-25 | 达闼机器人有限公司 | 一种障碍物检测方法及装置 |
CN107703951B (zh) * | 2017-07-27 | 2019-02-01 | 上海拓攻机器人有限公司 | 一种基于双目视觉的无人机避障方法及系统 |
CN108874555A (zh) * | 2018-05-23 | 2018-11-23 | 福建天泉教育科技有限公司 | 一种写消息至消息中间件的方法及装置 |
CN108733344B (zh) * | 2018-05-28 | 2023-07-04 | 深圳市道通智能航空技术股份有限公司 | 数据读写方法、装置以及环形队列 |
CN109034018B (zh) * | 2018-07-12 | 2022-01-21 | 北京航空航天大学 | 一种基于双目视觉的低空小型无人机障碍物感知方法 |
CN113776503B (zh) * | 2018-12-29 | 2024-04-12 | 深圳市道通智能航空技术股份有限公司 | 一种深度图处理方法、装置和无人机 |
-
2018
- 2018-12-29 CN CN202111075076.7A patent/CN113776503B/zh active Active
- 2018-12-29 CN CN201811634006.9A patent/CN109631853A/zh active Pending
-
2019
- 2019-12-28 EP EP19905206.9A patent/EP3889544B1/fr active Active
- 2019-12-28 WO PCT/CN2019/129562 patent/WO2020135797A1/fr unknown
Also Published As
Publication number | Publication date |
---|---|
WO2020135797A1 (fr) | 2020-07-02 |
CN113776503B (zh) | 2024-04-12 |
EP3889544A4 (fr) | 2022-01-26 |
EP3889544B1 (fr) | 2023-09-27 |
CN109631853A (zh) | 2019-04-16 |
CN113776503A (zh) | 2021-12-10 |
US20210325909A1 (en) | 2021-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3889544B1 (fr) | Procédé et dispositif de traitement d'image de profondeur et véhicule aérien sans pilote | |
US20220091608A1 (en) | Method, apparatus, and electronic device for obstacle avoidance | |
US12025996B2 (en) | Unmanned aerial vehicle path planning method and apparatus and unmanned aerial vehicle | |
US11797028B2 (en) | Unmanned aerial vehicle control method and device and obstacle notification method and device | |
US9632509B1 (en) | Operating a UAV with a narrow obstacle-sensor field-of-view | |
US10754354B2 (en) | Hover control | |
CN108323190B (zh) | 一种避障方法、装置和无人机 | |
WO2018210078A1 (fr) | Procédé de mesure de distance pour un véhicule aérien sans pilote et véhicule aérien sans pilote | |
US11924538B2 (en) | Target tracking method and apparatus and unmanned aerial vehicle | |
WO2021013228A1 (fr) | Procédé et dispositif de communication sans fil, véhicule aérien sans pilote et système de commande de véhicule aérien sans pilote | |
CN105974932B (zh) | 无人机控制方法 | |
CN109978045A (zh) | 一种目标跟踪方法、装置和无人机 | |
US20200012756A1 (en) | Vision simulation system for simulating operations of a movable platform | |
WO2021088684A1 (fr) | Procédé d'évitement d'obstacle omnidirectionnel et véhicule aérien sans pilote | |
WO2020207411A1 (fr) | Procédé et appareil de traitement de données d'image, puce de traitement d'image et aéronef | |
CN110291481B (zh) | 一种信息提示方法及控制终端 | |
WO2018045976A1 (fr) | Procédé de commande de vol pour aéronefs et appareil de commande de vol | |
WO2021027886A1 (fr) | Procédé de commande de vol de véhicule aérien sans pilote, et véhicule aérien sans pilote | |
CA3014963A1 (fr) | Vehicules aeriens sans pilote | |
WO2021238743A1 (fr) | Procédé et appareil de commande de vol pour véhicule aérien sans pilote, et véhicule aérien sans pilote | |
US12130641B2 (en) | Method, apparatus and unmanned aerial vehicle for processing depth map | |
WO2022027337A1 (fr) | Système de commande de plateforme mobile, procédé de commande, dispositif et support de stockage | |
WO2021056144A1 (fr) | Procédé et appareil pour commander le retour d'une plate-forme mobile, et plate-forme mobile | |
US20230073120A1 (en) | Method for Controlling an Unmanned Aerial Vehicle for an Inspection Flight to Inspect an Object and Inspection Unmanned Aerial Vehicle | |
WO2018098898A1 (fr) | Procédé de commande de véhicule aérien sans pilote sur la base du magnétisme terrestre, et véhicule aérien sans pilote |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210701 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20220104 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 7/593 20170101ALI20211221BHEP Ipc: G01C 11/04 20060101AFI20211221BHEP |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: B64U 101/30 20230101ALN20230403BHEP Ipc: G06T 7/593 20170101ALI20230403BHEP Ipc: G01C 11/04 20060101AFI20230403BHEP |
|
INTG | Intention to grant announced |
Effective date: 20230428 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602019038428 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231228 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20231220 Year of fee payment: 5 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231227 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20231228 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20231222 Year of fee payment: 5 Ref country code: DE Payment date: 20231214 Year of fee payment: 5 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20230927 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1615810 Country of ref document: AT Kind code of ref document: T Effective date: 20230927 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240127 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240127 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20240129 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602019038428 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20231231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230927 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231228 |
|
26N | No opposition filed |
Effective date: 20240628 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20231231 |