WO2018210078A1 - 无人机的距离测量方法以及无人机 - Google Patents

无人机的距离测量方法以及无人机 Download PDF

Info

Publication number
WO2018210078A1
WO2018210078A1 PCT/CN2018/082653 CN2018082653W WO2018210078A1 WO 2018210078 A1 WO2018210078 A1 WO 2018210078A1 CN 2018082653 W CN2018082653 W CN 2018082653W WO 2018210078 A1 WO2018210078 A1 WO 2018210078A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
pixel block
determining
image
block
Prior art date
Application number
PCT/CN2018/082653
Other languages
English (en)
French (fr)
Inventor
贾宁
雷志辉
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Priority to US16/615,082 priority Critical patent/US20200191556A1/en
Publication of WO2018210078A1 publication Critical patent/WO2018210078A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • G01C5/005Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels altimeters for aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G01C3/085Use of electric radiation detectors with electronic parallax measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/617Upgrading or updating of programs or applications for camera control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation

Definitions

  • the present application relates to the field of unmanned aerial vehicles, and in particular to a distance measuring method for a drone and a drone using the same.
  • drones can use the vision system to measure the distance to the target and use that distance as flight data. How to improve the measurement efficiency of the vision system in the drone has become an active research topic by those skilled in the art.
  • the embodiments of the present invention provide a distance measurement method for a drone and a drone, which can improve the efficiency of distance measurement of the vision system.
  • an embodiment of the present invention provides a method for measuring a height of a drone, including the following steps:
  • the determining the first pixel block in the first image, and the second pixel block in the second image that matches the first pixel block includes:
  • a pixel block that matches the block feature information in the at least one pixel block with the first block feature information is used as the second pixel block.
  • determining, according to the matching result of the pixel matching, the second pixel block in the at least one pixel block including:
  • a pixel block having the highest degree of matching with the pixel of the first pixel block in the at least one pixel block is used as the second pixel block.
  • a pixel block of the at least one pixel block including a pixel point matching the first pixel point is used as the second pixel block.
  • the matching, by the pixel point, the pixel point included in each of the at least one pixel block comprises:
  • the point feature information of the first pixel point is matched with the point feature information of the pixel point included in each of the pixel blocks.
  • the first pixel point includes a central pixel point of the first pixel block.
  • the method further includes:
  • the determining the first location information of the first pixel block and the second location information of the second pixel block including:
  • determining the distance between the UAV and the target according to the disparity value of the first pixel block and the second pixel block including:
  • the installation parameters of the first imaging device and the second imaging device include at least one of the following:
  • the distance between the lens optical center of the first imaging device and the lens optical center of the second imaging device, the distance from the optical center of the first imaging device to the drone body, and the optical center of the second imaging device to the drone The distance of the body.
  • determining the drone and the target according to the disparity values of the first pixel block and the second pixel block. The distance between them includes:
  • determining, according to the at least two distance values, a distance between the UAV and the target including:
  • an embodiment of the present invention provides a drone, including:
  • processor is configured to perform any one of the first aspects.
  • the embodiment of the present application further provides a vision system, including:
  • At least 2 imaging devices At least 2 imaging devices
  • At least one processor respectively connected to the at least two imaging devices;
  • a memory communicatively coupled to the at least one processor
  • the memory stores instructions executable by the at least one processor, the instructions being executed by the at least one processor after the user completes the interaction by the human interaction unit, such that the at least one processor is capable of performing the first aspect Any method.
  • the embodiment of the present application further provides a non-transitory computer readable storage medium storing computer executable instructions for causing a computer to execute the first aspect Any of the methods.
  • the embodiment of the present application further provides a computer program product, the computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, when the program instruction is The computer, when executed, causes the computer to perform any of the methods of the first aspect.
  • the beneficial effects of the embodiments of the present invention are: the UAV height measuring method provided in the embodiment and the UAV do not need to additionally set a special height measuring device, by presetting the positional combination between the first imaging device and the second imaging device
  • the real-time image analysis of the drone realizes the height measurement from the ground.
  • the height measurement method is simplified, the image matching is stable and reliable, and the height measurement response is fast.
  • the measurement accuracy is five to ten times higher than the ultrasonic height measurement in the set height range.
  • FIG. 1 is a schematic diagram of a drone according to an embodiment of the present invention.
  • FIG. 2 is a schematic flow chart of a method for measuring a distance of a drone according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a method for measuring a height of a drone according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a module of a drone according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram showing the hardware structure of a vision system in a drone according to an embodiment of the present invention.
  • the distance measuring method and the drone of the drone provided by the embodiment of the invention can improve the efficiency and accuracy of the distance measurement of the visual system of the drone.
  • the drone may include a flight control system (referred to as a flight control system), a vision system; alternatively, the drone may further include a plurality of systems such as an ultrasound system, an infrared system, a power system, a power system, and a data transmission system.
  • a flight control system referred to as a flight control system
  • a vision system a vision system
  • the drone may further include a plurality of systems such as an ultrasound system, an infrared system, a power system, a power system, and a data transmission system.
  • the flight control system is the core control device of the drone, and the flight control system may include: a main control module, a signal conditioning and interface module, a data acquisition module, and a steering gear drive module.
  • the flight control system collects the flight state data measured by each sensor in real time, receives the control commands and data transmitted by the ground controller via the radio uplink channel, and outputs control commands to the executing mechanism to realize various flights to the drone.
  • Modal control and management and control of mission equipment at the same time, the state data of the drone (ie flight data) and the operating state parameters of the engine, onboard power system and mission equipment are transmitted to the data transmission system in real time, and downlinked by radio The channel is sent back to the ground controller.
  • the flight control system is further configured to: perform high-precision acquisition of multi-channel analog signals, including gyro signals, heading signals, rudder angle signals, engine speed, cylinder temperature signals, dynamic and static pressure sensor signals, power supply voltage signals, and the like.
  • the output switch signal, analog signal and PWM pulse signal can be adapted to the control of different actuators such as rudder, aileron servo, elevator, air passage and damper. Communication with onboard data terminals, GPS signals, digital sensors, and related mission devices is accomplished using a plurality of communication channels.
  • the software design of the flight control system is divided into two parts, namely the programming of the logic circuit chip and the application design of the flight control system.
  • the logic circuit chip is used to form a digital logic control circuit to complete decoding and isolation as well as A/D, D/A and so on.
  • the application is used to implement the above functions of the flight control system.
  • the vision system includes one or more imaging devices, for example, the vision system can include two imaging devices, in which case the two imaging devices can also be binocular imaging devices.
  • the imaging device includes an optical component and an image acquisition sensor. Optical components are used to image objects, and image acquisition sensors are used to acquire image data using optical components.
  • the vision system can also include a processor coupled to one or more imaging devices, respectively, which processor can include one or more.
  • the image data collected by the image acquisition sensor may be transmitted to the processor, and the processor processes the image data collected by the image acquisition sensor.
  • the processor can calculate the distance between the target and the drone in the image according to the images captured by the plurality of imaging devices simultaneously.
  • the processor can further transmit the calculated distance to the flight control system, so that the flight control system controls the flight of the drone according to the distance to implement functions such as obstacle avoidance, tracking, hovering and the like.
  • the vision system can be installed at one or more of the front end, the rear end, the upper end, and the lower end of the drone.
  • the embodiment of the present application does not limit the installation position of the vision system on the drone.
  • the vision system can be called a lower vision system, and the lower vision system can measure the height of the drone. At this time, it can be understood as between the drone and the ground. The distance, the ground, is the target.
  • the vision system includes two imaging devices as an example for description.
  • the processor in the vision system can acquire images acquired by the two imaging devices at the same time, that is, the processor can acquire two images, and can perform pixel matching on the two images to obtain matching pixels.
  • the processor can calculate the distance between the target corresponding to the pixel and the drone based on the disparity value between the matched pairs of pixel points.
  • the vision system calculates the distance and the accuracy is not high; at the same time, the range of the detected object is small, for example, only the distance of the target within 10 meters can be calculated.
  • the distance measuring method and the drone of the unmanned aerial vehicle provided in the embodiments of the present application have high measurement efficiency, accurate accuracy and wide measurement range.
  • the flight height of the drone can be realized by directly calculating the most simplified forward view orientation.
  • the measurement can also be applied to the side view direction height measurement occasion of the unmanned aerial vehicle under the premise that the installation angles of the first imaging device and the second imaging device are predicted.
  • the unmanned aerial vehicle of the present embodiment includes a fuselage 20, a pan/tilt head 50, propellers 30, 32, and a first imaging device 52 and a second imaging device 54 disposed on the pan/tilt head.
  • the vision system including the first imaging device 52 and the second imaging device 54 shown in FIG. 1 is installed at the position of the main camera. Of course, the vision system can also be installed elsewhere in the drone.
  • the position of the vision system and the orientation of the imaging device in FIG. 1 are merely exemplary, which is defined by the embodiments of the present application.
  • the drone further includes a flight control processor and a wireless communication module connected to the flight control processor, the wireless communication module establishing a wireless connection with the ground remote control (ie, ground control station) 80, and controlling the flight control processor
  • the UAV flight state parameters i.e., the above flight parameters
  • image data are transmitted to the ground remote controller 80 via the wireless communication module.
  • the wireless communication module receives an operation command sent by the ground remote controller 80, and the flight control processor completes flight control of the drone based on the operation instruction.
  • the flight control processor can be a processor in a flight control system.
  • a processor may also be included in the vision system, and the processor may be connected to the first imaging device 52 and the second imaging device 54 using a communication interface, a communication bus, or the like.
  • the embodiment of the present application does not limit the connection manner between the processor and the imaging device in the vision system.
  • the drone of this embodiment is provided with one or more processors.
  • the one or more processors may be included in the flight control system to control the system hardware module to complete functions such as flight, map transmission, height measurement, and attitude adjustment of the drone.
  • the processor is also used to adjust the drone attitude such that the first imaging device 52 and the second imaging device 54 are in the same altimetry orientation when the orientation is being viewed.
  • the processor is further configured to acquire an installation angle of the first imaging device 52 and the second imaging device 54 with respect to a carrier, such as a drone, respectively; adjusting the first imaging device according to the installation angle and The optical axis direction of the second imaging device until the first imaging device 52 and the second imaging device 54 are in the same altimetry orientation.
  • the UAV body In addition to the installation of battery, processor, memory, wireless communication module and other flight control hardware, the UAV body also needs to be equipped with the flight control system 22 and related software of the vision system.
  • the flight control system 22 is coupled to the altimetry unit 30; the altimeter unit 30 can be understood as a processor in the vision system described above.
  • the altimetry unit 30 in the lower vision system can be used to measure the height of the drone, and the altimeter unit 30 in the vision system at other locations can be used to measure the distance between the drone and the target.
  • the altimetry unit 30 can acquire the first imaging device 52 and the second imaging device 54 on the drone while simultaneously capturing the first image and the second image in which the scenes are partially overlapped.
  • the partial overlap of the scene in the first image and the second image can be understood as an overlapping portion of the scene on the image for characterizing the same object when the first image and the second image are superimposed. That is, since the images captured by the two imaging devices have parallax, the positions of the same object in the image are not completely the same, and when they are superimposed together, they do not completely overlap, but partially overlap.
  • the altimetry unit 30 can control the first imaging device to simultaneously capture with the second imaging device, or the first imaging device and the second imaging device can achieve simultaneous shooting according to the respective configured crystal oscillators, and the altimetry unit 30 can acquire two simultaneous images. Images.
  • the altimetry unit 30 can include a matching module 32, an acquisition module 37, and a height estimation module 36.
  • the matching module 32 is configured to perform pixel point matching or pixel block matching to obtain a matched pixel point pair or a pixel block pair.
  • the acquiring module 37 is configured to acquire the first image and the second image, and is further configured to acquire the installation parameters of the first imaging device 52 and the second imaging device 54.
  • the installation parameter is a baseline length
  • the imaging is acquired.
  • the measuring unit calculates an accurate height or a current flight of the drone according to the installation parameter, that is, the baseline length, the imaging focal length, and one or more parameters of the parallax value between the pixel pair or the pixel block pair. The distance between the targets.
  • the drone further includes an adjustment unit for adjusting the first imaging device and the second imaging device to the altimetry or ranging orientation, ie, correcting the imaging device.
  • Scene matching refers to an image analysis and processing technique that determines a known image area from another corresponding scene area taken by other sensors or finds a correspondence between scene areas. This technology also has important application value in military fields such as navigation and guidance.
  • the scene matching referred to in the present application refers to an image analysis and image processing technique for identifying a reference area of an image from a target area between two images by an image comparison matching algorithm.
  • image comparison matching algorithm In binocular vision, the left and right images at the same time are matched, that is, the left image and the right image are the reference image and the real-time image. Calculate the height from the ground according to the parallax.
  • the matching module 32 includes a point module 33 of the same name.
  • the same-named point refers to two image points formed by a certain point in space on the left and right images. These two image points are the same-named points, which can also be understood as the matching pixel point pairs.
  • the matching module 32 can also include a pixel block module for matching pixel blocks in the two images to obtain matching pixel block pairs.
  • a matched pair of pixel blocks can include at least one matched pair of pixel points. Alternatively, the pair of pixel blocks can be derived based on other matching methods.
  • the matching module 33 determines an overlapping area of the first image and the second image, wherein the same-named point module 33 uses a set area in an overlapping area of the first image as a reference area, according to the reference area, Pixel matching is performed in the overlapping area of the second image to obtain a response area having the largest response value, and the center point of the response area and the center point of the reference area are points of the same name.
  • the matching module 33 also includes a parallax module 34.
  • the disparity module 34 determines a coordinate value of the point of the same name in the first image and determines a coordinate value of the point of the same name in the second image, the coordinate value of the same name point in the first image and the point of the same name in the second image The difference between the coordinate values is the disparity value.
  • disparity module 34 may determine a disparity value between pairs of matched pixel blocks.
  • an implementation manner of matching the binocular altimetry based on the first image and the second image is: first, the first imaging device 52 and the second imaging device 54 simultaneously capture a group of images, and the first The image and the second image are matched by the scene, and the overlapping regions of the two images are obtained. According to the information of the overlapping region, the position height of the current camera, that is, the flying height of the drone, can be measured.
  • the coordinates of the same name point can be directly performed by the two images formed by the first imaging device 52 and the second imaging device 54. Scene matching is obtained.
  • the adjustment unit of the drone adjusts the first imaging device and the second imaging device to an altimetry orientation.
  • the principle of the vision system for height measurement will be described below with reference to the accompanying drawings.
  • the height-adjusting orientation is a positive-downward position, and when the first imaging device 52 and the second imaging device 54 are respectively mounted at an angle of 90 degrees with respect to a horizontal plane of the unmanned vehicle body.
  • the adjusting unit adjusts the first imaging device and the second imaging device to the front view position such that the optical axis of the first imaging device and the optical axis of the second imaging device are both perpendicular to the ground. That is, both the first imaging device 52 and the second imaging device 54 take an image directly below. That is, the correction of the imaging device can be achieved first.
  • the first imaging device 52 and the second imaging device 54 are directly facing the lower side, the mirror surface of the camera is parallel to the ground, and the line connecting the optical center and the center of the image is parallel to the vertical line.
  • the height can be directly solved based on the parallax.
  • the camera is measured at the height of the drone, that is, the flying height of the drone has the following geometric relationship:
  • f is the equivalent focal length
  • B is the baseline length between the two cameras
  • ⁇ x is the parallax, that is, the displacement of the point of the same name obtained by the scene matching.
  • the equivalent focal length f is the ratio of the actual physical focal length of the camera to the physical size of each pixel, and is an attribute parameter of the camera, which can be estimated by the image attributes of the first image and the second image.
  • the center point mentioned here is only the center point of the reference area where the scene matching is obtained, that is, the point of the same name.
  • the displacement referred to here refers to the difference between the x-axis coordinate values of two points of the same name.
  • the base length of the installation parameters of the present embodiment refers to the distance between the optical centers of the two imaging devices, which can be acquired by the acquisition module 37 from the flight control system.
  • the adjustment unit of the drone adjusts the first imaging device 52 and the second imaging device 54 to the altimetry orientation to capture an image for altimetry, and the lens optical center and the second imaging device of the first imaging device
  • the spacing between the optical centers of the lenses is a preset installation parameter, that is, a baseline length, which can be acquired by the acquisition module 37 from the flight control system. It can be understood that the installation parameter further includes the distance from the optical center of the first imaging device to the UAV body and the distance from the optical center of the second imaging device to the UAV body.
  • the first imaging device 52 and the second imaging device 54 employ a sub-pixel image sensor that improves the coordinate accuracy of the same-named point.
  • the disparity value is determined by the difference between the pixel coordinate values, and the standard image accuracy can only reach 1 pixel (pixel).
  • the coordinates of the same-named point can be increased to sub-pixels.
  • Subpixels can achieve an accuracy of 0.1 to 0.3 pixels, which is at least 3 times better than standard image pixel matching.
  • the coordinate precision of the point of the same name can be increased accordingly, thereby directly improving the parallax accuracy; according to formula (1), it can be known that the parallax ⁇ x has a direct influence on the accuracy of the height H. Therefore, sub-pixel matching can improve the accuracy of parallax solution, and thus improve the accuracy of high resolution.
  • the altimetry orientation is a side down direction of the first imaging device 52 and the second imaging device 54
  • the processor adjusting the drone attitude to cause the first imaging device and the second The imaging devices are at the same height.
  • the adjusting unit adjusts the posture of the UAV so that the first imaging device and the second imaging device respectively The image is taken when the optical axis is perpendicular to the ground.
  • the processor acquires a mounting angle of the first imaging device and the second imaging device with respect to the drone, respectively, when the height measuring direction is a side down direction; adjusting the first imaging device and the second imaging device according to the mounting angle
  • the direction of the optical axis When the altimetry direction is the side down view direction, a preset mounting angle is between the first imaging device and the second imaging device. In the case of non-positive view, if the installation angle is known, it can be solved similarly. This angle parameter needs to be obtained by other means after installation, for example by calibration in a laboratory environment. The present invention defaults to this mounting angle.
  • the scene matching drone height measurement of the embodiment of the present application has many technical effects, and only needs to use the two onboard cameras and the onboard processing chip of the drone itself, and does not need to specifically add other drone height measuring devices. .
  • the ultrasonic height measurement there is no need to add ultrasonic equipment; and the calculation amount is small, simple and fast, and the absolute height of the drone is measured by the binocular scene matching (the height measurement of the drone from the ground), and the measurement accuracy is higher than the ultrasonic wave. ⁇ 10 times.
  • the height measurement application height of the embodiment of the present application is in the range of about 30 meters.
  • the flight control system will automatically switch to the altitude measurement using the air pressure drone height measurement combined with the GPS drone height to determine the relative height and the ground height.
  • the present application also relates to a distance measuring method for a drone, which is implemented based on a computer program running by the altimetry unit 30, which can be implemented as shown in FIG.
  • the height calculation module 36, the matching module 32, the point-of-name module 33, and the computer program of the parallax module 34 function.
  • the distance measuring method of the drone includes the following steps:
  • the distance measurement by the drone through the vision system can be triggered when the vision system is turned on, or can be triggered according to a request command sent by the ground remote controller 80.
  • the position correction of the first imaging device and the second imaging device may be first performed. Taking the following vision system as an example, obtaining the mounting angles of the first imaging device and the second imaging device with respect to the carrier, respectively, when the mounting angles of the first imaging device and the second imaging device with respect to the carrier are respectively 90 degrees, Adjusting a photographing position of the first imaging device and the second imaging device such that an optical axis of the first imaging device and an optical axis of the second imaging device are both perpendicular to the ground;
  • the altimetry unit 30 controls the first imaging device and the second imaging device to adjust to the altimetry on the pan/tilt Orientation, such as the first imaging device and the second imaging device are oriented directly downward;
  • Step 101 Acquire a first image captured by a first imaging device on the drone at one time, and a second image captured by the second imaging device on the drone at the time;
  • the first imaging device and the second imaging device can be simultaneously photographed by the altimetry unit 30, so that the altimetry unit 30 can acquire the first image captured by the first imaging device on the drone, and acquire the second imaging device simultaneously.
  • the first imaging device and the second imaging device may perform simultaneous engraving according to the same clock crystal or clock control unit, or according to a separately configured synchronized time unit or clock crystal to obtain a group of images simultaneously photographed. That is, the first image captured by the first imaging device and the second image captured by the second imaging device.
  • the altimetry unit 30 can acquire the set of images taken simultaneously for further processing.
  • Step 102 Determine a first pixel block in the first image, and a second pixel block in the second image that matches the first pixel block; wherein the first pixel block and the second pixel Any one of the pixel blocks includes at least 2 pixel points;
  • the first block of pixels includes at least 2 pixels and the second block of pixels also includes at least 2 pixels.
  • the number of pixels included in the first pixel block may be the same as or different from the number of pixels included in the second pixel block.
  • the first pixel block and the second pixel block can be understood as the matched pixel block pairs described above.
  • the first pixel block and the second pixel block may be understood to partially overlap the scenes in the two images described in the above embodiments.
  • the first pixel block in the first image can be matched with the second pixel block in the second image by any of the following methods:
  • pixels in the set area in the first image constitute a first pixel block, or the first image and the second image are superimposed and aligned, and a scene overlapping area of the first image and the second image is determined, wherein the scene
  • the overlapping area may include an image of the object.
  • the pixels in a certain area are determined to constitute a first pixel block in the overlapping area of the scene.
  • the region of the first pixel block in the first image may be used as a reference region to determine an overlap region in the second image associated with the reference region.
  • At least one pixel block in the second image is determined.
  • the second image may be divided into a plurality of pixel blocks, and the pixel blocks may include the same pixel, that is, the area where the pixel block is overlapped, or each pixel block is independent, that is, does not include the same pixel, the pixel block. There is no overlap in the area. Or determining, according to the reference area where the first pixel block is located, an area related to the reference area in the second image, for example, at least one overlapping area, each overlapping area includes one pixel block, and determining that one of the overlapping areas is included
  • the pixel block is a second pixel block.
  • the block feature information refers to a feature of the entire block of the pixel, which is distinguished from the point feature information, and the point feature information is used to indicate the feature of one pixel.
  • the block feature information can characterize the overall state, grayscale, size, target feature, and the like of the block of pixels.
  • the block feature information may be represented by a block feature vector, and the block feature information may include a multi-dimensional block feature vector, which is not limited herein.
  • the block feature information can more accurately describe the features of the target, thereby improving the accuracy of the distance calculation.
  • the first piece of feature information of the first pixel block is matched with the block feature information of each of the at least one pixel block.
  • a second pixel block that successfully matches the first pixel block is obtained from the at least one pixel block, that is, the pixel block whose block feature information matches the first block feature information is used as the second pixel block.
  • the pixel block having the highest similarity between each block feature vector and the first block feature vector is determined from the at least one pixel block as the second pixel block.
  • Method 2 determining a first pixel block in the first image
  • the above pixel matching means that the first pixel block and the pixels in one pixel block of the second image perform line-by-line matching or column-by-column matching.
  • the relative positions of each set of pixels in the pixel block are the same. For example, if a certain pixel is located in the x-row y column in the first pixel block, the pixel point matched in the second image is also located in the x-row y column in the second pixel block.
  • the set of matched pixel points can be further matched according to the point feature information.
  • the point feature information of each pixel point can be represented by a point feature vector.
  • the point feature vector may include a multi-dimensional feature vector, such as a 128-dimensional feature vector or the like.
  • a matching result may be obtained according to the matching degree of the point feature vector, for example, the group of pixel points is successfully matched, or the group of pixel points fails to match.
  • the matching degree may be determined according to the ratio of the matching success to the matching structure, that is, the higher the proportion, the higher the matching degree.
  • the first pixel block is sequentially pixel-matched with each of the at least one pixel block determined in the second image.
  • the second pixel block matching the first pixel block can be determined according to the matching result.
  • the pixel block with the highest degree of matching is used as the second pixel block.
  • Manner 3 determining a first pixel point in the first pixel block in the first image
  • a pixel block of the at least one pixel block including a pixel point matching the first pixel point is used as the second pixel block.
  • the first pixel block in the first pixel block is determined, wherein the first pixel point may be one or more. That is to say, in this way, it is not necessary to match all the pixel points in the pixel block, which further improves the matching efficiency.
  • the first pixel point may be a center point of the first pixel block, that is, a point of the same name in the above embodiment.
  • the first pixel may further include other pixels in the first pixel block, which is not limited herein.
  • the pixel points of the first pixel point and the pixel block in the second image can be matched based on the point feature information.
  • the position of the pixel point matched with the first pixel is the same as the position of the first pixel with respect to the first image.
  • a second pixel block matching the first pixel block may be determined according to the matching result.
  • the pixel block including the pixel is the second pixel.
  • the degree of matching between each pixel block and the first pixel block is determined, thereby determining that the pixel block with the highest degree of matching is the second pixel block.
  • Step 103 Determine a distance between the UAV and the target according to the disparity value of the first pixel block and the second pixel block.
  • a disparity value of the first pixel block and the second pixel block may be determined by determining first position information of the first pixel block and second position information of the second pixel block. Specifically, position information of a certain pixel point in the first pixel block and position information of the pixel point in the second pixel block matching the pixel point may be determined to determine a parallax of the first pixel block and the second pixel block. value. For example, position information of a central pixel point of the first pixel block and position information of a central pixel point of the second pixel block may be determined.
  • the second pixel block respectively matching the at least two first pixel blocks may be obtained according to the foregoing manner, and at least two disparity values may be calculated. Then, at least two distance values may be determined according to the two disparity values, so that the distance between the drone and the target is determined according to the at least two distance values.
  • the determined at least two distance values may be calculated as an average value, and the average value is used as a distance value between the drone and the target, that is, the distance data calculated by the visual system.
  • the following example illustrates that the drone uses the above method to measure the height.
  • the exact altitude of the current flight of the drone is measured.
  • the matching the first image and the second image to determine the same name includes:
  • pixel matching is performed in the overlapping region of the second image, and a response region having the largest response value is obtained, and the center point of the response region and the center point of the reference region are points of the same name.
  • a plurality of height values are calculated based on a plurality of points of the same name, and an average value is taken as the height of the drone.
  • the disparity value When determining the disparity value, determining a coordinate value of the same-named point in the first image and determining a coordinate value of the same-named point in the second image, the coordinate value of the same-named point in the first image and the same-named point in the The difference between the coordinate values in the two images is the disparity value.
  • the first imaging device and the second imaging device employ a sub-pixel image sensor that improves the coordinate accuracy of the point of the same name.
  • the drone attitude is adjusted such that the first imaging device 52 and the second imaging device 54 are in the same altimetry orientation.
  • the altimetry orientation is a side down view direction of the first imaging device and the second imaging device. Acquiring the mounting angle of the first imaging device and the second imaging device with respect to the carrier, respectively, adjusting the posture of the carrier when the mounting angle of the first imaging device and the second imaging device with respect to the carrier is not 90 degrees So that the first imaging device and the second imaging device respectively perform image capturing when the optical axis is perpendicular to the ground.
  • an installation angle of the first imaging device and the second imaging device with respect to the carrier is respectively obtained; and an optical axis direction of the first imaging device and the second imaging device is adjusted according to the mounting angle.
  • the mounting angle is preset between the first imaging device and the second imaging device.
  • the height measurement application height of the embodiment of the present application is in the range of about 30 meters.
  • the flight control system will automatically switch to the altitude measurement using the air pressure drone height measurement combined with the GPS drone height to determine the relative height and the ground height. Since the height required for the technology such as the industrial drone terrain following is the height from the ground, and the GPS and barometer measure the absolute height, it is necessary to subtract the corresponding ground height before the ground of the drone can be obtained. high.
  • the drone height measurement method of the embodiment adopts a binocular scene matching image processing technology to measure the height measurement of the drone from the ground, and the measurement accuracy is 5 to 10 times higher than that of the ultrasonic wave.
  • the UAV and the UAV height measurement method of the embodiment do not need to add other UAV height measurement hardware to the UAV; only the existing onboard dual camera and the onboard processing chip are used, and no special addition is required. device. Compared with ultrasonic height measurement, there is no need to increase the ultrasonic equipment; the calculation of the height measurement of the drone is small, simple and fast, and the flying height of the drone can be fed back in real time, and the height of the drone can be measured quickly and accurately.
  • the image matching of the UAV and the UAV height measuring method of the embodiment is the same as the point name confirmation and the disparity value calculation, which is stable and reliable, and has low requirements on the scene.
  • the UAV and the UAV height measurement method of the embodiment can adopt a sub-pixel image processing scheme, and the measurement accuracy can be further improved by three times; the UAV and the UAV height measurement method of the embodiment have a relatively wide working distance. , for a height range of 30 meters.
  • FIG. 5 is a schematic structural diagram of a vision system according to an embodiment of the present application. As shown in FIG. 5, the vision system 600 includes:
  • At least two imaging devices two in Figure 5 as an example, respectively a first imaging device 610, a second imaging device 620;
  • a processor 630 respectively connected to at least two imaging devices
  • the processor 630 may be implemented by at least one visual processing unit (VPU); or implemented by other processing units, which is not limited herein.
  • VPU visual processing unit
  • the processor 630 and the memory 640 may be connected by a bus or other means, and the bus connection is taken as an example in FIG. Alternatively, the memory 640 is integrated in the processor 630.
  • the memory 640 is used as a non-volatile computer readable storage medium, and can be used for storing a non-volatile software program, a non-volatile computer executable program, and a module, as in the UAV distance measurement method in the embodiment of the present application.
  • Program instructions/modules for example, height estimation module 36, matching module 32, point-name module 33, parallax module 34, etc. shown in FIG. 4).
  • the processor 630 executes various functional applications and data processing of the vision system by executing non-volatile software programs, instructions, and modules stored in the memory 640, that is, the distance measurement method of the drone in the above method embodiment is implemented. .
  • the memory 640 may include a storage program area and a storage data area, wherein the storage program area may store any of the following: an operating system, an application required for at least one function; the storage data area may store distance data, image data, and the like.
  • memory 640 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device.
  • memory 640 can optionally include memory remotely located relative to processor 630, which can be connected to the drone via a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the one or more modules are stored in the memory 640, and when executed by the one or more processors 630, perform a distance measurement method of the drone in any of the above method embodiments, for example, performing the above description
  • the method steps 101 to 103 in FIG. 2, and the functions of the height estimation module 36, the matching module 32, the point-of-same point module 33, and the parallax module 34 in FIG. 4 are implemented.
  • the embodiment of the present application provides a non-transitory computer readable storage medium storing computer-executable instructions that are executed by one or more processors, such as in FIG.
  • the processor 630 is configured to enable the one or more processors to perform the distance measurement method of the drone in any of the foregoing method embodiments, for example, to perform the method steps 101 to 103 in FIG. 2 described above, to implement
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本文公开了一种无人机的距离测量方法和无人机。该无人机的距离测量方法包括如下步骤:获取无人机上的第一成像装置在一个时刻拍摄的第一图像,以及所述无人机上的第二成像装置在所述时刻拍摄的第二图像;确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块;其中,所述第一像素块和所述第二像素块中的任一者包括至少2个像素点;根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。可以提升测距效率。

Description

无人机的距离测量方法以及无人机
本申请要求于2017年05月19日提交中国专利局、申请号为201710356535.6、申请名称为“一种无人机高度测量方法以及无人机”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及无人飞行器领域,特别是涉及一种无人机的距离测量方法以及使用该方法的无人机。
背景技术
随着无线通讯技术、无线局域网、图像处理技术以及电池技术的发展,无人机的续航飞行功能以及图像处理越来越强大,越来越多的用户爱好无人机拍摄和探索。在无人机飞出用户视野范围时,无人机需以一定频率向地面控制站反馈其飞行数据以实现地面控制站控制无人机避障飞行,或无人机根据获取的飞行数据实现避障飞行,该等功能是保护无人机安全飞行的必备条件。
当前,无人机可以利用视觉系统测量与目标物之间的距离,并可将该距离作为飞行数据。如何提升无人机中视觉系统对距离的测量效率,成为本领域技术人员积极研究的课题。
发明内容
鉴于此,本发明实施例提供了无人机的距离测量方法以及无人机,可以提升视觉系统的距离测量的效率。
第一方面,本发明实施例提供一种无人机高度测量方法,包括如下步骤:
获取无人机上的第一成像装置在一个时刻拍摄的第一图像,以及所述无人机上的第二成像装置在所述时刻拍摄的第二图像;
确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块;其中,所述第一像素块和所述第二像素块中的任一者包括至少2个像素点;
根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。
可选地,所述确定所述第一图像中的第一像素块,以及所述第二图像中与 所述第一像素块匹配的第二像素块,包括:
确定所述第一图像中的第一像素块的第一块特征信息;
确定所述第二图像中的至少一个像素块中每个像素块的块特征信息;将所述第一块特征信息与所述每个像素块的块特征信息匹配;
将所述至少一个像素块中块特征信息与所述第一块特征信息匹配的像素块作为第二像素块。
可选地,所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块,包括:
确定所述第一图像中的第一像素块;
确定所述第二图像中的至少一个像素块;
将所述第一像素块与所述至少一个像素块进行像素匹配;
根据所述像素匹配的匹配结果,确定所述至少一个像素块中的第二像素块。
可选地,所述根据所述像素匹配的匹配结果,确定所述至少一个像素块中的第二像素块,包括:
将所述至少一个像素块中与所述第一像素块的像素匹配程度最高的像素块作为第二像素块。
可选地,所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块,包括:
确定所述第一图像中的第一像素块中的第一像素点;
确定所述第二图像中的至少一个像素块;
将所述第一像素点与所述至少一个像素块中每个像素块包括的像素点进行匹配;
将所述至少一个像素块中的包括与所述第一像素点匹配的像素点的像素块作为第二像素块。
可选地,所述将所述第一像素点与所述至少一个像素块中每个像素块包括的像素点进行匹配,包括:
确定所述第一像素点的点特征信息;
确定所述每个像素块包括的像素点的点特征信息;
将所述第一像素点的点特征信息与所述每个像素块包括的像素点的点特征信息进行匹配。
可选地,所述第一像素点包括所述第一像素块的中心像素点。
可选地,所述方法还包括:
确定所述第一像素块的第一位置信息,以及所述第二像素块的第二位置信息;
根据所述第一位置信息和所述第二位置信息,确定所述第一像素块与所述第二像素块的视差值。
可选地,所述确定所述第一像素块的第一位置信息,以及所述第二像素块的第二位置信息,包括:
确定所述第一像素块中第二像素点的第一位置信息,以及所述第二像素块中所述第三像素点的第二位置信息,所述第二像素点与所述第三像素点匹配。
可选地,所述根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离,包括:
根据所述第一成像装置和所述第二成像装置的安装参数,以及所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。
可选地,所述第一成像装置和第二成像装置的安装参数包括以下至少一个:
所述第一成像装置的镜头光心和第二成像装置的镜头光心之间的间距,第一成像装置的光心至无人机本体的距离,第二成像装置的光心至无人机本体的距离。
可选地,若所述第一图像中包括至少2个第一像素块,所述根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离,包括:
确定至少2个视差值;
根据所述至少2个视差值,确定至少2个距离值;
根据所述至少2个距离值,确定所述无人机与所述目标物之间的距离。
可选地,所述根据所述至少2个距离值,确定所述无人机与所述目标物之间的距离,包括:
计算所述至少2个距离值的平均值,并将所述平均值作为所述无人机与所述目标物之间的距离。
第二方面,本发明实施例提供一种无人机,包括:
第一成像装置以及第二成像装置;
以及与所述第一成像装置和所述第二成像装置分别连接的处理器;
其中,所述处理器用于执行第一方面中的任意一种方法。
第三方面,本申请实施例还提供了一种视觉系统,包括:
至少2个成像装置;
分别与所述至少2个成像装置连接的至少一个处理器;以及,
与该至少一个处理器通信连接的存储器;
其中,该存储器存储有可被该至少一个处理器执行的指令,用户通过该人机交互单元完成交互后该指令被至少一个处理器执行,以使该至少一个处理器能够执行第一方面中的任意一种方法。
第四方面,本申请实施例还提供了一种非易失性计算机可读存储介质,该计算机可读存储介质存储有计算机可执行指令,该计算机可执行指令用于使计算机执行第一方面中的任意一种方法。
第五方面,本申请实施例还提供了一种计算机程序产品,该计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,该计算机程序包括程序指令,当该程序指令被计算机执行时,使该计算机执行第一方面中的任意一种方法。
本发明实施方式的有益效果是:本实施例中提供的无人机高度测量方法以及无人机无需额外设置专门测高装置,通过预设第一成像装置和第二成像装置之间的位置结合无人机实时图像分析来实现距地高度测量,测高方法精简,图像匹配稳定可靠,测高响应快速,在设定测高范围内测量精度高出超声波测高的五至十倍。
附图说明
图1是本发明实施例提供的一种无人机的示意图;
图2是本发明实施例提供的一种无人机的距离测量方法的流程示意图;
图3是本发明实施例无人机高度测量方法的原理图;
图4是本发明实施例无人机的模块示意图;以及
图5是本发明实施例提供的无人机中视觉系统的硬件结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例提供的无人机的距离测量方法和无人机,可以提升无人机的视觉系统对距离测量的效率和精准度。
以下举例说明本申请实施例提供的一种应用环境。
无人机可以包括飞行控制系统(简称飞控系统),视觉系统;可选地,无人机还可以包括超声系统,红外系统、动力系统、电源系统、数据传输系统等多个系统。
其中,飞控系统是无人机的核心控制装置,该飞控系统可以包括:主控制模块、信号调理及接口模块、数据采集模块以及舵机驱动模块等。
飞控系统实时采集各传感器测量的飞行状态数据、接收地面控制器传输的经由无线电上行通道发送的控制命令及数据,经计算处理,输出控制指令给执行机构,实现对无人机中各种飞行模态的控制和对任务设备的管理与控制;同时将无人机的状态数据(即飞行数据)及发动机、机载电源系统、任务设备的工作状态参数实时传送给数据传输系统,经无线电下行信道发送回地面控制器。
所述飞控系统还用于:完成多路模拟信号的高精度采集,包括陀螺信号、航向信号、舵偏角信号、发动机转速、缸温信号、动静压传感器信号、电源电压信号等。输出开关量信号、模拟信号和PWM脉冲信号等能适应不同执行机构如方向舵机、副翼舵机、升降舵机、气道和风门舵机等的控制。利用多个通信信道,分别实现与机载数据终端、GPS信号、数字量传感器以及相关任务设备的通信。
该飞控系统的软件设计分为2部分,即逻辑电路芯片的程序设计和飞控系统的应用程序设计。逻辑电路芯片用来构成数字逻辑控制电路,完成译码和隔离以及为A/D,D/A等等。应用程序用以实现飞控系统的上述功能。
视觉系统包括一个或多个成像装置,例如,视觉系统可以包括两个成像装置,在此情况下,这两个成像装置也可以成为双目成像装置。其中,成像装置包括光学组件以及图像采集传感器。光学组件用于对物体进行成像,图像采集传感器用于利用光学组件采集图像数据。进一步地,视觉系统还可以包括分别与一个或多个成像装置连接的处理器,该处理器可以包括一个或多个。其中,图像采集传感器所采集的图像数据可以传输给该处理器,该处理器对图像采集传感器所采集的图像数据进行处理。例如,该处理器可以根据多个成像装置同时拍摄得到的图像,计算图像中目标物与无人机之间的距离。该处理器可以进一步地将计算得到的距离传输给上述飞控系统,从而该飞控系统根据该距离控制无人机飞行,以实现避障、跟踪、悬停等功能。
其中,视觉系统可以安装在无人机的前端、后端、上端、下端中的一个或多个位置。本申请实施例对于视觉系统在无人机上的安装位置不予限定。当无人机的下端安装有视觉系统时,该视觉系统可以被称为下视视觉系统,下视视觉系统可以测量无人机的高度,此时,可以理解成为无人机与地面之间的距离,即地面作为目标物。
对于上述无人机中的其他系统可以参见当前无人机中的实现方式,在此不一一描述。
本申请实施例中以视觉系统包括两个成像装置为例进行说明。当前,视觉系统中的处理器可以获取这两个成像装置同时刻分别采集的图像,即处理器可以获取2个图像,并可以将这2个图像进行像素点匹配,以得相匹配的像素点对,处理器可以根据这相匹配的像素点对之间的视差值,来计算像素点对应的目标物与无人机之间的距离。
当前通过此种方式,视觉系统计算得到距离的效率较低,且精度不高;同时,所检测的目标物的范围较小,例如,仅能计算10米内目标物的距离。
本申请实施例中提供的无人机的距离测量方法以及无人机,测量效率高,精度准且测量范围广。
需要说明的是,当本申请实施例提供的距离测量方法应用在下视视觉系统中,用于测量高度时,可以以测算最直接计算最简化的正下视方位实现所述无人机飞行高度的测量;也可以应用在预知该第一成像装置和第二成像装置的安装角度的前提下,该无人机的侧下视方向高度测量场合。
实施例1
请参考图1和图4,本实施例的无人机,包括机身20、云台50、螺旋桨30、32以及设置在云台的第一成像装置52以及第二成像装置54。需要说明的是,图1所示的包括第一成像装置52和第二成像装置54的视觉系统安装在主相机的位置。当然,该视觉系统还可以安装在无人机其他位置。图1中视觉系统的位置及成像装置的朝向仅为示例性地,本申请实施例对其进行限定。
该无人机还包括飞控处理器以及和飞控处理器连接的无线通信模组,该无线通信模组与地面遥控器(即地面控制站)80建立无线连接,在飞控处理器的控制下通过无线通信模组将无人机飞行状态参数(即上述飞行参数)以及图像数据发送至该地面遥控器80。该无线通信模组接收地面遥控器80发送的操作指令,该飞控处理器基于该操作指令完成无人机的飞行调控。该飞控处理器可以是飞行控制系统中的处理器。
视觉系统中也可以包括处理器,该处理器与该第一成像装置52以及第二成像装置54之间可以利用通信接口、通信总线等进行连接。本申请实施例对于视觉系统中的处理器与成像装置之间的连接方式不予限定。
从硬件层面说明:
本实施例的无人机,设置一个或者多个处理器。该一个或者多个处理器可以包括在飞行控制系统中,用以控制系统硬件模组完成无人机的飞行、图传、测高、姿态调整等功能。
本实施例中,该第一成像装置52和第二成像装置54处于正下视方位时,可测量到更准确的无人机飞行高度,当该第一成像装置52和第二成像装置54不在正下视方位时,该处理器还用于调整无人机姿态以使该第一成像装置52和第二成像装置54处于相同测高方位。
在另一实施例中,该处理器还用于获取该第一成像装置52和第二成像装 置54分别相对于承载体,比如无人机的安装角度;根据该安装角度调整第一成像装置和第二成像装置的光轴方向,直至该第一成像装置52和第二成像装置54处于相同测高方位。
从软件层面说明:
该无人机机身中除了安装电池、处理器、存储器、无线通信模组等飞控硬件之外,还需搭载飞控系统22和视觉系统的相关软件。
该飞控系统22与测高单元30连接;该测高单元30可以理解为上述视觉系统中的处理器。其中,下视视觉系统中的测高单元30可以用于测量无人机的高度,位于其他位置的视觉系统中的测高单元30可以用于测量无人机与目标物之间的距离。该测高单元30可以获取无人机上的第一成像装置52和第二成像装置54同时拍摄生成景象部分重叠的第一图像和第二图像。其中,第一图像与第二图像中的景象部分重叠可以理解成将第一图像与第二图像叠加在一起时,用以表征同一目标物的景象在图像上的重叠部分。即,由于两个成像装置拍摄的图像存在视差,因此,同一目标物在图像中的位置并不完全相同,将其叠加在一起时,不会完全重合,而是部分重叠。
该测高单元30可以控制第一成像装置与第二成像装置同时拍摄,或者,第一成像装置与第二成像装置可以根据各自配置的晶振实现同时拍摄,测高单元30可以获取同时拍摄的两个图像。
该测高单元30可以包括匹配模块32、获取模块37以及高度测算模块36。
该匹配模块32用于进行像素点匹配或者像素块匹配,以得到相匹配的像素点对或像素块对。该获取模块37用于获取第一图像和第二图像,还可以用于获取该第一成像装置52和第二成像装置54的安装参数,本实施例中该安装参数为基线长度,并获取成像装置的成像焦距f。该测算单元根据该安装参数,亦即基线长度、成像焦距以及像素点对或像素块对之间的视差值中的一种或多种参数,测算出无人机当前飞行的准确高度或与目标物之间的距离。
该无人机还包括调整单元,用于调整第一成像装置和第二成像装置至测高或测距方位,即实现成像装置的校正。
景象匹配是指将一个已知的图像区域从其它传感器摄取的另一个相应景象区域中确定出来或者找到景象区域之间的对应关系的一种图像分析与处理 技术。该技术在导航、制导等军事领域中也有着重要的应用价值。本申请所指的景象匹配是指通过图像比较匹配算法在两幅图之间将一个图像的基准区域从目标区域中识别出来,找到它们之间的识别同名点的图像分析与图像处理技术。在双目视觉中,匹配同一时刻的左右两幅图像,即左图和右图互为基准图和实时图。根据视差解算距地高度。
为了实现景象匹配,该匹配模块32包括同名点模块33。同名点是指空间上的某一个点在左右两幅图像上分别所成的两个像点,这两个像点即同名点,也可以理解为上述相匹配的像素点对。该匹配模块32还可以包括像素块模块,该像素块模块用于匹配两个图像中的像素块,以得到相匹配的像素块对。相匹配的像素块对可以包括至少一个相匹配的像素点对。或者,像素块对可以基于其他匹配方式得到。
一种实现方式中,该匹配模块33确定该第一图像、第二图像的重叠区域,该同名点模块33以其中第一图像的重叠区域内的设定区域为基准区域,根据该基准区域,在第二图像的重叠区域内进行像素匹配,得到响应值最大的响应区域,该响应区域的中心点以及基准区域的中心点为同名点。
该匹配模块33还包括视差模块34。该视差模块34确定该同名点在第一图像中的坐标值以及确定该同名点在第二图像中的坐标值,该同名点在第一图像中的坐标值与该同名点在第二图像中的坐标值之差为视差值。或者,视差模块34可以确定相匹配的像素块对之间的视差值。
总的来说,基于该第一图像、第二图像的景象匹配双目测高的一种实现方式为:首先第一成像装置52和第二成像装置54同时拍摄一组图像,将该第一图像、第二图像进行景象匹配,得到两幅图的重叠区域,根据重叠区域的信息可以测算出目前相机的位置高度,也就是无人机的飞行高度。
在推算同名点的坐标值时,由于景象匹配输出的结果就是同名点的图像坐标值,因此,同名点坐标可直接通过对第一成像装置52和第二成像装置54所成的两幅图像进行景象匹配得到。
该无人机的调整单元将第一成像装置和第二成像装置调整至测高方位。下面结合附图介绍视觉系统进行测高时的原理。
请参考图3,本实施例中,该测高方位是的正下视位置,当该第一成像装 置52和第二成像装置54分别相对于无人机机身所在水平面呈安装角度为90度时,该调整单元调整第一成像装置和第二成像装置至正下视位置,以使第一成像装置的光轴和第二成像装置的光轴均垂直于地面。也就是该第一成像装置52和第二成像装置54均朝正下方拍摄图像。即可以首先实现对成像装置的校正。该第一成像装置52和第二成像装置54正对着下方进行拍摄,相机的镜面与地面平行,光心与图像中心所在的连线与铅垂线平行。
在正下视的情况下,可根据视差直接求解高度。如图3所示,地面点P3在该第一图像、第二图像的中对应像点分别为O1P3和O2P3,其x方向坐标分别记为x 1和x r,视差Δ x=x 1-x r。相当于同名点P3在该第一图像、第二图像中对应像点分别为O1P3和O2P3,其x方向坐标分别记为x 1和x r,视差Δ x=x 1-x r
此时相机距地无人机高度测量,也就是无人机的飞行高度有如下几何关系:
Figure PCTCN2018082653-appb-000001
其中,f为等效焦距,B为两相机之间基线长度,Δ x为视差,即景象匹配所得到的同名点的位移。等效焦距f是相机实际的物理焦距与每个像元物理尺寸的比值,是相机的属性参数,可以通过该第一图像、第二图像的图像属性中推算得到。需要说明的是,这里所说的中心点只是在景象匹配所得到的所在基准区域的中心点,也就是同名点。这里所说的位移是指两个同名点的x轴坐标值之差。
本实施例安装参数的基线长度(base length)指该两个成像装置光心之间的距离,通过获取模块37从飞控系统中获取即可。
实施时,无人机的调整单元将该第一成像装置52和第二成像装置54调整至测高方位即可拍摄图像进行测高,并且该第一成像装置的镜头光心和第二成像装置的镜头光心之间的间距为预设的安装参数,亦即基线长度,通过获取模块37从飞控系统中获取即可。可以理解的是,该安装参数还包括第一成像装置的光心至无人机本体的距离和第二成像装置的光心至无人机本体的距离。
为了进一步提高测高精度,该第一成像装置52和第二成像装置54采用提高该同名点的坐标精度的亚像素图像传感器。可以理解的是,该视差值是通过 像素坐标值之差确定的,标准图像精度最高只能达到1像素(pixel)。在采用亚像素图像精度处理的匹配算法中,可以将同名点的坐标提高至亚像素。亚像素的精度可达到0.1~0.3像素(pixel),比标准图像像素匹配的精度提高了至少3倍。因此,同名点的坐标精度也可相应提高,从而直接提高视差精度;根据公式(1),可以知道视差Δ x对高度H的求解精度有着直接的影响。因此,采用亚像素匹配可以提高视差求解的精度,进而提高高度求解精度。
在另一实施例中,该测高方位为的该第一成像装置52和第二成像装置54的侧下视方向,该处理器调整无人机姿态以使该第一成像装置和该第二成像装置处于相同高度。当第一成像装置和第二成像装置分别相对于无人机机身的安装角度不为90度时,该调整单元调整该无人机的姿态,以使第一成像装置和第二成像装置分别在光轴垂直于地面时完成拍摄图像。
该测高方位为侧下视方向时,该处理器获取该第一成像装置和第二成像装置分别相对于无人机的安装角度;根据该安装角度调整第一成像装置和第二成像装置的光轴方向。该测高方位为侧下视方向时,该第一成像装置和第二成像装置之间预设安装角度。非正下视的情况下,若安装角度已知,亦可类似的进行求解。该角度参数需要在安装后通过其他途径获取,例如可以通过在实验室的环境下标定获取。本发明默认该安装角度已知。
本申请实施例的景象匹配无人机高度测量有诸多技术效果,只需使用已安装的两个机载相机和无人机本身的机载处理芯片,不需要专门增加其它无人机高度测量设备。与超声波测高相比,不需要增加超声波设备;并且计算量小、简单快速,通过双目景象匹配,测量无人机的绝对高度(距地无人机高度测量),测量精度比超声波高5~10倍。
由视觉原理可知,在基线距离固定不变的前提下,当被观测物体与观测者(第一成像装置和第二成像装置)的距离增加时,视差也相应减小。因此,本申请实施例的测高应用高度在30米左右范围。无人机超出30高度飞行时,则飞控系统会自动切换至用气压无人机高度测量结合GPS无人机高度测量来确定相对高度和距地高度。
实施例2
请一并参考图2和图4,本申请还涉及一种无人机的距离测量方法,该方法是基于测高单元30运行的计算机程序实现的,该计算机程序包括能够实现附图4所示高度测算模块36、匹配模块32、同名点模块33以及视差模块34功能的计算机程序。
该无人机的距离测量方法包括如下步骤:
无人机通过视觉系统进行距离测量可以是在该视觉系统被开启时触发的,也可以根据地面遥控器80发送的请求指令触发的。其中,可以首先对第一成像装置与第二成像装置的位置校正。以下视视觉系统为例,获取该第一成像装置和第二成像装置分别相对于承载体的安装角度,当第一成像装置和第二成像装置分别相对于承载体的安装角度为90度时,调整该第一成像装置和第二成像装置的拍摄位置,以使第一成像装置的光轴和第二成像装置的光轴均垂直于地面;
调整第一成像装置和第二成像装置至测高方位;该测高单元30收到启动指令后,该测高单元30控制该第一成像装置和第二成像装置在云台上调整至测高方位,比如第一成像装置和第二成像装置朝向正下方;
步骤101:获取无人机上的第一成像装置在一个时刻拍摄的第一图像,以及所述无人机上的第二成像装置在所述时刻拍摄的第二图像;
示例性地,可由测高单元30控制第一成像装置与第二成像装置同时拍摄,从而,测高单元30可以获取无人机上的第一成像装置拍摄的第一图像,获取第二成像装置同时拍摄的第二图像。
或者,第一成像装置与第二成像装置可以根据相同的时钟晶振或时钟控制单元,或根据各自配置的同步的时间单元或时钟晶振,实现同时刻拍摄,以得到同时刻拍摄的一组图像,即第一成像装置拍摄的第一图像,第二成像装置拍摄的第二图像。
测高单元30可以获取到同时刻拍摄的该组图像,以进行进一步处理。
步骤102:确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块;其中,所述第一像素块和所述第二像素块中的任一者包括至少2个像素点;
示例性地,第一像素块包括至少2个像素点,第二像素块也包括至少2 个像素点。第一像素块包括的像素点的个数可以与第二像素块包括的像素点的个数相同,或不同。第一像素块与第二像素块可以理解成上述相匹配的像素块对。或者,第一像素块与第二像素块可以理解成上述实施例中描述的两个图像中景象部分重叠。
可以通过以下任意一种方式实现第一图像中的第一像素块与第二图像中的第二像素块进行匹配:
方式一,确定第一图像中的第一像素块;
例如,可以确定第一图像中设定区域内的像素组成第一像素块,或者,第一图像与第二图像进行叠加比对,确定第一图像与第二图像的景象重叠区域,其中,景象重叠区域可以包括目标物的图像。在景象重叠区域确定某一区域内的像素组成第一像素块。可以将该第一像素块在第一图像中的区域作为基准区域,用以确定第二图像中的与该基准区域相关的重叠区域。
确定第二图像中的至少1个像素块。
例如,可以将第二图像划分为多个像素块,像素块之间可以包括相同的像素,即像素块所在区域存在重叠,或者,各像素块之间独立,即不包括相同的像素,像素块所在区域不存在重叠。或者,根据第一像素块所在的基准区域,确定第二图像中与该基准区域相关的区域,例如,至少一个重叠区域,每个重叠区域包括的一个像素块,并确定其中一个重叠区域所包括的像素块为第二像素块。
确定第一像素块的第一块特征信息,以及第二图像中至少一个像素块中每个像素块的块特征信息。
在此,块特征信息是指像素块整体的特征,其区别与点特征信息,点特征信息用于指示一个像素点的特征。块特征信息能够表征该像素块的整体状态、灰度、大小、目标物特征等。块特征信息可以通过块特征向量表示,块特征信息可以包括多维块特征向量,在此不予限定。
相较于点特征信息的匹配,块特征信息能够更准确的描述目标物的特征,进而能够提升距离计算的精准度。
具体地,将第一像素块的第一块特征信息与至少一个像素块中每个像素块的块特征信息进行匹配。从而从至少一个像素块中得到与第一像素块匹配成功 的第二像素块,即块特征信息与第一块特征信息匹配的像素块作为第二像素块。例如,从至少一个像素块中确定各块特征向量与第一块特征向量相似度最高的像素块为第二像素块。
方式二,确定第一图像中的第一像素块;
确定第二图像中的至少一个像素块;
上述步骤的实现方式可以参见方式一中的描述。
将所述第一像素块与所述至少一个像素块进行像素匹配;
根据所述像素匹配的匹配结果,确定所述至少一个像素块中的第二像素块。
具体地,上述像素匹配是指第一像素块与第二图像的一个像素块中的像素进行逐行匹配或逐列匹配。每组像素点在像素块中的相对位置相同。如,某一像素点在第一像素块中位于x行y列,则第二图像中与其匹配的像素点在第二像素块中同样位于x行y列。该组进行匹配的像素点可以进一步地根据点特征信息进行匹配。每个像素点的点特征信息可以用点特征向量来表示。该点特征向量可以包括多维特征向量,例如128维特征向量等。
其中,每组像素点在进行匹配后,可以根据点特征向量的匹配程度,得到一组匹配结果,例如,该组像素点匹配成功,或该组像素点匹配失败。
当将第一像素块与与其进行匹配的像素中所有组像素点进行匹配后,可以根据匹配成功占匹配结构的比例确定匹配程度,即所占比例越高,匹配程度越高。
根据上述方式,依次将第一像素块与第二图像中确定的至少一个像素块中的每个像素块进行像素匹配。从而可以根据匹配结果,确定出与第一像素块匹配的第二像素块。例如,将匹配程度最高的像素块作为第二像素块。
方式三,确定所述第一图像中的第一像素块中的第一像素点;
确定所述第二图像中的至少一个像素块;
将所述第一像素点与所述至少一个像素块中每个像素块包括的像素点进行匹配;
将所述至少一个像素块中的包括与所述第一像素点匹配的像素点的像素块作为第二像素块。
具体地,确定第一像素块中的第一像素块,其中,第一像素点可以是一个或多个。也就是说,通过这种方式,无需将像素块中的所有像素点进行匹配,进一步提升了匹配效率。
该第一像素点可以是第一像素块的中心点,即上述实施例中的同名点。或者,该第一像素点还可以包括第一像素块中的其他像素点,在此不予限定。
从而,可以将第一像素点与第二图像中的像素块的像素点基于点特征信息进行匹配。其中,与第一像素点进行匹配的像素点相对于第二图像的位置与第一像素点相对于第一图像的位置相同。
进一步地,可以根据匹配结果确定与第一像素块相匹配的第二像素块。
若第一像素点包括一个像素点,若存在与该第一像素点匹配成功的像素点,则包括该像素点的像素块为第二像素块。
当第一像素点包括多个像素点,则确定各像素块与第一像素块的匹配程度,进而确定匹配程度最高的像素块为第二像素块。
需要说明的是,上述步骤的执行顺序可以改变。
步骤103:根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。
一种实现方式中,可以通过确定第一像素块的第一位置信息,以及第二像素块的第二位置信息,来确定第一像素块与第二像素块的视差值。具体地,可以确定第一像素块中某一像素点的位置信息,以及第二像素块中与该像素点匹配的像素点的位置信息,来确定第一像素块与第二像素块的视差值。例如,可以确定第一像素块的中心像素点的位置信息,与第二像素块的中心像素点的位置信息。
当然,还可以获取其他参数,例如各成像装置的安装角度,镜头光心之间的距离,第一成像装置的光心至无人机的距离,第二成像装置的光心至无人机的距离等等,来计算出无人机与目标物之间的距离,在此不予限定。
进一步地,若第一图像中包括至少2个第一像素块,则可以根据上述方式得到分别与这至少2个第一像素块匹配的第二像素块,进而可以计算得到至少2个视差值;则可以根据这两个视差值,来确定至少2个距离值,从而,根据这至少2个距离值,来确定无人机与目标物之间的距离。
一种实现方式中,可以将确定的至少2个距离值计算得到平均值,将该平均值作为无人机与目标物之间的距离值,即视觉系统计算得到的距离数据。
下面举例说明,无人机应用上述方法测高。
在其中一实施例中,测算出无人机当前飞行的准确高度。
在确定同名点时,该匹配第一图像、第二图像以确定同名点包括:
确定该第一图像、第二图像的重叠区域,以第一图像中的重叠区域内的设定区域为基准区域;
根据该基准区域,在第二图像的重叠区域内进行像素匹配,得到响应值最大的响应区域,该响应区域的中心点以及基准区域的中心点为同名点。
其中,该设定区域为多个时,则根据多个同名点测算出多个高度值后取平均值作为无人机的高度。
在确定视差值时,确定该同名点在第一图像中的坐标值以及确定该同名点在第二图像中的坐标值,该同名点在第一图像中的坐标值与该同名点在第二图像中的坐标值之差为视差值。
针对该预存的安装参数,亦即基线长度,在获取该第一成像装置和第二成像装置的安装参数之前,还包括安装该第一成像装置和第二成像装置,使得该第一成像装置的镜头光心和第二成像装置的镜头光心之间的间距为预设的安装参数,保存该安装参数。
为了进一步提高测高精度,该第一成像装置和第二成像装置采用提高该同名点的坐标精度的亚像素图像传感器。
在另一实施例中,调整无人机姿态以使第一成像装置52和第二成像装置54处于相同测高方位。
该测高方位为的该第一成像装置和第二成像装置的侧下视方向。获取该第一成像装置和第二成像装置分别相对于承载体的安装角度,当第一成像装置和第二成像装置分别相对于承载体的安装角度不为90度时,调整该承载体的姿态,以使第一成像装置和第二成像装置分别在光轴垂直于地面时进行图像拍摄。
或者,获取该第一成像装置和第二成像装置分别相对于承载体的安装角度;根据该安装角度调整第一成像装置和第二成像装置的光轴方向。该测高方 位为侧下视方向时,该第一成像装置和第二成像装置之间预设安装角度。
由视觉原理可知,当被观测物体与观测者(第一成像装置和第二成像装置)的距离增加时,由视差产生的位移也成比例地减少。因此,本申请实施例的测高应用高度在30米左右范围。无人机超出30高度飞行时,则飞控系统会自动切换至用气压无人机高度测量结合GPS无人机高度测量来确定相对高度和距地高度。由于行业无人机地形跟随等技术所需要测量的高度是距地高度,而GPS和气压计测量的是绝对高度,因此需要减去相应的地面高度之后,才能得到此时无人机的距地高。
本实施例的无人机高度测量方法,采用双目景象匹配图像处理技术,实现测量无人机的距地面无人机高度测量,并且测量精度比超声波高5~10倍。
本实施例的无人机以及无人机高度测量方法无需在无人机上增加其它无人机高度测量硬件;只需使用现有的机载的双摄像机和机载处理芯片,不需要专门增加其它设备。与超声波测高相比,不需要增加超声波设备;无人机高度测量的计算量小、简单快速,可实时性反馈无人机飞行高度,快捷、准确测量无人机的距地高度。本实施例的无人机以及无人机高度测量方法的图像匹配同名点确认和视差值计算,稳定可靠,对场景要求低。并且本实施例的无人机以及无人机高度测量方法可以采用亚像素图像处理方案,测量精度可以进一步提高三倍;本实施例的无人机以及无人机高度测量方法作用距离相对较宽,为30米高度范围。
实施例3
图5是本申请实施例提供的一种视觉系统的结构示意图,如图5所示,该视觉系统600包括:
至少两个成像装置,图5中以两个为例,分别为第一成像装置610,第二成像装置620;
与至少两个成像装置分别连接的处理器630;
当然,还可以包括存储器640等。
其中,处理器630可以由至少一个视觉处理单元(Visual Processing Unit,VPU)实现;或者由其他处理单元实现,在此不予限定。
处理器630、存储器640可以通过总线或者其他方式连接,图5中以通过总线连接为例。或者,存储器640集成在处理器630中。
存储器640作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的无人机距离测量方法对应的程序指令/模块(例如,附图4所示高度测算模块36、匹配模块32、同名点模块33、视差模块34等)。处理器630通过运行存储在存储器640中的非易失性软件程序、指令以及模块,从而执行视觉系统的各种功能应用以及数据处理,即实现上述方法实施例中的无人机的距离测量方法。
存储器640可以包括存储程序区和存储数据区,其中,存储程序区可存储以下任一:操作系统、至少一个功能所需要的应用程序;存储数据区可存储距离数据,图像数据等。此外,存储器640可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器640可选包括相对于处理器630远程设置的存储器,这些远程存储器可以通过网络连接至无人机。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述一个或者多个模块存储在所述存储器640中,当被所述一个或者多个处理器630执行时,执行上述任意方法实施例中的无人机的距离测量方法,例如,执行以上描述的图2中的方法步骤101至步骤103,以及实现图4中高度测算模块36、匹配模块32、同名点模块33以及视差模块34的功能。
上述产品可执行本申请实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本申请实施例所提供的方法。
本申请实施例提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,该计算机可执行指令被一个或多个处理器执行,例如图5中的一个处理器630,可使得上述一个或多个处理器可执行上述任意方法实施例中的无人机的距离测量方法,例如,执行以上描述的图2中的方法步骤101至步骤103,实现图4中的附图4所示高度测算模块36、匹配模块32、同名点模块33以及视差模块34的功能。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的 单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施方式的描述,本领域普通技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (28)

  1. 一种无人机的距离测量方法,其特征在于,包括:
    获取无人机上的第一成像装置在一个时刻拍摄的第一图像,以及所述无人机上的第二成像装置在所述时刻拍摄的第二图像;
    确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块;其中,所述第一像素块和所述第二像素块中的任一者包括至少2个像素点;
    根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。
  2. 根据权利要求1所述的方法,其特征在于,所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块,包括:
    确定所述第一图像中的第一像素块的第一块特征信息;
    确定所述第二图像中的至少一个像素块中每个像素块的块特征信息;将所述第一块特征信息与所述每个像素块的块特征信息匹配;
    将所述至少一个像素块中块特征信息与所述第一块特征信息匹配的像素块作为第二像素块。
  3. 根据权利要求1所述的方法,其特征在于,所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块,包括:
    确定所述第一图像中的第一像素块;
    确定所述第二图像中的至少一个像素块;
    将所述第一像素块与所述至少一个像素块进行像素匹配;
    根据所述像素匹配的匹配结果,确定所述至少一个像素块中的第二像素块。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述像素匹配的匹配结果,确定所述至少一个像素块中的第二像素块,包括:
    将所述至少一个像素块中与所述第一像素块的像素匹配程度最高的像素块作为第二像素块。
  5. 根据权利要求1所述的方法,其特征在于,所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块,包括:
    确定所述第一图像中的第一像素块中的第一像素点;
    确定所述第二图像中的至少一个像素块;
    将所述第一像素点与所述至少一个像素块中每个像素块包括的像素点进行匹配;
    将所述至少一个像素块中的包括与所述第一像素点匹配的像素点的像素块作为第二像素块。
  6. 根据权利要求5所述的方法,其特征在于,所述将所述第一像素点与所述至少一个像素块中每个像素块包括的像素点进行匹配,包括:
    确定所述第一像素点的点特征信息;
    确定所述每个像素块包括的像素点的点特征信息;
    将所述第一像素点的点特征信息与所述每个像素块包括的像素点的点特征信息进行匹配。
  7. 根据权利要求5或6所述的方法,其特征在于,所述第一像素点包括所述第一像素块的中心像素点。
  8. 根据权利要求1至7任一项所述的方法,其特征在于,所述方法还包括:
    确定所述第一像素块的第一位置信息,以及所述第二像素块的第二位置信息;
    根据所述第一位置信息和所述第二位置信息,确定所述第一像素块与所述第二像素块的视差值。
  9. 根据权利要求8所述的方法,其特征在于,所述确定所述第一像素块的第一位置信息,以及所述第二像素块的第二位置信息,包括:
    确定所述第一像素块中第二像素点的第一位置信息,以及所述第二像素块中所述第三像素点的第二位置信息,所述第二像素点与所述第三像素点匹配。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离,包括:
    根据所述第一成像装置和所述第二成像装置的安装参数,以及所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。
  11. 根据权利要求10所述的方法,其特征在于,所述第一成像装置和第二成像装置的安装参数包括以下至少一个:
    所述第一成像装置的镜头光心和第二成像装置的镜头光心之间的间距,第一成像装置的光心至无人机本体的距离,第二成像装置的光心至无人机本体的距离。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,若所述第一图像中包括至少2个第一像素块,所述根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离,包括:
    确定至少2个视差值;
    根据所述至少2个视差值,确定至少2个距离值;
    根据所述至少2个距离值,确定所述无人机与所述目标物之间的距离。
  13. 根据权利要求12所述的方法,其特征在于,所述根据所述至少2个距离值,确定所述无人机与所述目标物之间的距离,包括:
    计算所述至少2个距离值的平均值,并将所述平均值作为所述无人机与所述目标物之间的距离。
  14. 一种无人机,包括:
    第一成像装置以及第二成像装置;
    以及与所述第一成像装置和所述第二成像装置分别连接的处理器;
    其中,所述处理器用于:
    获取无人机上的第一成像装置在一个时刻拍摄的第一图像,以及所述无人机上的第二成像装置在所述时刻拍摄的第二图像;
    确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块;其中,所述第一像素块和所述第二像素块中的任一者包 括至少2个像素点;
    根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。
  15. 根据权利要求14所述的无人机,其特征在于,在所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块方面,所述处理器具体用于:
    确定所述第一图像中的第一像素块的第一块特征信息;
    确定所述第二图像中的至少一个像素块中每个像素块的块特征信息;将所述第一块特征信息与所述每个像素块的块特征信息匹配;
    将所述至少一个像素块中块特征信息与所述第一块特征信息匹配的像素块作为第二像素块。
  16. 根据权利要求14所述的方法,其特征在于,在所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块方面,所述处理器具体用于:
    确定所述第一图像中的第一像素块;
    确定所述第二图像中的至少一个像素块;
    将所述第一像素块与所述至少一个像素块进行像素匹配;
    根据所述像素匹配的匹配结果,确定所述至少一个像素块中的第二像素块。
  17. 根据权利要求16所述的无人机,其特征在于,在所述根据所述像素匹配的匹配结果,确定所述至少一个像素块中的第二像素块方面,所述处理器具体用于:
    将所述至少一个像素块中与所述第一像素块的像素匹配程度最高的像素块作为第二像素块。
  18. 根据权利要求14所述的无人机,其特征在于,在所述确定所述第一图像中的第一像素块,以及所述第二图像中与所述第一像素块匹配的第二像素块方面,所述处理器具体用于:
    确定所述第一图像中的第一像素块中的第一像素点;
    确定所述第二图像中的至少一个像素块;
    将所述第一像素点与所述至少一个像素块中每个像素块包括的像素点进行匹配;
    将所述至少一个像素块中的包括与所述第一像素点匹配的像素点的像素块作为第二像素块。
  19. 根据权利要求18所述的无人机,其特征在于,在所述将所述第一像素点与所述至少一个像素块中每个像素块包括的像素点进行匹配方面,所述处理器具体用于:
    确定所述第一像素点的点特征信息;
    确定所述每个像素块包括的像素点的点特征信息;
    将所述第一像素点的点特征信息与所述每个像素块包括的像素点的点特征信息进行匹配。
  20. 根据权利要求18或19所述的无人机,其特征在于,所述第一像素点包括所述第一像素块的中心像素点。
  21. 根据权利要求14至20任一项所述的无人机,其特征在于,所述处理器还用于:
    确定所述第一像素块的第一位置信息,以及所述第二像素块的第二位置信息;
    根据所述第一位置信息和所述第二位置信息,确定所述第一像素块与所述第二像素块的视差值。
  22. 根据权利要求21所述的无人机,其特征在于,在所述确定所述第一像素块的第一位置信息,以及所述第二像素块的第二位置信息方面,所述处理器具体用于:
    确定所述第一像素块中第二像素点的第一位置信息,以及所述第二像素块中所述第三像素点的第二位置信息,所述第二像素点与所述第三像素点匹配。
  23. 根据权利要求14-22任一项所述的方法,其特征在于,在所述根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离方面,所述处理器具体用于:
    根据所述第一成像装置和所述第二成像装置的安装参数,以及所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离。
  24. 根据权利要求23所述的无人机,其特征在于,所述第一成像装置和第二成像装置的安装参数包括以下至少一个:
    所述第一成像装置的镜头光心和第二成像装置的镜头光心之间的间距,第一成像装置的光心至无人机本体的距离,第二成像装置的光心至无人机本体的距离。
  25. 根据权利要求14-24任一项所述的无人机,其特征在于,若所述第一图像中包括至少2个第一像素块,在所述根据所述第一像素块与所述第二像素块的视差值,确定所述无人机与目标物之间的距离方面,所述处理器具体用于:
    确定至少2个视差值;
    根据所述至少2个视差值,确定至少2个距离值;
    根据所述至少2个距离值,确定所述无人机与所述目标物之间的距离。
  26. 根据权利要求25所述的方法,其特征在于,在所述根据所述至少2个距离值,确定所述无人机与所述目标物之间的距离方面,所述处理器具体用于:
    计算所述至少2个距离值的平均值,并将所述平均值作为所述无人机与所述目标物之间的距离。
  27. 一种非易失性计算机可读存储介质,其中,所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使无人机执行权利要求1-13任一项所述的方法。
  28. 一种计算机程序产品,其中,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被无人机执行时,实现如权利要求1-13任一项所述的方法。
PCT/CN2018/082653 2017-05-19 2018-04-11 无人机的距离测量方法以及无人机 WO2018210078A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/615,082 US20200191556A1 (en) 2017-05-19 2018-04-11 Distance mesurement method by an unmanned aerial vehicle (uav) and uav

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710356535.6A CN108965651A (zh) 2017-05-19 2017-05-19 一种无人机高度测量方法以及无人机
CN201710356535.6 2017-05-19

Publications (1)

Publication Number Publication Date
WO2018210078A1 true WO2018210078A1 (zh) 2018-11-22

Family

ID=64273392

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/082653 WO2018210078A1 (zh) 2017-05-19 2018-04-11 无人机的距离测量方法以及无人机

Country Status (3)

Country Link
US (1) US20200191556A1 (zh)
CN (1) CN108965651A (zh)
WO (1) WO2018210078A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111986248A (zh) * 2020-08-18 2020-11-24 东软睿驰汽车技术(沈阳)有限公司 多目视觉感知方法、装置及自动驾驶汽车
CN113772081A (zh) * 2021-09-28 2021-12-10 上海莘汭驱动技术有限公司 一种高性能无人机舵机

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11099539B2 (en) * 2018-05-17 2021-08-24 Ut-Battelle, Llc Multi-sensor agent devices
CN109709982A (zh) * 2018-12-29 2019-05-03 东南大学 一种无人机定高控制系统及方法
US11055864B2 (en) * 2019-05-24 2021-07-06 Nanjing Polagis Technology Co. Ltd Method and apparatus for determining a geographic position of a target object on a street view map
KR102316960B1 (ko) * 2019-11-28 2021-10-22 광운대학교 산학협력단 무인 항공기 영상 내 실시간 객체 검출 방법 및 장치
CN114846295A (zh) * 2020-12-17 2022-08-02 深圳市大疆创新科技有限公司 一种可移动平台的控制方法、装置及可移动平台
CN112686938B (zh) * 2020-12-29 2024-03-15 重庆大学 基于双目图像测距的输电线路净距计算与安全告警方法
CN113014904A (zh) * 2021-02-24 2021-06-22 苏州臻迪智能科技有限公司 一种无人机巡检图像处理的方法、装置、系统和存储介质
KR20220132937A (ko) * 2021-03-24 2022-10-04 한국전자통신연구원 Gnss 기반의 고도 정밀 측위 방법 및 시스템
CN113514013B (zh) * 2021-04-20 2023-02-24 广西电网有限责任公司南宁供电局 弧垂测量方法、装置、计算机设备和存储介质
CN114488328B (zh) * 2021-12-27 2023-08-15 北京自动化控制设备研究所 分布式地质磁异常辨识方法及系统
CN115839962B (zh) * 2023-02-23 2023-05-16 国网山西省电力公司电力科学研究院 一种基于无人机控制的压接金具检测系统及方法
CN117346741A (zh) * 2023-09-04 2024-01-05 中南大学 一种基于无人机的高铁接触网施工接口测量方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07234111A (ja) * 1994-02-23 1995-09-05 Matsushita Electric Works Ltd 三次元物体の計測方法
CN101680756A (zh) * 2008-02-12 2010-03-24 松下电器产业株式会社 复眼摄像装置、测距装置、视差算出方法以及测距方法
CN102713509A (zh) * 2010-09-14 2012-10-03 株式会社理光 立体摄影装置、校正方法和程序
CN105627932A (zh) * 2015-12-31 2016-06-01 零度智控(北京)智能科技有限公司 一种基于双目视觉的测距方法及装置
CN105973140A (zh) * 2016-04-29 2016-09-28 维沃移动通信有限公司 一种测量物体空间参数的方法及移动终端
CN106030243A (zh) * 2014-02-25 2016-10-12 株式会社理光 距离测量装置和视差计算系统
CN106153008A (zh) * 2016-06-17 2016-11-23 北京理工大学 一种基于视觉的旋翼无人机三维目标定位方法

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103292710B (zh) * 2013-05-27 2016-01-06 华南理工大学 一种应用双目视觉视差测距原理的距离测量方法
CN104299228B (zh) * 2014-09-23 2017-08-25 中国人民解放军信息工程大学 一种基于精确点位预测模型的遥感影像密集匹配方法
CN104463969B (zh) * 2014-12-09 2017-09-26 广西界围信息科技有限公司 一种对航空倾斜拍摄的地理照片的模型的建立方法
CN105043350A (zh) * 2015-06-25 2015-11-11 闽江学院 一种双目视觉测量方法
CN105225482B (zh) * 2015-09-02 2017-08-11 上海大学 基于双目立体视觉的车辆检测系统和方法
CN105282438B (zh) * 2015-09-18 2018-06-22 贵州省第二测绘院 一种辅助地理国情解译与核查的全景照片采集方法
CN105424006B (zh) * 2015-11-02 2017-11-24 国网山东省电力公司电力科学研究院 基于双目视觉的无人机悬停精度测量方法
CN105654732A (zh) * 2016-03-03 2016-06-08 上海图甲信息科技有限公司 一种基于深度图像的道路监控系统及方法
CN106643518A (zh) * 2016-11-09 2017-05-10 乐视控股(北京)有限公司 一种利用双目摄像装置测量距离和大小的方法和装置
CN106595500B (zh) * 2016-11-21 2019-06-14 云南电网有限责任公司电力科学研究院 基于无人机双目视觉的输电线路覆冰厚度测量方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07234111A (ja) * 1994-02-23 1995-09-05 Matsushita Electric Works Ltd 三次元物体の計測方法
CN101680756A (zh) * 2008-02-12 2010-03-24 松下电器产业株式会社 复眼摄像装置、测距装置、视差算出方法以及测距方法
CN102713509A (zh) * 2010-09-14 2012-10-03 株式会社理光 立体摄影装置、校正方法和程序
CN106030243A (zh) * 2014-02-25 2016-10-12 株式会社理光 距离测量装置和视差计算系统
CN105627932A (zh) * 2015-12-31 2016-06-01 零度智控(北京)智能科技有限公司 一种基于双目视觉的测距方法及装置
CN105973140A (zh) * 2016-04-29 2016-09-28 维沃移动通信有限公司 一种测量物体空间参数的方法及移动终端
CN106153008A (zh) * 2016-06-17 2016-11-23 北京理工大学 一种基于视觉的旋翼无人机三维目标定位方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111986248A (zh) * 2020-08-18 2020-11-24 东软睿驰汽车技术(沈阳)有限公司 多目视觉感知方法、装置及自动驾驶汽车
CN111986248B (zh) * 2020-08-18 2024-02-09 东软睿驰汽车技术(沈阳)有限公司 多目视觉感知方法、装置及自动驾驶汽车
CN113772081A (zh) * 2021-09-28 2021-12-10 上海莘汭驱动技术有限公司 一种高性能无人机舵机
CN113772081B (zh) * 2021-09-28 2024-05-14 上海莘汭驱动技术有限公司 一种高性能无人机舵机

Also Published As

Publication number Publication date
CN108965651A (zh) 2018-12-07
US20200191556A1 (en) 2020-06-18

Similar Documents

Publication Publication Date Title
WO2018210078A1 (zh) 无人机的距离测量方法以及无人机
CN108323190B (zh) 一种避障方法、装置和无人机
US11073389B2 (en) Hover control
CN111448476B (zh) 在无人飞行器与地面载具之间共享绘图数据的技术
CN106529495B (zh) 一种飞行器的障碍物检测方法和装置
US10447912B2 (en) Systems, methods, and devices for setting camera parameters
US11057604B2 (en) Image processing method and device
JP5775632B2 (ja) 飛行体の飛行制御システム
CN105182992A (zh) 无人机的控制方法、装置
CN110225249B (zh) 一种对焦方法、装置、航拍相机以及无人飞行器
US20240153122A1 (en) Binocular vision-based environment sensing method and apparatus, and unmanned aerial vehicle
CN108733064A (zh) 一种无人机的视觉定位避障系统及其方法
CN110139038B (zh) 一种自主环绕拍摄方法、装置以及无人机
WO2021259253A1 (zh) 一种轨迹跟踪方法及无人机
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
US20230127974A1 (en) Trajectory tracking method and unmanned aerial vehicle
Moore et al. UAV altitude and attitude stabilisation using a coaxial stereo vision system
WO2020062024A1 (zh) 基于无人机的测距方法、装置及无人机
WO2020062356A1 (zh) 控制方法、控制装置、无人飞行器的控制终端
WO2020019175A1 (zh) 图像处理方法和设备、摄像装置以及无人机
CN114973037B (zh) 一种无人机智能检测与同步定位多目标的方法
CN114353667A (zh) 基于ar与无人机单目视觉的地面目标测量方法及其应用
KR20230115042A (ko) 충돌회피 드론 및 제어방법
CN118463976A (zh) 卫星定位系统拒止条件下的无人飞行器悬停侦察方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18801911

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18801911

Country of ref document: EP

Kind code of ref document: A1